About Me
I am currently a PhD student (from fall, 2021) at the School of Software of Tsinghua University and a member of the THUML, advised by Prof. MingSheng Long.
My research interests cover Deep Learning and Time Series Analysis. I am currently working on time series foundation models and MLSys for time series. In addition to pure research, I also dedicate myself to promoting research on valuable real-world applications. My research aims to contribute to the advancement of intelligent systems capable of handling massive and complicated temporal data across domains, including finance, healthcare, industry, and environment.
For more information, please take a look at my Google Scholar and GitHub.
News
- [Jun. 2025] An open-source time series foundation model (Sundial) is released here, which can generate multiple probable predictions.
- [Jun. 2025] A generative foundation model for time series (Sundial) was accepted as ICML 2025 Oral!
- [Mar. 2025] An open-source large time-series model (Timer) is released here, which can make zero-shot predictions.
- [Jan. 2025] An multivariate attention for time series (Timer-XL) was accepted in ICLR 2025.
- [Jan. 2025] iTransformer for Ant Group Green Computing are awarded as Outstanding Projects of CCF Fund.
- [Sep. 2024] Two papers (AutoTimes and TimeXer) were accepted in NeurIPS 2024.
- [Jun. 2024] A large model for time series (Timer) was accepted in ICML 2024. Code is available.
- [Jan. 2024] A deep model for multivariate forecasting (iTransformer) was accepted as ICLR 2024 Spotlight!
- [Dec. 2023] Native AI analytical engine (AINode) in database (Apache IoTDB) is released!
- [Oct. 2023] A deep model for non-stationary forecasting inspired by Koopman theory (Koopa) was accepted as NeurIPS 2023.
- [Apr. 2023] TimesNet for general time series analysis was accepted as ICLR 2023.
- [Oct. 2022] Non-stationary Transformers was accepted as NeurIPS 2022.
Publications & Preprints
-
ICML
Yong Liu*, Guo Qin*, Zhiyuan Shi, Zhi Chen, Caiyin Yang, Xiangdong Huang, Jianmin Wang, Mingsheng Long#
International Conference on Machine Learning, 2025.
-
ICLR
Yong Liu*, Guo Qin*, Xiangdong Huang, Jianmin Wang, Mingsheng Long#
International Conference on Learning Representations, 2025.
-
ICLR
Yong Liu*, Tengge Hu*, Haoran Zhang*, Haixu Wu, Shiyu Wang, Lintao Ma, Mingsheng Long#
International Conference on Learning Representations, 2024.
-
NeurIPS
Yong Liu*, Haiwu Wu*, Jianmin Wang, Mingsheng Long#
Conference on Neural Information Processing Systems, 2022.
-
ICML
Yong Liu*, Haoran Zhang*, Chenyu Li*, Xiangdong Huang, Jianmin Wang, Mingsheng Long#
International Conference on Machine Learning, 2024.
-
NeurIPS
Yong Liu*, Chenyu Li*, Jianmin Wang, Mingsheng Long#
Conference on Neural Information Processing Systems, 2023.
-
ICLR
Haiwu Wu*, Tengge Hu*, Yong Liu*, Hang Zhou, Jianmin Wang, Mingsheng Long#
International Conference on Learning Representations, 2023.
-
NeurIPS
Yong Liu*, Guo Qin*, Xiangdong Huang, Jianmin Wang, Mingsheng Long#
Conference on Neural Information Processing Systems, 2024.
-
NeurIPS
Yuxuan Wang*, Haixu Wu*, Jiaxiang Dong, Guo Qin, Haoran Zhang, Yong Liu, Yunzhong Qiu, Jianmin Wang, Mingsheng Long#
Conference on Neural Information Processing Systems, 2024.
-
arxiv
Yuxuan Wang*, Haixu Wu*, Jiaxiang Dong, Yong Liu, Mingsheng Long, Jianmin Wang#
arxiv preprint
arXiv
arXiv preprint, 2025
-
JMLR
Kaichao You*, Yong Liu*, Ziyang Zhang, Jianmin Wang, Michael I. Jordan, Mingsheng Long#
Journal of Machine Learning Research, 2022.
-
ICML
Kaichao You*, Yong Liu*, Jianmin Wang, Mingsheng Long#
International Conference on Machine Learning, 2021.
Selected Projects
HuggingFace Models
- Sundial - Generative times series foundation models (Pre-trained on 1 trillion pts).
- Timer - Large times-series models (Pre-trained on 260 billion pts).
Deep Models for Time Series
- iTransformer - Foundation Multivariate Time Series Model.

- Timer - Datasets, Checkpoints, Code for Developing Large Time Series Model.

- Non-stationary Transformers - Transformers for Non-stationary Forecasting.

- Koopa - Theory-inspired efficient non-stationary time series forecaster.

- AutoTimes - LLM-based time series forecasting with texts.

Model Library
System and Applications
Invited Talks
- Foundation Models and Analytical System for Time Series at Apache IoTDB. [Slides]
- Exploring Large Models for Time Series at IoA, CAS. [Slides]
- Deep Learning for Time Series Applications at DoA, THU. [Slides]
- Large Models for Native Database Analysis at TPCTC 2024. [PDF]
Services
Academic Services
- Conference Reviewer, International Conference on Learning Representations (ICLR) 2024-2025.
- Conference Reviewer, International Conference on Machine Learning (ICML) 2022-2025.
- Conference Reviewer, International Conference on Machine Learning (CVPR) 2023.
- Conference Reviewer, International Conference on Very Large Databases (VLDB) 2023.
- Conference Reviewer, Conference on Neural Information Processing Systems (NeurIPS) 2023-2025.
Teaching Experiences
- Teaching Assistant, Database System, Spring 2024, Prof. Jianmin Wang.
- Teaching Assistant, Machine Learning, Fall 2023, Prof. Mingsheng Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2023, Prof. Mingsheng Long.
- Teaching Assistant, Deep Learning, Fall 2022, Prof. Mingsheng Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2022, Prof. Mingsheng Long.
- Teaching Assistant, Machine Learning, Fall 2021, Prof. Mingsheng Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2021, Prof. Mingsheng Long.
Education
Honors & Awards
- Outstanding Papers of Beijing (北京市优秀毕业论文), 2021.
- Outstanding Graduates of Beijing (北京市优秀毕业生), 2021.
- Excellent Graduates of Tsinghua (清华大学优秀毕业生), 2021.
- Future Scholar Scholarship (未来学者奖学金), 2021.
- Shenzhen Stock Exchange Scholarship (深交所奖学金), 2023.
- Boeing Scholarship (波音奖学金), 2020.
- Tang Lixin Scholarship (唐立新优秀奖学金), 2020.
- Jiang Nanxiang Scholarship (蒋南翔奖学金), 2019.
- Huawei Scholarship (华为奖学金), 2018.
Powered by Jekyll and Minimal Light theme.