About Me
I ‘m currently a PhD student (from fall, 2021) at the School of Software of Tsinghua University and a member of the THUML, advised by Prof. MingSheng Long.
My research interests cover Deep Learning and Machine Learning. I am currently working on deep learning applications for Time Series Analysis (i.e. Foundation Time Series Models, Large Models for Time Series, Cross-modality Time Series Models). The pursuit of my reasearch is to implement deep learning methodology to valuable real-world applications.
For more information, you may take a look at my publications.
Education
- Bachelor in Software Engineering (School of Software, Tsinghua University), 2017 to 2021.
- Bachelor in Economics (School of Economics and Management, Tsinghua University, Second Degree), 2018 to 2021.
- PhD Student in Software Engineering (School of Software, Tsinghua University), 2021 to present.
Publications & Preprints
LogME: Practical Assessment of Pre-trained Models for Transfer Learning
You, K., Liu, Y., Wang, J., & Long, M. (2021). LogME: Practical Assessment of Pre-trained Models for Transfer Learning. ICML 2021.
Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting
Liu, Y., Wu, H., Wang, J., & Long, M. (2022). Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting. NeurIPS 2022.
Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs
You, K., Liu, Y., Zhang, Z., Wang, J., Jordan, M. I., & Long, M. (2022). Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs. Journal of Machine Learning Research, 23, 1-47.
TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis
Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., & Long, M. (2022). TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis. ICLR 2022.
Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors
Liu, Y, Li, C, Wang, J., & Long, M. (2023). Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors[J]. NeurIPS 2023.
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting
Liu, Y., Hu, T., Zhang, H., Wu, H., Wang, S., Ma, L., & Long, M. (2023). iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. ICLR 2023.
AutoTimes: Autoregressive Time Series Forecasters via Large Language Models
Liu, Y., Qin, G., Huang, X., Wang, J., & Long, M. (2024). AutoTimes: Autoregressive Time Series Forecasters via Large Language Models. arXiv preprint.
Timer: Transformers for Time Series Analysis at Scale
Liu, Y., Zhang, H., Li, C., Huang, X., Wang, J., & Long, M. (2024). Timer: Transformers for Time Series Analysis at Scale. arXiv preprint.
TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables
Wang, Y., Wu, H., Dong, J., Liu, Y., Qiu, Y., Zhang, H., Wang, J., & Long, M. (2024). TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables. arXiv preprint.
Open Source
Deep Models for Time Series:
- Non-stationary Transformers - General Framework for Transformers, 400+ Stars.
- Koopa - Koopman Theory Inspired Non-stationary Time Series Predcitor.
- iTransformer - Foundation Multivariate Time Series Model, 700+ Stars.
- Timer - GPT-style Large Time Series Model for General Time Series Analysis.
- AutoTimes - Adopting LLMs as Autoregressive Time Series Forecasters.
Algorithm Library:
- Time Series Library - Committer, 4.3k+ Stars
- Transfer Learning Library - Committer, 3.1k+ Stars
System and Applications
Apache IoTDB AINode - AINode (Native machine learning engine of time series database).
Academic Services
- Conference Reviewer, International Conference on Machine Learning (ICML) 2024.
- Conference Reviewer, International Conference on Learning Representations (ICLR) 2024.
- Conference Reviewer, International Conference on Machine Learning (ICML) 2023.
- Conference Reviewer, Conference on Neural Information Processing Systems (NeurIPS) 2023.
- Conference Reviewer, International Conference on Machine Learning (ICML) 2022.
- Conference Reviewer, International Conference on Machine Learning (CVPR) 2023.
Honors & Awards
- Shenzhen Stock Exchange Scholarship (深交所奖学金, Top 1%). 2023.
- Outstanding Papers of Beijing (北京市优秀毕业论文, Top 1%). 2021.
- Outstanding graduates of Beijing (北京市优秀毕业生, Top 1%). 2021.
- Excellent graduates of Tsinghua (清华大学优秀毕业生, Top 1%). 2021.
- Future Scholar Scholarship (未来学者奖学金, Top 1%), Tsinghua University, 2021.
- Boeing Scholarship (波音奖学金, Top 1%), Tsinghua University, 2020.
- Tang Lixin Scholarship (唐立新优秀奖学金, Top 1%), Tsinghua University, 2020.
- Jiang Nanxiang Scholarship (蒋南翔奖学金, Top 1%), Tsinghua University, 2019.
- Huawei Scholarship (华为奖学金, Top 1%), Tsinghua University, 2018.
Experience
- Membership (Tsinghua University Machine Learning Group, THUML), 2021 to present.
- Teaching Assistant, Database System, Spring 2024, Prof. Wang.
- Teaching Assistant, Machine Learning, Fall 2023, Prof. Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2023, Prof. Long.
- Teaching Assistant, Deep Learning, Fall 2022, Prof. Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2022, Prof. Long.
- Teaching Assistant, Machine Learning, Fall 2021, Prof. Long.
- Teaching Assistant, Introduction to Artificial Intelligence, Spring 2021, Prof. Long.