Shen Li
Associate Professor
Sun Yat-sen University
Educational Background:
- 2013–2017: South China University of Technology, School of Mathematics, Doctoral Degree (Full-time)
- 2009–2013 South China University of Technology, School of Mathematics, Bachelor Degree (Full-time)
Work Experience:
- 2025 - Present:Shenzhen Loop Area Institute, Center for Theory and Systems, Associate Professor
- 2024 - Present:Sun Yat-sen University, School of Cyber Science and Technology, Associate Professor
- 2021-2024:JD.com, JD Explore Academy, Algorithm Scientist
- 2017-2021:Tencent, Tencent AI Lab, Senior researcher
Li Shen is currently an associate professor at Sun Yat-sen University and Shenzhen Loop Area Institute. Previously, he was a research scientist at JD Explore Academy, Beijing, and a senior researcher at Tencent AI Lab, Shenzhen. He received his bachelor's degree and Ph.D. from the School of Mathematics, South China University of Technology. His research interests include efficient deep learning, efficient reinforcement learning, optimization and deep learning theory. He has served as the senior program committee for AAAI and area chairs for ICML, NeurIPS, ICLR, CVPR, and ACMMM. He is also the associate editor for IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Transactions on Knowledge and Data Engineering, and IEEE Transactions on Multimedia.
[1] Hongling Zheng, Li Shen* (共同通讯作者), Anke Tang, Yong Luo*, Han Hu*, Bo Du*, Yonggang Wen, Dacheng Tao: Learning from models beyond fine-tuning. Nat. Mac. Intell. 7(1): 6-17 (2025)
[2] Li Shen (第一作者), Anke Tang, Enneng Yang, Guibing Guo, Yong Luo, Lefei Zhang, Xiaochun Cao, Bo Du, Dacheng Tao: Efficient and Effective Weight-Ensembling Mixture of Experts for Multi-Task Model Merging. IEEE Trans. Pattern Anal. Mach. Intell. 48(3): 2329-2341 (2026)
[3] Li Shen (第一作者), Anke Tang, Yong Luo, Tao Sun, Han Hu, Xiaochun Cao: Targeted Low-rank Refinement: Enhancing Sparse Language Models with Precision. ICML 2025.
[4] Zixuan Hu, Yongxian Wei, Li Shen (唯一通讯作者), Zhenyi Wang, Baoyuan Wu, Chun Yuan, Dacheng Tao: Task-Distributionally Robust Data-Free Meta-Learning. IEEE Trans. Pattern Anal. Mach. Intell. 48(1): 436-447 (2026)
[5] Zhenyi Wang, Li Shen (唯一通讯作者), Tiehang Duan, Yanjun Zhu, Tongliang Liu, Mingchen Gao, Dacheng Tao: Release the Potential of Memory Buffer in Continual Learning: A Dynamic System Perspective. IEEE Trans. Pattern Anal. Mach. Intell. 48(2): 1811-1824 (2026)
[6] Peng Mi, Li Shen (唯一通讯作者), Tianhe Ren, Yiyi Zhou, Tianshuo Xu, Xiaoshuai Sun, Tongliang Liu, Rongrong Ji, Dacheng Tao: Systematic Investigation of Sparse Perturbed Sharpness-Aware Minimization Optimizer. IEEE Trans. Pattern Anal. Mach. Intell. 47(10): 8538-8549 (2025)
[7] Yan Sun, Li Shen (唯一通讯作者), Hao Sun, Liang Ding, Dacheng Tao: Efficient Federated Learning Via Local Adaptive Amended Optimizer With Linear Speedup. IEEE Trans. Pattern Anal. Mach. Intell. 45(12): 14453-14464 (2023)
[8] Shiwei Liu, Yuesong Tian, Tianlong Chen, Li Shen (唯一通讯作者): Don't Be So Dense: Sparse-to-Sparse GAN Training Without Sacrificing Performance. Int. J. Comput. Vis. 131(10): 2635-2648 (2023)
[9] Peng Wang, Li Shen (唯一通讯作者), Zerui Tao, Shuaida He, Dacheng Tao: Generalization Analysis of Stochastic Weight Averaging with General Sampling. ICML 2024: 51442-51464
[10] Yan Sun, Li Shen (唯一通讯作者), Shixiang Chen, Liang Ding, Dacheng Tao: Dynamic Regularized Sharpness Aware Minimization in Federated Learning: Approaching Global Consistency and Smooth Landscape. ICML 2023: 32991-33013.