陈淙靓
研究助理教授
深圳河套学院
教育经历(按时间倒序):
- 2018-2025:香港中文大学(深圳) 信息工程 博士学位(全日制)
- 2014-2018:北京大学 计算机科学与技术 学士学位(全日制)
工作经历(按时间倒序):
- 2025-至今:深圳河套学院 研究助理教授
陈淙靓本科毕业于北京大学信息科学技术学院,博士毕业于香港中文大学(深圳),现任深圳河套学院研究助理教授。其研究聚焦机器学习理论、数值计算、大语言模型优化算法与算子生成优化。提出证明分布式Adam多机加速,设计1bit通信的Adam变体;参与Adam-mini等大模型优化相关研究。成果发表于JMLR、IEEE TSP及NeurIPS、ICLR等
Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.
Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.
Communication efficient primal-dual algorithm for nonconvex nonsmooth distributed optimization
Congliang Chen, Jiawei Zhang, Li Shen, Peilin Zhao, and Zhiquan Luo, In International conference on artificial intelligence and statistics, pp. 1594-1602. PMLR, 2021.
Adam-mini: Use fewer learning rates to gain more.
Yushun Zhang*, Congliang Chen*, Ziniu Li, Tian Ding, Chenwei Wu, Diederik P. Kingma, Yinyu Ye, Zhi-Quan Luo, and Ruoyu Sun, In The Thirteenth International Conference on Learning Representations.
Adam can converge without any modification on update rules
Yushun Zhang, Congliang Chen, Naichen Shi, Ruoyu Sun, and Zhi-Quan Luo, Advances in neural information processing systems 35 (2022): 28386-28399.