Congliang Chen
Research Assistant Professor
Shenzhen Loop Area Institute
Educational Background (in reverse chronological order):
- 2018-2025: The Chinese University of Hong Kong (Shenzhen), Computer and Information Engineering, Doctoral Degree (Full-time)
- 2014-2018: Peking University, Computer Science and Technology, Bachelor Degree (Full-time)
Work Experience (in reverse chronological order):
- 2018-Present: Shenzhen Loop Area Institute, Research Assistant Professor
Congliang Chen received a B.S. from the School of EECS at Peking University and a Ph.D. from The Chinese University of Hong Kong, Shenzhen. He is currently a Research Assistant Professor at the Shenzhen Loop Area Institute. His research interests include machine learning theory, numerical computation, optimization algorithms for large language models, and kernel generation and optimization. He proved distributed Adam with multi-worker acceleration and a communication-efficient Adam variant enabling 1-bit-per-parameter training. He also contributed to Adam-mini and other works that related to the optimization in large models. His work appears in JMLR, IEEE TSP, NeurIPS, and ICLR.
Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.
Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.
Communication efficient primal-dual algorithm for nonconvex nonsmooth distributed optimization
Congliang Chen, Jiawei Zhang, Li Shen, Peilin Zhao, and Zhiquan Luo, In International conference on artificial intelligence and statistics, pp. 1594-1602. PMLR, 2021.
Adam-mini: Use fewer learning rates to gain more.
Yushun Zhang*, Congliang Chen*, Ziniu Li, Tian Ding, Chenwei Wu, Diederik P. Kingma, Yinyu Ye, Zhi-Quan Luo, and Ruoyu Sun, In The Thirteenth International Conference on Learning Representations.
Adam can converge without any modification on update rules
Yushun Zhang, Congliang Chen, Naichen Shi, Ruoyu Sun, and Zhi-Quan Luo, Advances in neural information processing systems 35 (2022): 28386-28399.