• About
    • About Us
    • Contact Us
  • Faculty
  • Admissions
    • International Admissions Information
    • Admissions News
  • Research
    • Center for AI Theoretical Foundation and Systems
    • Center for Language, Intelligence and Machines
    • Center for AI for Science and Engineering
    • Center for AI for Social Science
    • Center for Embodied Artificial Intelligence and Computer Vision
  • News
    • School News
  • Recruitment
    • Academic Positions
  • Academic Forum
    • Forum Schedule
  • 中文
中

Breadcrumb

  • Home

Congliang Chen

Research Assistant Professor

Center for AI Theoretical Foundation and Systems

Education Background
  • 2018-2025: The Chinese University of Hong Kong (Shenzhen), Computer and Information Engineering, Doctoral Degree (Full-time)
  • 2014-2018: Peking University, Computer Science and Technology, Bachelor Degree (Full-time)

 

Work Experience

  • 2018-Present: Shenzhen Loop Area Institute,  Research Assistant Professor

 

Research Fields
Core Research Interests: Machine Learning Theory, Numerical Computation, Optimization Algorithms for Large Language Models, Kernel Generation and Optimization
Class Type
AI Foundations & Systems
Personal Website
https://chcoliang.github.io
Email
chencongliang@slai.edu.cn
Biography

Congliang Chen received a B.S. from the School of EECS at Peking University and a Ph.D. from The Chinese University of Hong Kong, Shenzhen. He is currently a Research Assistant Professor at the Shenzhen Loop Area Institute. His research interests include machine learning theory, numerical computation, optimization algorithms for large language models, and kernel generation and optimization. He proved distributed Adam with multi-worker acceleration and a communication-efficient Adam variant enabling 1-bit-per-parameter training. He also contributed to Adam-mini and other works that related to the optimization in large models. His work appears in JMLR, IEEE TSP, NeurIPS, and ICLR.

Academic Publications

Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.

Towards practical adam: Non-convexity, convergence theory, and mini-batch acceleration
Congliang Chen*, Li Shen*, Fangyu Zou*, and Wei Liu, Journal of Machine Learning Research 23, no. 229 (2022): 1-47.

Communication efficient primal-dual algorithm for nonconvex nonsmooth distributed optimization
Congliang Chen, Jiawei Zhang, Li Shen, Peilin Zhao, and Zhiquan Luo, In International conference on artificial intelligence and statistics, pp. 1594-1602. PMLR, 2021.

Adam-mini: Use fewer learning rates to gain more.
Yushun Zhang*, Congliang Chen*, Ziniu Li, Tian Ding, Chenwei Wu, Diederik P. Kingma, Yinyu Ye, Zhi-Quan Luo, and Ruoyu Sun, In The Thirteenth International Conference on Learning Representations.

Adam can converge without any modification on update rules
Yushun Zhang, Congliang Chen, Naichen Shi, Ruoyu Sun, and Zhi-Quan Luo, Advances in neural information processing systems 35 (2022): 28386-28399.

FROM 1 TO INFINITY
以1为始 向∞而行
Contact Us
Admissions:
admission@slai.edu.cn
Admissions Hotline:
(86)0755 81970253
Hotline Time:
Weekdays, 9:30–11:00 am & 3:00–5:00 pm
Faculty Recruitment:
FacultyHiring@slai.edu.cn
Cooperation:
icfo@slai.edu.cn 
Explore More
内网
About Us
Recruitment

Copyright © SLAI All Rights Reserved. 粤ICP备14099122号-14