Hu Baotian
Professor
HARBIN INSTITUTE OF TECHNOLOGY, SHENZHEN
Educational Background:
- 2012–2016: Ph.D. in Computer Science and Technology, Harbin Institute of Technology
- 2010–2012: M.S. in Computer Science and Technology, Harbin Institute of Technology
Work Experience:
- 2024–Present: Professor, School of Computing, Harbin Institute of Technology (Shenzhen)
- 2022–2024: Associate Professor, School of Computing, Harbin Institute of Technology (Shenzhen)
- 2019–2022: Assistant Professor, School of Computing, Harbin Institute of Technology (Shenzhen)
Hu Baotian, Ph.D., is a Professor and Doctoral Supervisor at Harbin Institute of Technology, and Secretary-General of the Committee on Large Models and Generation, Chinese Information Processing Society of China. An awardee of the National Science Fund for Outstanding Young Scholars, he focuses on language-centered large models and intelligent agents, including novel architectures, multimodal models, and agent systems. He has published over 100 papers in top venues such as NeurIPS, ACL, and ICML, with one first-authored paper cited over 1,800 times. He has led major projects from NSFC and industry partners like Huawei, Tencent, and Baidu. He served as Program Committee Co-Chair for IJCNLP-AACL 2023, is an Editorial Board Member of Neural Networks, and has served as Area Chair for ACL, EMNLP, and IJCAI. He welcomes research collaboration and applications from outstanding students.
1. Xinping Zhao, Xinshuo Hu, Zifei Shan, Shouzheng Huang, Yao Zhou, Zetian Sun, Zhenyu Liu, Dongfang Li, Xinyuan Wei, Qian Chen, Youcheng Pan, Yang Xiang, Meishan Zhang, Haofen Wang, Jun Yu, Baotian Hu*, Min Zhang. KaLM-Embedding-V2: Superior Training Techniques and Data Inspire A Versatile Embedding Model. International Conference on Learning Representations (ICLR), 2026 (CCF A 类,DOI: 10.48550/arXiv.2506.20923)
2. Yunxin Li, Xinyu Chen, Shenyuan Jiang, Haoyuan Shi, Zhenyu Liu, Xuanyu Zhang, Nanhao Deng, Zhenran Xu, Yicheng Ma, Meishan Zhang, Baotian Hu*, Min Zhang*. Uni-MoE-2.0-Omni: Scaling Language-Centric Omnimodal Large Model with Advanced MoE, Training and Data. arXiv preprint arXiv:2511.12609, 2025 (DOI: 10.48550/arXiv.2511.12609)
3. Zhenyu Liu, Yunxin Li, Xuanyu Zhang, Qixun Teng, Shenyuan Jiang, Xinyu Chen, Haoyuan Shi, Jinchao Li, Qi Wang, Haolan Chen, Fanbo Meng, Mingjun Zhao, Yu Xu, Yancheng He, Baotian Hu*, Min Zhang. UniMoE-Audio: Unified Speech and Music Generation with Dynamic-Capacity MoE. arXiv preprint arXiv:2510.13344, 2025 (DOI: 10.48550/arXiv.2510.13344)
4. Yunxin Li, Shenyuan Jiang, Baotian Hu*, Longyue Wang, Wanqi Zhong, Wenhan Luo, Lin Ma, Min Zhang. Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2025 (CCF A 类, DOI: 10.48550/arXiv.2405.11273)
5. Haoyuan Shi, Yunxin Li, Xinyu Chen, Longyue Wang, Baotian Hu*, Min Zhang*. AniMaker: Multi-Agent Animated Storytelling with MCTS-Driven Clip Generation. Siggraph Asia, 2025 (CCF-A类,DOI: 10.48550/arXiv.2506.10540)
6. Yunxin Li, Baotian Hu*, Haoyuan Shi, Wei Wang, Longyue Wang, Min Zhang*. VisionGraph: Leveraging Large Multimodal Models for Graph Theory Problems in Visual Context. International Conference on Machine Learning (ICML), 2024 (CCF A 类, DOI: 10.48550/arXiv.2405.04950)
7. Dongfang Li, Zhenyu Liu, Xinshuo Hu, Zetian Sun, Baotian Hu*, Min Zhang. In-Context Learning State Vector with Inner and Momentum Optimization. Neural Information Processing Systems (NeurIPS), 2024 (CCF A 类, DOI: 10.48550/arXiv.2404.11225)
8. Yuxiang Wu*, Baotian Hu*. Learning to Extract Coherent Summary via Deep Reinforcement Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 2018 (CCF A 类,DOI: 10.1609/aaai.v32i1.11987)
9. Baotian Hu, Qingcai Chen, Fangze Zhu. LCSTS: A Large-Scale Chinese Short Text Summarization Dataset. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2015 (CCF B 类,DOI: 10.18653/v1/D15-1229)
10. Baotian Hu*, Zhengdong Lu, Hang Li, Qingcai Chen. Convolutional Neural Network Architectures for Matching Natural Language Sentences. Neural Information Processing Systems (NeurIPS), 2014 (CCF A 类,JCR Q1, DOI: 10.48550/arXiv.1503.03244)