Zhiwei Li

zli404 [AT] connect.hkust-gz.edu.cn

photo2.jpg

The Great Wall at Badaling, 2024.

Hello, I’m Zhiwei Li (李志伟). I’m currently pursuing my Ph.D at the Hong Kong University of Science and Technology (Guangzhou), supervised by Prof. Zhijiang Guo. I previously completed my M.Sc. in School of Data Science at Fudan University, where I was advised by Prof. Weizhong Zhang. Before that, I earned my B.Sc. in School of Statistics and Data Science from Shanghai University of Finance and Economics in Summer 2023.

My current research is centered on Reasoning of Large Language Models, especially in multimodal LLMs, reinforcement learning, and multi-turn agents. I previously conducted research on model compression and acceleration, including LLM pruning, low-precision training, and diffusion model accelerated sampling.

For a complete overview of my research and experience, please refer to my resume: CV.

news

Sep 19, 2025 Three papers are accepted by NeurIPS 2025!
Aug 23, 2025 New journey begins at HKUST(GZ)!
Sep 27, 2024 One paper is accepted by NeurIPS 2024!

selected publications

  1. From System 1 to System 2: A Survey of Reasoning Large Language Models
    Zhong-Zhi Li, Duzhen Zhang, Ming-Liang Zhang, Jiaxin Zhang, Zengyan Liu, Yuxuan Yao, Haotian Xu, Junhao Zheng, Pei-Jie Wang, Xiuyi Chen, Yingying Zhang, Fei Yin, Jiahua Dong, Zhiwei Li, Bao-Long Bi, Ling-Rui Mei, Junfeng Fang, Zhijiang Guo, Le Song, and Cheng-Lin Liu
    ArXiv, 2025
  2. Computation and Memory-Efficient Model Compression with Gradient Reweighting
    Zhiwei Li*, Yuesen Liao*, Binrui Wu, Yuquan Zhou, Xupeng Shi, Dongsheng Jiang, Yin Li, and Weizhong Zhang
    NeurIPS, 2025
  3. Compress Large Language Models via Collaboration Between Learning and Matrix Approximation
    Yuesen Liao*Zhiwei Li*, Binrui Wu, Zihao Cheng, Su Zhao, Shuai Chen, and Weizhong Zhang
    NeurIPS, 2025
  4. Low Precision Local Training is Enough for Federated Learning
    Zhiwei Li*, Yiqiu Li*, Binbin Lin, Zhongming Jin, and Weizhong Zhang
    NeurIPS, 2024