About Me

I am a final-year master student in Tsinghua University, advised by Prof. Yangdong Deng. Before that, I received my bachelor degree from Wuhan University in June 2022.

Currently, I am interested in efficient pre-training and inference for Large Language Model(LLM), data efficiency, etc. I am committed to build software and algorithms that really work in practice.

Outside of research, I enjoy running, swimming and books on history and sociology.

Email  /  Google Scholar  /  Github  /  Zhihu

profile photo

🔥 News
  • [August. 2024] I am invited as a reviewer for ICLR 2025.
  • [May. 2024] CQIL is accepted to ACL 2024!
  • [May. 2024] Ended my journey in Vienna for ICLR 2024 and enriched my mind with those remarkable papers!
  • [April. 2024] Post a preprint (CQIL) on arXiv.
  • [Jan. 2024] First Paper accepted to ICLR 2024!
  • [Aug. 2022] Arrived Tsinghua University and started my journey in Beijing.
  • [Jun. 2022] Received Bachelor Degree from School of Computer Science at Wuhan University.

📑 Publications
CQIL: Inference Latency Optimization with Concurrent Computation of Quasi-Independent Layers
Longwei Zou, Qingyang Wang Han Zhao Jiangang Kong Yi Yang Yangdong Deng
ACL, 2024
github / arXiv

Parallelize the structure of LLM to reduce the inference latency.

A Multi-Level Framework for Accelerating Training Transformer Models
Longwei Zou, Han Zhang Yangdong Deng
ICLR, 2024
github / arXiv

Accelerate transformer training through a series of smaller models.


Design and source code from Jon Barron's and Tianxiang Sun's websites.