Jintao’s Homepage
About Me
I am a five-year Ph.D. student in the Department of Computer Science at Tsinghua University. Feel free to call me Qifan Zhang (张棋番), my preferred nickname.
Currently, I am a visiting scholar at UC Berkeley, working with Joseph E. Gonzalez in the Sky Computing Lab .
Advisors
Ph.D. Advisors:...Prof. Jun Zhu and Prof. Jianfei Chen
 Master Advisor: Prof. Guoliang Li
Research Interests
(1) Efficient Machine Learning System, specializing in accelerating model training and inference;
(2) Data Management, specializing in query optimization and high-quality data acquisition.
Selected Publications
Preprint SLA: Beyond Sparsity in Diffusion Transformers via Fine-Tunable Sparse–Linear Attention
 Jintao Zhang, Haoxu Wang, Kai Jiang, Shuo Yang, Kaiwen Zheng, Haocheng Xi, Ziteng Wang, Hongzhou Zhu, Min Zhao, Ion Stoica, Joseph E. Gonzalez, Jun Zhu, Jianfei Chen
 |  paper |   |
Preprint A Survey of Efficient Attention Methods: Hardware-efficient, Sparse, Compact, and Linear Attention
 Jintao Zhang, Rundong Su, Chunyu Liu, Jia Wei, Ziteng Wang, Haoxu Wang, Pengle Zhang, et al. 
 |  paper |   |
NeurIPS SageAttention3: Microscaling FP4 Attention for Inference and An Exploration of 8-Bit Training
 Jintao Zhang*, Jia Wei*, Pengle Zhang, Xiaoming Xu, Haofeng Huang, Haoxu Wang, Kai Jiang, Jun Zhu, Jianfei Chen
 2025,  [spotlight paper], CCF-A, Research track, Full paper
 |  paper |   |
ICML SpargeAttention: Accurate and Training-free Sparse Attention Accelerating Any Model Inference
 Jintao Zhang, Chendong Xiang, Haofeng Huang, Haocheng Xi, Jia Wei, Jun Zhu, Jianfei Chen
 2025, CCF-A, Research track, Full paper
 |  paper |   |
ICML SageAttention2: Efficient Attention with Thorough Outlier Smoothing and Per-thread INT4 Quantization
 Jintao Zhang*, Haofeng Huang*, Pengle Zhang, Jia Wei, Jun Zhu, Jianfei Chen
 2025, CCF-A, Research track, Full paper
 |  paper |   |
ICLR SageAttention: Accurate 8-Bit Attention for Plug-and-play Inference Acceleration
 Jintao Zhang, Jia Wei, Pengle Zhang, Jun Zhu, Jianfei Chen
 2025, TH-CPL-A, Research track, Full paper 
 |  paper |   |
ICDE SAGE: A Framework of Precise Retrieval for RAG
 Jintao Zhang, Guoliang Li, Jinyang Su
 2025, CCF-A, Research track, Full paper
 |  paper |
SIGMOD PACE: Poisoning Attacks on Learned Cardinality Estimation
 Jintao Zhang, Guoliang Li, Chao Zhang, Chengliang Chai
 2025, CCF-A, Research track, Full paper 
 |  paper |
ICDE AutoCE: An Accurate and Efficient Model Advisor for Learned Cardinality Estimation
 Jintao Zhang, Chao Zhang, Guoliang Li, Chengliang Chai
 2023, CCF-A, Research track, Full paper
 |  paper |
TKDE A Lightweight Learned Cardinality Estimation Model
 Yaoyu Zhu, Jintao Zhang#(corresponding author), Guoliang Li#, Jianhua Feng
 2025, CCF-A, Research track, Full paper
 |  paper |
VLDB Learned Cardinality Estimation: A Design Space Exploration and A Comparative Evaluation
 Ji Sun*, Jintao Zhang*(equal contribution), Zhaoyan Sun, Nan Tang, Guoliang Li
 2022, CCF-A, Research track, Full paper
 |  paper |   |
NeurIPS Sparse VideoGen2: Accelerate Video Generation with Sparse Attention via Semantic-Aware Permutation
 Shuo Yang, Haocheng Xi, Yilong Zhao, Muyang Li, Jintao Zhang, Han Cai, Yujun Lin, Xiuyu Li, Chenfeng Xu, Kelly Peng, Jianfei Chen, Song Han, Kurt Keutzer, Ion Stoica
 2025,  [spotlight paper], CCF-A, Research track, Full paper
 |  paper |   |
ICML Sparse VideoGen: Accelerating Video Diffusion Transformers with Spatial-Temporal Sparsity
 Haocheng Xi, Shuo Yang, Yilong Zhao, Chenfeng Xu, Muyang Li, Xiuyu Li, Yujun Lin, Han Cai, Jintao Zhang, Dacheng Li, Jianfei Chen, Ion Stoica, Kurt Keutzer, Song Han
 2025, CCF-A, Research track, Full paper
 |  paper |   |
Awards
Oct. 2025: National Graduate Scholarship at Tsinghua University
May. 2023: Siebel Scholars Scholarship (Tsinghua’s highest master’s award)
Dec. 2020: President’s Scholarship of Xidian University (Top 0.02%)
Dec. 2020: National Undergraduate Scholarship
Dec. 2019: National Undergraduate Scholarship 
  
