Jie Hao
Scholar

Jie Hao

Google Scholar ID: S8ZTkikAAAAJ
George Mason University
Bilevel optimizationContinual learningNeural architecture search
Citations & Impact
All-time
Citations
123
 
H-index
6
 
i10-index
4
 
Publications
14
 
Co-authors
5
list available
Publications
14 items
Browse publications on Google Scholar (top-right) ↗
Resume (English only)
Academic Achievements
  • Multiple papers accepted at top international conferences such as NeurIPS 2025, NeurIPS 2024, ICML 2024, ICLR 2024 (Spotlight), NeurIPS 2023, and UAI 2023. Also served as a reviewer for several academic conferences including NeurIPS 2024, EMNLP 2024, and AISTATS 2024.
Research Experience
  • Participating in the AUDITION project, which aims to develop the mathematical foundations for a digital twin (DT) system for individuals with autism spectrum disorder (ASD), focusing on dynamic modeling.
Education
  • Currently a third-year Ph.D. student at the Department of Computer Science, George Mason University, advised by Prof. Mingrui Liu and Prof. Jie Xu; received a master's degree in computer science from the University of Electronic Science and Technology of China, advised by William Zhu; bachelor's degree in EE from Sichuan University.
Background
  • Research interests include: data selection for LLM, bilevel optimization, continual learning, federated learning, and parameter-efficient fine-tuning for LLM. Focused on designing efficient machine learning algorithms for practical problems, particularly in pretraining/fine-tuning of LLMs, continual learning, and meta-learning, with theoretical guarantees.
Miscellany
  • Looking for a research internship opportunity in 2026 summer related to large language model (LLM) training, optimization, or acceleration.