Scholar
Hae Beom Lee
Google Scholar ID: 50_nxq0AAAAJ
Korea Advanced Institute of Science and Technology
Meta-learning
Follow
Homepage
↗
Google Scholar
↗
Citations & Impact
All-time
Citations
728
H-index
12
i10-index
13
Publications
19
Co-authors
0
Contact
Email
haebeomlee@korea.ac.kr
GitHub
Open ↗
Publications
1 items
Cost-Sensitive Freeze-thaw Bayesian Optimization for Efficient Hyperparameter Tuning
2025
Cited
0
Resume (English only)
Academic Achievements
Global Ph.D Fellowship Program, 2019–2021
Google Ph.D Fellowship Program, 2021
Outstanding reviewer at ICML 2020 (Top 33%) and ICML 2022 (Top 10%)
Published multiple papers at top-tier conferences including:
- 'Cost-Sensitive Freeze-thaw Bayesian Optimization for Efficient Hyperparameter Tuning' (NeurIPS 2025)
- 'Bayesian Neural Scaling Laws Extrapolation with Prior-Fitted Networks' (ICML 2025)
- 'Delta-AI: Local Objectives for Amortized Inference in Sparse Graphical Models' (ICLR 2024)
- 'Online Hyperparameter Meta-Learning with Hypergradient Distillation' (ICLR 2022, spotlight)
- 'Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning' (ICLR 2022)
- 'Large-Scale Meta-Learning with Continual Trajectory Shifting' (ICML 2021)
- 'MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and Architectures' (NeurIPS 2020, spotlight)
- 'Meta-Learning for Short Utterance Speaker Recognition with Imbalance Length Pairs' (Interspeech 2020)
- Preprint: 'Dataset Condensation with Latent Space Knowledge Factorization and Sharing' (arXiv 2022)
Co-authors
0 total
Co-authors: 0 (list not available)
×
Welcome back
Sign in to Agora
Welcome back! Please sign in to continue.
Email address
Password
Forgot password?
Continue
Do not have an account?
Sign up