- ICLR 2025: The Optimization Landscape of SGD Across the Feature Learning Strength
- NeurIPS 2024: Why do Learning Rates Transfer? Reconciling Optimization and Scaling Limits for Deep Learning
- ICLR 2024: Towards Training Without Depth Limits: Batch Normalization Without Gradient Explosion
- RECOMB 2023: Aligning distant sequences to graphs using long seed sketches
Research Experience
Interned in various places.
Education
Finished Data Science MSc at ETH Zurich working with Gunnar Rätsch and Francesco Orabona (KAUST) and CS BSc at UPB working with Iuliu Vasilescu.
Background
CS PhD student at Harvard co-advised by Prof. Cengiz Pehlevan and Prof. Sham Kakade. Main interests are in Deep Learning theory and neural network optimization. Aims to understand how these models learn from data by developing a theoretical framework backed by empirical evidence.
Miscellany
Born in Romania. Did a lot of systems programming back in BSc years. Advocate of open-source software. Hobbies include running, reading, swimming, and bike fixing.