Tommi Jaakkola
Scholar

Tommi Jaakkola

Google Scholar ID: Ao4gtsYAAAAJ
MIT
machine learningnatural language processingbiomolecular design
Citations & Impact
All-time
Citations
38,265
 
H-index
81
 
i10-index
223
 
Publications
20
 
Co-authors
111
list available
Resume (English only)
Academic Achievements
  • Introduced BoltzGen, an all-atom generative model for universal binder design, experimentally validated across eight wet-lab campaigns; model and code released under MIT license
  • Published 'Learning diffusion models with flexible representation guidance' at NeurIPS 2025
  • Published 'Next semantic scale prediction via hierarchical diffusion language models' at NeurIPS 2025
  • Published 'Thought calibration: Efficient and confident test-time scaling' at EMNLP 2025
  • Published 'Leaps: A discrete neural sampler via locally equivariant networks' at ICML 2025
  • Published 'Identifying biological perturbation targets through causal differential networks' at ICML 2025
  • Published 'Symmetry-driven discovery of dynamical variables in molecular simulations' at ICML 2025
  • Collaborates extensively with researchers including Regina Barzilay on interdisciplinary projects
Background
  • Thomas Siebel Professor of Electrical Engineering and Computer Science and the Institute for Data, Systems, and Society at MIT
  • Research focuses on enabling machines to learn, predict, or control in an efficient, principled, and interpretable manner at scale
  • Work spans foundational machine learning theory to modern applications, with emphasis on statistical inference and estimation in complex learning problems
  • Develops new methods, theory, and algorithms to automate the use and generation of semi-structured data such as text, images, molecules, and strategies
  • Applies algorithms to multifaceted recommender, retrieval, and inferential tasks (e.g., biomedical), molecular design for drug discovery, and modeling strategic game-theoretic interactions