Filip Szatkowski
Scholar

Filip Szatkowski

Google Scholar ID: xjnAIOEAAAAJ
PhD Student, Warsaw University of Technology
deep learningefficiencyadaptive computationcontinual learning
Citations & Impact
All-time
Citations
98
 
H-index
4
 
i10-index
4
 
Publications
12
 
Co-authors
14
list available
Resume (English only)
Academic Achievements
  • Published several papers on early-exit networks, activation sparsity, and continual learning, such as 'Failure Prediction Is a Better Performance Proxy for Early-Exit Networks Than Calibration', 'Universal Properties of Activation Sparsity in Modern Large Language Models', 'Improving Continual Learning Performance and Efficiency with Auxiliary Classifiers', and 'Exploiting Activation Sparsity with Dense to Dynamic-k Mixture-of-Experts Conversion'.
Research Experience
  • Published at top conferences such as NeurIPS and ICML; collaborated with European institutions including the Computer Vision Center in Barcelona and Sapienza University of Rome; worked as an Applied Scientist Intern at Amazon AWS AI and NLP Intern at Samsung R&D in Warsaw.
Education
  • PhD student at Warsaw University of Technology, supervised by professor Tomasz Trzciński.
Background
  • PhD student at Warsaw University of Technology, focusing on efficiency in deep learning, spanning adaptive computation, early-exits, activation sparsity, speculative decoding, and continual learning.
Miscellany
  • Active in the Polish ML community, organizing major events such as the ML in PL conferences and summer schools, as well as the ELLIS Doctoral Symposium 2025 in Warsaw.