- ForAug: Recombining Foregrounds and Backgrounds to Improve Vision Transformer Training with Bias Mitigation, arXiv, March 2025
- Which Transformer to Favor: A Comparative Analysis of Efficiency in Vision Transformers, WACV 2025, February 2025
- TaylorShift: Shifting the Complexity of Self-Attention from Squared to Linear (and Back) using Taylor-Softmax, ICPR 2024 (oral), 2024
- Projects:
- Albatross: A research project in the area of continual learning
- Sustainable Embedded AI: Energy- and data-saving methods for environmental perception in embedded AI systems, funded by the Carl Zeiss Foundation
- SustainML: Dedicated to creating a sustainable ML framework for Green AI, prioritizing energy efficiency
Research Experience
- Researcher, German Research Center for Artificial Intelligence (DFKI)
- PhD Student, RPTU Kaiserslautern-Landau
Education
- M.Sc. in Mathematics, 2022, Leibniz University Hannover
- B.Sc. in Computer Science, 2022, Leibniz University Hannover
- B.Sc. in Mathematics, 2019, Leibniz University Hannover
Background
- Research Interests: Efficient deep learning, Transformer models, multimodal learning, computer vision
- Professional Field: Artificial Intelligence
- Biography: Researcher of artificial intelligence at DFKI and RPTU Kaiserslautern-Landau, focusing on the development of efficient Transformer models for vision, language, and multimodal tasks.
Miscellany
- Skills: Python, Linux Terminal, LaTeX
- Hobbies: Electric Guitar, Hiking, Bicycle Traveling