Published several scientific articles, such as 'Systematic Generalization in Language Models Scales with Information Entropy' and others.
Research Experience
A Doctoral Research Fellow at the Language Technology Group (LTG). He works on how out-of-distribution generalization relates to compositionality, specifically trying to quantify this ability using knowledge graphs and synthetic languages generated from a CFG.
Background
Research interests include Natural Language Processing, Machine Learning, and Knowledge Graphs. His research primarily focuses on the extent to which sequence models, like language models, can generalize to previously seen information that is composed in a novel way.
Miscellany
Teaching: IN1140 (2022-2024) - Introduction to Language Technology; IN5550 (2023-2025) - Neural Methods in Natural Language Processing