2024: How Reliable Are Automatic Evaluation Methods for Instruction-Tuned LLMs?
2023: On the Generalization Ability of Retrieval-Enhanced Transformers (Findings of the Association for Computational Linguistics, p. 1485-1493)
2023: Making Instruction Finetuning Accessible to Non-English Languages: A Case Study on Swedish Models (Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa), p. 634-642)
2023: Surface-Based Retrieval Reduces Perplexity of Retrieval-Augmented Language Models (Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), p. 521-529)
2022: On the Effects of Video Grounding on Language Models (Proceedings of the First Workshop on Performance and Interpretability Evaluations of Multimodal, Multipurpose, Massive-Scale Models)
Research Experience
Currently a researcher at Linköping University, involved in multiple research projects focusing on natural language processing and large language models.
Education
Insufficient information to provide detailed educational background.
Background
Research interests include artificial intelligence and its applications. Works at the Department of Computer and Information Science, Linköping University, in the Artificial Intelligence and Integrated Computer Systems division.