How to Tune a Multilingual Encoder Model for Germanic Languages: A Study of PEFT, Full Fine-Tuning, and Language Adapters

📅 2025-01-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study systematically evaluates parameter-efficient fine-tuning (PEFT) methods—specifically LoRA and Pfeiffer adapters—against full fine-tuning on mDeBERTa for German (high-resource), Swedish, and Icelandic (low-resource). Experiments span named entity recognition (NER) and question answering (QA) tasks. Results reveal that PEFT yields an average +1.8% improvement over full fine-tuning for German but exhibits substantial performance variance on low-resource languages. Crucially, task type governs method selection: PEFT consistently outperforms full fine-tuning on QA, whereas full fine-tuning proves more robust for NER. Unsupervised pre-adaptation on monolingual text confers no measurable gain. The core contribution lies in empirically delineating the generalization boundaries of PEFT across resource levels and task types, thereby providing evidence-based guidance for lightweight multilingual model adaptation.

Technology Category

Application Category

📝 Abstract
This paper investigates the optimal use of the multilingual encoder model mDeBERTa for tasks in three Germanic languages -- German, Swedish, and Icelandic -- representing varying levels of presence and likely data quality in mDeBERTas pre-training data. We compare full fine-tuning with the parameter-efficient fine-tuning (PEFT) methods LoRA and Pfeiffer bottleneck adapters, finding that PEFT is more effective for the higher-resource language, German. However, results for Swedish and Icelandic are less consistent. We also observe differences between tasks: While PEFT tends to work better for question answering, full fine-tuning is preferable for named entity recognition. Inspired by previous research on modular approaches that combine task and language adapters, we evaluate the impact of adding PEFT modules trained on unstructured text, finding that this approach is not beneficial.
Problem

Research questions and friction points this paper is trying to address.

Multilingual Model Optimization
Germanic Languages
Comparative Performance Analysis
Innovation

Methods, ideas, or system contributions that make the work stand out.

PEFT
Multilingual Models
Language-specific Performance
🔎 Similar Papers
No similar papers found.