🤖 AI Summary
This work proposes LabelPigeon, a novel framework that challenges the prevailing assumption that joint training of machine translation and label projection degrades translation quality. Unlike conventional cross-lingual label transfer approaches that decouple translation from label projection—thereby compromising both translation fidelity and transfer efficiency—LabelPigeon integrates these tasks through XML-tag-aware modeling. The framework leverages XML tag embeddings, end-to-end fine-tuning, a multilingual Transformer architecture, and a direct evaluation mechanism tailored for label projection. Evaluated on named entity recognition across 27 languages, LabelPigeon achieves up to a 39.9 F1-score improvement over prior methods, while also delivering consistent gains in translation quality across 203 languages.
📝 Abstract
Label projection is an effective technique for cross-lingual transfer, extending span-annotated datasets from a high-resource language to low-resource ones. Most approaches perform label projection as a separate step after machine translation, and prior work that combines the two reports degraded translation quality. We re-evaluate this claim with LabelPigeon, a novel framework that jointly performs translation and label projection via XML tags. We design a direct evaluation scheme for label projection, and find that LabelPigeon outperforms baselines and actively improves translation quality in 11 languages. We further assess translation quality across 203 languages and varying annotation complexity, finding consistent improvement attributed to additional fine-tuning. Finally, across 27 languages and three downstream tasks, we report substantial gains in cross-lingual transfer over comparable work, up to +39.9 F1 on NER. Overall, our results demonstrate that XML-tagged label projection provides effective and efficient label transfer without compromising translation quality.