To Word Senses and Beyond: Inducing Concepts with Contextualized Language Models

📅 2024-06-28
🏛️ Conference on Empirical Methods in Natural Language Processing
📈 Citations: 3
Influential: 1
📄 PDF
🤖 AI Summary
This work addresses the dual challenges of polysemy and synonymy in lexical ambiguity by proposing “Concept Induction”—a novel unsupervised task that learns soft word clustering directly from raw corpora to define cross-lexeme, semantically coherent meaning units. Methodologically, it generalizes word sense induction to cross-lexeme concept induction for the first time, introducing a two-level modeling paradigm that jointly captures local token-centric views and global inter-lexical semantic structures. Leveraging contextualized language models, it derives static concept embeddings optimized end-to-end for both soft clustering and hierarchical structure learning. On SemCor, the approach achieves a BCubed F1 score exceeding 0.60; its induced concept embeddings attain state-of-the-art performance on the Word-in-Context (WiC) task, demonstrating strong contextual sensitivity and generalization capability in semantic representation.

Technology Category

Application Category

📝 Abstract
Polysemy and synonymy are two crucial interrelated facets of lexicalambiguity. While both phenomena are widely documented in lexical resources and have been studied extensively in NLP,leading to dedicated systems, they are often being consideredindependently in practictal problems. While many tasks dealing with polysemy (e.g. Word SenseDisambiguiation or Induction) highlight the role of word’s senses,the study of synonymy is rooted in the study of concepts, i.e. meaningsshared across the lexicon. In this paper, we introduce ConceptInduction, the unsupervised task of learning a soft clustering amongwords that defines a set of concepts directly from data. This taskgeneralizes Word Sense Induction. We propose a bi-levelapproach to Concept Induction that leverages both a locallemma-centric view and a global cross-lexicon view to induceconcepts. We evaluate the obtained clustering on SemCor’s annotateddata and obtain good performance (BCubed F1 above0.60). We find that the local and the global levels are mutuallybeneficial to induce concepts and also senses in our setting. Finally,we create static embeddings representing our induced concepts and usethem on the Word-in-Context task, obtaining competitive performancewith the State-of-the-Art.
Problem

Research questions and friction points this paper is trying to address.

Inducing soft clustering of words into concepts
Generalizing word sense induction through contextual modeling
Addressing polysemy and synonymy via bi-level clustering approach
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unsupervised soft clustering for concept induction
Bi-level approach combining local and global views
Static embeddings representing induced concepts for tasks
B
Bastien Liétard
University of Lille, Inria, CNRS, Centrale Lille, UMR 9189 - CRIStAL, F-59000 Lille, France
Pascal Denis
Pascal Denis
University of Lille, Inria, CNRS, Centrale Lille, UMR 9189 - CRIStAL, F-59000 Lille, France
M
Mikaela Keller
University of Lille, Inria, CNRS, Centrale Lille, UMR 9189 - CRIStAL, F-59000 Lille, France