🤖 AI Summary
This study addresses the scarcity of scalable and interpretable methods for organizing and semantically retrieving unstructured agricultural texts under limited labeled data. To this end, the authors propose a unified framework that integrates topic modeling (BERTopic) with large language models by transforming discovered topics into structured prompts. This enables zero-shot topic labeling, interpretable summarization, and topic-guided semantic retrieval. The framework further incorporates dense embeddings and vector-based retrieval to support efficient querying, alongside modules for evaluating topic coherence and potential bias. Experimental results demonstrate that the proposed approach significantly enhances topic coherence, interpretability, and retrieval effectiveness in unsupervised settings for agricultural text corpora.
📝 Abstract
As the volume of unstructured text continues to grow across domains, there is an urgent need for scalable methods that enable interpretable organization, summarization, and retrieval of information. This work presents a unified framework for interpretable topic modeling, zero-shot topic labeling, and topic-guided semantic retrieval over large agricultural text corpora. Leveraging BERTopic, we extract semantically coherent topics. Each topic is converted into a structured prompt, enabling a language model to generate meaningful topic labels and summaries in a zero-shot manner. Querying and document exploration are supported via dense embeddings and vector search, while a dedicated evaluation module assesses topical coherence and bias. This framework supports scalable and interpretable information access in specialized domains where labeled data is limited.