🤖 AI Summary
Existing concept bottleneck models rely on manually predefined concept sets, limiting their generalizability and flexibility. This paper proposes the eXplanation Bottleneck Model (XBM), the first end-to-end framework that generates high-quality, natural-language explanations without requiring pre-specified concepts: given an input image, XBM directly produces semantically coherent, task-relevant textual explanations while jointly performing downstream prediction. Built upon a frozen vision–language encoder-decoder (e.g., BLIP-2), XBM is trained via joint optimization of task loss and explanation distillation regularization, where the generative explanation decoder is guided by distilled explanations from a teacher model. On multiple benchmarks, XBM achieves state-of-the-art performance in explanation fidelity, linguistic fluency, and task accuracy—substantially outperforming conventional concept bottleneck methods. The code is publicly available.
📝 Abstract
Recent concept-based interpretable models have succeeded in providing meaningful explanations by pre-defined concept sets. However, the dependency on the pre-defined concepts restricts the application because of the limited number of concepts for explanations. This paper proposes a novel interpretable deep neural network called explanation bottleneck models (XBMs). XBMs generate a text explanation from the input without pre-defined concepts and then predict a final task prediction based on the generated explanation by leveraging pre-trained vision-language encoder-decoder models. To achieve both the target task performance and the explanation quality, we train XBMs through the target task loss with the regularization penalizing the explanation decoder via the distillation from the frozen pre-trained decoder. Our experiments, including a comparison to state-of-the-art concept bottleneck models, confirm that XBMs provide accurate and fluent natural language explanations without pre-defined concept sets. Code will be available at https://github.com/yshinya6/xbm/.