CountXplain: Interpretable Cell Counting with Prototype-Based Density Map Estimation

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Biomedical cell counting faces challenges in deep learning model interpretability. This paper introduces prototype learning into density map estimation for the first time, proposing an interpretable cell counting framework: a prototype layer is embedded within the density estimation network to automatically learn biologically meaningful visual prototypes representing cells and background artifacts; cell regions are identified via prototype matching, and pixel-level interpretation heatmaps are generated using gradient-weighted class activation mapping. The method achieves competitive counting accuracy (MAE/ME) against state-of-the-art models on two public benchmarks while providing intuitive, biologist-validated visual explanations. Key contributions include: (i) the first prototype-based interpretable framework specifically designed for cell counting; (ii) joint optimization of counting performance and model interpretability; and (iii) significantly enhanced clinical trustworthiness and transparency for computer-aided diagnosis.

Technology Category

Application Category

📝 Abstract
Cell counting in biomedical imaging is pivotal for various clinical applications, yet the interpretability of deep learning models in this domain remains a significant challenge. We propose a novel prototype-based method for interpretable cell counting via density map estimation. Our approach integrates a prototype layer into the density estimation network, enabling the model to learn representative visual patterns for both cells and background artifacts. The learned prototypes were evaluated through a survey of biologists, who confirmed the relevance of the visual patterns identified, further validating the interpretability of the model. By generating interpretations that highlight regions in the input image most similar to each prototype, our method offers a clear understanding of how the model identifies and counts cells. Extensive experiments on two public datasets demonstrate that our method achieves interpretability without compromising counting effectiveness. This work provides researchers and clinicians with a transparent and reliable tool for cell counting, potentially increasing trust and accelerating the adoption of deep learning in critical biomedical applications. Code is available at https://github.com/NRT-D4/CountXplain.
Problem

Research questions and friction points this paper is trying to address.

Addressing interpretability challenges in deep learning cell counting models
Developing prototype-based density map estimation for transparent cell counting
Providing reliable biomedical counting tools without compromising effectiveness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prototype-based density map estimation for cell counting
Learns representative visual patterns for cells and background
Generates interpretations highlighting prototype-similar regions in images
🔎 Similar Papers
No similar papers found.
A
Abdurahman Ali Mohammed
Department of Computer Science, Iowa State University, Ames, IA 50011
Wallapak Tavanapong
Wallapak Tavanapong
Professor of Computer Science, Iowa State University
image/video analysiscolonoscopy qualityapplied machine learningmultimedia systemsdata science
C
Catherine Fonder
Department of Genetics, Development, and Cell Biology, Iowa State University, Ames, IA 50011
D
Donald S. Sakaguchi
Department of Genetics, Development, and Cell Biology, Iowa State University, Ames, IA 50011