Inclusive AI for Group Interactions: Predicting Gaze-Direction Behaviors in People with Intellectual and Developmental Disabilities

📅 2026-03-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the limitations of current AI systems in supporting individuals with intellectual and developmental disabilities (IDD), which stem from training data predominantly derived from neurotypical populations and consequently fail to capture atypical gaze behaviors characteristic of IDD. To bridge this gap, we introduce MIDD, the first multimodal dataset of group interactions involving IDD participants, enriched with fine-grained annotations and clinician feedback. Through systematic analysis, we uncover distinct patterns in gaze distribution, verbal engagement, and interaction dynamics within this population. Leveraging these insights, we design targeted features and fine-tune models—including SVM and FSFNet—significantly improving gaze direction prediction accuracy for IDD individuals. Our work highlights critical challenges such as class imbalance and feature misalignment, offering empirical foundations and design guidelines for developing inclusive human–AI interaction systems.

Technology Category

Application Category

📝 Abstract
Artificial agents that support human group interactions hold great promise, especially in sensitive contexts such as well-being promotion and therapeutic interventions. However, current systems struggle to mediate group interactions involving people who are not neurotypical. This limitation arises because most AI detection models (e.g., for turn-taking) are trained on data from neurotypical populations. This work takes a step toward inclusive AI by addressing the challenge of eye contact detection, a core component of non-verbal communication, with and for people with Intellectual and Developmental Disabilities. First, we introduce a new dataset, Multi-party Interaction with Intellectual and Developmental Disabilities (MIDD), capturing atypical gaze and engagement patterns. Second, we present the results of a comparative analysis with neurotypical datasets, highlighting differences in class imbalance, speaking activity, gaze distribution, and interaction dynamics. Then, we evaluate classifiers ranging from SVMs to FSFNet, showing that fine-tuning on MIDD improves performance, though notable limitations remain. Finally, we present the insights gathered through a focus group with six therapists to interpret our quantitative findings and understand the practical implications of atypical gaze and engagement patterns. Based on these results, we discuss data-driven strategies and emphasize the importance of feature choice for building more inclusive human-centered tools.
Problem

Research questions and friction points this paper is trying to address.

Inclusive AI
Intellectual and Developmental Disabilities
Gaze Direction
Group Interaction
Non-verbal Communication
Innovation

Methods, ideas, or system contributions that make the work stand out.

Inclusive AI
Gaze Estimation
Intellectual and Developmental Disabilities
MIDD Dataset
Human-Centered AI
🔎 Similar Papers
No similar papers found.