Emotional Support with Conversational AI: Talking to Machines About Life

📅 2026-03-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses a critical gap in existing research, which has predominantly focused on the efficacy of AI-provided emotional support while overlooking how such support is co-constructed through interaction. Reconceptualizing AI emotional support as a sociotechnical process shaped by community negotiation, this work analyzes qualitative data from user–AI companion dialogues and associated community discussions on Reddit. The analysis reveals three core interactional mechanisms—empathic validation, reflective questioning, and a sense of companionship—as well as three key tensions that emerge in practice. By foregrounding the role of social context and interactional dynamics in shaping AI-mediated support, the study offers both theoretical grounding and practical insights for designing responsible, context-sensitive affective support systems.

Technology Category

Application Category

📝 Abstract
AI companion chatbots are increasingly used for emotional support, with prior work in the domain predominantly documenting their mixed psychosocial impacts, including both increased emotional expression and heightened loneliness. However, most existing research primarily focuses on outcome-level effects, offering limited insight into how emotional support is produced through interaction. In this paper, we examine emotional support as an interactional and socially situated process. Drawing on qualitative analysis of Reddit discussions, we analyze how users engage with AI companions and how these interactions are interpreted and contested within online communities. We show that emotional support is coconstructed through conversational mechanisms such as validation, reflective prompting, and companionship, while also giving rise to tensions including support versus dependency, validation versus delusion, and accessibility versus harm. Importantly, support extends beyond human AI interaction and is shaped by community responses that legitimize or challenge AI-mediated care. Hence, we reconceptualize AI emotional support as a negotiated socio-technical process and derive implications for the design of responsible, context-sensitive AI systems.
Problem

Research questions and friction points this paper is trying to address.

emotional support
conversational AI
human-AI interaction
socio-technical process
online communities
Innovation

Methods, ideas, or system contributions that make the work stand out.

conversational AI
emotional support
socio-technical process
co-construction
online communities
🔎 Similar Papers
O
Olivia Yan Huang
University of Illinois Urbana-Champaign, USA
M
Monika Stodolska
University of Illinois Urbana-Champaign, USA
Sharifa Sultana
Sharifa Sultana
Assistant Professor, Computer Science, University of Illinois Urbana-Champaign
HCIResponsible AIDesign