"She's Like a Person but Better": Characterizing Companion-Assistant Dynamics in Human-AI Relationships

📅 2025-09-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the paradoxical duality of “digital companionship” in human–AI interactions (with ChatGPT and Replika): users are simultaneously drawn to AI’s anthropomorphic qualities—such as emotional resonance and personalization—while relying on its non-human affordances, including perpetual availability and unconditional tolerance, thereby forming deep affective attachments while denying AI’s ontological personhood and actively aligning such relationships with prevailing social norms. Employing a mixed-methods design (N=204 survey + 30 in-depth interviews), the study focuses on high-intensity users. Findings reveal that users dynamically negotiate a spectrum between “tool” and “companion,” cultivating hybrid relational stances. We introduce the concept of “bounded personhood” to articulate the structural tension between AI’s anthropomorphic design and sociocultural criteria for personhood attribution. Furthermore, this work provides the first systematic account of cognitive duality—co-occurring instrumental and affective appraisals—in digital intimacy, elucidating its psychological mechanisms and ethical implications for AI relationship design and human–AI interaction theory.

Technology Category

Application Category

📝 Abstract
Large language models are increasingly used for both task-based assistance and social companionship, yet research has typically focused on one or the other. Drawing on a survey (N = 204) and 30 interviews with high-engagement ChatGPT and Replika users, we characterize digital companionship as an emerging form of human-AI relationship. With both systems, users were drawn to humanlike qualities, such as emotional resonance and personalized responses, and non-humanlike qualities, such as constant availability and inexhaustible tolerance. This led to fluid chatbot uses, such as Replika as a writing assistant and ChatGPT as an emotional confidant, despite their distinct branding. However, we observed challenging tensions in digital companionship dynamics: participants grappled with bounded personhood, forming deep attachments while denying chatbots"real"human qualities, and struggled to reconcile chatbot relationships with social norms. These dynamics raise questions for the design of digital companions and the rise of hybrid, general-purpose AI systems.
Problem

Research questions and friction points this paper is trying to address.

Characterizing human-AI relationships in task assistance and social companionship
Examining tensions between humanlike qualities and bounded personhood in chatbots
Exploring design challenges for hybrid general-purpose AI companion systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combining task assistance with social companionship
Leveraging humanlike and non-humanlike chatbot qualities
Addressing tensions in human-AI relationship dynamics
🔎 Similar Papers
No similar papers found.
A
A. Manoli
Sentience Institute, US and Max Planck institute for Human Cognitive and Brain Sciences, Germany
J
Janet V. T. Pauketat
Sentience Institute, US
A
Ali Ladak
Sentience Institute, US and University of Edinburgh, UK
H
Hayoun Noh
University of Oxford, UK
Angel Hsing-Chi Hwang
Angel Hsing-Chi Hwang
University of Southern California
human-AI collaborationhuman-computer interactionhuman-centered AI
J
Jacy Reese Anthis
Sentience Institute, US and Stanford University, US and University of Chicago, US