Is Seeing Believing? Evaluating Human Sensitivity to Synthetic Video

📅 2026-03-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates human perceptual sensitivity to audiovisual distortions and artifacts in synthetic videos—such as deepfakes—and their impact on perceived credibility and learning outcomes. Through three between-subjects experiments combining subjective credibility ratings with objective measures of learning performance, the research systematically evaluates how visual and auditory artifacts influence cognitive processing. The findings reveal, for the first time, that multimodal artifacts inherent in deepfakes significantly undermine content credibility and impair learning effectiveness. These results provide novel empirical evidence for theories of synthetic media perception and establish a foundational basis for developing more effective deepfake detection methods and mitigation strategies.

Technology Category

Application Category

📝 Abstract
Advances in machine learning have enabled the creation of realistic synthetic videos known as deepfakes. As deepfakes proliferate, concerns about rapid spread of disinformation and manipulation of public perception are mounting. Despite the alarming implications, our understanding of how individuals perceive synthetic media remains limited, obstructing the development of effective mitigation strategies. This paper aims to narrow this gap by investigating human responses to visual and auditory distortions of videos and deepfake-generated visuals and narration. In two between-subjects experiments, we study whether audio-visual distortions affect cognitive processing, such as subjective credibility assessment and objective learning outcomes. A third study reveals that artifacts from deepfakes influence credibility. The three studies show that video distortions and deepfake artifacts can reduce credibility. Our research contributes to the ongoing exploration of the cognitive processes involved in the evaluation and perception of synthetic videos, and underscores the need for further theory development concerning deepfake exposure.
Problem

Research questions and friction points this paper is trying to address.

deepfakes
synthetic video
human perception
credibility assessment
disinformation
Innovation

Methods, ideas, or system contributions that make the work stand out.

deepfakes
synthetic video perception
audio-visual distortions
credibility assessment
cognitive processing
🔎 Similar Papers
No similar papers found.
D
David Wegmann
Department of Media and Journalism Studies, School of Communication and Culture, Aarhus University
E
Emil Stevnsborg
Department of Computer Science, University of Copenhagen
S
Søren Knudsen
IT University of Copenhagen
Luca Rossi
Luca Rossi
IT University of Copenhagen
SNSTwitterSNA
Aske Mottelson
Aske Mottelson
Associate Professor, IT University of Copenhagen
Human-Computer InteractionData ScienceExperimental PsychologyExtended Reality