Differential Analysis of Pseudo Haptic Feedback: Novel Comparative Study of Visual and Auditory Cue Integration for Psychophysical Evaluation

📅 2025-10-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates whether audiovisual cues on commodity tablet devices can collaboratively elicit graded pseudohaptic pressure perception without dedicated haptic actuators. Method: A Unity-based ball-rolling game was developed, integrating Robotous RFT40 torque sensors to quantify real-time finger force variations across parametrically textured surfaces; customized audio (440 Hz–13.1 kHz rolling/tapping sounds) and visual textures were synchronized, and multisensory integration effects were evaluated via psychophysical experiments. Contribution/Results: We provide the first systematic validation that audiovisual cues on off-the-shelf devices induce reliable, intensity-controllable pseudohaptic feedback: high-frequency sounds coupled with high-density visual textures significantly reduced the mean finger force required to discriminate surface differences (<0.90 N) and enhanced electromyographic muscle activation. These findings demonstrate that cross-modal audiovisual integration alone suffices to stably evoke, quantify, and modulate pseudohaptic responses—establishing a novel, low-cost paradigm for immersive interaction.

Technology Category

Application Category

📝 Abstract
Pseudo-haptics exploit carefully crafted visual or auditory cues to trick the brain into"feeling"forces that are never physically applied, offering a low-cost alternative to traditional haptic hardware. Here, we present a comparative psychophysical study that quantifies how visual and auditory stimuli combine to evoke pseudo-haptic pressure sensations on a commodity tablet. Using a Unity-based Rollball game, participants (n = 4) guided a virtual ball across three textured terrains while their finger forces were captured in real time with a Robotous RFT40 force-torque sensor. Each terrain was paired with a distinct rolling-sound profile spanning 440 Hz - 4.7 kHz, 440 Hz - 13.1 kHz, or 440 Hz - 8.9 kHz; crevice collisions triggered additional"knocking"bursts to heighten realism. Average tactile forces increased systematically with cue intensity: 0.40 N, 0.79 N and 0.88 N for visual-only trials and 0.41 N, 0.81 N and 0.90 N for audio-only trials on Terrains 1-3, respectively. Higher audio frequencies and denser visual textures both elicited stronger muscle activation, and their combination further reduced the force needed to perceive surface changes, confirming multisensory integration. These results demonstrate that consumer-grade isometric devices can reliably induce and measure graded pseudo-haptic feedback without specialized actuators, opening a path toward affordable rehabilitation tools, training simulators and assistive interfaces.
Problem

Research questions and friction points this paper is trying to address.

Compares visual and auditory cues for inducing pseudo-haptic pressure sensations
Quantifies multisensory integration effects on perceived force and muscle activation
Evaluates consumer devices for affordable pseudo-haptic feedback applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Comparative study of visual and auditory pseudo-haptic cues
Unity-based Rollball game with force-torque sensor measurements
Consumer-grade devices induce multisensory pseudo-haptic feedback
🔎 Similar Papers
No similar papers found.
N
Nishant Gautam
Advanced Robotics @ Queen Mary University of London, London E1 4NS, United Kingdom
S
Somya Sharma
Advanced Robotics @ Queen Mary University of London, London E1 4NS, United Kingdom
Peter Corcoran
Peter Corcoran
Professor (personal chair) National University of Ireland, Galway
consumer electronicscomputer visionbiometricsdeep learningedge computing
K
K. Althoefer
Advanced Robotics @ Queen Mary University of London, London E1 4NS, United Kingdom