🤖 AI Summary
This study investigates whether audiovisual cues on commodity tablet devices can collaboratively elicit graded pseudohaptic pressure perception without dedicated haptic actuators.
Method: A Unity-based ball-rolling game was developed, integrating Robotous RFT40 torque sensors to quantify real-time finger force variations across parametrically textured surfaces; customized audio (440 Hz–13.1 kHz rolling/tapping sounds) and visual textures were synchronized, and multisensory integration effects were evaluated via psychophysical experiments.
Contribution/Results: We provide the first systematic validation that audiovisual cues on off-the-shelf devices induce reliable, intensity-controllable pseudohaptic feedback: high-frequency sounds coupled with high-density visual textures significantly reduced the mean finger force required to discriminate surface differences (<0.90 N) and enhanced electromyographic muscle activation. These findings demonstrate that cross-modal audiovisual integration alone suffices to stably evoke, quantify, and modulate pseudohaptic responses—establishing a novel, low-cost paradigm for immersive interaction.
📝 Abstract
Pseudo-haptics exploit carefully crafted visual or auditory cues to trick the brain into"feeling"forces that are never physically applied, offering a low-cost alternative to traditional haptic hardware. Here, we present a comparative psychophysical study that quantifies how visual and auditory stimuli combine to evoke pseudo-haptic pressure sensations on a commodity tablet. Using a Unity-based Rollball game, participants (n = 4) guided a virtual ball across three textured terrains while their finger forces were captured in real time with a Robotous RFT40 force-torque sensor. Each terrain was paired with a distinct rolling-sound profile spanning 440 Hz - 4.7 kHz, 440 Hz - 13.1 kHz, or 440 Hz - 8.9 kHz; crevice collisions triggered additional"knocking"bursts to heighten realism. Average tactile forces increased systematically with cue intensity: 0.40 N, 0.79 N and 0.88 N for visual-only trials and 0.41 N, 0.81 N and 0.90 N for audio-only trials on Terrains 1-3, respectively. Higher audio frequencies and denser visual textures both elicited stronger muscle activation, and their combination further reduced the force needed to perceive surface changes, confirming multisensory integration. These results demonstrate that consumer-grade isometric devices can reliably induce and measure graded pseudo-haptic feedback without specialized actuators, opening a path toward affordable rehabilitation tools, training simulators and assistive interfaces.