🤖 AI Summary
Tactical augmented reality (AR) systems are vulnerable to cognitive attacks that induce perceptual distortions, leading to erroneous situational awareness and decision-making failures.
Method: This paper proposes a perception-graph-based computational reasoning model that formalizes human semantic parsing and perceptual behavior in mixed-reality environments as a structured graph—where nodes represent semantic entities and weighted edges quantify the degree of perceptual distortion induced by cognitive attacks. The model integrates cognitive modeling with graph neural reasoning to enable attack detection, impact scoring, and attribution analysis.
Contribution/Results: Experimental evaluation demonstrates high detection accuracy and strong interpretability across complex AR interaction scenarios. The model significantly enhances the measurability and defensibility of human–machine interaction security by enabling quantitative assessment of cognition-driven perceptual anomalies.
📝 Abstract
Augmented reality (AR) systems are increasingly deployed in tactical environments, but their reliance on seamless human-computer interaction makes them vulnerable to cognitive attacks that manipulate a user's perception and severely compromise user decision-making. To address this challenge, we introduce the Perception Graph, a novel model designed to reason about human perception within these systems. Our model operates by first mimicking the human process of interpreting key information from an MR environment and then representing the outcomes using a semantically meaningful structure. We demonstrate how the model can compute a quantitative score that reflects the level of perception distortion, providing a robust and measurable method for detecting and analyzing the effects of such cognitive attacks.