A Logic of General Attention Using Edge-Conditioned Event Models (Extended Version)

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing dynamic epistemic logics support attention modeling only for atomic formulas, failing to capture higher-order beliefs, others’ attention, and attentional biases; moreover, their model size grows exponentially with the number of agents and events. Method: We introduce the first General Attention Logic (GAL), axiomatizing attention as an independent modality—thereby lifting the atomic-formula restriction—and propose Edge-Conditioned Event Models (ECMs), which retain full expressivity while achieving exponential succinctness. Contribution/Results: GAL formally defines the attention closure principle and the bias detection mechanism—the first such formalizations in the literature. The framework enables compact, verifiable modeling and reasoning about complex inter-agent attention interactions, reducing model size dramatically. It provides a rigorous logical foundation for AI systems to recognize and reason about human attentional biases, advancing both theoretical epistemic logic and practical cognitive AI.

Technology Category

Application Category

📝 Abstract
In this work, we present the first general logic of attention. Attention is a powerful cognitive ability that allows agents to focus on potentially complex information, such as logically structured propositions, higher-order beliefs, or what other agents pay attention to. This ability is a strength, as it helps to ignore what is irrelevant, but it can also introduce biases when some types of information or agents are systematically ignored. Existing dynamic epistemic logics for attention cannot model such complex attention scenarios, as they only model attention to atomic formulas. Additionally, such logics quickly become cumbersome, as their size grows exponentially in the number of agents and announced literals. Here, we introduce a logic that overcomes both limitations. First, we generalize edge-conditioned event models, which we show to be as expressive as standard event models yet exponentially more succinct (generalizing both standard event models and generalized arrow updates). Second, we extend attention to arbitrary formulas, allowing agents to also attend to other agents' beliefs or attention. Our work treats attention as a modality, like belief or awareness. We introduce attention principles that impose closure properties on that modality and that can be used in its axiomatization. Throughout, we illustrate our framework with examples of AI agents reasoning about human attentional biases, demonstrating how such agents can discover attentional biases.
Problem

Research questions and friction points this paper is trying to address.

Develops a general logic for modeling complex attention scenarios
Extends attention to arbitrary formulas and multi-agent beliefs
Addresses exponential growth issues in dynamic epistemic logics
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized edge-conditioned event models for succinctness
Extended attention to arbitrary formulas and agents
Attention as a modality with axiomatic principles
🔎 Similar Papers
No similar papers found.