Virtual Arc Consistency for Linear Constraints inCost Function Networks

📅 2025-09-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing soft global constraints exhibit weak communication among variables, while LP-based reconstruction methods suffer from scalability issues, limiting modeling flexibility and lower-bound inference in constraint programming with soft constraints. Method: This paper embeds linear constraints as local cost functions into cost function networks and extends soft arc consistency (SAC) for the first time to enable exact pruning with linear constraints. A novel virtual propagation mechanism is introduced to achieve strong lower-bound inference without explicitly expanding constraints. Contribution/Results: The approach bridges expressive modeling and computational efficiency by preserving the compact structure of linear constraints while enabling tighter bounds. Experimental evaluation on multiple benchmark instances demonstrates significant improvements in lower-bound quality; in several cases, total solving time is reduced by over 30%.

Technology Category

Application Category

📝 Abstract
In Constraint Programming, solving discrete minimization problems with hard and soft constraints can be done either using (i) soft global constraints, (ii) a reformulation into a linear program, or (iii) a reformulation into local cost functions. Approach (i) benefits from a vast catalog of constraints. Each soft constraint propagator communicates with other soft constraints only through the variable domains, resulting in weak lower bounds. Conversely, the approach (ii) provides a global view with strong bounds, but the size of the reformulation can be problematic. We focus on approach (iii) in which soft arc consistency (SAC) algorithms produce bounds of intermediate quality. Recently, the introduction of linear constraints as local cost functions increases their modeling expressiveness. We adapt an existing SAC algorithm to handle linear constraints. We show that our algorithm significantly improves the lower bounds compared to the original algorithm on several benchmarks, reducing solving time in some cases.
Problem

Research questions and friction points this paper is trying to address.

Adapting soft arc consistency algorithms to handle linear constraints
Improving lower bounds for discrete minimization problems with constraints
Enhancing modeling expressiveness of local cost function networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adapted soft arc consistency algorithm for linear constraints
Used local cost functions to improve modeling expressiveness
Enhanced lower bounds and reduced solving time
🔎 Similar Papers
P
Pierre Montalbano
LIFAT, UR 6300, Université de Tours, ANITI, INRAE, Tours, France
S
Simon de Givry
MIAT, UR 875, Université de Toulouse, ANITI, INRAE, Toulouse, France
George Katsirelos
George Katsirelos
INRAE
Artificial IntelligenceConstraint SatisfactionBoolean SatisfiabilityCombinatorial Optimization