Enforcing convex constraints in Graph Neural Networks

📅 2025-10-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Graph neural networks (GNNs) must produce outputs that satisfy input-dependent, dynamically varying convex constraints and accommodate variable-dimensional outputs—a challenge unaddressed by standard architectures. Method: This paper proposes ProjNet, a novel framework featuring a differentiable, nonsmooth constraint projection module that integrates sparse vector clipping with a GPU-accelerated component-averaged Dykstra (CAD) algorithm, coupled with a surrogate gradient technique for end-to-end training. Contribution/Results: ProjNet guarantees theoretical convergence while substantially improving optimization efficiency and stability on large-scale graph data. Experiments on linear programming, two classes of nonconvex quadratic programming, and wireless power allocation demonstrate high solution accuracy, strong generalization across problem instances, and excellent scalability with graph size.

Technology Category

Application Category

📝 Abstract
Many machine learning applications require outputs that satisfy complex, dynamic constraints. This task is particularly challenging in Graph Neural Network models due to the variable output sizes of graph-structured data. In this paper, we introduce ProjNet, a Graph Neural Network framework which satisfies input-dependant constraints. ProjNet combines a sparse vector clipping method with the Component-Averaged Dykstra (CAD) algorithm, an iterative scheme for solving the best-approximation problem. We establish a convergence result for CAD and develop a GPU-accelerated implementation capable of handling large-scale inputs efficiently. To enable end-to-end training, we introduce a surrogate gradient for CAD that is both computationally efficient and better suited for optimization than the exact gradient. We validate ProjNet on four classes of constrained optimisation problems: linear programming, two classes of non-convex quadratic programs, and radio transmit power optimization, demonstrating its effectiveness across diverse problem settings.
Problem

Research questions and friction points this paper is trying to address.

Enforcing convex constraints in Graph Neural Networks with variable output sizes
Developing iterative projection methods for constraint satisfaction in graph data
Solving constrained optimization problems across linear and non-convex programming domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

ProjNet framework enforces input-dependent convex constraints
Combines sparse clipping with GPU-accelerated CAD algorithm
Uses surrogate gradient for efficient end-to-end training
🔎 Similar Papers
No similar papers found.