🤖 AI Summary
This work addresses the limited tightness of convex relaxations in mixed-integer nonlinear programming (MINLP) by proposing a computational geometry–based polyhedral relaxation method. It employs a convexification strategy that selects points to iteratively approximate the simultaneous convex hull of factorable function graphs and introduces novel explicit inequalities to strengthen factorable relaxations. Theoretically, it proves that for multilinear functions over axis-aligned domains, the simultaneous convex hull is uniquely determined by its corner points. Furthermore, the approach integrates voxelization with the QuickHull algorithm to efficiently approximate feasible regions. Computational experiments demonstrate that the method reduces the dual gap by 20–25% on average for random polynomial problems and outperforms existing techniques on approximately 30% of MINLPLib instances, with over 10% of cases achieving more than a 50% gap reduction.
📝 Abstract
We present a novel relaxation framework for general mixed-integer nonlinear programming (MINLP) grounded in computational geometry. Our approach constructs polyhedral relaxations by convexifying finite sets of strategically chosen points, iteratively refining the approximation to converge toward the simultaneous convex hull of factorable function graphs. The framework is underpinned by three key contributions: (i) a new class of explicit inequalities for products of functions that strictly improve upon standard factorable and composite relaxation schemes; (ii) a proof establishing that the simultaneous convex hull of multilinear functions over axis-aligned regions is fully determined by their values at corner points, thereby generalizing existing results from hypercubes to arbitrary axis-aligned domains; and (iii) the integration of computational geometry tools, specifically voxelization and QuickHull, to efficiently approximate feasible regions and function graphs. We implement this framework and evaluate it on randomly generated polynomial optimization problems and a suite of 619 instances from \texttt{MINLPLib}. Numerical results demonstrate significant improvements over state-of-the-art benchmarks: on polynomial instances, our relaxation closes an additional 20--25\% of the optimality gap relative to standard methods on half the instances. Furthermore, compared against an enhanced factorable programming baseline and Gurobi's root-node bounds, our approach yields superior dual bounds on approximately 30\% of \texttt{MINLPLib} instances, with roughly 10\% of cases exhibiting a gap reduction exceeding 50\%.