🤖 AI Summary
This paper addresses the computational intractability of exact directed acyclic graph (DAG) inference in causal structure learning. To overcome the exponential growth in DAG constraint enumeration that renders conventional integer programming infeasible, we propose a scalable global optimization framework based on dynamic constraint generation—specifically, a mixed-integer quadratic programming (MIQP) formulation that iteratively adds violated acyclicity constraints, avoiding full constraint enumeration. Our method integrates continuous relaxation, cutting-plane techniques, and flexible noise modeling (Gaussian and non-Gaussian). It guarantees globally optimal DAG recovery for graphs with up to 50 nodes. Experiments demonstrate: (i) superior structural accuracy over leading local solvers under Gaussian noise; (ii) a 10–100× improvement in scalability compared to existing global methods; and (iii) robust performance across diverse noise distributions.
📝 Abstract
There has been a growing interest in causal learning in recent years. Commonly used representations of causal structures, including Bayesian networks and structural equation models (SEM), take the form of directed acyclic graphs (DAGs). We provide a novel mixed-integer quadratic programming formulation and associated algorithm that identifies DAGs on up to 50 vertices, where these are identifiable. We call this method ExDAG, which stands for Exact learning of DAGs. Although there is a superexponential number of constraints that prevent the formation of cycles, the algorithm adds constraints violated by solutions found, rather than imposing all constraints in each continuous-valued relaxation. Our empirical results show that ExDAG outperforms local state-of-the-art solvers in terms of precision and outperforms state-of-the-art global solvers with respect to scaling, when considering Gaussian noise. We also provide validation with respect to other noise distributions.