🤖 AI Summary
To address the challenges of high model complexity, poor interpretability, and exponential growth in network size inherent in multi-source Bayesian network fusion, this paper proposes the Greedy Min-Cut Bayesian Consensus (GMCBC) algorithm. GMCBC is the first method to integrate min-cut analysis—based on the Ford–Fulkerson algorithm—with backward equivalence search (BES) to efficiently construct structural consensus within the Bayesian equivalence class space, while rigorously bounding edge count and inference complexity. Compared to conventional fusion approaches, GMCBC achieves significantly higher structural accuracy and dependency fidelity on synthetic data and federated learning simulations. Crucially, the resulting fused network exhibits size comparable to individual input networks, ensuring scalability and interpretability. By jointly optimizing structural fidelity and computational tractability, GMCBC overcomes the fundamental trade-off between complexity and accuracy that constrains existing methods.
📝 Abstract
This paper presents the Greedy Min-Cut Bayesian Consensus (GMCBC) algorithm for the structural fusion of Bayesian Networks (BNs). The method is designed to preserve essential dependencies while controlling network complexity. It addresses the limitations of traditional fusion approaches, which often lead to excessively complex models that are impractical for inference, reasoning, or real-world applications. As the number and size of input networks increase, this issue becomes even more pronounced. GMCBC integrates principles from flow network theory into BN fusion, adapting the Backward Equivalence Search (BES) phase of the Greedy Equivalence Search (GES) algorithm and applying the Ford-Fulkerson algorithm for minimum cut analysis. This approach removes non-essential edges, ensuring that the fused network retains key dependencies while minimizing unnecessary complexity. Experimental results on synthetic Bayesian Networks demonstrate that GMCBC achieves near-optimal network structures. In federated learning simulations, GMCBC produces a consensus network that improves structural accuracy and dependency preservation compared to the average of the input networks, resulting in a structure that better captures the real underlying (in)dependence relationships. This consensus network also maintains a similar size to the original networks, unlike unrestricted fusion methods, where network size grows exponentially.