🤖 AI Summary
This work studies efficient learning of $O(log n)$-juntas under smooth Markov Random Field (MRF) distributions, where only the external fields are subject to random perturbations—constituting a fundamental generalization of Kalai–Teng’s junta learning framework for product distributions. The proposed method employs a two-stage framework: first, unsupervised MRF structure learning via statistical moment estimation; second, greedy supervised learning leveraging the recovered graphical structure. The analysis integrates smoothed analysis with higher-order correlation techniques. This is the first result demonstrating that accurate MRF structure learning provably enables efficient supervised junta learning, achieving polynomial-time learnability. The work extends junta learning from independent (product) distributions to arbitrary correlated distributions governed by general graph structures, thereby providing the first theoretical guarantee for interpretable learning on structured, dependent data.
📝 Abstract
We give an algorithm for learning $O(log n)$ juntas in polynomial-time with respect to Markov Random Fields (MRFs) in a smoothed analysis framework where only the external field has been randomly perturbed. This is a broad generalization of the work of Kalai and Teng, who gave an algorithm that succeeded with respect to smoothed product distributions (i.e., MRFs whose dependency graph has no edges). Our algorithm has two phases: (1) an unsupervised structure learning phase and (2) a greedy supervised learning algorithm. This is the first example where algorithms for learning the structure of an undirected graphical model lead to provably efficient algorithms for supervised learning.