Block-Sample MAC-Bayes Generalization Bounds

📅 2026-02-13
📈 Citations: 0
Influential: 0
📄 PDF

Technology Category

Application Category

📝 Abstract
We present a family of novel block-sample MAC-Bayes bounds (mean approximately correct). While PAC-Bayes bounds (probably approximately correct) typically give bounds for the generalization error that hold with high probability, MAC-Bayes bounds have a similar form but bound the expected generalization error instead. The family of bounds we propose can be understood as a generalization of an expectation version of known PAC-Bayes bounds. Compared to standard PAC-Bayes bounds, the new bounds contain divergence terms that only depend on subsets (or \emph{blocks}) of the training data. The proposed MAC-Bayes bounds hold the promise of significantly improving upon the tightness of traditional PAC-Bayes and MAC-Bayes bounds. This is illustrated with a simple numerical example in which the original PAC-Bayes bound is vacuous regardless of the choice of prior, while the proposed family of bounds are finite for appropriate choices of the block size. We also explore the question whether high-probability versions of our MAC-Bayes bounds (i.e., PAC-Bayes bounds of a similar form) are possible. We answer this question in the negative with an example that shows that in general, it is not possible to establish a PAC-Bayes bound which (a) vanishes with a rate faster than $\mathcal{O}(1/\log n)$ whenever the proposed MAC-Bayes bound vanishes with rate $\mathcal{O}(n^{-1/2})$ and (b) exhibits a logarithmic dependence on the permitted error probability.
Problem

Research questions and friction points this paper is trying to address.

PAC-Bayes
generalization bounds
MAC-Bayes
block-sample
expected generalization error
Innovation

Methods, ideas, or system contributions that make the work stand out.

MAC-Bayes
block-sample
generalization bounds
PAC-Bayes
expected generalization error
🔎 Similar Papers
No similar papers found.
M
Matthias Frey
Department of Electrical and Electronic Engineering, The University of Melbourne
Jingge Zhu
Jingge Zhu
University of Melbourne
Information TheoryCommunication SystemsStatistical Learning Theory
M
Michael C. Gastpar
Laboratory for Information in Networked Systems, École polytechnique fédérale de Lausanne (EPFL)