An Asymptotic Equation Linking WAIC and WBIC in Singular Models

📅 2025-05-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In singular statistical models—such as those with latent variables or hierarchical structures—classical information criteria (AIC/BIC) fail due to the breakdown of normal approximations to the likelihood and posterior. Although WAIC and WBIC were proposed for Bayesian model selection in such settings, they require computationally expensive multi-temperature posterior sampling. Method: Leveraging singular learning theory, we conduct rigorous Bayesian asymptotic analysis combined with temperature-scaled posterior modeling. Contribution/Results: We establish, for the first time, an asymptotic equivalence relation between WAIC and WBIC in singular models. This theoretical result implies that WAIC can be unbiasedly estimated using only the single-temperature posterior samples required for WBIC—eliminating the need for additional sampling. Our analysis not only uncovers a fundamental structural connection between WAIC and WBIC but also yields a computationally efficient, theoretically grounded approach to model selection in singular settings.

Technology Category

Application Category

📝 Abstract
In statistical learning, models are classified as regular or singular depending on whether the mapping from parameters to probability distributions is injective. Most models with hierarchical structures or latent variables are singular, for which conventional criteria such as the Akaike Information Criterion and the Bayesian Information Criterion are inapplicable due to the breakdown of normal approximations for the likelihood and posterior. To address this, the Widely Applicable Information Criterion (WAIC) and the Widely Applicable Bayesian Information Criterion (WBIC) have been proposed. Since WAIC and WBIC are computed using posterior distributions at different temperature settings, separate posterior sampling is generally required. In this paper, we theoretically derive an asymptotic equation that links WAIC and WBIC, despite their dependence on different posteriors. This equation yields an asymptotically unbiased expression of WAIC in terms of the posterior distribution used for WBIC. The result clarifies the structural relationship between these criteria within the framework of singular learning theory, and deepens understanding of their asymptotic behavior. This theoretical contribution provides a foundation for future developments in the computational efficiency of model selection in singular models.
Problem

Research questions and friction points this paper is trying to address.

Links WAIC and WBIC in singular models asymptotically
Clarifies structural relationship between WAIC and WBIC
Improves computational efficiency for singular model selection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Derives asymptotic equation linking WAIC and WBIC
Uses singular learning theory framework
Enhances computational efficiency in model selection
🔎 Similar Papers
No similar papers found.
Naoki Hayashi
Naoki Hayashi
Toyota Central R&D Labs.
Bayesian StatisticsAlgebraic StatisticsSingular Learning TheoryStatistical Learning Theory
T
Takuro Kutsuna
TOYOTA CENTRAL R&D LABS., INC. Nagakute Campus, 41-1, Yokomichi, Nagakute, Aichi 480-1192, Japan.
S
Sawa Takamuku
AISIN CORPORATION Tokyo Research Center, Akihabara Dai Building 7F, 1-18-13 Sotokanda, Chiyoda-ku, Tokyo 101-0021, Japan.