🤖 AI Summary
This study addresses the context-sensitive binary hypothesis testing problem under i.i.d. observations with a multiplicative weight function, aiming to characterize the asymptotic optimality of the total risk. By embedding the weighted geometric mixture into an exponential family framework, the work proposes a weighted Chernoff information as the optimal error exponent and expresses it as the maximum of the log-normalizer of the associated exponential family. Leveraging exponential family modeling, weighted likelihood analysis, and concentration inequalities, the authors establish a logarithmic asymptotic expression for the total risk, derive explicit solutions for canonical models such as Gaussian and Poisson, and provide concentration bounds for the tilted weighted log-likelihood. These results offer both theoretical foundations and computational tools for weighted hypothesis testing.
📝 Abstract
We consider context-sensitive (binary) hypothesis testing for i.i.d. observations under a multiplicative weight function. We establish the logarithmic asymptotic, as the sample size grows, of the optimal total loss (sum of type-I and type-II losses) and express the corresponding error exponent through a weighted Chernoff information between the competing distributions. Our approach embeds weighted geometric mixtures into an exponential family and identifies the exponent as the maximizer of its log-normaliser. We also provide concentration bounds for a tilted weighted log-likelihood and derive explicit expressions for Gaussian and Poisson models, as well as further parametric examples.