🤖 AI Summary
This work investigates the convexity of the normalized logarithmic moment-generating function (log-MGF) of convex functions applied to Gaussian variables, and its intrinsic connection to an “Ehrhard-like property.” Employing tools from convex analysis, Gaussian space theory, probabilistic measure transformations, and geometric concepts—including Rényi divergence and intrinsic volumes of cones—we establish, for the first time, an equivalence between the Ehrhard-like property and strict convexity of the log-MGF: distributions satisfying this property are precisely convex images of Gaussian variables; moreover, the log-MGF is strictly convex for non-Gaussian cases and affine in the Gaussian case. As a consequence, we derive optimal Rényi divergence comparisons between Gaussian and strongly log-concave variables, and generalize—while sharpening—the classical McMullen inequality.
📝 Abstract
We investigate a convexity properties for normalized log moment generating function continuing a recent investigation of Chen of convex images of Gaussians. We show that any variable satisfying a ``Ehrhard-like'' property for its distribution function has a strictly convex normalized log moment generating function, unless the variable is Gaussian, in which case affine-ness is achieved. Moreover we characterize variables that satisfy the Ehrhard-like property as the convex images of Gaussians. As applications, we derive sharp comparisons between Rényi divergences for a Gaussian and a strongly log-concave variable, and characterize the equality case. We also demonstrate essentially optimal concentration bounds for the sequence of conic intrinsic volumes associated to convex cone and we obtain a reversal of McMullen's inequality between the sum of the (Euclidean) intrinsic volumes associated to a convex body and the body's mean width that generalizes and sharpens a result of Alonso-Hernandez-Yepes.