Mixed Fractional Information: Consistency of Dissipation Measures for Stable Laws

📅 2025-04-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Classical Fisher information is infinite for symmetric α-stable (SαS) distributions with α < 2 due to their heavy tails, rendering it incapable of quantifying scale differences. Method: This paper introduces the Mixed Fractional Information (MFI), a finite, computable information measure defined as the initial rate of relative entropy dissipation under scale interpolation. Contribution/Results: We rigorously prove a consistency identity for MFI, establishing its intrinsic connections to fractional score differences and an MMSE-type scoring function. We derive two equivalent analytical expressions for MFI—demonstrating its non-negativity and proving that it vanishes if and only if the scales are equal—and obtain a closed-form solution for the Cauchy case (α = 1). Numerical experiments confirm MFI’s robustness and computational feasibility. As the first self-consistent fractional information framework for heavy-tailed systems, MFI enables the development of fractional I-MMSE relations and novel functional inequalities.

Technology Category

Application Category

📝 Abstract
Symmetric alpha-stable (S alpha S) distributions with alpha<2 lack finite classical Fisher information. Building on Johnson's framework, we define Mixed Fractional Information (MFI) via the initial rate of relative entropy dissipation during interpolation between S alpha S laws with differing scales, v and s. We demonstrate two equivalent formulations for MFI in this specific S alpha S-to-S alpha S setting. The first involves the derivative D'(v) of the relative entropy between the two S alpha S densities. The second uses an integral expectation E_gv[u(x,0) (pF_v(x) - pF_s(x))] involving the difference between Fisher scores (pF_v, pF_s) and a specific MMSE-related score function u(x,0) derived from the interpolation dynamics. Our central contribution is a rigorous proof of the consistency identity: D'(v) = (1/(alpha v)) E_gv[X (pF_v(X) - pF_s(X))]. This identity mathematically validates the equivalence of the two MFI formulations for S alpha S inputs, establishing MFI's internal coherence and directly linking entropy dissipation rates to score function differences. We further establish MFI's non-negativity (zero if and only if v=s), derive its closed-form expression for the Cauchy case (alpha=1), and numerically validate the consistency identity. MFI provides a finite, coherent, and computable information-theoretic measure for comparing S alpha S distributions where classical Fisher information fails, connecting entropy dynamics to score functions and estimation concepts. This work lays a foundation for exploring potential fractional I-MMSE relations and new functional inequalities tailored to heavy-tailed systems.
Problem

Research questions and friction points this paper is trying to address.

Defines Mixed Fractional Information for stable laws
Proves consistency between entropy dissipation formulations
Provides computable measure for heavy-tailed distributions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Defines Mixed Fractional Information via entropy dissipation
Proves consistency identity for MFI equivalence
Provides finite measure for S alpha S distributions
🔎 Similar Papers
No similar papers found.