Extropy Rate: Properties and Application in Feature Selection

📅 2025-07-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional uncertainty measures for discrete random variables exhibit limitations in quantifying complexity and intrinsic information. Method: This paper introduces, for the first time, the concept of *extropy rate*—a novel entropy-like measure—and establishes its systematic theoretical framework: defining conditional and joint extropies, characterizing their asymptotic behavior for stationary ergodic processes, and uncovering an intrinsic connection to the Simpson diversity index. A new feature selection criterion is proposed, prioritizing features with higher extropy rates to capture richer inherent structure; robustness is rigorously analyzed via information-theoretic bounds and Lipschitz continuity. Results: Evaluated on six benchmark datasets, the method significantly outperforms state-of-the-art feature selection algorithms, achieving superior classification accuracy and enhanced feature representativeness—demonstrating the theoretical significance and practical efficacy of extropy rate in complexity quantification, chaotic system analysis, and machine learning.

Technology Category

Application Category

📝 Abstract
Extropy, a complementary dual of entropy, (proposed by Lad et al. cite{lad2015extropy} in 2015) has attracted considerable interest from the research community. In this study, we focus on discrete random variables and define conditional extropy, establishing key properties of joint and conditional extropy such as bounds, uncertainty reduction due to additional information, and Lipschitz continuity. We further introduce the concept of extropy rate for a stochastic process of discrete random variables as a measure of the average uncertainty per random variable within the process. It is observed that for infinite stationary and ergodic stochastic processes, as well as for identically and independently distributed sequences, the extropy rate exhibits asymptotic equivalence. We explore the extropy rate for finite stochastic processes and numerically illustrate its effectiveness in capturing the underlying information across various distributions, quantifying complexity in time series data, and characterizing chaotic dynamics in dynamical systems. The behaviour of estimated extropy rate is observed to be closely aligned with Simpson's diversity index. The real-life applicability of the extropy rate is presented through a novel feature selection method based on the fact that features with higher extropy rates contain greater inherent information. Using six publicly available datasets, we show the superiority of the proposed feature selection method over some other existing popular approaches.
Problem

Research questions and friction points this paper is trying to address.

Defining extropy rate for stochastic processes to measure uncertainty
Exploring extropy rate's effectiveness in quantifying time series complexity
Proposing extropy-based feature selection for higher information retention
Innovation

Methods, ideas, or system contributions that make the work stand out.

Defines conditional extropy for discrete variables
Introduces extropy rate for stochastic processes
Proposes feature selection using extropy rate
🔎 Similar Papers
No similar papers found.
N
Naveen Kumar
Department of Mathematics, Indian Institute of Technology Jodhpur, Karwar, Jodhpur, 342030, Rajasthan, India.
Vivek Vijay
Vivek Vijay
IIT Jodhpur