An Evolutionary Algorithm with Probabilistic Annealing for Large-scale Sparse Multi-objective Optimization

📅 2026-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the exploration–exploitation imbalance arising from the coexistence of high dimensionality and sparsity in large-scale sparse multi-objective optimization. To this end, a probabilistic-annealing-based bi-vector evolutionary algorithm is proposed, which employs two probability vectors with distinct entropy characteristics to collaboratively guide the search process. The low-entropy vector ensures convergence, while the annealing vector dynamically modulates the transition from global exploration to local refinement through an entropy-controlled strategy. Furthermore, a sparsity-aware mechanism is integrated to efficiently identify critical non-zero variables. Experimental results on benchmark problems and real-world applications demonstrate that the proposed algorithm significantly outperforms state-of-the-art methods, achieving superior performance in both convergence and solution diversity.

Technology Category

Application Category

📝 Abstract
Large-scale sparse multi-objective optimization problems (LSMOPs) are prevalent in real-world applications, where optimal solutions typically contain only a few nonzero variables, such as in adversarial attacks, critical node detection, and sparse signal reconstruction. Since the function evaluation of LSMOPs often relies on large-scale datasets involving a large number of decision variables, the search space becomes extremely high-dimensional. The coexistence of sparsity and high dimensionality greatly intensifies the conflict between exploration and exploitation, making it difficult for existing multi-objective evolutionary algorithms (MOEAs) to identify the critical nonzero decision variables within limited function evaluations. To address this challenge, this paper proposes an evolutionary algorithm with probabilistic annealing for large-scale sparse multi-objective optimization. The algorithm is driven by two probability vectors with distinct entropy characteristics: a convergence-oriented probability vector with relatively low entropy ensures stable exploitation, whereas an annealed probability vector with gradually decreasing entropy enables an adaptive transition from global exploration to local refinement. By integrating these complementary search dynamics, the proposed algorithm achieves a dynamic equilibrium between exploration and exploitation. Experimental results on benchmark problems and real-world applications demonstrate that the proposed algorithm outperforms state-of-the-art evolutionary algorithms in terms of both convergence and diversity.
Problem

Research questions and friction points this paper is trying to address.

large-scale sparse multi-objective optimization
sparsity
high dimensionality
exploration-exploitation trade-off
multi-objective evolutionary algorithms
Innovation

Methods, ideas, or system contributions that make the work stand out.

probabilistic annealing
large-scale sparse multi-objective optimization
evolutionary algorithm
entropy-based search
exploration-exploitation balance
🔎 Similar Papers
No similar papers found.