Feature-based Evolutionary Diversity Optimization of Discriminating Instances for Chance-constrained Optimization Problems

📅 2025-01-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses chance-constrained optimization, specifically the stochastic maximum coverage problem with random weights characterized by their means and variances. To rigorously benchmark algorithms under uncertainty, we construct a diverse, discriminative suite of synthetic instances. Our key innovation is the first integration of feature-driven diversity optimization into chance-constrained instance generation—jointly controlling both distributional diversity across a multidimensional problem feature space and discriminability of algorithmic performance differences. We employ a feature-space-based (μ+1) evolutionary algorithm, incorporating chance-constrained modeling, stochastic instance encoding, and a tailored fitness function to generate high-discrimination, broad-coverage benchmarks. Empirical evaluation demonstrates that the generated instances clearly separate solution difficulty and performance profiles of two representative algorithms. This work establishes a novel, principled paradigm for fair, feature-aware evaluation of stochastic optimization algorithms.

Technology Category

Application Category

📝 Abstract
Algorithm selection is crucial in the field of optimization, as no single algorithm performs perfectly across all types of optimization problems. Finding the best algorithm among a given set of algorithms for a given problem requires a detailed analysis of the problem's features. To do so, it is important to have a diverse set of benchmarking instances highlighting the difference in algorithms' performance. In this paper, we evolve diverse benchmarking instances for chance-constrained optimization problems that contain stochastic components characterized by their expected values and variances. These instances clearly differentiate the performance of two given algorithms, meaning they are easy to solve by one algorithm and hard to solve by the other. We introduce a $(mu+1)~EA$ for feature-based diversity optimization to evolve such differentiating instances. We study the chance-constrained maximum coverage problem with stochastic weights on the vertices as an example of chance-constrained optimization problems. The experimental results demonstrate that our method successfully generates diverse instances based on different features while effectively distinguishing the performance between a pair of algorithms.
Problem

Research questions and friction points this paper is trying to address.

Uncertainty Optimization
Algorithm Performance
Test Cases Design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uncertainty Optimization
Performance Differentiation
Vertex-Weighted Max-Cover
🔎 Similar Papers
No similar papers found.
S
Saba Sadeghi Ahouei
Optimisation and Logistics, School of Computer and Mathematical Sciences, The University of Adelaide, Adelaide, Australia
D
Denis Antipov
LIP6, Sorbonne University, Paris, France
Aneta Neumann
Aneta Neumann
Researcher, The University of Adelaide, Australia
Artificial IntelligenceBio-inspired ComputationOptimisation under UncertaintyQuality Diversity
F
Frank Neumann
Optimisation and Logistics, School of Computer and Mathematical Sciences, The University of Adelaide, Adelaide, Australia