One Search Fits All: Pareto-Optimal Eco-Friendly Model Selection

📅 2025-05-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
The high energy consumption of AI model training necessitates general-purpose optimization strategies that jointly maximize performance and energy efficiency. This paper proposes GREEN, the first eco-friendly model selection framework spanning computer vision (CV), natural language processing (NLP), and recommender systems. It introduces EcoTaskSet—the first large-scale, multi-task training dynamics dataset comprising 1,767+ experiments—and designs a lightweight, retraining-free configuration evaluator enabling inference-time dynamic recommendation of Pareto-optimal model configurations. By integrating predictive modeling with multi-objective trade-off reasoning, GREEN achieves joint energy-efficiency–performance optimization. Experiments demonstrate that GREEN reduces GPU-kWh energy consumption by 38% on average across tasks while preserving ≥99.2% of baseline accuracy; moreover, 100% of its recommended configurations lie precisely on the empirical Pareto frontier.

Technology Category

Application Category

📝 Abstract
The environmental impact of Artificial Intelligence (AI) is emerging as a significant global concern, particularly regarding model training. In this paper, we introduce GREEN (Guided Recommendations of Energy-Efficient Networks), a novel, inference-time approach for recommending Pareto-optimal AI model configurations that optimize validation performance and energy consumption across diverse AI domains and tasks. Our approach directly addresses the limitations of current eco-efficient neural architecture search methods, which are often restricted to specific architectures or tasks. Central to this work is EcoTaskSet, a dataset comprising training dynamics from over 1767 experiments across computer vision, natural language processing, and recommendation systems using both widely used and cutting-edge architectures. Leveraging this dataset and a prediction model, our approach demonstrates effectiveness in selecting the best model configuration based on user preferences. Experimental results show that our method successfully identifies energy-efficient configurations while ensuring competitive performance.
Problem

Research questions and friction points this paper is trying to address.

Optimizing AI model configurations for energy efficiency and performance
Overcoming limitations of current eco-efficient neural architecture search methods
Providing a dataset and model for cross-domain eco-friendly AI recommendations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pareto-optimal AI model configurations recommendation
EcoTaskSet dataset with 1767 experiments data
Energy-efficient and performance-competitive model selection
🔎 Similar Papers
No similar papers found.
F
Filippo Betello
DIAG, Sapienza University of Rome, Rome, Italy
A
Antonio Purificato
DIAG, Sapienza University of Rome, Rome, Italy
V
Vittoria Vineis
DIAG, Sapienza University of Rome, Rome, Italy
Gabriele Tolomei
Gabriele Tolomei
Associate Professor of Computer Science at Sapienza University of Rome
Machine LearningExplainable AIFederated LearningAdversarial LearningWeb Search & Advertising
Fabrizio Silvestri
Fabrizio Silvestri
Sapienza, University of Rome
Machine LearningArtificial IntelligenceNatural Language ProcessingRAGWeb