🤖 AI Summary
Deploying AI models on resource-constrained edge and IoT devices faces challenges in computational efficiency, energy consumption, and bandwidth limitations. Method: This paper proposes a “triple-frugality” co-optimization framework: (i) data-efficient learning at the input level; (ii) incremental knowledge distillation with adaptive regularization during training; and (iii) dynamic architecture design integrated with model compression at the model level. It further establishes the first unified taxonomy for frugal machine learning. The framework enables continual model updates without full retraining and supports energy-aware hardware–software co-optimization. Contribution/Results: Experiments demonstrate that, with <2% accuracy degradation, the framework reduces energy consumption by 40–70% and inference latency by over 50%, significantly improving AI deployment efficiency on edge devices—especially under stringent bandwidth and battery constraints.
📝 Abstract
Frugal Machine Learning (FML) refers to the practice of designing Machine Learning (ML) models that are efficient, cost-effective, and mindful of resource constraints. This field aims to achieve acceptable performance while minimizing the use of computational resources, time, energy, and data for both training and inference. FML strategies can be broadly categorized into input frugality, learning process frugality, and model frugality, each focusing on reducing resource consumption at different stages of the ML pipeline. This chapter explores recent advancements, applications, and open challenges in FML, emphasizing its importance for smart environments that incorporate edge computing and IoT devices, which often face strict limitations in bandwidth, energy, or latency. Technological enablers such as model compression, energy-efficient hardware, and data-efficient learning techniques are discussed, along with adaptive methods including parameter regularization, knowledge distillation, and dynamic architecture design that enable incremental model updates without full retraining. Furthermore, it provides a comprehensive taxonomy of frugal methods, discusses case studies across diverse domains, and identifies future research directions to drive innovation in this evolving field.