Transfer learning optimization based on evolutionary selective fine tuning

📅 2025-08-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Full-parameter fine-tuning suffers from overfitting and high computational overhead. To address this, we propose BioTune—a selective fine-tuning method that adaptively identifies and updates only the most critical network layers for a target domain, guided by an evolutionary algorithm. Its core innovation lies in dynamically optimizing both parameter efficiency and generalization via differentiable evolutionary search, seamlessly integrating deep transfer learning without relying on manually predefined layer-selection heuristics. Evaluated across nine cross-domain image classification benchmarks, BioTune matches or exceeds the accuracy of state-of-the-art methods—including AutoRGN and LoRA—while drastically reducing trainable parameters by an average of 72.4%. This demonstrates BioTune’s dual advantage: superior parameter efficiency without sacrificing predictive performance.

Technology Category

Application Category

📝 Abstract
Deep learning has shown substantial progress in image analysis. However, the computational demands of large, fully trained models remain a consideration. Transfer learning offers a strategy for adapting pre-trained models to new tasks. Traditional fine-tuning often involves updating all model parameters, which can potentially lead to overfitting and higher computational costs. This paper introduces BioTune, an evolutionary adaptive fine-tuning technique that selectively fine-tunes layers to enhance transfer learning efficiency. BioTune employs an evolutionary algorithm to identify a focused set of layers for fine-tuning, aiming to optimize model performance on a given target task. Evaluation across nine image classification datasets from various domains indicates that BioTune achieves competitive or improved accuracy and efficiency compared to existing fine-tuning methods such as AutoRGN and LoRA. By concentrating the fine-tuning process on a subset of relevant layers, BioTune reduces the number of trainable parameters, potentially leading to decreased computational cost and facilitating more efficient transfer learning across diverse data characteristics and distributions.
Problem

Research questions and friction points this paper is trying to address.

Optimizing transfer learning with selective fine-tuning
Reducing computational costs in deep learning models
Preventing overfitting through evolutionary layer selection
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evolutionary algorithm selects layers for fine-tuning
Selective fine-tuning reduces trainable parameters
Enhances transfer learning efficiency and accuracy
🔎 Similar Papers
No similar papers found.