PEFT-MuTS: A Multivariate Parameter-Efficient Fine-Tuning Framework for Remaining Useful Life Prediction based on Cross-domain Time Series Representation Model

📅 2026-01-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of remaining useful life (RUL) prediction in small-sample scenarios, where existing methods typically require abundant degradation data from homogeneous equipment. To overcome this limitation, the authors propose PEFT-MuTS, a framework that leverages cross-domain pre-trained time series representation models and enables effective knowledge transfer through parameter-efficient fine-tuning. The approach introduces a dedicated feature adaptation network and a meta-variable-driven low-rank multivariate fusion mechanism to adapt univariate pre-trained models to multivariate RUL prediction, complemented by a zero-initialized regressor to stabilize fine-tuning under limited data. Demonstrating the first successful transfer from large-scale cross-domain time series pre-training to RUL estimation, PEFT-MuTS achieves superior performance on both aircraft engine and bearing datasets using less than 1% of target-device samples, significantly reducing the data requirements for high-accuracy predictions compared to current supervised and few-shot methods.

Technology Category

Application Category

📝 Abstract
The application of data-driven remaining useful life (RUL) prediction has long been constrained by the availability of large amount of degradation data. Mainstream solutions such as domain adaptation and meta-learning still rely on large amounts of historical degradation data from equipment that is identical or similar to the target, which imposes significant limitations in practical applications. This study investigates PEFT-MuTS, a Parameter-Efficient Fine-Tuning framework for few-shot RUL prediction, built on cross-domain pre-trained time-series representation models. Contrary to the widely held view that knowledge transfer in RUL prediction can only occur within similar devices, we demonstrate that substantial benefits can be achieved through pre-training process with large-scale cross-domain time series datasets. A independent feature tuning network and a meta-variable-based low rank multivariate fusion mechanism are developed to enable the pre-trained univariate time-series representation backbone model to fully exploit the multivariate relationships in degradation data for downstream RUL prediction task. Additionally, we introduce a zero-initialized regressor that stabilizes the fine-tuning process under few-shot conditions. Experiments on aero-engine and industrial bearing datasets demonstrate that our method can achieve effective RUL prediction even when less than 1\% of samples of target equipment are used. Meanwhile, it substantially outperforms conventional supervised and few-shot approaches while markedly reducing the data required to achieve high predictive accuracy. Our code is available at https://github.com/fuen1590/PEFT-MuTS.
Problem

Research questions and friction points this paper is trying to address.

Remaining Useful Life Prediction
Few-shot Learning
Cross-domain Time Series
Data Scarcity
Degradation Data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-Efficient Fine-Tuning
Cross-domain Time Series
Remaining Useful Life Prediction
Low-rank Multivariate Fusion
Few-shot Learning
🔎 Similar Papers
No similar papers found.