🤖 AI Summary
This work proposes continuous Progressive Neural Networks (cPNN) to address the threefold challenges of temporal dependencies, concept drift, and catastrophic forgetting in non-stationary time series data streams. By integrating recurrent neural network architectures with a streaming-oriented stochastic gradient descent strategy, cPNN enables a dynamically expandable network structure that efficiently learns new concepts while preserving historical knowledge. As the first unified framework tackling these interrelated issues simultaneously, cPNN significantly enhances adaptation speed and robustness to distributional shifts without compromising memory of previously learned tasks. Ablation studies further confirm its superior performance in scenarios involving concept drift, demonstrating its effectiveness in maintaining both stability and plasticity in dynamic environments.
📝 Abstract
Dealing with an unbounded data stream involves overcoming the assumption that data is identically distributed and independent. A data stream can, in fact, exhibit temporal dependencies (i.e., be a time series), and data can change distribution over time (concept drift). The two problems are deeply discussed, and existing solutions address them separately: a joint solution is absent. In addition, learning multiple concepts implies remembering the past (a.k.a. avoiding catastrophic forgetting in Neural Networks' terminology). This work proposes Continuous Progressive Neural Networks (cPNN), a solution that tames concept drifts, handles temporal dependencies, and bypasses catastrophic forgetting. cPNN is a continuous version of Progressive Neural Networks, a methodology for remembering old concepts and transferring past knowledge to fit the new concepts quickly. We base our method on Recurrent Neural Networks and exploit the Stochastic Gradient Descent applied to data streams with temporal dependencies. Results of an ablation study show a quick adaptation of cPNN to new concepts and robustness to drifts.