🤖 AI Summary
This paper addresses robust parameter identification for partially observed linear time-invariant (LTI) systems under heavy-tailed noise—relaxing conventional Gaussian or sub-Gaussian assumptions to only require finite second moments. We introduce, for the first time, boosting-based ensemble learning into the system identification framework, integrating robust statistical estimation with non-asymptotic analysis to design an algorithm highly resilient to heavy-tailed disturbances. Theoretically, under mild excitation conditions requiring only finite fourth-order moments of the input, our method achieves optimal logarithmic dependence of the failure probability on sample complexity—matching the fundamental lower bound established under sub-Gaussian noise. The key contribution lies in breaking the restrictive distributional assumptions on noise while simultaneously attaining strong robustness and high statistical efficiency—a previously unattained unification in system identification under heavy-tailed settings.
📝 Abstract
We consider the problem of system identification of partially observed linear time-invariant (LTI) systems. Given input-output data, we provide non-asymptotic guarantees for identifying the system parameters under general heavy-tailed noise processes. Unlike previous works that assume Gaussian or sub-Gaussian noise, we consider significantly broader noise distributions that are required to admit only up to the second moment. For this setting, we leverage tools from robust statistics to propose a novel system identification algorithm that exploits the idea of boosting. Despite the much weaker noise assumptions, we show that our proposed algorithm achieves sample complexity bounds that nearly match those derived under sub-Gaussian noise. In particular, we establish that our bounds retain a logarithmic dependence on the prescribed failure probability. Interestingly, we show that such bounds can be achieved by requiring just a finite fourth moment on the excitatory input process.