🤖 AI Summary
To address the challenge of learning embedding representations for irregularly sampled stellar light curves, this paper proposes a Conformer-style self-supervised encoder that jointly models global temporal dependencies and local morphological features. Methodologically, it innovatively integrates multi-head attention, depthwise separable convolutions, and gated linear units to learn robust light-curve representations without labeled data. Evaluated on the MACHO R-band dataset, our approach reduces error rates by 70% and 61% compared to Astromer v1 and v2, respectively, while improving macro-F1 by approximately 7%. The learned embeddings significantly outperform existing methods in few-shot classification and demonstrate strong cross-dataset transferability. This work establishes the first efficient, label-efficient foundation model specifically designed for irregular time-series data in time-domain astronomy.
📝 Abstract
We present AstroCo, a Conformer-style encoder for irregular stellar light curves. By combining attention with depthwise convolutions and gating, AstroCo captures both global dependencies and local features. On MACHO R-band, AstroCo outperforms Astromer v1 and v2, yielding 70 percent and 61 percent lower error respectively and a relative macro-F1 gain of about 7 percent, while producing embeddings that transfer effectively to few-shot classification. These results highlight AstroCo's potential as a strong and label-efficient foundation for time-domain astronomy.