Generalization from Low- to Moderate-Resolution Spectra with Neural Networks for Stellar Parameter Estimation: A Case Study with DESI

📅 2026-02-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of estimating stellar parameters across heterogeneous spectroscopic surveys—specifically, transferring knowledge from low-resolution spectra (e.g., LAMOST) to medium-resolution data (e.g., DESI). The authors propose a transfer learning framework based on a multilayer perceptron (MLP), pre-trained on LAMOST spectra and systematically evaluated for zero-shot transfer and various fine-tuning strategies—including full-parameter fine-tuning, LoRA, and residual head adapters—on DESI data. Remarkably, the pre-trained MLP achieves strong performance without any fine-tuning, with modest improvements attainable through targeted adaptation. Comparative experiments reveal that MLPs operating directly on raw spectra excel for metal-poor stars, whereas Transformer-based embeddings yield superior iron abundance estimates for metal-rich stars, highlighting the surprising efficacy of simple MLP architectures in cross-survey spectral generalization.

Technology Category

Application Category

📝 Abstract
Cross-survey generalization is a critical challenge in stellar spectral analysis, particularly in cases such as transferring from low- to moderate-resolution surveys. We investigate this problem using pre-trained models, focusing on simple neural networks such as multilayer perceptrons (MLPs), with a case study transferring from LAMOST low-resolution spectra (LRS) to DESI medium-resolution spectra (MRS). Specifically, we pre-train MLPs on either LRS or their embeddings and fine-tune them for application to DESI stellar spectra. We compare MLPs trained directly on spectra with those trained on embeddings derived from transformer-based models (self-supervised foundation models pre-trained for multiple downstream tasks). We also evaluate different fine-tuning strategies, including residual-head adapters, LoRA, and full fine-tuning. We find that MLPs pre-trained on LAMOST LRS achieve strong performance, even without fine-tuning, and that modest fine-tuning with DESI spectra further improves the results. For iron abundance, embeddings from a transformer-based model yield advantages in the metal-rich ([Fe/H]>-1.0) regime, but underperform in the metal-poor regime compared to MLPs trained directly on LRS. We also show that the optimal fine-tuning strategy depends on the specific stellar parameter under consideration. These results highlight that simple pre-trained MLPs can provide competitive cross-survey generalization, while the role of spectral foundation models for cross-survey stellar parameter estimation requires further exploration.
Problem

Research questions and friction points this paper is trying to address.

cross-survey generalization
stellar parameter estimation
low-resolution spectra
moderate-resolution spectra
neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

cross-survey generalization
stellar parameter estimation
pre-trained MLP
spectral embeddings
fine-tuning strategies
🔎 Similar Papers
No similar papers found.
X
Xiaosheng Zhao
Department of Physics & Astronomy, The Johns Hopkins University, Baltimore, MD 21218, USA
Y
Yuan-Sen Ting
Department of Astronomy, The Ohio State University, 140 West 18th Avenue, Columbus, OH 43210, USA; Center for Cosmology and AstroParticle Physics (CCAPP), The Ohio State University, Columbus, OH 43210, USA
R
Rosemary F. G. Wyse
Department of Physics & Astronomy, The Johns Hopkins University, Baltimore, MD 21218, USA
A
Alexander S. Szalay
Department of Physics & Astronomy, The Johns Hopkins University, Baltimore, MD 21218, USA; Department of Computer Science, The Johns Hopkins University, Baltimore, MD 21218, USA
Y
Yang Huang
School of Astronomy and Space Science, University of Chinese Academy of Sciences, Beijing 100049, People’s Republic of China; National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012, People’s Republic of China
László Dobos
László Dobos
Johns Hopkins University
astrophysicsscientific databases
Tamás Budavári
Tamás Budavári
Dept. of Applied Mathematics and Statistics, Johns Hopkins University
applied statisticscomputational sciencedata sciencecomputer scienceastronomy
V
Viska Wei
Department of Physics & Astronomy, The Johns Hopkins University, Baltimore, MD 21218, USA; Department of Computer Science, The Johns Hopkins University, Baltimore, MD 21218, USA