Machine learning revolution for exoplanet direct imaging detection: transformer architectures
Machine learning revolution for exoplanet direct imaging detection: transformer architectures
Directly imaging exoplanets is a formidable challenge due to extreme contrast ratios and quasi-static speckle noise, motivating the exploration of advanced post-processing methods. While Convolutional Neural Networks (CNNs) have shown promise, their inherent limitations in capturing long-range dependencies in image sequences hinder their effectiveness. This study introduces a novel hybrid deep learning architecture that combines a CNN feature extractor with a Transformer encoder to leverage temporal information, modeling the signature of a planet's coherent motion across an observation sequence. We first validated the model on a purely synthetic dataset, where it demonstrated excellent performance. While the final metrics varied slightly between training runs, our reported trial achieved 100.0% accuracy, a 100.0% F1-score, and a position accuracy of 0.72 pixels, showing strong results on this specific test case in comparison to traditional methods like median subtraction and PCA-KLIP. To assess its viability on realistic data, we retrained the model on a semi-synthetic dataset created by injecting planet signals into actual high-contrast imaging observations of the TW Hya protoplanetary disk from JWST. The model successfully identified the injected signals with high confidence, confirming its ability to function amidst complex, correlated noise and bright disk features. This work serves as a successful proof-of-concept, demonstrating that a CNN-Transformer architecture holds significant promise as a fast, accurate, and automated method for exoplanet detection in the large datasets expected from current and future high-contrast imaging instruments.
Yu-Chia Lin
天文学计算技术、计算机技术遥感技术
Yu-Chia Lin.Machine learning revolution for exoplanet direct imaging detection: transformer architectures[EB/OL].(2025-08-25)[2025-09-03].https://arxiv.org/abs/2508.14508.点此复制
评论