|国家预印本平台
首页|Advancing Symbolic Discovery on Unsupervised Data: A Pre-training Framework for Non-degenerate Implicit Equation Discovery

Advancing Symbolic Discovery on Unsupervised Data: A Pre-training Framework for Non-degenerate Implicit Equation Discovery

Advancing Symbolic Discovery on Unsupervised Data: A Pre-training Framework for Non-degenerate Implicit Equation Discovery

来源:Arxiv_logoArxiv
英文摘要

Symbolic regression (SR) -- which learns symbolic equations to describe the underlying relation from input-output pairs -- is widely used for scientific discovery. However, a rich set of scientific data from the real world (e.g., particle trajectories and astrophysics) are typically unsupervised, devoid of explicit input-output pairs. In this paper, we focus on symbolic implicit equation discovery, which aims to discover the mathematical relation from unsupervised data that follows an implicit equation $f(\mathbf{x}) =0$. However, due to the dense distribution of degenerate solutions (e.g., $f(\mathbf{x})=x_i-x_i$) in the discrete search space, most existing SR approaches customized for this task fail to achieve satisfactory performance. To tackle this problem, we introduce a novel pre-training framework -- namely, Pre-trained neural symbolic model for Implicit Equation (PIE) -- to discover implicit equations from unsupervised data. The core idea is that, we formulate the implicit equation discovery on unsupervised scientific data as a translation task and utilize the prior learned from the pre-training dataset to infer non-degenerate skeletons of the underlying relation end-to-end. Extensive experiments shows that, leveraging the prior from a pre-trained language model, PIE effectively tackles the problem of degenerate solutions and significantly outperforms all the existing SR approaches. PIE shows an encouraging step towards general scientific discovery on unsupervised data.

天文学

.Advancing Symbolic Discovery on Unsupervised Data: A Pre-training Framework for Non-degenerate Implicit Equation Discovery[EB/OL].(2025-05-05)[2025-05-17].https://arxiv.org/abs/2505.03130.点此复制

评论