|国家预印本平台
首页|Provable In-Context Learning of Nonlinear Regression with Transformers

Provable In-Context Learning of Nonlinear Regression with Transformers

Provable In-Context Learning of Nonlinear Regression with Transformers

来源:Arxiv_logoArxiv
英文摘要

The transformer architecture, which processes sequences of input tokens to produce outputs for query tokens, has revolutionized numerous areas of machine learning. A defining feature of transformers is their ability to perform previously unseen tasks using task-specific prompts without updating parameters, a phenomenon known as in-context learning (ICL). Recent research has actively explored the training dynamics behind ICL, with much of the focus on relatively simple tasks such as linear regression and binary classification. To advance the theoretical understanding of ICL, this paper investigates more complex nonlinear regression tasks, aiming to uncover how transformers acquire in-context learning capabilities in these settings. We analyze the stage-wise dynamics of attention during training: attention scores between a query token and its target features grow rapidly in the early phase, then gradually converge to one, while attention to irrelevant features decays more slowly and exhibits oscillatory behavior. Our analysis introduces new proof techniques that explicitly characterize how the nature of general non-degenerate L-Lipschitz task functions affects attention weights. Specifically, we identify that the Lipschitz constant L of nonlinear function classes as a key factor governing the convergence dynamics of transformers in ICL. Leveraging these insights, for two distinct regimes depending on whether L is below or above a threshold, we derive different time bounds to guarantee near-zero prediction error. Notably, despite the convergence time depending on the underlying task functions, we prove that query tokens consistently attend to prompt tokens with highly relevant features at convergence, demonstrating the ICL capability of transformers for unseen functions.

Hongbo Li、Lingjie Duan、Yingbin Liang

计算技术、计算机技术

Hongbo Li,Lingjie Duan,Yingbin Liang.Provable In-Context Learning of Nonlinear Regression with Transformers[EB/OL].(2025-07-28)[2025-08-10].https://arxiv.org/abs/2507.20443.点此复制

评论