Distinct Computations Emerge From Compositional Curricula in In-Context Learning
Distinct Computations Emerge From Compositional Curricula in In-Context Learning
In-context learning (ICL) research often considers learning a function in-context through a uniform sample of input-output pairs. Here, we investigate how presenting a compositional subtask curriculum in context may alter the computations a transformer learns. We design a compositional algorithmic task based on the modular exponential-a double exponential task composed of two single exponential subtasks and train transformer models to learn the task in-context. We compare (a) models trained using an in-context curriculum consisting of single exponential subtasks and, (b) models trained directly on the double exponential task without such a curriculum. We show that models trained with a subtask curriculum can perform zero-shot inference on unseen compositional tasks and are more robust given the same context length. We study how the task and subtasks are represented across the two training regimes. We find that the models employ diverse strategies modulated by the specific curriculum design.
Jin Hwa Lee、Andrew K. Lampinen、Aaditya K. Singh、Andrew M. Saxe
计算技术、计算机技术
Jin Hwa Lee,Andrew K. Lampinen,Aaditya K. Singh,Andrew M. Saxe.Distinct Computations Emerge From Compositional Curricula in In-Context Learning[EB/OL].(2025-06-16)[2025-06-29].https://arxiv.org/abs/2506.13253.点此复制
评论