|国家预印本平台
首页|Learning to Search Effective Example Sequences for In-Context Learning

Learning to Search Effective Example Sequences for In-Context Learning

Learning to Search Effective Example Sequences for In-Context Learning

来源:Arxiv_logoArxiv
英文摘要

Large language models (LLMs) demonstrate impressive few-shot learning capabilities, but their performance varies widely based on the sequence of in-context examples. Key factors influencing this include the sequence's length, composition, and arrangement, as well as its relation to the specific query. Existing methods often tackle these factors in isolation, overlooking their interdependencies. Moreover, the extensive search space for selecting optimal sequences complicates the development of a holistic approach. In this work, we introduce Beam Search-based Example Sequence Constructor (BESC), a novel method for learning to construct optimal example sequences. BESC addresses all key factors involved in sequence selection by considering them jointly during inference, while incrementally building the sequence. This design enables the use of beam search to significantly reduce the complexity of the search space. Experiments across various datasets and language models show notable improvements in performance.

Kamalika Das、Ankita Sinha、Xiang Gao

计算技术、计算机技术

Kamalika Das,Ankita Sinha,Xiang Gao.Learning to Search Effective Example Sequences for In-Context Learning[EB/OL].(2025-03-11)[2025-05-17].https://arxiv.org/abs/2503.08030.点此复制

评论