|国家预印本平台
首页|RAG+: Enhancing Retrieval-Augmented Generation with Application-Aware Reasoning

RAG+: Enhancing Retrieval-Augmented Generation with Application-Aware Reasoning

RAG+: Enhancing Retrieval-Augmented Generation with Application-Aware Reasoning

来源:Arxiv_logoArxiv
英文摘要

The integration of external knowledge through Retrieval-Augmented Generation (RAG) has become foundational in enhancing large language models (LLMs) for knowledge-intensive tasks. However, existing RAG paradigms often overlook the cognitive step of applying knowledge, leaving a gap between retrieved facts and task-specific reasoning. In this work, we introduce RAG+, a principled and modular extension that explicitly incorporates application-aware reasoning into the RAG pipeline. RAG+ constructs a dual corpus consisting of knowledge and aligned application examples, created either manually or automatically, and retrieves both jointly during inference. This design enables LLMs not only to access relevant information but also to apply it within structured, goal-oriented reasoning processes. Experiments across mathematical, legal, and medical domains, conducted on multiple models, demonstrate that RAG+ consistently outperforms standard RAG variants, achieving average improvements of 3-5%, and peak gains up to 7.5% in complex scenarios. By bridging retrieval with actionable application, RAG+ advances a more cognitively grounded framework for knowledge integration, representing a step toward more interpretable and capable LLMs.

Yu Wang、Shiwan Zhao、Zhihu Wang、Yubo Zhang、Xicheng Zhang、Zhengfan Wang、Heyuan Huang、Ming Fan、Ting Liu

计算技术、计算机技术

Yu Wang,Shiwan Zhao,Zhihu Wang,Yubo Zhang,Xicheng Zhang,Zhengfan Wang,Heyuan Huang,Ming Fan,Ting Liu.RAG+: Enhancing Retrieval-Augmented Generation with Application-Aware Reasoning[EB/OL].(2025-06-24)[2025-06-28].https://arxiv.org/abs/2506.11555.点此复制

评论