|国家预印本平台
首页|PrismRAG: Boosting RAG Factuality with Distractor Resilience and Strategized Reasoning

PrismRAG: Boosting RAG Factuality with Distractor Resilience and Strategized Reasoning

PrismRAG: Boosting RAG Factuality with Distractor Resilience and Strategized Reasoning

来源:Arxiv_logoArxiv
英文摘要

Retrieval-augmented generation (RAG) often falls short when retrieved context includes confusing semi-relevant passages, or when answering questions require deep contextual understanding and reasoning. We propose an efficient fine-tuning framework, called PrismRAG, that (i) trains the model with distractor-aware QA pairs mixing gold evidence with subtle distractor passages, and (ii) instills reasoning-centric habits that make the LLM plan, rationalize, and synthesize without relying on extensive human engineered instructions. Evaluated across 12 open-book RAG QA benchmarks spanning diverse application domains and scenarios, PrismRAG improves average factuality by 5.4%, outperforming state-of-the-art solutions.

Mohammad Kachuee、Teja Gollapudi、Minseok Kim、Yin Huang、Kai Sun、Xiao Yang、Jiaqi Wang、Nirav Shah、Yue Liu、Aaron Colak、Anuj Kumar、Wen-tau Yih、Xin Luna Dong

计算技术、计算机技术

Mohammad Kachuee,Teja Gollapudi,Minseok Kim,Yin Huang,Kai Sun,Xiao Yang,Jiaqi Wang,Nirav Shah,Yue Liu,Aaron Colak,Anuj Kumar,Wen-tau Yih,Xin Luna Dong.PrismRAG: Boosting RAG Factuality with Distractor Resilience and Strategized Reasoning[EB/OL].(2025-07-25)[2025-08-10].https://arxiv.org/abs/2507.18857.点此复制

评论