|国家预印本平台
首页|Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing

Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing

Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing

来源:Arxiv_logoArxiv
英文摘要

Knowledge editing aims to efficiently update Large Language Models (LLMs) by modifying specific knowledge without retraining the entire model. Among knowledge editing approaches, in-context editing (ICE) offers a lightweight solution by injecting new knowledge directly into the input context, leaving model parameters unchanged. However, existing ICE approaches do not explicitly separate the newly injected knowledge from the model's original reasoning process. This entanglement often results in conflicts between external updates and internal parametric knowledge, undermining the consistency and accuracy of the reasoning path.In this work, we conduct preliminary experiments to examine how parametric knowledge influences reasoning path planning. We find that the model's reasoning is tightly coupled with its internal knowledge, and that naively injecting new information without adapting the reasoning path often leads to performance degradation, particularly in multi-hop tasks. To this end, we propose DecKER, a novel ICE framework that decouples reasoning from knowledge editing by generating a masked reasoning path and then resolving knowledge edits via hybrid retrieval and model-based validation. Experiments on multi-hop QA benchmarks show that DecKER significantly outperforms existing ICE methods by mitigating knowledge conflicts and preserving reasoning consistency. Our code is available at: https://github.com/bebr2/DecKER .

Changyue Wang、Weihang Su、Qingyao Ai、Yujia Zhou、Yiqun Liu

计算技术、计算机技术

Changyue Wang,Weihang Su,Qingyao Ai,Yujia Zhou,Yiqun Liu.Decoupling Reasoning and Knowledge Injection for In-Context Knowledge Editing[EB/OL].(2025-05-31)[2025-06-23].https://arxiv.org/abs/2506.00536.点此复制

评论