|国家预印本平台
首页|Systematic Reward Gap Optimization for Mitigating VLM Hallucinations

Systematic Reward Gap Optimization for Mitigating VLM Hallucinations

Systematic Reward Gap Optimization for Mitigating VLM Hallucinations

来源:Arxiv_logoArxiv
英文摘要

The success of Direct Preference Optimization (DPO) in mitigating hallucinations in Vision Language Models (VLMs) critically hinges on the true reward gaps within preference pairs. However, current methods, typically relying on ranking or rewriting strategies, often struggle to optimize these reward gaps in a systematic way during data curation. A core difficulty lies in precisely characterizing and strategically manipulating the overall reward gap configuration, that is, the deliberate design of how to shape these reward gaps within each preference pair across the data. To address this, we introduce Topic-level Preference Rewriting(TPR), a novel framework designed for the systematic optimization of reward gap configuration. Through selectively replacing semantic topics within VLM responses with model's own resampled candidates for targeted rewriting, TPR can provide topic-level control over fine-grained semantic details. This precise control enables advanced data curation strategies, such as progressively adjusting the difficulty of rejected responses, thereby sculpting an effective reward gap configuration that guides the model to overcome challenging hallucinations. Comprehensive experiments demonstrate TPR achieves state-of-the-art performance on multiple hallucination benchmarks, outperforming previous methods by an average of 20%. Notably, it significantly reduces hallucinations by up to 93% on ObjectHal-Bench, and also exhibits superior data efficiency towards robust and cost-effective VLM alignment.

Lu Sheng、Lehan He、Zeren Chen、Zhelun Shi、Tianyu Yu、Jing Shao

计算技术、计算机技术

Lu Sheng,Lehan He,Zeren Chen,Zhelun Shi,Tianyu Yu,Jing Shao.Systematic Reward Gap Optimization for Mitigating VLM Hallucinations[EB/OL].(2025-06-23)[2025-07-02].https://arxiv.org/abs/2411.17265.点此复制

评论