Mitigating Object Hallucinations in Large Vision-Language Models with Assembly of Global and Local Attention
Mitigating Object Hallucinations in Large Vision-Language Models with Assembly of Global and Local Attention
Despite great success across various multimodal tasks, Large Vision-Language Models (LVLMs) often encounter object hallucinations with generated textual responses being inconsistent with the actual objects in images. We examine different LVLMs and pinpoint that one root cause of object hallucinations lies with deficient attention on discriminative image features. Specifically, LVLMs often predominantly attend to prompt-irrelevant global features instead of prompt-relevant local features, undermining their visual grounding capacity and leading to object hallucinations. We propose Assembly of Global and Local Attention (AGLA), a training-free and plug-and-play approach that mitigates hallucinations by assembling global features for response generation and local features for visual discrimination simultaneously. Specifically, we introduce an image-prompt matching scheme that captures prompt-relevant local features from images, leading to an augmented view of the input image where prompt-relevant content is highlighted while irrelevant distractions are suppressed. Hallucinations can thus be mitigated with a calibrated logit distribution that is from generative global features of the original image and discriminative local features of the augmented image. Extensive experiments show the superiority of AGLA in LVLM hallucination mitigation, demonstrating its wide applicability across both discriminative and generative tasks. Our code is available at https://github.com/Lackel/AGLA.
Sicong Leng、Jiahao Nie、Ping Chen、QianYing Wang、Xiaoqin Zhang、Wenbin An、Feng Tian、Haonan Lin、Shijian Lu
计算技术、计算机技术遥感技术
Sicong Leng,Jiahao Nie,Ping Chen,QianYing Wang,Xiaoqin Zhang,Wenbin An,Feng Tian,Haonan Lin,Shijian Lu.Mitigating Object Hallucinations in Large Vision-Language Models with Assembly of Global and Local Attention[EB/OL].(2024-06-18)[2025-06-27].https://arxiv.org/abs/2406.12718.点此复制
评论