Diffusion Augmented Retrieval: A Training-Free Approach to Interactive Text-to-Image Retrieval
Diffusion Augmented Retrieval: A Training-Free Approach to Interactive Text-to-Image Retrieval
Interactive Text-to-image retrieval (I-TIR) is an important enabler for a wide range of state-of-the-art services in domains such as e-commerce and education. However, current methods rely on finetuned Multimodal Large Language Models (MLLMs), which are costly to train and update, and exhibit poor generalizability. This latter issue is of particular concern, as: 1) finetuning narrows the pretrained distribution of MLLMs, thereby reducing generalizability; and 2) I-TIR introduces increasing query diversity and complexity. As a result, I-TIR solutions are highly likely to encounter queries and images not well represented in any training dataset. To address this, we propose leveraging Diffusion Models (DMs) for text-to-image mapping, to avoid finetuning MLLMs while preserving robust performance on complex queries. Specifically, we introduce Diffusion Augmented Retrieval (DAR), a framework that generates multiple intermediate representations via LLM-based dialogue refinements and DMs, producing a richer depiction of the user's information needs. This augmented representation facilitates more accurate identification of semantically and visually related images. Extensive experiments on four benchmarks show that for simple queries, DAR achieves results on par with finetuned I-TIR models, yet without incurring their tuning overhead. Moreover, as queries become more complex through additional conversational turns, DAR surpasses finetuned I-TIR models by up to 7.61% in Hits@10 after ten turns, illustrating its improved generalization for more intricate queries.
Richard Mccreadie、Paul Henderson、Zijun Long、Kangheng Liang、Gerardo Aragon-Camarasa
计算技术、计算机技术
Richard Mccreadie,Paul Henderson,Zijun Long,Kangheng Liang,Gerardo Aragon-Camarasa.Diffusion Augmented Retrieval: A Training-Free Approach to Interactive Text-to-Image Retrieval[EB/OL].(2025-07-10)[2025-07-18].https://arxiv.org/abs/2501.15379.点此复制
评论