|国家预印本平台
首页|Beyond Modality Collapse: Representations Blending for Multimodal Dataset Distillation

Beyond Modality Collapse: Representations Blending for Multimodal Dataset Distillation

Beyond Modality Collapse: Representations Blending for Multimodal Dataset Distillation

来源:Arxiv_logoArxiv
英文摘要

Multimodal Dataset Distillation (MDD) seeks to condense large-scale image-text datasets into compact surrogates while retaining their effectiveness for cross-modal learning. Despite recent progress, existing MDD approaches often suffer from \textit{\textbf{Modality Collapse}}, characterized by over-concentrated intra-modal representations and enlarged distributional gap across modalities. In this paper, at the first time, we identify this issue as stemming from a fundamental conflict between the over-compression behavior inherent in dataset distillation and the cross-modal supervision imposed by contrastive objectives. To alleviate modality collapse, we introduce \textbf{RepBlend}, a novel MDD framework that weakens overdominant cross-modal supervision via representation blending, thereby significantly enhancing intra-modal diversity. Additionally, we observe that current MDD methods impose asymmetric supervision across modalities, resulting in biased optimization. To address this, we propose symmetric projection trajectory matching, which synchronizes the optimization dynamics using modality-specific projection heads, thereby promoting balanced supervision and enhancing cross-modal alignment. Experiments on Flickr-30K and MS-COCO show that RepBlend consistently outperforms prior state-of-the-art MDD methods, achieving significant gains in retrieval performance (e.g., +9.4 IR@10, +6.3 TR@10 under the 100-pair setting) and offering up to 6.7$\times$ distillation speedup.

Xin Zhang、Ziruo Zhang、Jiawei Du、Zuozhu Liu、Joey Tianyi Zhou

计算技术、计算机技术

Xin Zhang,Ziruo Zhang,Jiawei Du,Zuozhu Liu,Joey Tianyi Zhou.Beyond Modality Collapse: Representations Blending for Multimodal Dataset Distillation[EB/OL].(2025-05-15)[2025-06-14].https://arxiv.org/abs/2505.14705.点此复制

评论