Slot Attention with Re-Initialization and Self-Distillation
Slot Attention with Re-Initialization and Self-Distillation
Unlike popular solutions based on dense feature maps, Object-Centric Learning (OCL) represents visual scenes as sub-symbolic object-level feature vectors, termed slots, which are highly versatile for tasks involving visual modalities. OCL typically aggregates object superpixels into slots by iteratively applying competitive cross attention, known as Slot Attention, with the slots as the query. However, once initialized, these slots are reused naively, causing redundant slots to compete with informative ones for representing objects. This often results in objects being erroneously segmented into parts. Additionally, mainstream methods derive supervision signals solely from decoding slots into the input's reconstruction, overlooking potential supervision based on internal information. To address these issues, we propose Slot Attention with re-Initialization and self-Distillation (DIAS): $\emph{i)}$ We reduce redundancy in the aggregated slots and re-initialize extra aggregation to update the remaining slots; $\emph{ii)}$ We drive the bad attention map at the first aggregation iteration to approximate the good at the last iteration to enable self-distillation. Experiments demonstrate that DIAS achieves state-of-the-art on OCL tasks like object discovery and recognition, while also improving advanced visual prediction and reasoning. Our code is available on https://github.com/Genera1Z/DIAS.
Rongzhen Zhao、Yi Zhao、Juho Kannala、Joni Pajarinen
计算技术、计算机技术
Rongzhen Zhao,Yi Zhao,Juho Kannala,Joni Pajarinen.Slot Attention with Re-Initialization and Self-Distillation[EB/OL].(2025-07-31)[2025-08-07].https://arxiv.org/abs/2507.23755.点此复制
评论