|国家预印本平台
首页|SuperCM: Improving Semi-Supervised Learning and Domain Adaptation through differentiable clustering

SuperCM: Improving Semi-Supervised Learning and Domain Adaptation through differentiable clustering

SuperCM: Improving Semi-Supervised Learning and Domain Adaptation through differentiable clustering

来源:Arxiv_logoArxiv
英文摘要

Semi-Supervised Learning (SSL) and Unsupervised Domain Adaptation (UDA) enhance the model performance by exploiting information from labeled and unlabeled data. The clustering assumption has proven advantageous for learning with limited supervision and states that data points belonging to the same cluster in a high-dimensional space should be assigned to the same category. Recent works have utilized different training mechanisms to implicitly enforce this assumption for the SSL and UDA. In this work, we take a different approach by explicitly involving a differentiable clustering module which is extended to leverage the supervised data to compute its centroids. We demonstrate the effectiveness of our straightforward end-to-end training strategy for SSL and UDA over extensive experiments and highlight its benefits, especially in low supervision regimes, both as a standalone model and as a regularizer for existing approaches.

Durgesh Singh、Ahcène Boubekki、Robert Jenssen、Michael Kampffmeyer

10.1016/j.patcog.2025.112117

计算技术、计算机技术

Durgesh Singh,Ahcène Boubekki,Robert Jenssen,Michael Kampffmeyer.SuperCM: Improving Semi-Supervised Learning and Domain Adaptation through differentiable clustering[EB/OL].(2025-07-18)[2025-08-10].https://arxiv.org/abs/2507.13779.点此复制

评论