|国家预印本平台
首页|自增强的深度子空间聚类

自增强的深度子空间聚类

Self-reinforce deep subspace clustering

中文摘要英文摘要

近年来,随着信息时代的快速发展,互联网上产生了大量的数据,为了充分挖掘出数据中的信息,聚类分析是一个非常有效的手段。神经流形聚类嵌入算法利用了最大编码率衰减来进行特征的提取,本文在此基础上对分类器的输出施加的一个约束,提出了一种自增强的深度子空间聚类算法。具体来说,本文首先利用最大编码率衰减提取数据的特征,然后利用学习到的特征训练分类器,还将分类器的输出进行了锐化操作得到一个新的概率分布,利用KL散度约束或平方损失约束来减小这两个概率分布的距离,使得分类器的输出指向性更加明确,并且比较了不同的锐化系数对实验性能的影响。本文在CIFAR-10、CIFAR-100和STL-10数据集上进行了实验,并与其他的聚类算法进行了对比,还对模型中的各个模块进行了消融实验,最终证明了本算法的有效性。

In recent years, with the rapid development of the information age, a large amount of data has been generated on the Internet. In order to fully mine the information in the data, cluster analysis is a very effective means.The neural manifold clustering and embedding algorithm uses maximal coding rate reduction to extract features. On this basis, this paper imposes a constraint on the output of the classifier, and proposes a self-enhanced deep subspace clustering algorithm. Specifically, this paper first use maximal coding rate reduction to extract the features of the data, and then use the learned features to train the classifier. This paper also sharpen the output of the classifier to obtain a new probability distribution, and the KL divergence constraint or square loss constraints are used to reduce the distance between these two probability distributions, making the output of the classifier more specific, additionally this paper compare the impact of different sharpening factors on the experimental performance. This paper conducts experiments on CIFAR-10, CIFAR-100 and STL-10 datasets, and compares it with other clustering algorithms. This paper also conduct ablation experiments on each module in the model, and finally proves the effectiveness of this algorithm.

周英杰、李春光

计算技术、计算机技术

子空间聚类深度学习对比学习锐化约束

subspace clusteringdeep learningcontrastive learningsharpen constraint

周英杰,李春光.自增强的深度子空间聚类[EB/OL].(2023-04-17)[2025-08-16].http://www.paper.edu.cn/releasepaper/content/202304-244.点此复制

评论