Revisiting Sparsity Constraint Under High-Rank Property in Partial Multi-Label Learning
Revisiting Sparsity Constraint Under High-Rank Property in Partial Multi-Label Learning
Partial Multi-Label Learning (PML) extends the multi-label learning paradigm to scenarios where each sample is associated with a candidate label set containing both ground-truth labels and noisy labels. Existing PML methods commonly rely on two assumptions: sparsity of the noise label matrix and low-rankness of the ground-truth label matrix. However, these assumptions are inherently conflicting and impractical for real-world scenarios, where the true label matrix is typically full-rank or close to full-rank. To address these limitations, we demonstrate that the sparsity constraint contributes to the high-rank property of the predicted label matrix. Based on this, we propose a novel method Schirn, which introduces a sparsity constraint on the noise label matrix while enforcing a high-rank property on the predicted label matrix. Extensive experiments demonstrate the superior performance of Schirn compared to state-of-the-art methods, validating its effectiveness in tackling real-world PML challenges.
Chongjie Si、Yidan Cui、Fuchao Yang、Xiaokang Yang、Wei Shen
计算技术、计算机技术
Chongjie Si,Yidan Cui,Fuchao Yang,Xiaokang Yang,Wei Shen.Revisiting Sparsity Constraint Under High-Rank Property in Partial Multi-Label Learning[EB/OL].(2025-05-27)[2025-06-29].https://arxiv.org/abs/2505.20938.点此复制
评论