|国家预印本平台
首页|Understanding the Essence: Delving into Annotator Prototype Learning for Multi-Class Annotation Aggregation

Understanding the Essence: Delving into Annotator Prototype Learning for Multi-Class Annotation Aggregation

Understanding the Essence: Delving into Annotator Prototype Learning for Multi-Class Annotation Aggregation

来源:Arxiv_logoArxiv
英文摘要

Multi-class classification annotations have significantly advanced AI applications, with truth inference serving as a critical technique for aggregating noisy and biased annotations. Existing state-of-the-art methods typically model each annotator's expertise using a confusion matrix. However, these methods suffer from two widely recognized issues: 1) when most annotators label only a few tasks, or when classes are imbalanced, the estimated confusion matrices are unreliable, and 2) a single confusion matrix often remains inadequate for capturing each annotator's full expertise patterns across all tasks. To address these issues, we propose a novel confusion-matrix-based method, PTBCC (ProtoType learning-driven Bayesian Classifier Combination), to introduce a reliable and richer annotator estimation by prototype learning. Specifically, we assume that there exists a set $S$ of prototype confusion matrices, which capture the inherent expertise patterns of all annotators. Rather than a single confusion matrix, the expertise per annotator is extended as a Dirichlet prior distribution over these prototypes. This prototype learning-driven mechanism circumvents the data sparsity and class imbalance issues, ensuring a richer and more flexible characterization of annotators. Extensive experiments on 11 real-world datasets demonstrate that PTBCC achieves up to a 15% accuracy improvement in the best case, and a 3% higher average accuracy while reducing computational cost by over 90%.

Ju Chen、Jun Feng、Shenyu Zhang

计算技术、计算机技术

Ju Chen,Jun Feng,Shenyu Zhang.Understanding the Essence: Delving into Annotator Prototype Learning for Multi-Class Annotation Aggregation[EB/OL].(2025-08-04)[2025-08-19].https://arxiv.org/abs/2508.02123.点此复制

评论