|国家预印本平台
| 注册
首页|DPDANet:融合密集连接与自注意力机制的改进DPCNN文本分类模型

DPDANet:融合密集连接与自注意力机制的改进DPCNN文本分类模型

罗华宇 王冬梅 张颖晖 林玮 陈晨

DPDANet:融合密集连接与自注意力机制的改进DPCNN文本分类模型

DPDANet: An Improved DPCNN Model for Text Classification with Dense Connections and Self-Attention Mechanism

罗华宇 1王冬梅 1张颖晖 1林玮 1陈晨1

作者信息

  • 1. 华南师范大学数据科学与工程学院
  • 折叠

摘要

[目的]针对海量评论数据的高效情感分析需求,提出DPDANet模型以提升文本分类性能。[方法]基于BERT构建的DPDANet融合了密集连接与注意力机制,通过优化DPCNN中层间的连接策略,增强特征流动与信息复用能力,从而更高效地利用浅层特征,并有效降低计算复杂度。[结果]将DPDANet与基于BERT的TextCNN、CNN-LSTM、DPCNN、DPCNN-BiGRU、Transformer、XLSTM、BERT及DPDBNet八类模型进行了对比实验。在四个文本分类数据集上,DPDANet分别取得了0.6679、0.9307、0.9278和0.6242的优异准确率,相较于DPCNN分别提升了6.47%、1.32%、0.72%与3.52%。[局限]模型在极短文本与多类别不均衡场景中仍存在泛化能力不足的问题。[结论] DPDANet在众多文本分类任务中均展现出更优的性能与效率,具备良好的应用前景。

Abstract

[Objective] In response to the demand for efficient sentiment analysis of large-scale review data, this study proposes DPDANet model to enhance the performance of text classification.[Methods] The BERT-based DPDANet incorporates dense connections and an attention mechanism. By refining the inter-layer connection strategy of the DPCNN architecture, it enhances feature propagation and information reuse, thereby facilitating more efficient exploitation of shallow features and effectively reducing computational complexity.[Results] Comparative experiments were conducted between DPDANet and eight BERT-based models, including TextCNN, CNN-LSTM, DPCNN, DPCNN-BiGRU, Transformer, XLSTM, BERT, and DPDBNet. On four text classification datasets, DPDANet achieved outstanding accuracy scores of 0.6679, 0.9307, 0.9278 and 0.6242, representing improvements of 6.47%, 1.32%, 0.72% and 3.52%, respectively, over the baseline DPCNN model.[Limitations] The model still exhibits limited generalization capability in scenarios involving extremely short texts and imbalanced multi-class distributions.[Conclusions] DPDANet demonstrates superior performance and efficiency across a variety of text classification tasks, indicating strong potential for practical application.

关键词

DPDANet/文本分类/DPCNN/自注意力机制

Key words

DPDANet/Text Classification/DPCNN/Self-Attention Mechanism

引用本文复制引用

罗华宇,王冬梅,张颖晖,林玮,陈晨.DPDANet:融合密集连接与自注意力机制的改进DPCNN文本分类模型[EB/OL].(2025-05-05)[2025-12-13].https://chinaxiv.org/abs/202505.00005.

学科分类

计算技术、计算机技术

评论

首发时间 2025-05-05
下载量:0
|
点击量:9
段落导航相关论文