基于GloVe模型和注意力机制Bi-LSTM的文本分类方法
DOI:
作者:
作者单位:

华南农业大学, 数学与信息学院, 广州, 510642

作者简介:

通讯作者:

中图分类号:

TP391.1

基金项目:

2019年广东省研究生教育创新计划项目(2019JGXM18)


Text classification method based on GloVe model and attention mechanism Bi-LSTM
Author:
Affiliation:

College of Mathematics and Informatics, South China Agricultural University, Guangzhou, 510642, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为了提高文本分类的准确性,扩展分类任务的多样性,提出一种结合一维卷积神经网络(1D-CNN)和双向长短期记忆网络(Bi-LSTM)的文本分类方法。首先,为了解决近义词、多义词的表征困难,采用GloVe模型表示词特征,充分利用全局信息和共现窗口的优势。然后,利用1D-CNN进行特征提取,以降低分类器或预测模型的输入特征维数。最后,对分类模块Bi-LSTM进行优化,其隐藏层由两个残差块组成,并引入注意力机制进一步改善预测的准确度。在多个公开数据集中进行二元分类和多元主题分类实验。实验结果表明,与其他优秀方法相比,所提方法在准确率、召回率和F1得分方面的性能更优,最高准确度达92.5%,最高F1得分为91.3%。

    Abstract:

    To improve the accuracy of text classification and expand different classification tasks, a text classification method combining one-dimensional convolutional neural network (1D-CNN) and bi-directional long short-term memory (Bi-LSTM) network is proposed. Firstly, in order to solve the difficulty of representing synonyms and polysemy, GloVe model is used to represent word features, making full use of the advantages of global information and co-occurrence window. Then, 1D-CNN is used for feature extraction to reduce the input feature dimension of classifier or prediction model. Finally, the classification module Bi-LSTM is optimized, which hidden layer is composed of two residual blocks, and the attention mechanism is introduced to further improve the accuracy of prediction. Binary classification and multiple topic classification experiments are carried out in multiple public data sets. The experimental results show that compared with other excellent methods, the proposed method has better performance in accuracy, recall and F1 score, with the highest accuracy of 92.5% and the highest F1 score of 91.3%.

    参考文献
    相似文献
    引证文献
引用本文

周燕.基于GloVe模型和注意力机制Bi-LSTM的文本分类方法[J].电子测量技术,2022,45(7):42-47

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-05-14
  • 出版日期: