基于多注意力与多特征融合的脑电情感识别
DOI:
CSTR:
作者:
作者单位:

西华大学电气与电子信息学院 成都 610039

作者简介:

通讯作者:

中图分类号:

TN911

基金项目:


Emotion recognition from EEG based on multi-attention and multi-feature fusion
Author:
Affiliation:

School of Electrical and Electronic Information, Xihua University, Chengdu 610039, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    为充分融合脑电信号中多维度情感信息以提升情感识别性能,本文提出了一种多注意力与多特征融合的网络模型。该模型结合了脑半球非对称性以及脑电信号的空间、频谱和时域特性,通过并行双输入路径进行特征提取。利用并行注意力机制强化频率通道和空间信息的表达,同时通过动态核选择方式调整卷积核的尺寸,结合深度可分离卷积进一步提取和压缩特征。最后,通过Transformer编码层的融合,提取特征间的时序依赖关系和全局关联信息,从而实现情感分类。在SEED数据集上的三分类实验中,模型取得了98.53%的平均准确率,验证了该方法的优越性。此外,通过对注意力模块的可视化分析,进一步增强了模型的可解释性。

    Abstract:

    To fully integrate multi-dimensional emotional information from EEG signals and improve emotion recognition performance, this paper proposes a network model based on multi-attention and multi-feature fusion. The model combines the asymmetry of the brain hemispheres and the spatial, spectral, and temporal characteristics of EEG signals, performing feature extraction through parallel dual-input pathways. A parallel attention mechanism is used to enhance the expression of frequency channels and spatial information, while the size of convolution kernels is adjusted through dynamic kernel selection. Additionally, depthwise separable convolutions are employed to further extract and compress features. Finally, the temporal dependencies and global associations between features are captured through fusion in the Transformer encoding layer, enabling emotion classification. In three-class experiments on the SEED dataset, the model achieved an average accuracy of 98.53%, demonstrating the superiority of this approach. Furthermore, visual analysis of the attention module further enhances the interpretability of the model.

    参考文献
    相似文献
    引证文献
引用本文

吴家彬,阳小明,曹太强.基于多注意力与多特征融合的脑电情感识别[J].电子测量技术,2025,48(19):144-152

复制
分享
相关视频

文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2025-12-01
  • 出版日期:
文章二维码