计算机与现代化 ›› 2022, Vol. 0 ›› Issue (02): 120-126.

• 图像处理 • 上一篇    

基于3CNN-BiGRU的睡眠自动分期研究

  

  1. (广东工业大学信息工程学院,广东广州510006)
  • 出版日期:2022-03-31 发布日期:2022-03-31
  • 作者简介:唐洁(1997—),女,湖南东安人,硕士研究生,研究方向:深度学习,模式识别,E-mail: tangjie0202@163.com; 通信作者:文元美(1968—),女,湖北荆州人,副教授,博士,研究方向:智能信息处理,E-mail: ym0218@gdut.edu.cn。

Automatic Sleep Staging Based on 3CNN-BiGRU

  1. (College of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China)
  • Online:2022-03-31 Published:2022-03-31

摘要: 针对单通道脑电信号睡眠自动分期效率和准确率问题,提出采用三尺度并行卷积神经网络提取睡眠信号特征和双向门控循环单元学习睡眠阶段之间内部时间关系的3CNN-BiGRU睡眠自动分期模型。首先对原始单通道脑电信号进行带通滤波处理,并采用合成少数类过采样技术进行类平衡,然后送入搭建的模型中进行训练和验证实验,其中采用预训练和微调训练对模型进行优化,采用10次和20次交叉验证提高训练可靠性。不同数据集下的不同模型对比实验结果表明,3CNN-BiGRU模型取得了更高的训练效率和更好的分期准确率。

关键词: 脑电信号, 睡眠分期, 卷积神经网络, 双向门控循环单元, 合成少数类过采样技术

Abstract: Aiming at the efficiency and accuracy of single-channel EEG signal sleep automatic staging, this paper proposes to use three-scale parallel Convolutional Neural Networks to extract sleep signal features and Bidirectional Gated Recurrent Unit 3CNN-BiGRU automatic sleep staging model to learn the internal time relationship between sleep stages. First, the model performs band-pass filtering on the original single-channel EEG signal, and uses the synthetic minority oversampling technique for class balance, and then sends it to the built model for training and verification experiments. Pre-training and fine-tuning training  are used for optimizing the model, and  10-folds and 20-folds cross-validation is uses to improve training reliability. The experimental results of different models under different data sets show that the 3CNN-BiGRU model has achieved better training efficiency and better staging accuracy.

Key words: EEG signal, sleep staging, convolutional neural network, bidirectional gated recurrent unit, synthetic minority oversampling technique