跳到主要內容

臺灣博碩士論文加值系統

(44.200.117.166) 您好!臺灣時間:2023/10/03 19:09
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:王咨晴
研究生(外文):Wang, Tzu-Ching
論文名稱:跨受試者表徵學習於腦電波情緒分類
論文名稱(外文):Learning of Subject-Independent Representation in EEG Emotion Classification
指導教授:陳永昇陳永昇引用關係
指導教授(外文):Chen, Yong-Sheng
口試委員:陳麗芬魏群樹郭柏志
口試日期:2022-07-21
學位類別:碩士
校院名稱:國立陽明交通大學
系所名稱:多媒體工程研究所
學門:電算機學門
學類:軟體發展學類
論文種類:學術論文
論文出版年:2022
畢業學年度:110
語文別:英文
論文頁數:54
中文關鍵詞:深度學習腦電波情緒分類跨受試者領域自適應
外文關鍵詞:deep learningelectroencephalographyemotion classificationsubject-independentdomain adaptation
相關次數:
  • 被引用被引用:0
  • 點閱點閱:157
  • 評分評分:
  • 下載下載:9
  • 收藏至我的研究室書目清單書目收藏:0
近年來,自動情緒識別成為人工智慧應用中相當熱門的題目,利用面部表情、情緒語音、肢體語言、生理信號和腦電波等各種形式的情緒識別技術陸續被提出,其中,腦電波因為其可靠性受到了廣泛關注。為了進一步了解情緒的大腦機制,過去的研究為腦電波情緒分類任務設計了許多人工的特徵,其中,大腦連通性特徵由於符合腦神經科學對於情緒的理解成為近期研究的方向之一。在腦電波情緒分類中,最具挑戰性的一點便是腦電波在不同受試者之間的高變異性。受限於此,許多情緒識別模型雖然能在小部分的受試者之中取得較高的準確度,但卻無法順利地應用於新的受試者。因此,領域自適應 (domain adaption) 方法被引入相關研究之中,以降低受試者間腦電訊號的分佈差異,獲得跨受試者的情緒表徵。
在本文中,我們結合腦連結特徵與領域自適應技巧提出了一種新的深度神經網路,該網路可以透過受試者分類器及互信息 (mutual information) 的估計來將大腦連通性特徵分解為與情感相關及與受試者相關的表徵,防止模型對特定受試者資料過度擬合(overfitting),從而提高跨受試者腦電波情緒識別的準確度。我們在公開的 DEAP 數據集和自己所收集的 HEDEC 數據集上進行情緒的三分類以評估模型的效能。最終,在資料集內的跨受試者的測試中,我們的方法在 DEAP 上達到了 92.77%,在 HEDEC 上達到了 97.22% 的準確率。而在未見過受試者的測試中,DEAP 和 HEDEC 的準確率分別為 52.43% 和 54.01%。在所有實驗設置中,模型最終的準確度皆高於基線模型 3-7%,顯示我們設計的領域自適應方法確實有助於降低不同受試者間腦電波的差異,幫助模型學習到更為廣泛的情緒表徵。
Recently, emotion classification has become a popular topic in artificial intelligence applications. Among the various modalities for recognizing emotion, EEG attracts significant attention because of its reliability. To further understand the brain mechanisms of emotion, various hand-crafted EEG features are designed for the emotion classification tasks. Based on findings in neuroscience, brain connectivity features have received more and more attention.
One of the most significant challenges in the EEG emotion classification task is the high inter-subject variability of the EEG. Due to this limitation, many models can only achieve high accuracy on a limited number of subjects and cannot be applied to new subjects effectively. Thus, domain adaptation methods are introduced to emotion classification tasks to diminish the inter-subject distributional discrepancy of EEG signals.
In the thesis, we propose a deep neural network combining brain connectivity features and domain adaptation techniques, which can decompose connectivity features into emotion-relevant and subject-relevant representations through a subject classifier and mutual information estimation. To evaluate our model, we perform three-category emotion classification experiments on the public DEAP dataset and our own HEDEC dataset. Finally, in the cross-subject learning experiments, our method achieves 92.77% on DEAP and 97.22% on HEDEC. And in the unseen subject testing experiment, the accuracy of DEAP and HEDEC are 52.43% and 54.01%. Across all experiment settings, our proposed model outperforms the baseline model by 3-6 %, demonstrating that the method can reduce EEG inter-subject variability, enabling the model to learn more general emotion representation.
中文摘要 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i
英文摘要 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii
誌謝 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii
Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv
List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Emotion classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Electroencephalography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Deep learning based EEG emotion classification . . . . . . . . . . . . . . . . . 4
1.4 Thesis goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2 Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.1 CNN with an attention mechanism . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2 Cross-subject EEG emotion classification methods . . . . . . . . . . . . . . . 10
2.3 Unseen-subject EEG emotion classification methods . . . . . . . . . . . . . . 13
3 Materials and Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.1 EEG emotion database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.1.1 DEAP dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.1.2 HEDEC dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 Feature extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.2.1 Single-electrode feature . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.2.2 Brain connectivity features . . . . . . . . . . . . . . . . . . . . . . . . 23
3.3 The proposed method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.3.1 EEG encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.3.2 Feature decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.3.3 Training settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.1 Experiment settings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.2 Comparison of different feature extraction . . . . . . . . . . . . . . . . . . . . 32
4.3 Comparison of the ablations . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.3.1 Cross-subject experiment . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.3.2 Unseen-subject experiment . . . . . . . . . . . . . . . . . . . . . . . . 38
4.4 Effect of representation dimension . . . . . . . . . . . . . . . . . . . . . . . . 41
4.5 Performance of the proposed model . . . . . . . . . . . . . . . . . . . . . . . 42
4.6 Comparison with existing methods . . . . . . . . . . . . . . . . . . . . . . . . 44
4.6.1 Cross-subject emotion classification methods . . . . . . . . . . . . . . 44
4.6.2 Unseen-subject emotion classification methods . . . . . . . . . . . . . 45
5 Conclusions and Future Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
5.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
5.2 Future works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
[1] P. Ekman, “Emotions revealed,” Bmj, vol. 328, no. Suppl S5, 2004.
[2] J. A. Russell, “A circumplex model of affect,” Journal of personality and social psychol ogy, vol. 39, no. 6, p. 1161, 1980.
[3] O. Korn, L. Stamm, and G. Moeckl, Designing Authentic Emotions for Non-Human Char acters: A Study Evaluating Virtual Affective Behavior, 2017.
[4] R. A. Calvo and S. D’Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Transactions on affective computing, vol. 1, no. 1, pp. 18–37, 2010.
[5] L. Shu, J. Xie, M. Yang, Z. Li, Z. Li, D. Liao, X. Xu, and X. Yang, “A review of emotion recognition using physiological signals,” Sensors, vol. 18, no. 7, p. 2074, 2018.
[6] L. F. Nicolas-Alonso and J. Gomez-Gil, “Brain computer interfaces, a review,” sensors, vol. 12, no. 2, pp. 1211–1279, 2012.
[7] A. Craik, Y. He, and J. L. Contreras-Vidal, “Deep learning for electroencephalogram (eeg) classification tasks: a review,” Journal of neural engineering, vol. 16, no. 3, p. 031001, 2019.
[8] M. R. Islam, M. A. Moni, M. M. Islam, M. Rashed-Al-Mahfuz, M. S. Islam, M. K. Hasan, M. S. Hossain, M. Ahmad, S. Uddin, and A. Azad, “Emotion recognition from eeg sig nal focusing on deep learning and shallow learning techniques,” IEEE Access, vol. 9, pp. 94 601–94 624, 2021.
[9] N. Kumar, K. Khaund, and S. M. Hazarika, “Bispectral analysis of eeg for emotion recog nition,” Procedia Computer Science, vol. 84, pp. 31–35, 2016.
[10] S. Z. Bong, K. Wan, M. Murugappan, N. M. Ibrahim, Y. Rajamanickam, and K. Mohamad, “Implementation of wavelet packet transform and non linear analysis for emotion classi fication in stroke patient using brain signals,” Biomedical signal processing and control, vol. 36, pp. 102–112, 2017.
[11] R.-N. Duan, J.-Y. Zhu, and B.-L. Lu, “Differential entropy feature for eeg-based emotion classification,” in 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, Conference Proceedings, pp. 81–84.
[12] P. Bashivan, I. Rish, M. Yeasin, and N. Codella, “Learning representations from eeg with deep recurrent-convolutional neural networks,” arXiv preprint arXiv:1511.06448, 2015.
[13] I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,” Cognition and emo tion, vol. 23, no. 2, pp. 209–237, 2009.
[14] S.-E. Moon, S. Jang, and J.-S. Lee, “Convolutional neural network approach for eeg-based emotion recognition using brain connectivity and its spatial information,” in 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, Conference Proceedings, pp. 2556–2560.
[15] V. Jayaram, M. Alamgir, Y. Altun, B. Scholkopf, and M. Grosse-Wentrup, “Transfer learn ing in brain-computer interfaces,” IEEE Computational Intelligence Magazine, vol. 11, no. 1, pp. 20–31, 2016.
[16] O. Özdenizci, Y. Wang, T. Koike-Akino, and D. Erdoğmuş, “Learning invariant repre sentations from eeg via adversarial inference,” IEEE access, vol. 8, pp. 27 074–27 085, 2020.
[17] ——, “Transfer learning in brain-computer interfaces with adversarial variational autoen coders,” in 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, Conference Proceedings, pp. 207–210.
[18] M. Han, O. Özdenizci, Y. Wang, T. Koike-Akino, and D. Erdoğmuş, “Disentangled ad versarial autoencoder for subject-invariant physiological feature extraction,” IEEE signal processing letters, vol. 27, pp. 1565–1569, 2020.
[19] X. Chai, Q. Wang, Y. Zhao, X. Liu, O. Bai, and Y. Li, “Unsupervised domain adapta tion techniques based on auto-encoder for non-stationary eeg-based emotion recognition,” Computers in biology and medicine, vol. 79, pp. 205–214, 2016.
[20] H. Liu, M. Long, J. Wang, and M. Jordan, “Transferable adversarial training: A general approach to adapting deep classifiers,” in International Conference on Machine Learning. PMLR, Conference Proceedings, pp. 4013–4022.
[21] E. Jeon, W. Ko, J. S. Yoon, and H.-I. Suk, “Mutual information-driven subject-invariant and class-relevant deep representation learning in bci,” IEEE Transactions on Neural Net works and Learning Systems, 2021.
[22] J. Chen, P. Zhang, Z. Mao, Y. Huang, D. Jiang, and Y. Zhang, “Accurate eeg-based emo tion recognition on combined features using deep convolutional neural networks,” IEEE Access, vol. 7, pp. 44 317–44 328, 2019.
[23] S. Sakhavi, C. Guan, and S. Yan, “Learning temporal information for brain-computer in terface using convolutional neural networks,” IEEE transactions on neural networks and learning systems, vol. 29, no. 11, pp. 5619–5629, 2018.
[24] P. Blanc-Durand, “Réseaux de neurones convolutifs en médecine nucléaire: applications à la segmentation automatique des tumeurs gliales et à la correction d’atténuation en tep irm,” 2018.
[25] Y. Li, J. Huang, H. Zhou, and N. Zhong, “Human emotion recognition with electroen cephalographic multidimensional features by hybrid deep neural networks,” Applied Sci ences, vol. 7, no. 10, p. 1060, 2017.
[26] J. Li, Z. Zhang, and H. He, “Hierarchical convolutional neural networks for eeg-based emotion recognition,” Cognitive Computation, vol. 10, no. 2, pp. 368–380, 2018.
[27] J. Hu, L. Shen, and G. Sun, “Squeeze-and-excitation networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, Conference Proceedings, pp. 7132–7141.
[28] Y. Li, Y. Liu, W.-G. Cui, Y.-Z. Guo, H. Huang, and Z.-Y. Hu, “Epileptic seizure detection in eeg signals using a unified temporal-spectral squeeze-and-excitation network,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 4, pp. 782–794, 2020.
[29] J. Zhang, R. Yao, W. Ge, and J. Gao, “Orthogonal convolutional neural networks for au tomatic sleep stage classification based on single-channel eeg,” Computer methods and programs in biomedicine, vol. 183, p. 105089, 2020.
[30] Z. Wang, H. Song, S. Hu, and G. Liu, “Channel selection method based on cnnse for eeg emotion recognition,” in 2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE). IEEE, Conference Proceedings, pp. 654–658.
[31] S. Alhagry, A. A. Fahmy, and R. A. El-Khoribi, “Emotion recognition based on eeg using lstm recurrent neural network,” Emotion, vol. 8, no. 10, pp. 355–358, 2017.
[32] W. Tao, C. Li, R. Song, J. Cheng, Y. Liu, F. Wan, and X. Chen, “Eeg-based emotion recognition via channel-wise attention and self attention,” IEEE Transactions on Affective Computing, 2020.
[33] J. Lachaux, E. Rodriguez, J. Martinerie, and F. J. Varela, “Measuring phase synchrony in brain signals,” Human brain mapping, vol. 8, no. 4, pp. 194–208, 1999.
[34] X. Wu, W.-L. Zheng, Z. Li, and B.-L. Lu, “Investigating eeg-based functional connectiv ity patterns for multimodal emotion recognition,” Journal of neural engineering, vol. 19, no. 1, p. 016012, 2022.
[35] H. Saarimäki, A. Gotsopoulos, I. P. Jääskeläinen, J. Lampinen, P. Vuilleumier, R. Hari, M. Sams, and L. Nummenmaa, “Discrete neural signatures of basic emotions,” Cerebral ortex, vol. 26, no. 6, pp. 2563–2573, 2016.
[36] S. Jirayucharoensak, S. Pan-Ngum, and P. Israsena, “Eeg-based emotion recognition using deep learning network with principal component based covariate shift adaptation,” The Scientific World Journal, vol. 2014, 2014.
[37] Z. Lan, O. Sourina, L. Wang, R. Scherer, and G. R. Müller-Putz, “Domain adaptation tech niques for eeg-based emotion recognition: a comparative study on two public datasets,” IEEE Transactions on Cognitive and Developmental Systems, vol. 11, no. 1, pp. 85–94, 2018.
[38] S. Koelstra, C. Muhl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, “Deap: A database for emotion analysis; using physiological signals,” IEEE transactions on affective computing, vol. 3, no. 1, pp. 18–31, 2011.
[39] P.-Y. W. Y.-S. C. L.-F. Ruei-Jyun Hung, Hung-Chun Yeh, “Investigating neural mecha nisms of empathy during one-way emotional communication using electroencephalogra phy.” Poster presented at the 14th Annual Meeting of Social and Affective Neuroscience Society, Virtual., 2022.
[40] C. E. Shannon, “A mathematical theory of communication,” ACM SIGMOBILE mobile computing and communications review, vol. 5, no. 1, pp. 3–55, 2001.
[41] A. Delorme and S. Makeig, “Eeglab: an open source toolbox for analysis of single-trial eeg
dynamics including independent component analysis,” Journal of neuroscience methods,
vol. 134, no. 1, pp. 9–21, 2004.
[42] P. Welch, “The use of fast fourier transform for the estimation of power spectra: a method based on time averaging over short, modified periodograms,” IEEE Transactions on audio and electroacoustics, vol. 15, no. 2, pp. 70–73, 1967.
[43] T. M. Cover and J. A. Thomas, “Entropy, relative entropy and mutual information,” Ele ments of information theory, vol. 2, no. 1, pp. 12–13, 1991.
[44] D. Yao, “A method to standardize a reference of scalp eeg recordings to a point at infinity,” Physiological measurement, vol. 22, no. 4, p. 693, 2001. 54
[45] M. I. Belghazi, A. Baratin, S. Rajeshwar, S. Ozair, Y. Bengio, A. Courville, and D. Hjelm,“Mutual information neural estimation,” in International conference on machine learning. PMLR, Conference Proceedings, pp. 531–540.
[46] M. Donsker and S. Varadhan, Large deviations for Markov processes and the asymptotic evaluation of certain Markov process expectations for large times. Springer, 1975, pp. 82–88.
[47] Y. Ganin, E. Ustinova, H. Ajakan, P. Germain, H. Larochelle, F. Laviolette, M. Marc hand, and V. Lempitsky, “Domain-adversarial training of neural networks,” The journal of machine learning research, vol. 17, no. 1, pp. 2096–2030, 2016.
[48] L. Liu, H. Jiang, P. He, W. Chen, X. Liu, J. Gao, and J. Han, “On the variance of the adaptive learning rate and beyond,” arXiv preprint arXiv:1908.03265, 2019.
[49] 权学良, 曾志刚, 蒋建华, 张亚倩, 吕宝粮, and 伍冬睿, “基于生理信号的情感计算研 究综述,” 自动化学报, vol. 47, no. 8, pp. 1769–1784, 2021.
[50] A. M. Bastos and J.-M. Schoffelen, “A tutorial review of functional connectivity analysis methods and their interpretational pitfalls,” Frontiers in systems neuroscience, vol. 9, p. 175, 2016.
[51] K. L. Smarr and A. L. Keefer, “Measures of depression and depressive symptoms: Beck depression inventory-ii (bdi-ii), center for epidemiologic studies depression scale (ces-d), geriatric depression scale (gds), hospital anxiety and depression scale (hads), and patient health questionnaire-9 (phq-9),” Arthritis care research, vol. 63, no. S11, pp. S454–S466, 2011.
[52] X. Li, R. La, Y. Wang, B. Hu, and X. Zhang, “A deep learning approach for mild depression recognition based on functional connectivity using electroencephalography,” Frontiers in neuroscience, vol. 14, p. 192, 2020.
[53] L. Van der Maaten and G. Hinton, “Visualizing data using t-sne,” Journal of machine learning research, vol. 9, no. 11, 2008.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊