跳到主要內容

臺灣博碩士論文加值系統

(18.97.9.168) 您好!臺灣時間:2025/01/16 18:03
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:陳映璇
研究生(外文):Chen, Yin-Husan
論文名稱:結合短時程心電圖與小波散射變換的情緒分類系統
論文名稱(外文):Emotion Classification Based on Wavelet Scattering Transform of Short-Term ECG Signals
指導教授:張文輝
指導教授(外文):CHANG,WEN-WHEI
口試委員:陳信宏黃紹華黃敬群
口試日期:2023-08-10
學位類別:碩士
校院名稱:國立陽明交通大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2023
畢業學年度:112
語文別:中文
論文頁數:61
中文關鍵詞:情緒識別情感運算短時程心電圖人工特徵擷取小波散射變換自動特徵擷取支持向量機
外文關鍵詞:emotion recognitionaffective computingshort-term electrocardiographymanual feature extractionwavelet scattering transformautomated feature extractionsupport vector machine
相關次數:
  • 被引用被引用:0
  • 點閱點閱:2
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
情緒是外來刺激所引起的人體生理反應,對我們的日常生活有深遠的影響。隨著人機介面技術的廣泛應用,自動化情緒識別已成為情感運算領域的研究重點。穿戴式裝置可即時連續量測心電圖,促成其應用於情緒識別的發展契機。本論文旨在針對穿戴式裝置的應用情境,研發以短時程心電訊號為基礎的情緒分類系統,以期能監測人類情緒的連續變化過程。系統製作將用兩階段處理模式:特徵擷取和情緒分類。第一階段旨在擷取與情緒相關的生理特徵,傳統的人工擷取方式耗時費力,且需要長時間的生理訊號以進行統計分析。本論文使用小波散射變換進行自動特徵擷取,其優點是能針對短時程心電訊號在不同尺度和時頻域中擷取更豐富的情緒生理特徵。在第二階段,採用監督式機器學習的支持向量機作為分類器,將心電訊號自動分為三種情緒:開心、放鬆和恐懼。最後,在CASE數據庫上進行各項參數優化實驗以及效能評估。實驗結果顯示,本論文提出的小波散射變換優於傳統的人工特徵提取方法,且使用多週期心電訊號的準確率達到84.4%,遠高於100秒心電訊號的準確率60.56%,以及單週期心電訊號的準確率62.56%。整體而言,小波散射變換自動且準確的擷取出與情緒相關的生理特徵,而結合短時程心電圖與小波散射變換的情緒分類系統也可延伸應用於穿戴式裝置。
Recent advances in wearable devices have enabled continuous ambulatory monitoring of electrocardiography (ECG) signals. ECG contains considerable information regarding emotions and it greatly improves human-computer interaction if human emotions can be recognized from single-lead ECG. Considering the time-varying nature of emotions, this study aims to develop emotion recognition using short-term ECG signals. This task is essentially a pattern recognition problem consisting of two stages: feature extraction and classification. First, we employ wavelet scattering transform (WST) as an automated method for feature extraction compared with manual feature extraction. Handcrafted features are not suitable for short-term emotion recognition, because manual extraction is time-consuming, and statistical time-frequency analysis is meaningless for short ECG segments. The advantage of WST is to introduce translation invariance and deformation stability in the feature representation of nonstationary ECG signals. After the feature extraction, a classifier based on support vector machine (SVM) is used for automatic classification of emotion into three types: amusing, relaxed, and scary. Experiments on the CASE database show that the proposed WST algorithm for extracting features obtains an accuracy of 84.4% for 10s ECG, compared with 60.56% for 100s ECG and 62.56% for single-beat ECG.
中文摘要i
Abstractii
目錄iii
圖目錄v
表目錄vii
第一章 緒論1
1.1 研究背景與方向1
1.2 文獻回顧2
1.3 全文架構4
第二章 心電圖的情緒分析5
2.1 心電圖原理5
2.2 情緒模型9
2.2.1 分類法9
2.2.2 座標法10
2.3 心電圖情緒資料庫13
第三章 心電圖特徵擷取機制16
3.1人工特徵擷取16
3.2自動特徵擷取18
3.3小波散射變換22
3.4 WST軟體套件26
第四章 心電圖情緒分類系統30
4.1 系統架構30
4.2 不同時程的心電生理特徵擷取32
4.2.1 單週期心電訊號32
4.2.2 多週期心電訊號33
4.3 支持向量機分類模型34
第五章 實驗結果與分析39
5.1 實驗開發環境39
5.1.1 實驗環境設定及使用工具39
5.1.2 系統效能的評估指標40
5.2 長時程心電訊號的WST參數優化設定42
5.3 長時程心電訊號的特徵擷取效能比較 44
5.4 單週期心電訊號的WST參數優化設定45
5.5 多週期心電訊號的WST參數優化設定47
5.6 實驗結果與前人方法比較49
第六章 結論與未來展望52
參考文獻54
[1]M. Dubey and P. L. Singh, "Automatic Emotion Recognition Using Facial Expression: A Review," International Research Journal of Engineering and Technology, 2016.
[2]Y. Hsu, J. Wang, W. Chiang and C. Hung, " Automatic ECG-Based Emotion Recognition in Music Listening," IEEE Transactions on Affective Computing, vol. 11, no. 1, pp. 85–99, 2020.
[3]Q. Zhang, X. X. Chen, Q. Y. Zhan, T. Yang, S. H. Xia. "Respiration-based emotion recognition with deep learning," Computers in Industry, vol. 92–93, pp. 84–90, 2017.
[4]J. Kim and E. Andre, "Emotion-specific dichotomous classification and feature-level fusion of multichannel biosignals for automatic emotion recognition," Proc. IEEE Int’l Conf. on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Korea, 20–22 August 2008; pp. 114–119.
[5]L. Mirmohamadsadeghi, A. Yazdani and J.M. Vesin, "Using cardio-respiratory signals to recognize emotions elicited by watching music video clips," IEEE International Workshop on Multimedia Signal Processing, 2017.
[6]W. Seo, N. Kim, S. Kim, C. Lee and S.-M. Park, "Deep ECG-respiration network (DeepER net) for recognizing mental stress," Sensors, vol. 19, no. 13, pp. 3021, 2019.
[7]Z. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang, “A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions,” IEEE Trans. Pattern Analysis and Machine Intelligence,vol. 31, no. 1, pp. 39-58, Mar. 2009.
[8]M. Pantic, M. Valstar, R. Rademaker, and L. Maat, “Web-Based Database for Facial Expression Analysis,” Proc. IEEE Int’l Conf. Multimedia and Expo, pp. 317-321, 2005.
[9]M.F. Valstar and M. Pantic, “Induced Disgust, Happiness and Surprise: An Addition to the MMI Facial Expression Database,” Proc. Int’l Conf. Language Resources and Evaluation, Workshop EMOTION, pp. 65-70, May 2010.
[10]E. Douglas-Cowie, R. Cowie, I. Sneddon, C. Cox, O. Lowry, M. McRorie, J.-C. Martin, L. Devillers, S. Abrilian, A. Batliner, N. Amir, and K. Karpouzis, “The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data,” Proc. Second Int’l Conf. Affective Computing and Intelligent Interaction, A. Paiva et al., pp. 488-500, 2007.
[11]E. Douglas-cowie, R. Cowie, and M. Schro¨der, “A New Emotion Database: Considerations, Sources and Scope,” Proc. ISCA Int’l Technical Research Workshop Speech and Emotion, pp. 39-44, 2000.
[12]M. Grimm, K. Kroschel, and S. Narayanan, “The Vera am Mittag German Audio-Visual Emotional Speech Database,” Proc. IEEE Int’l Conf. Multimedia and Expo, pp. 865-868, Apr. 2008.
[13]J.A. Healey and R.W. Picard, “Detecting Stress during Real-World Driving Tasks Using Physiological Sensors,” IEEE Trans. Intelligent Transportation Systems, vol. 6, no. 2, pp. 156-166, June 2005.
[14]S. Koelstra, C. Mu¨ hl, M. Soleymani, A. Yazdani, J.-S. Lee, T. Ebrahimi, T. Pun, A. Nijholt, and I. Patras, “DEAP: A Database for Emotion Analysis Using Physiological Signals,” IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 18-31, Jan.-Mar. 2012.
[15]M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal database for affect recognition and implicit tagging,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 42–55, Jan.-Mar. 2012.
[16]Albu-Schaeffer, A., van den Broek, E.L., Castellini, C., Schwenker, F., Sharma, K. ‘ A dataset of continuous affect annotations and physiological signals for emotion analysis’. Scientific Data, vol. 6, no. 1, pp 196, Oct, 2019.
[17]W. Wen, G. Liu, N. Cheng, J. Wei, P. Shangguan, and W. Huang, “Emotion recognition based on multi-variant correlation of physiological signals,” IEEE Trans. Affect. Comput., vol. 5, no. 2, pp. 126– 140, Apr.-Jun. 2014.
[18]P. Ekman, “An argument for basic emotions,” Cognition Emotion, vol. 6, pp. 169–200, 1992.
[19]J. Kim and E. Andr_e, “Emotion recognition based on physiological changes in music listening,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 30, no. 12, pp. 2067–2083, Dec. 2008.
[20]F. Agrafioti, D. Hatzinakos, and A. K. Anderson, “ECG pattern analysis for emotion detection,” IEEE Trans. Affect. Comput., vol. 3, no. 1, pp. 102–115, Jan.-Mar. 2012.
[21]J. Posner, J. A. Russell, and B. S. Peterson, “The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology,” Develop. Psychopathology, vol. 17, no. 3, pp. 715–734, 2005.
[22]M. D. van der Zwaag, J. H. Janssen, and J. H. D. M. Wesrerlink, “Directing physiology and mood through music: Validation of an affectice music player,” IEEE Trans. Affect. Comput., vol. 4, no. 1, pp. 57–68, Jan.-Mar. 2013.
[23]Tongshuai Song, Guanming Lu, and Jingjie Yan. 2020. Emotion recognition based on physiological signals using convolution neural networks. In Proceedings of the 2020 12th International Conference on Machine Learning and Computing. 161–165.
[24]Martinez, H.P.; Bengio, Y.; Yannakakis, G.N. Learning deep physiological models of affect. IEEE Comput. Intell. Mag. 2013, 8, 20–33.
[25]Yang, Y.;Wu, Q.; Qiu, M.;Wang, Y.; Chen, X. Emotion Recognition from Multi-Channel EEG through Parallel Convolutional Recurrent Neural Network. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7.
[26]M. Ben Henia Wiem and Z. Lachiri, "Emotion assessing using valence-arousal evaluation based on peripheral physiological signals and support vector machine," 2016 4th International Conference on Control Engineering & Information Technology (CEIT), Hammamet, Tunisia, 2016, pp. 1-5, doi: 10.1109/CEIT.2016.7929117.
[27]M. B. H. Wiem and Z. Lachiri, “Emotion classification in arousal valence model using MAHNOB-HCI database,” Int. J. Adv. Comput. Sci. Appl., vol. 8, no. 3, pp. 318–323, 2017.
[28]A. Albraikan, D. P. Tobón and A. El Saddik, "Toward User-Independent Emotion Recognition Using Physiological Signals," in IEEE Sensors Journal, vol. 19, no. 19, pp. 8402-8412, 1 Oct.1, 2019, doi: 10.1109/JSEN.2018.2867221.
[29]Guo, R.; Li, S.; He, L.; Gao, W.; Qi, H.; Owens, G. Pervasive and unobtrusive emotion sensing for human mental health. In Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare, Venice, Italy, 5–8 May 2013; pp. 436–439.
[30]Malmivuo, J., Plonsey, R., Bioelectromagnetism, 15, 12-Lead ECG System, Oxford University Press, 1975.
[31]Luthra A. ECG Made Easy. Jaypee Brothers, Medical Publisher; Delhi, India: 2012.
[32]The ECG leads: electrodes, limb leads, chest (precordial) leads, 12-Lead ECG (EKG) – ECG & ECHO, avaliable from: https://ecgwaves.com/topic/ekg-ecg-leads-electrodes-systems-limb-chest-precordial/.
[33]Szczepański A., Saeed K. A mobile device system for early warning of ECG anomalies. Sensors. 2014;14:11031–11044.
[34]EKMAN, Paul; FRIESEN, Wallace V. Constants across cultures in the face and emotion. Journal of personality and social psychology, 1971, 17.2: 124.
[35]Plutchik, R. The Nature of Emotions. Am. Sci. 2001, 89, 344–350.
[36]Goshvarpour A., Abbasi A., Goshvarpour A. An accurate emotion recognition system using ECG and GSR signals and matching pursuit method. Biomed. J. 2017;40:355–368.
[37]Minhad K.N., Ali S.H.M.D., Reaz M.B.I. A design framework for human emotion recognition using electrocardiogram and skin conductance response signals. J. Eng. Sci. Technol.
[38]Wei W., Jia Q., Feng Y., Chen G. Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals. Comput. Intell. Neurosci. 2018;2018:5296523.
[39]Bulagang A.F., Weng N.G., Mountstephens J., Teo J. A review of recent approaches for emotion classification using electrocardiography and electrodermography signals. Inform. Med. Unlocked. 2020;20:100363. doi: 10.1016/j.imu.2020.100363.
[40]Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074.
[41]RUSSELL, James A. A circumplex model of affect. Journal of personality and social psychology, 1980, 39.6: 1161.
[42]Machine Recognition of Music Emotion: A Review - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-2D-valence-arousal-emotion-space-Russell-1980-the-position-of-the-affective_fig1_254004106
[43]R. E. Thayer, The Biopsychology of Mood and Arousal, New York, NY, USA: Oxford Univ. Press, 1989.
[44]BĂLAN, Oana, et al. Emotion classification based on biophysical signals and machine learning techniques. Symmetry, 2019, 12.1: 21.
[45]Mehrabian A. Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. J. Psychopathol. Behav. Assess. 1997;19:331–357.
[46]Pothisarn, C.; Klomjit, J.; Ngaopitakkul, A.; Jettanasen, C.; Asfani, D.A.; Negara, I.M.Y. Comparison of various mother wavelets for fault classification in electrical systems. Appl. Sci. 2020, 10, 1203.
[47]M. Soleymani, F. Villaro-Dixon, T. Pun, G. Chanel, Toolbox for Emotional feAture extraction from Physiological signals (TEAP). Frontiers in ICT. 4 (2017), doi:10.3389/fict.2017.00001.
[48]Jirayucharoensak, S.; Pan-Ngum, S.; Israsena, P. EEG-based emotion recognition using deep learning network with principal component based covariate shift adaptation. Sci. World J. 2014, 2014, 627892.
[49]I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning, MIT press, 2016.
[50]黃頎凱(2022)。結合小波散射與卷積神經網路之內涵式音樂情緒辨識系統。國立陽明交通大學電信工程研究所碩士論文,新竹市。
[51]陳宥丞(2021)。基於時間卷積網路之強健性心電圖身分識別系統。國立陽明交通大學電機工程學系碩士論文,新竹市。
[52]C. Li, C. Zheng, and C. Tai, “Detection of ECG characteristic points using wavelet transforms,” IEEE Transactions on Biomedical Engineering, vol. 42, no. 1, pp. 21–28, 1995.
[53]T. Ince, S. Kiranyaz, and M. Gabbouj, “A generic and robust system for automated patient-specific classification of ECG signals,” IEEE Transactions on Biomedical Engineering, vol. 56, no. 5, pp. 1415–1426, 2009.
[54]R. J. Martis, U. R. Acharya, and L. C. Min, “ECG beat classification using PCA, LDA, ICA and discrete wavelet transform,” Biomedical Signal Processing and Control, vol. 8, no. 5, pp. 437–448, 2013.
[55]S. Mallat, “Group invariant scattering,” Communications on Pure and Applied Mathematics, vol. 65, no. 10, pp. 1331–1398, 2012.
[56]J. Bruna and S. Mallat, “Invariant scattering convolution networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 8, pp. 1872–1886, 2013.
[57]S. Mallat, “Understanding deep convolutional networks,” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 374, no. 2065, article 20150203, 2016.
[58]Anden, J., Mallat, S. (2014) Deep scattering spectrum. IEEE Transactions on Signal Processing, 62(16), 4114-4128.
[59]Daubechies, I.; Heil, C. Ten Lectures on Wavelets. Comput. Phys. 1992, 6, 697.
[60]E. M. Polo, M. Mollura, M. Lenatti, M. Zanet, A. Paglialonga and R. Barbieri, "Emotion recognition from multimodal physiological measurements based on an interpretable feature selection method," 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Mexico, 2021, pp. 989-992, doi: 10.1109/EMBC46164.2021.9631019.
[61]E. M. Polo, M. Mollura, M. Zanet, M. Lenatti, A. Paglialonga and R. Barbieri, "Analysis of the Effect of Emotion Elicitation on the Cardiovascular System," 2021 Computing in Cardiology (CinC), Brno, Czech Republic, 2021, pp. 1-4, doi: 10.23919/CinC53138.2021.9662859.
[62]J. Pan, W. J. Tompkins, “A Real-Time QRS Detection Algorithm,” Biomedical Engineering, IEEE Transactions, Vol. BME-32, No. 3, 1985, pp. 230-236.
[63]G. Rigas, C. D. Katsis, G. Ganiatsas, and D. I. Fotiadis, “A User Independent, Biosignal Based, Emotion Recognition Method,” in User Modeling 2007, Springer Berlin Heidelberg, pp. 314–318.
[64]Nussinovitch U, Elishkevitz KP, Katz K, Nussinovitch M, Segev S, Volovitz B, Nussinovitch N. Reliability of Ultra-Short ECG Indices for Heart Rate Variability. Ann Noninvasive Electrocardiol. 2011 Apr;16(2):117-22. doi: 10.1111/j.1542-474X.2011.00417.x. PMID: 21496161; PMCID: PMC6932379.
[65]Cortes, C.; Vapnik, V. Support-vector networks. Machine Learning. 1995, 20 (3): 273–297. doi:10.1007/BF00994018.
[66]Boser, B. E.; Guyon, I. M.; Vapnik, V. N. A training algorithm for optimal margin classifiers. Proceedings of the fifth annual workshop on Computational learning theory – COLT '92. 1992: 144. ISBN 089791497X. doi:10.1145/130385.130401
[67]MathWork[Online]
Available: https://www.mathworks.com/help/wavelet/ug/wavelet-scattering.html
[68]Patanè, A., Kwiatkowska, M. (2019). Calibrating the Classifier: Siamese Neural Network Architecture for End-to-End Arousal Recognition from ECG. In: Nicosia, G., Pardalos, P., Giuffrida, G., Umeton, R., Sciacca, V. (eds) Machine Learning, Optimization, and Data Science. LOD 2018. Lecture Notes in Computer Science(), vol 11331. Springer, Cham.
[69]音樂情緒與樂理[Online]
Available: https://i.imgur.com/OUNCulQ.png
電子全文 電子全文(網際網路公開日期:20280814)
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊