跳到主要內容

臺灣博碩士論文加值系統

(44.220.184.63) 您好!臺灣時間:2024/10/04 07:10
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:黃稚翔
研究生(外文):Huang, Xhi-Xiang
論文名稱:使用深度學習和機器學習技術基於面部微表情開發用於分類重度抑鬱症的投票策略
論文名稱(外文):A Voting Strategy for Classifying Major Depressive Disorder Based on Facial Micro-Expressions Using Deep Learning and Machine Learning Techniques
指導教授:洪哲倫洪哲倫引用關係
指導教授(外文):Hung, Che-Lun
口試委員:巫坤品林俊淵
口試委員(外文):Wu, Kun-PinLin, Chun-Yuan
口試日期:2023-07-20
學位類別:碩士
校院名稱:國立陽明交通大學
系所名稱:生物醫學資訊研究所
學門:生命科學學門
學類:生物化學學類
論文種類:學術論文
論文出版年:2023
畢業學年度:111
語文別:中文
論文頁數:52
中文關鍵詞:深度學習機器學習電腦視覺憂鬱症臉部表情分析臉部微表情
外文關鍵詞:deep learningmachine learningcomputer visiondepressionfacial expression analysisfacial micro-expressions
相關次數:
  • 被引用被引用:0
  • 點閱點閱:161
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
全球普遍存在的重度憂鬱症(Major Depressive Disorder),在台灣憂鬱症相關疾病的盛行率高達2.4%。然而,近年來深度學習和電腦視覺技術的發展為我們提供了一個新的診斷方式,即利用臉部微表情分析來客觀評估憂鬱症。
本研究的目的在於利用機器學習方法來區分患有重度憂鬱症和未患者的臉部表情。研究收集了29名受試者的臉部影片,其中包括9名被診斷為憂鬱症的患者。這些經過挑選的受試者被要求觀看NAPS圖片資料集中的正面、中性和負面三種類型圖片時,其臉部表情被記錄下來,接續使用了OpenFace工具來擷取表情的臉部動作單元(Action Unit),並配合catboost classifier、隨機森林等機器學習分類器來開發模型,設立一個投票機制,使用精確度、召回率及F1分數等指標對其進行評估。研究結果顯示,我最佳模型在分層3-Fold交叉驗證中取得了平均召回率為78%、精確率為100%以及F1分數為84.3%。而SHAP分析更是突顯了嘴角下垂和眉毛下垂等動作作為重要特徵。
本研究證明了利用臉部表情作為憂鬱症診斷的可行性,此非侵入式評估技術對於臨床上應用具有潛力,後續擴充樣本數據,有助提升模型在憂鬱症識別的準確性與穩健性。
The prevalence of major depressive disorder, a severe form of depression, is estimated to be as high as 2.4% in Taiwan. However, recent advancements in deep learning and computer vision techniques have provided a new diagnostic approach by analyzing micro-expressions on the face to objectively assess depression.
The purpose of this study was to use machine learning methods to differentiate facial expressions between individuals with major depressive disorder and non-depressed individuals. The study collected facial videos from 29 participants, including 9 diagnosed with depression. These selected participants were asked to view positive, neutral, and negative images from the NAPS image dataset while their facial expressions were recorded. The OpenFace tool was then used to extract facial action units associated with the expressions. Machine learning classifiers such as CatBoost and Random Forest were employed to develop models, incorporating a voting mechanism. The models were evaluated using metrics such as accuracy, recall, and F1 score.
The results of the study showed that the best model achieved an average recall of 78%, precision of 100%, and an F1 score of 84.3% in a stratified 3-fold cross-validation. Furthermore, the SHAP analysis highlighted features such as drooping corners of the mouth and eyebrows as important indicators.
This study demonstrates the feasibility of using facial expressions for diagnosing depression. This non-invasive assessment technique has potential for clinical applications. Expanding the sample size in future studies can contribute to improving the accuracy and robustness of the model in identifying depression.
中文摘要 i
英文摘要 ii
目錄 iii
圖目錄 vi
表目錄 viii
第一章 導論 (Introduction) 1
1.1 研究動機: 1
1.2 研究目的: 2
第二章 文獻回顧 (Literature Review) 3
2.1 憂鬱症 3
2.1.1 憂鬱症介紹與診斷: 3
2.1.2 憂鬱症病因: 5
2.1.3 憂鬱症症狀及治療: 5
2.2 微表情與臉部情緒識別: 6
2.3 深度學習與電腦視覺: 7
2.4 臉部特徵擷取方式 8
2.4.1 幾何特徵(Geometric Features): 8
2.4.2 局部二值模式(Local Binary Patterns ,LBP): 8
2.4.3 方向梯度直方圖(Histogram of Oriented Gradients,HoG): 9
2.4.4 隱藏式馬可夫模型(Hidden Markov Model,HMM): 10
2.4.5 主成分分析 (Principal Component Analysis, PCA): 11
2.4.6 卷積神經網絡(Convolutional Neural Network, CNN): 12
2.4.7 彈性束圖匹配(Elastic Bunch Graph Matching, EBGM): 13
2.5 情緒辨識技術之非臨床應用 13
2.5.1 遠距教學 13
2.5.2 測謊 14
2.5.3 駕駛危險行為預測 15
2.5.4 人機互動系統 16
2.6 臉部表情與臨床醫學應用結合 17
2.6.1 內分泌和代謝相關疾病 17
2.6.2 神經相關之疾病 18
2.6.3 阿茲海默症 18
2.6.4 唐氏症及自閉症 19
2.6.5 疼痛程度分析 20
第三章 研究材料與方法 (Material and Methods) 21
3.1 資料收集: 22
3.1.1 實驗組收案條件: 23
3.1.2 對照組收案條件: 23
3.1.3 迷你國際神經精神訪談 24
3.2 觸發表情之影片與錄製環境 25
3.2.1 錄製環境與錄製設備 25
3.2.2 Nencki Affective Picture System(NAPS) 25
3.2.3 臉部微表情觸發影片: 27
3.3 資料前處理與特徵擷取 29
3.3.1 資料前處理: 29
3.3.2 OpenFace特徵擷取 29
3.3.3 所使用到的臉部編碼特徵 30
3.3.4 影片轉數值矩陣 32
3.3.5 特徵選擇 33
3.4 模型架構與機器學習模型 35
3.4.1 模型架構 35
3.4.2 分類器: 38
3.5 評估方式 40
Stratified K-Fold交叉驗證 40
Accuracy(準確率): 41
Recall(召回率): 41
Precision(精確度): 42
F1-score: 42
第四章 研究結果與討論 (Result and Discussion) 43
Accuracy 43
Recall 43
Precision 44
F1-score 44
SHAP(SHapley Additive exPlanations) 45
第五章 結論(Conclusion) 46
第六章 參考資料 47
[1] “depression,” World Health Organization, 31 3 2023. [線上]. Available: https://www.who.int/news-room/fact-sheets/detail/depression.
[2] “110年度全民健康保險醫療統計年報,” 12 12 2022. [線上]. Available: https://dep.mohw.gov.tw/dos/lp-5103-113-xCat-y110.html.
[3] S. Surguladze, “A differential pattern of neural response toward sad versus happy facial expressions in major depressive disorder,” Biological Psychiatry, pp. 201-209, 2 2005.
[4] S. M. Persad, “Differences between depressed and nondepressed individuals in the recognition of and response to facial emotional cues.,” Journal of Abnormal Psychology, p. 358–368, 1993.
[5] F. Z. Canal, “A survey on facial emotion recognition techniques: A state-of-the-art literature review,” Information Sciences, pp. 593-617, 2022.
[6] W. Zhao, “Face recognition: A literature survey,” ACM Computing Surveys, p. 399–458, 2003.
[7] M. Chen, “Double Encoder Conditional GAN for Facial Expression Synthesis,” 於 Chinese Control Conference (CCC), 2018.
[8] W. C. d. Melo, “Encoding Temporal Information For Automatic Depression Recognition From Facial Analysis,” 2018.
[9] C. X. Z. Su, “Deep learning in mental health outcome research: a scoping review,” 22 April 2022.
[10] Z. A. T. Ahmed, “Facial Features Detection System To Identify Children With Autism Spectrum Disorder: Deep Learning Models,” Computational and Mathematical Methods in Medicine, 2022.
[11] K. K. M. Rahman, “Identification of Autism in Children Using Static Facial Features and Deep Neural Networks,” Brain Sciences, 12 January 2022.
[12] D. Fontaine, “Artificial intelligence to evaluate postoperative pain based on facial expression recognition,” European Pain Federation, pp. 1282-1291, 2022.
[13] E. Hosseini, “Convolution Neural Network for Pain Intensity Assessment from Facial Expression,” 於 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 2022.
[14] B. Güntekin, “Impairment in recognition of emotional facial expressions in Alzheimer's disease is represented by EEG theta and alpha responses,” Society for Psychophysiological Research, 2 July 2019 .
[15] T. Baltrušaitis, “OpenFace: An open source facial behavior analysis toolkit,” 於 IEEE Winter Conference on Applications of Computer Vision (WACV), 2016.
[16] A. P. Association., Diagnostic and statistical manual of mental disorders (5th ed., text rev.)., 2022.
[17] M. Hamilton, “A rating scale for depression,” Journal of Neurology, Neurosurgery & Psychiatry, p. 56–61, 1960.
[18] P. D. J. K. MD, “Major depressive disorder: new clinical, neurobiological, and treatment perspectives,” The Lancet, p. 17–23, 2012.
[19] J. B. Dwyer, “Hormonal Treatments for Major Depressive Disorder: State of the Art,” American Journal of Psychiatry, pp. 686-705, 2020.
[20] F. W. Ekman P, “Facial Action Coding System: A Technique for the Measurement of Facial Movement,” Consulting Psychologists Press, 1978.
[21] P. E. Group, “Micro Expressions Training Tools,”. Available: https://www.paulekman.com/micro-expressions-training-tools
[22] E. Yurtsever, “A Survey of Autonomous Driving: Common Practices and Emerging Technologies,” IEEE Access, pp. 58443-58469, 2020.
[23] K. Shahriari, “IEEE standard review — Ethically aligned design: A vision for prioritizing human wellbeing with artificial intelligence and autonomous systems,” 於 2017 IEEE Canada International Humanitarian Technology Conference (IHTC), 2017.
[24] A. Fourcade, “Deep learning in medical image analysis: A third eye for doctors,” A. Fourcade, pp. 279-288, 2019.
[25] V. Jacintha, “A Review on Facial Emotion Recognition Techniques,” 於 2019 International Conference on Communication and Signal Processing (ICCSP), 2019.
[26] H. Cui, “Research on Gesture Recognition Method Based on Computer Vision Technology,” 於 2020 International Conference on Computer Information and Big Data Applications (CIBDA), 2020.
[27] L. Ge, “An accurate and robust monitoring method of full-bridge traffic load distribution based on YOLO-v3 machine vision,” Structural Control and Health Monitoring, 2020.
[28] M. Ganga, “Survey of Texture Based Image Processing and Analysis with Differential Fractional Calculus Methods,” 於 2021 International Conference on System, Computation, Automation and Networking (ICSCAN), 2021.
[29] M. Monaro, “Detecting deception through facial expressions in a dataset of videotaped interviews: A comparison between human judges and machine learning models,” Computers in Human Behavior, 2022.
[30] A. Majumder, “Emotion recognition from geometric facial features using self-organizing map,” Pattern Recognition, pp. 1282-1293, March 2014.
[31] C. Liu, “Facial Expression Recognition Using Hybrid Features of Pixel and Geometry,” IEEE Access, pp. 18876-18889, 2021.
[32] D. G. R. Kola, “A novel approach for facial expression recognition using local binary pattern with adaptive window,” Multimedia Tools and Applications, p. pages2243–2262, 2021.
[33] M. Lyons, “The Japanese Female Facial Expression (JAFFE) Dataset,” Zenodo, 1998.
[34] P. Lucey, “The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression,” 於 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, 2010 .
[35] A. D, “Modeling stylized character expressions via deep learning,” Asian conference on computer vision. Springer.
[36] T. CE, “A new ranking method for principal components analysis and its application to face image analysis,” Image Vision Comput, p. 902–913, 2010.
[37] S. K. Eng, “Facial expression recognition in JAFFE and KDEF Datasets using histogram of oriented gradients and support vector machine,” 5th International Conference on Man Machine Systems 26–27 , August 2019.
[38] H. Tang, “Non-frontal view facial expression recognition based on ergodic hidden Markov model supervectors,” 於 2010 IEEE International Conference on Multimedia and Expo, 2010 .
[39] M. A. &. M. Kumar, “AutoFER: PCA and PSO based automatic facial emotion recognition,” Multimedia Tools and Applications, p. 3039–3049, 2021.
[40] J. Haddad, “3D-CNN for Facial Emotion Recognition in Videos,” Advances in Visual Computing, 2020.
[41] L. Wiskott, “Face recognition by elastic bunch graph matching,” 於 IEEE Intl. Conf. on Image Processing (ICIP'97), 1997.
[42] J. -M. Sun, “Facial emotion recognition in modern distant education system using SVM,” 於 2008 International Conference on Machine Learning and Cybernetics, 2008 .
[43] M. Owayjan, “The design and development of a Lie Detection System using facial micro-expressions,” International Conference on Advances in Computational Tools for Engineering Applications (ACTEA), pp. 33-38, 2012 .
[44] M. Jabon, “Facial expression analysis for predicting unsafe driving behavior,” IEEE Pervasive Computing, pp. 84-95, 2011.
[45] S. Nayak, “A Human–Computer Interaction framework for emotion recognition through time-series thermal video sequences,” Computers & Electrical Engineering, 2021.
[46] S. Gubbi, “Artificial Intelligence and Machine Learning in Endocrinology and Metabolism: The Dawn of a New Era,” Frontiers in Endocrinology , 2019.
[47] B. Jin, “Research on Diagnosing Parkinson’s Disease through Facial Expression Recognition,” Journal of Medical Internet Research , 2020.
[48] M. Munsif, “Monitoring Neurological Disorder Patients via Deep Learning Based Facial Expressions Analysis,” Artificial Intelligence Applications and Innovations. AIAI 2022 IFIP WG 12.5 International Workshops, 10 June 2022.
[49] A. T.-H. Lu, “Support for calcium channel gene defects in autism spectrum disorders,” Molecular Autism, 2012.
[50] B. Qin, “Automatic Identification of Down Syndrome Using Facial Images with Deep Convolutional Neural Network,” Diagnostics , 2020.
[51] Z. A. T. Ahmed, “Facial Features Detection System To Identify Children With Autism Spectrum Disorder: Deep Learning Models,” Computational and Mathematical Methods in Medicine, 2022.
[52] A. G. Howard, “MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications,” arXiv:1704.04861, 2017.
[53] T. Hassan, “Automatic Detection of Pain from Facial Expressions: A Survey,” 2022 International Conference on Computational Intelligence and Sustainable Engineering Solutions (CISES), 2020.
[54] K. Park, “Final validation of the mental health screening tool for depressive disorders: A brief online and offline screening tool for major depressive disorder,” Frontiers in Psychology, Oct 2022 .
[55] M. M. Bradley, “The International Affective Picture System (IAPS) in the study of emotion and attention,” Handbook of emotion elicitation and assessment, p. 29–46, 2007.
[56] E. S. &. S. K. R. Dan-Glauser, “The Geneva affective picture database (GAPED),” Behavior Research Methods, 2011.
[57] M. Wessa, “EmoPics: Subjektive und psychophysiologische Evaluationen neuen Bildmaterials für die klinisch-bio-psychologische Forschung,” Zeitschrift für Klinischer Psychologie und Psychotherapie, Supplement.
[58] Artur Marchewka, “The Nencki Affective Picture System (NAPS): Introduction to a novel, standardized, wide-range, high-quality, realistic picture database,” Behav Res Methods, p. 596–610, 2014.
[59] H. CH, Man's Face and Mimic Language, Studentlitteratur, 1969 .
[60] X. a. K. Wu, “Top 10 algorithms in data mining,” Knowledge and information systems, pp. 1-37, 2008.
[61] T. K. Ho, “Random decision forests,” In Proceedings of 3rd international conference on document analysis and recognition, p. 278–282, 1995.
[62] L. Prokhorenkova, “CatBoost: unbiased boosting with categorical features,” Advances in Neural Information Processing Systems, 2018.
電子全文 電子全文(網際網路公開日期:20250830)
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊