跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.87) 您好!臺灣時間:2025/01/19 05:08
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:李汶政
研究生(外文):Wen-Jeng Li
論文名稱:使用體感裝置於中風後復健手勢辨識
論文名稱(外文):Hand Gesture Recognition for Post-stroke Rehabilitation Using Leap Motion
指導教授:朱唯勤朱唯勤引用關係
指導教授(外文):Woei-Chyn Chu
學位類別:碩士
校院名稱:國立陽明大學
系所名稱:生物醫學工程學系
學門:工程學門
學類:生醫工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:中文
論文頁數:66
中文關鍵詞:中風復健手勢辨識機器學習Leap Motion
外文關鍵詞:Stroke RehabilitationGesture RecognitionMachine LearningLeap Motion
相關次數:
  • 被引用被引用:0
  • 點閱點閱:364
  • 評分評分:
  • 下載下載:37
  • 收藏至我的研究室書目清單書目收藏:1
為了提升與改善中風後的復健成效,復健需要盡早開始,並透過醫院和居家環境中連續和經常性的長期干預來監測。隨著台灣邁向高齡化趨勢,老年人口比例與日俱增,同時老年人屬於中風之高危險群,導致醫療資源急遽匱乏、臨床復健師短缺等問題,促使中風病患於居家自主復健的觀念被提倡,以降低醫療資源短缺問題。然而中風病患的居家復健,面臨了缺乏臨床治療師的專業指導,且臨床治療師也面臨無法即時觀察病人復健過程等問題,這些問題將導致病患額外的傷害與復健進度延遲。為解決無法即時觀察復健過程此問題,本研究針對中風後病患的居家復健,開發復健手勢辨識演算法,用於觀察復健手勢執行次數與成效。實驗受測者為十七名年輕健康個體,分別為十五位男性以及兩位女性,實驗結果透過 k-fold 交叉驗證方法評估,結果顯示支持向量機 (Support Vector Machine, SVM) 以及最鄰近演算法 (K-Nearest Neighbors, KNN) 手勢辨識精準度分別為97.29%和97.71%。本研究主要貢獻有兩項,第一項為利用體感裝置收集中風病患居家復健手勢,透過機器學習方法,可精確地完成復健手勢辨識;第二項為提供一圖型使用者介面供臨床治療師查看中風病患復健動作完成次數與執行狀況。
In order to enhance and/or improve recovery after stroke, rehabilitation needs to start early and be monitored by continuous and recurrent long-term interventions in the clinic and the home setting. The elderly is a high risk group of stroke with advancing age, resulting in the increasing strengthened resource of hospitals and physiotherapist. Residential rehabilitation for stroke patients would effectively relieve shortages of medical resources. However, residential rehabilitation for stroke patients faces with the lack of professional guidance, while physiotherapist cannot monitor the rehabilitation progress of stroke patients in another disadvantage. These problems may lead to extra harm or deteriorate rehabilitation progress. In order to solve these problems, we develop a hand gesture recognition algorithm devoted to monitor the seven gestures for residential rehabilitation of the post-stroke patients. The gestures were performed on seventeen healthy young subjects (15 males and 2 females, 23-27 years old). The results were assessed by k-fold cross validation method. The results show that the proposed hand gesture recognition algorithm using multi-class SVM and k-NN classifier achieve accuracy of 97.29% and 97.71%, respectively. There are two major contributions in the study. One is the development of an accurate hand gesture recognition algorithm for rehabilitation of stroke patients and the other is to provide a graphic user interface for physiotherapist to monitor the rehabilitation progress of patients.
目錄
致謝 i
摘要 ii
Abstract iii
目錄 iv
圖目錄 v
表目錄 vi
第一章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 3
1.3 論文架構 5
第二章 文獻回顧 6
2.1 中風復健 6
2.1.1 傳統式復健 6
2.1.2 遠距復健 12
2.2 手勢辨識 14
2.2.1 手勢辨識技術 16
2.2.2 動態手勢辨識演算法 20
第三章 研究工具與方法 24
3.1 開發環境與研究工具 24
3.1.1 軟體:Eclipse 24
3.1.2 硬體:Leap Motion Controller 25
3.1.3 MATLAB 29
3.2 系統架構與研究方法 30
3.2.1 復健手勢活動資訊獲取 31
3.2.2 特徵擷取 36
3.2.3 手勢辨識模型訓練 38
第四章 研究結果與討論 41
4.1 手勢辨識模型評估 41
4.2 手勢動作訊號 45
第五章 討論與結論 51
5.1 討論 51
5.2 結論 55
第六章 未來展望 57
參考文獻 62

圖目錄
圖1-1 WHO公佈2015年十大死因 1
圖2-1 侷限-誘發動作治療 11
圖2-2 手勢辨識技術 15
圖2-3 各種手勢識別裝置 17
圖2-4 隱藏式馬可夫模型式意圖 21
圖2-5 SVM分類示意圖 23
圖3-1 Eclipse IDE 整合式開發環境 25
圖3-2 體感裝置比較 26
圖3-3 Leap Motion感應範圍 27
圖3-4 Leap Motion座標系 27
圖3-5 手掌的法向量和方向 28
圖3-6 手掌球的圓心和半徑 28
圖3-7 可指向對象 29
圖3-8 手的方向與手指位置點 29
圖3-9 圖形使用者介面 30
圖3-10 手勢辨識演算法功能圖 30
圖3-11 Leap Motion可獲取的手掌資訊 32
圖3-12 實驗環境示意圖 33
圖3-13 手勢示意圖 34
圖3-14 屈曲與伸展 34
圖3-15 手指內收、外展 35
圖3-16 手指點擊 35
圖3-17 腕橈尺側偏斜 35
圖3-18 拇指各別點擊四指 35
圖3-19 內側旋轉、外側旋轉 36
圖3-20 手掌開握 36
圖3-21 Pitch Yaw Roll Angle 38
圖3-22 經由核函數將數據映射到更高維度的特徵空間 40
圖4-1 十折交叉驗證示意圖 42
圖4-2 左右手辨識錯誤 44
圖4-3 腕橈尺側偏斜 腕關節角度變化 46
圖4-4 屈曲、伸展 腕關節角度變化 47
圖4-5 連續伸展與屈曲 48
圖4-6 使用者介面 49
圖5-1 訊號自動切割示意圖 54
圖6-1 使用者介面完整版 58

表目錄
表1-1 近五年使用體感裝置進行手勢辨識相關研究 4
表2-1 臨床評估量表 8
表4-1 手勢辨識混淆 43
表4-2 手勢辨識演算法 預測精準度 45
表4-3 臨床評估表單內容 50
[1] W. H. Organization. (2017). The top 10 causes of death. Available: http://www.who.int/mediacentre/factsheets/fs310/en/
[2] I. Aprile, M. Rabuffetti, L. Padua, E. Di Sipio, C. Simbolotti, and M. Ferrarin, "Kinematic analysis of the upper limb motor strategies in stroke patients as a tool towards advanced neurorehabilitation strategies: a preliminary study," BioMed research international, 2014.
[3] 陳誌睿. (2016). 台灣遠距復健敘述. Available: http://mag.longgood.com.tw/2016/06/09/%E5%8F%B0%E7%81%A3%E9%81%A0%E8%B7%9D%E5%BE%A9%E5%81%A5%E6%95%98%E8%BF%B0/
[4] (2017, 06/30). Telerehabilitation. Available: http://www.physiotherapy.asn.au/DocumentsFolder/Advocacy_Background_Papers_Telerehabilitation.pdf
[5] X. Song, L. Tao, W. Li, and Q. Fang, "Information management system of remote rehabilitation for stroke patients," Orange Technologies (ICOT), 2013 International Conference on, pp. 27-30, 2013.
[6] 何正宇, 王志龍, 盧玉強, 孫淑芬, 張兆宏, and 蔡欣宜, "以 Wii (上標 TM) 建構虛擬實境輔助慢性中風患者復健訓練之療效評估," 台灣復健醫學雜誌, vol. 38, pp. 11-18, 2010.
[7] 張容瑜, "結合虛擬實境與動作捕捉系統於平衡能力評估之姿勢反應研究," 元智大學工業工程與管理學系學位論文, pp. 1-56, 2011.
[8] 裴駿, 徐仲楠, 黃建華, 孫天龍, and 黃振嘉, "創新遠距居家健康促進體感遊戲系統開發與初步評估," 福祉科技與服務管理學刊, vol. 1, pp. 51-62, 2013.
[9] 吳英黛、劉千綺、鄭建興. (2008) 慢性中風病人的物理治療需求. 台灣腦中風學會第十五卷第三期. Available: http://www.stroke.org.tw/newpaper/2008Dec/paper_6.asp
[10] M. Shaughnessy, B. M. Resnick, and R. F. Macko, "Testing a model of post‐stroke exercise behavior," Rehabilitation nursing, vol. 31, pp. 15-21, 2006.
[11] Y. Chen, Z. Ding, Y.-L. Chen, and X. Wu, "Rapid recognition of dynamic hand gestures using leap motion," Information and Automation, 2015 IEEE International Conference on, pp. 1419-1424, 2015.
[12] W. Lu, Z. Tong, and J. Chu, "Dynamic Hand Gesture Recognition With Leap Motion Controller," IEEE Signal Processing Letters, vol. 23, pp. 1188-1192, 2016.
[13] G. Marin, F. Dominio, and P. Zanuttigh, "Hand gesture recognition with jointly calibrated Leap Motion and depth sensor," Multimedia Tools and Applications, vol. 75, pp. 14991-15015, 2016.
[14] R. McCartney, J. Yuan, and H.-P. Bischof, "Gesture recognition with the leap motion controller," Proceedings of the International Conference on Image Processing, Computer Vision, and Pattern Recognition (IPCV), p. 3, 2015.
[15] Q. Wang, Y. Xu, Y.-L. Chen, Y. Wang, and X. Wu, "Dynamic hand gesture early recognition based on Hidden Semi-Markov Models," Robotics and Biomimetics (ROBIO), 2014 IEEE International Conference on, pp. 654-658, 2014.
[16] A. COLGAN. (2015). Changing How People Look at Physical Therapy. Available: http://blog.leapmotion.com/changing-people-look-physical-therapy/
[17] 梁蕙雯. (2009). 腦中風之障礙與失能評估量表簡介. Available: http://www.ntuh.gov.tw/PMR/Lists/List14/Attachments/168/%E8%85%A6%E4%B8%AD%E9%A2%A8%E4%B9%8B%E9%9A%9C%E7%A4%99%E8%88%87%E5%A4%B1%E8%83%BD%E8%A9%95%E4%BC%B0%E9%87%8F%E8%A1%A8%E7%B0%A1%E4%BB%8B.pdf
[18] D. Barthel, "Functional evaluation: the barthel index, Maryland State," Med J, vol. 14, pp. 16-65, 1965.
[19] P. Langhorne, F. Coupar, and A. Pollock, "Motor recovery after stroke: a systematic review," The Lancet Neurology, vol. 8, pp. 741-754, 2009.
[20] G. Hankey, J. Spiesser, Z. Hakimi, G. Bego, P. Carita, and S. Gabriel, "Rate, degree, and predictors of recovery from disability following ischemic stroke," Neurology, vol. 68, pp. 1583-1587, 2007.
[21] M. van Eeden, C. M. van Heugten, and S. M. Evers, "The economic impact of stroke in The Netherlands: the€-restore4stroke study," BMC Public Health, vol. 12, p. 122, 2012.
[22] N. Shah, A. Basteris, and F. Amirabdollahian, "Design parameters in multimodal games for rehabilitation," GAMES FOR HEALTH: Research, Development, and Clinical Applications, vol. 3, pp. 13-20, 2014.
[23] A. Y. Wang, "Games for physical therapy," in Proceedings of CHI, 2012.
[24] D. M. Brennan, A. C. Georgeadis, C. R. Baron, and L. M. Barker, "The effect of videoconference-based telerehabilitation on story retelling performance by brain-injured subjects and its implications for remote speech-language therapy," Telemedicine Journal & e-Health, vol. 10, pp. 147-154, 2004.
[25] N. M. Peel, T. G. Russell, and L. C. Gray, "Feasibility of using an in-home video conferencing system in geriatric rehabilitation," Journal of rehabilitation medicine, vol. 43, pp. 364-366, 2011.
[26] J. C. Perry, J. Andureu, F. I. Cavallaro, J. Veneman, S. Carmien, and T. Keller, "Effective game use in neurorehabilitation: user-centered perspectives," Handbook of research on improving learning and motivation through educational games: Multidisciplinary approaches, pp. 683-725, 2011.
[27] G. Saposnik, M. Levin, and S. O. R. C. W. Group, "Virtual reality in stroke rehabilitation," Stroke, vol. 42, pp. 1380-1386, 2011.
[28] J.-H. Shin, H. Ryu, and S. H. Jang, "A task-specific interactive game-based virtual reality rehabilitation system for patients with stroke: a usability test and two clinical experiments," Journal of neuroengineering and rehabilitation, vol. 11, p. 32, 2014.
[29] D. Kim, O. Hilliges, S. Izadi, A. D. Butler, J. Chen, I. Oikonomidis, et al., "Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor," Proceedings of the 25th annual ACM symposium on User interface software and technology, pp. 167-176, 2012.
[30] H. Kim, G. Albuquerque, S. Havemann, and D. W. Fellner, "Tangible 3D: Hand Gesture Interaction for Immersive 3D Modeling," IPT/EGVE, pp. 191-199, 2005.
[31] D. Datcu and S. Lukosch, "Free-hands interaction in augmented reality," Proceedings of the 1st symposium on Spatial user interaction, pp. 33-40, 2013.
[32] J. Sutton, "Air painting with corel painter freestyle and the leap motion controller: A revolutionary new way to paint!," ACM SIGGRAPH 2013 Studio Talks, p. 21, 2013.
[33] S. Vikram, L. Li, and S. Russell, "Writing and sketching in the air, recognizing and controlling on the fly," CHI'13 Extended Abstracts on Human Factors in Computing Systems, pp. 1179-1184, 2013.
[34] H. Cheng, L. Yang, and Z. Liu, "Survey on 3D Hand Gesture Recognition," IEEE Transactions on Circuits and Systems for Video Technology, vol. 26, pp. 1659-1673, 2016.
[35] H. Kaur and J. Rani, "A review: Study of various techniques of Hand gesture recognition," Power Electronics, Intelligent Control and Energy Systems (ICPEICES), IEEE International Conference on, pp. 1-5, 2016.
[36] J. Wu, J. Cheng, and W. Feng, "3D dynamic gesture recognition based on improved HMMs with entropy," Information and Automation (ICIA), 2014 IEEE International Conference on, pp. 213-218, 2014.
[37] S. S. Jambhale and A. Khaparde, "Gesture recognition using DTW & piecewise DTW," Electronics and Communication Systems (ICECS), 2014 International Conference on, pp. 1-5, 2014.
[38] Y. Xu, Q. Wang, X. Bai, Y.-L. Chen, and X. Wu, "A novel feature extracting method for dynamic gesture recognition based on support vector machine," Information and Automation (ICIA), 2014 IEEE International Conference on, pp. 437-441, 2014.
[39] S. Naidoo, C. Omlin, and M. Glaser, "Vision-based static hand gesture recognition using support vector machines," University of Western Cape, Bellville, 1998.
[40] Y.-T. Chen and K.-T. Tseng, "Multiple-angle hand gesture recognition by fusing SVM classifiers," Automation Science and Engineering, 2007. CASE 2007. IEEE International Conference on, pp. 527-530, 2007.
[41] N. H. Dardas and N. D. Georganas, "Real-time hand gesture detection and recognition using bag-of-features and support vector machine techniques," IEEE Transactions on Instrumentation and Measurement, vol. 60, pp. 3592-3607, 2011.
[42] C.-H. Chuan, E. Regina, and C. Guardino, "American sign language recognition using leap motion sensor," Machine Learning and Applications (ICMLA), 2014 13th International Conference on, pp. 541-544, 2014.
[43] K. L. MEng, "Development of finger-motion capturing device based on optical linear encoder," Journal of rehabilitation research and development, vol. 48, p. 69, 2011.
[44] K. Mitobe, T. Kaiga, T. Yukawa, T. Miura, H. Tamamoto, A. Rodgers, et al., "Development of a motion capture system for a hand using a magnetic three dimensional position sensor," SIGGRAPH Research Posters, p. 102, 2006.
[45] M. Nishiyama and K. Watanabe, "Wearable sensing glove with embedded hetero-core fiber-optic nerves for unconstrained hand motion capture," IEEE Transactions on Instrumentation and Measurement, vol. 58, pp. 3995-4000, 2009.
[46] (2017, 05/15). CyberGlove Systems. Available: http://www.cyberglovesystems.com/
[47] (2017, 04/28). Myo Gesture Control. Available: http://www.thalmic.com/
[48] L. Chen, F. Wang, H. Deng, and K. Ji, "A survey on hand gesture recognition," Computer Sciences and Applications (CSA), 2013 International Conference on, pp. 313-316, 2013.
[49] A. Shimada, T. Yamashita, and R.-i. Taniguchi, "Hand gesture based TV control system—Towards both user-& machine-friendly gesture applications," Frontiers of Computer Vision,(FCV), 2013 19th Korea-Japan Joint Workshop on, pp. 121-126, 2013.
[50] P. Garg, N. Aggarwal, and S. Sofat, "Vision based hand gesture recognition," World Academy of Science, Engineering and Technology, vol. 49, pp. 972-977, 2009.
[51] S. Muttena, R. Shiva, S. Sriram, and S. Murugavalli, "Simulation of speech by identifying and classifying dynamic gestures," Smart Technologies and Management for Computing, Communication, Controls, Energy and Materials (ICSTM), 2015 International Conference on, pp. 192-197, 2015.
[52] J. Chastine, N. Kosoris, and J. Skelton, "A study of gesture-based first person control," Computer Games: AI, Animation, Mobile, Interactive Multimedia, Educational & Serious Games (CGAMES), 2013 18th International Conference on, pp. 79-86, 2013.
[53] S. S. Rautaray and A. Agrawal, "Vision based hand gesture recognition for human computer interaction: a survey," Artificial Intelligence Review, vol. 43, pp. 1-54, 2015.
[54] CESARSOUZA. (2010, 07/03). Hidden Markov Models in C#. Available: http://crsouza.com/2010/03/23/hidden-markov-models-in-c/
[55] C. Cortes and V. Vapnik, "Support-vector networks," Machine learning, vol. 20, pp. 273-297, 1995.
[56] (2017, 06/02). Leap Motion. Available: http://zh.wikipedia.org/wiki/Leap_Motion#cite_note-aboutpage-1
[57] J. Artal-Sevil and J. Montañés, "Development of a robotic arm and implementation of a control strategy for gesture recognition through Leap Motion device," Technologies Applied to Electronics Teaching (TAEE), 2016, pp. 1-9, 2016.
[58] (2017, 04/19). API Overview — Leap Motion Java SDK v3.2 Beta documentation. Available: http://developer.leapmotion.com/documentation/java/devguide/Leap_Overview.html
[59] J. Lee and T. L. Kunii, "Model-based analysis of hand posture," IEEE Computer Graphics and applications, vol. 15, pp. 77-86, 1995.
[60] D. J. Gladstone, C. J. Danells, and S. E. Black, "The Fugl-Meyer assessment of motor recovery after stroke: a critical review of its measurement properties," Neurorehabilitation and neural repair, vol. 16, pp. 232-240, 2002.
[61] A. Bulling, U. Blanke, and B. Schiele, "A tutorial on human activity recognition using body-worn inertial sensors," ACM Computing Surveys (CSUR), vol. 46, p. 33, 2014.
[62] R. Kohavi, "A study of cross-validation and bootstrap for accuracy estimation and model selection," Ijcai, vol. 14, pp. 1137-1145, 1995.
[63] Y.-F. Chen, "Using Infrared Assistive Device to Build Wrist Range of Motion Measurement System," master, Department of Biomedical Engineering, National Yang-Ming University, Taiwan, 2015.
[64] A. E. Holland and N. S. Cox, "Telerehabilitation for COPD: Could pulmonary rehabilitation deliver on its promise?," Respirology, vol. 22, pp. 626-627, 2017.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top