跳到主要內容

臺灣博碩士論文加值系統

(44.210.83.132) 您好!臺灣時間:2024/05/22 22:28
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:劉欽平
研究生(外文):Liu, Chin-Ping
論文名稱:基於使用者步態之意圖偵測及其於主動式行動輔具操控系統開發
論文名稱(外文):Development of a Manipulation System for Active Walking Helper Based on Gait Information
指導教授:楊谷洋楊谷洋引用關係
指導教授(外文):Young, Kuu-Young
口試委員:陳永平王學誠柯春旭
口試委員(外文):Chen, Yon-PingWang, Hsueh-ChengKo, Chun-Hsu
口試日期:2019-02-25
學位類別:碩士
校院名稱:國立交通大學
系所名稱:電控工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2019
畢業學年度:107
語文別:中文
論文頁數:54
中文關鍵詞:行動輔具步態影像意圖推導深度學習主動式控制
外文關鍵詞:robot walking helpergait imageintention derivationdeep learningactive control
相關次數:
  • 被引用被引用:1
  • 點閱點閱:344
  • 評分評分:
  • 下載下載:65
  • 收藏至我的研究室書目清單書目收藏:0
近年來,醫療技術日益進步,社會日趨高齡化,然而隨著年紀增長,身體機能隨之弱化,導致行動不便而無法打理生活起居等問題產生。此外,許多因重大傷病而無法自主行走的患者,也需透過醫療復健以恢復行動力。因此,如何有效地恢復並維持長者及患者之行動能力極為重要。目前已有許多關於智慧型行動輔具的研究及開發,本研究基於實驗室所開發的被動式智慧型行動輔助機器人「i-Go」作為參考,發展出以馬達驅動的主動式智慧型行動輔助機器人,本系統利用步態影像獲取使用者意圖角度,讓輔具輔助更為省力、操控更加穩定,引入深度學習方法「YOLOv2」,能夠即時、準確地辨識使用者足部資訊,經分析處理取得足部朝向角度,以此作為使用者行走意圖,再透過輔助控制策略,使輔具提供使用者合適的行動輔助力。透過實驗的驗證,本系統所設計的輔助控制策略,可以節省使用者的扭力,有效地減輕使用者上肢推行時的負擔,也能在空間較小的場地來回行走,提升系統的操控及穩定性。
In recent years, due to the advancement of medical technology, society has become increasingly aging. However, as people grow older, their physical functions are weakened, resulting in problems such as inconvenient movements and inability to manage their daily lives. Therefore, how to effectively restore and maintain the mobility of the elderly and patients is extremely important. In today's era of imbalanced care, medical care resources are extremely scarce, and mobile assistive technology can not only solve the problem of elderly long-term care, but also provide timely medical assistance such as patient rehabilitation. Based on the passive walking helper, i-Go, developed by our laboratory, this paper develops a motor-driven active walking helper. The system introduces the deep learning method, YOLOv2, which can instantly and accurately identify the user's gait image information, and obtain the user's intention angle through analysis and processing. Through the proposed control strategy, the walking helper provides the user with appropriate action assistance. From the verification of the experimental results, the proposed system effectively reduces the burden on the user's upper limbs, and can walk back and forth in a small space to improve the handling and stability of the system.
第一章 導論 1
1-1引言 1
1-2相關研究 2
1-3 研究動機 8
1-4 研究目標 9
1-5 論文架構 9
第二章 基於使用者意圖之行動輔具操控系統 10
2-1輔具動態模型 11
2-2雙足影像意圖偵測系統 13
2-2.1深度學習方法 14
2-2.2足角計算方法 20
2-3輔具控制策略 22
第三章 系統實現 24
3-1硬體架構 25
3-2六軸力感測器 26
3-3攝影機 28
3-4筆記型電腦 29
3-5 微控制器 31
3-6 軸編碼器 31
3-7 直流無刷馬達 32
第四章 實驗與分析 34
4-1雙足影像意圖偵測系統測試 34
4-2行走輔助實驗 39
4-2.1實驗一:空曠區域行走 40
4-2.2實驗二:狹窄空間行走 45
第五章 結論與未來展望 50
5-1結論 50
5-2未來展望 51
參考文獻 52
[1] "World Population Ageing 2017," United Nations Department of Economic and Social Affairs, New York, 2017.
[2] “107年第15週內政統計通報,” 內政部統計處, 民國107年4月14日.
[3] 施銘峰、黃獻樑, “初探老人衰弱,” 國立台灣大學醫學院附設醫院健康教育中心, 2018.
[4] D. Fontanelli, A. Giannitrapani, L. Palopoli, and D. Prattichizzo, "A passive guidance system for a robotic walking assistant using brakes," IEEE 54th Annual Conference on Decision and Control, 2015.
[5] 張邱涵, “基於步態影像與足壓之被動式輔具操控系統研發,” 國立交通大學碩士論文, 民國103年.
[6] 洪偉鐘, 基於髖部資訊之行動意圖分析與輔具策略研發, 國立交通大學碩士論文, 民國102年.
[7] Y. H. Hsieh, K. Y. Young, and C .H. Ko, "Effective Maneuver for Passive Robot Walking Helper Based on User Intention," IEEE Transactions on Industrial Electronics, vol. 62, no. 10, pp. 6404-6416, October 2015.
[8] Y. H. Hsieh, Y. C. Huang, K. Y. Young, C. H. Ko, and S. K. Agrawal, "Motion Guidance for a Passive RobotWalking Helper via User’s Applied Hand Forces," IEEE Transactions on Human-Machine Systems, vol. 46, no. 6, pp. 869-881, December 2016.
[9] P. Di, J. Huang, S. Nakagawa, K. Sekiyama and T. Fukuda, "Fall Detection and Prevention in the Elderly based on the ZMP Stability Control," IEEE Workshop on Advanced Robotics and its Social Impacts, 2013.
[10] K. Wakita, J. Huang, P. Di, K. Sekiyama, and T. Fukuda, "Human-Walking-Intention-Based Motion Control of an Omnidirectional-Type Cane Robot," IEEE/ASME Transactions on Mechatronics, vol. 18, no. 1, pp. 285-296, February 2013.
[11] S. Nakagawa, Y. Hasegawa, T. Fukuda, I. Kondo, M. Tanimoto, P. Di, J. Huang, and Q. Huang, "Tandem Stance Avoidance Using Adaptive and Asymmetric Admittance Control for Fall Prevention," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 24, no. 5, pp. 542-550, May 2016.
[12] T. D Modise, N. Steyn, and Y. Hamam, "Linear Progression Measurement and Analysis of Human Gait for the Development of a Multifunctional Robotic Walker," Pattern Recognition Association of South Africa and Robotics and Mechatronics International Conference , 2016.
[13] T. D Modise, N. Steyn, and Y. Hamam, "Human Feet Tracking in Arranging the Navigation of a Robotic Rollator," IEEE Africon, 2017.
[14] Carlos A. Cifuentes, Camilo Rodriguez, Anselmo Frizera-Neto, Teodiano Freire Bastos-Filho, and Ricardo Carelli, "Multimodal Human–Robot Interaction for Walker-Assisted Gait," IEEE Systems Journal, vol. 10, no. 3, pp. 933-943, September 2016.
[15] C. D. Lim, C. Y. Cheng, C. M. Wang, Y. Chao, and L. C. Fu,, "Depth Image Based Gait Tracking and Analysis via Robotic Walker," IEEE International Conference on Robotics and Automation, 2015.
[16] C. D. Lim, C. M. Wang, C. Y. Cheng, Y. Chao, S. H. Tseng, and L. C. Fu, "Sensory Cues Guided Rehabilitation Robotic Walker Realized by Depth Image-Based Gait Analysis," IEEE Transactions on Automation Science and Engineering, vol. 13, no. 1, pp. 171-180, January 2016.
[17] J. Paulo and P. Peixoto, "Classification of Reaching and Gripping Gestures for Safety on Walking Aids," The 23rd IEEE International Symposium on Robot and Human Interactive Communication, 2014.
[18] I. Caetano, J. Alves, J. Gonçalves, M. Martins and C. P. Santos, "Development of a biofeedback approach using body tracking with Active Depth sensor in ASBGo smart walker,"International Conference on Autonomous Robot Systems and Competitions, 2016.
[19] R. Girshick, J. Donahue, T. Darrell, and J. Malik, "Rich feature hierarchies for accurate object detection and semantic segmentation,"IEEE Conference on Computer Vision and Pattern Recognition, 2014.
[20] R. Girshick, "Fast R-CNN," IEEE International Conference on Computer Vision, 2015.
[21] S. Ren, K. He, R. Girshick, and J. Sun, "Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 6, pp. 1137-1149, June 2017.
[22] J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, "You Only Look Once :Unified, Real-Time Object Detection," IEEE Conference on Computer Vision and Pattern Recognition, 2016.
[23] K. Aspelin, "Establishing Pedestrian Walking Speeds," Portland State University, 2005.
[24] J. Redmon, and A. Farhadi, "YOLO9000: Better, Faster, Stronger," IEEE Conference on Computer Vision and Pattern Recognition, 2017.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top