跳到主要內容

臺灣博碩士論文加值系統

(98.82.140.17) 您好!臺灣時間:2024/09/10 12:28
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:楊采霓
研究生(外文):Tsai-Ni Yang
論文名稱:基於肌電訊號和陀螺儀的深度學習手部識別系統的應用
論文名稱(外文):Application of hand recognition system based on electromyography and gyroscope using deep learning
指導教授:施國琛施國琛引用關係
指導教授(外文):Timothy K. Shih
學位類別:碩士
校院名稱:國立中央大學
系所名稱:資訊工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2019
畢業學年度:107
語文別:英文
論文頁數:83
中文關鍵詞:深度學習手勢識別肌電訊號陀螺儀手勢
外文關鍵詞:Deep learninghand recognitionelectromyographygyroscopegesture
相關次數:
  • 被引用被引用:1
  • 點閱點閱:385
  • 評分評分:
  • 下載下載:31
  • 收藏至我的研究室書目清單書目收藏:0
隨著科技的進步,人與電腦的交互活動越來越密不可分,其中人機互動這項技術方便了人們的生活,在過去,使用者通常只需要透過滑鼠和鍵盤即可進行操作,而近年來,為了讓人機互動可以更直觀且自然的方式操作,而發展了利用人類的手部動作、指紋、聲音等去當作電腦輸入的指示,為了讓整體科技的技術更貼近真實,又發展出了AR/VR讓虛擬的應用能與現實世界相互連接在一起。而以手勢為基礎的人機互動系統更是熱門研究的主題之一,手勢是一種直觀且易於學習的交互手段,利用人的手直接做為電腦的輸入設備。在本文中,藉由穿戴式裝置Myo手環開發了一套人機互動的手勢辨識系統應用於虛擬劇場,在傳統的話劇中,通常是由工作人員在幕後進行舞台上的操控,藉由這項系統希望在表演上可以不受場地和光線與遮蔽物的限制,讓演出者可以直接的操控舞台的物件,將藝術與科技結合在一起,在這項系統中,利用手部的肌電訊號當作靜態手勢並且採集手臂的三軸移動資訊當作動態手勢,透過深度學習的方法,帶著穿戴式裝置向電腦發出指令,手勢的軌跡追蹤資料通過藍芽立即傳輸至電腦,可分類出當前手勢動作。在虛擬舞台的設計上,我們利用Maya這款軟體建出3D的模型搭配Unity提供的開發包打造了不同的場景,並利用TCP/IP Socket將手勢辨識結果傳輸至Unity中,使用者可以透過此系統簡單的控制Unity中的場景和物件,讓舞台效果更豐富。
With the advent of new technological, the interaction between people and computers has become more and more inseparable. The human-computer interaction (HCI) technology improves the operations in people's lives. In the past, users usually need to use the mouse and keyboard for system operation. In recent years, in order to make this technology more intuitive and more natural to operate, an indication has been developed to use human hand movements, sounds and fingerprint as computer input. In order to make the technology of the human-computer interaction closer to reality, AR/VR has been developed to connect virtual applications with the real world.The gesture recognition is the basic operation and is one of the hot research topics. Gesture is an intuitive and easy-to-learn interactive method. Users uses hands directly as an input device for a computer. In this paper, we developed a human-computer interactive gesture recognition system for virtual theaters through the wearable device Myo armband. The staff controls the stage objects behind the scenes in the traditional drama. We hope that this system can be free from the restrictions of the space, light and shelter, allowing the performer to directly manipulate the objects of the stage. Finally, provide a new way to combine technology and art.The proposed system used a deep learning method to classify dynamic gestures, and then send instructions to the virtual theater. In the design of the virtual stage, we use Maya to build 3D models and create different scenes with the development kit provided by Unity. Then, transmit the recognition result to Unity through TCP/IP Socket. By using this system, users can easily control the scenes and objects in the theater developed by Unity and make the virtual stage more enriched during the performance.
1 Introduction 1
1.1 Background 1
1.2 Motivation 2
1.3 Thesis Organization 5
2 Related work 6
2.1 Virtual Theater 6
2.2 Introduction of sensors 9
2.2.1  Myo Introduction 10
2.2.2  Hand data of Myo armband 12
2.3 Gesture recognition technology 14
2.3.1  IMU and EMG hand recognition 16
2.4 Myo with HCI 22
2.5 Artificial Neural Networks 23
2.5.1  CNN 25
2.5.2  RNN 26
3 System Architecture 28
3.1 System Introduction 28
3.2 Development equipment and setup 28
3.3 Data Acquisition 30
3.4 Target gestures 31
3.5 Data preprocessing 33
3.6 Development Framework 35
3.7 Classification 38
3.7.1  Model classification of CNN 39
3.7.2  Model classification of LSTM 42
3.7.3  Model classification of GRU 44
3.8 Interactive application design 45
3.8.1  Real-time gesture recognition 46
3.8.2  Scene design in virtual theater 48
3.8.3  Application pseudocode 51
4 Experiment Result 53
4.1 Experimental environment 53
4.2 Experimental dataset 55
4.3 Evolution of model architecture 55
4.4 The evaluation criteria of the system performance 59
4.5 Result of drma 63

5 Conclusion and Future Work 65
5.1 Conclusion 65
5.2 Future Work 65
6 Reference 66

[1] H. P. Gupta, H. S. Chudgar, S. Mukherjee, T. Dutta and K. Sharma, "A Continuous Hand Gestures Recognition Technique for Human-Machine Interaction Using Accelerometer and Gyroscope Sensors," in IEEE Sensors Journal, vol. 16, no. 16, pp. 6425-6432, Aug.15, 2016.
[2] S. Jiang, B. Lv, X. Sheng, C. Zhang, H. Wang and P. B. Shull, "Development of a real-time hand gesture recognition wristband based on sEMG and IMU sensing," 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, 2016, pp. 1256-1261.
[3] J. Geigel, M. Schweppe, D. Huynh and B. Johnstone, "Adapting a Virtual World for Theatrical Performance," in Computer, vol. 44, no. 12, pp. 33-38, Dec. 2011
[4] S. Piman and A. Z. Talib, "Puppet modeling for real-time and interactive virtual shadow puppet play," 2012 Second International Conference on Digital Information and Communication Technology and it's Applications (DICTAP), Bangkok, 2012, pp. 110-114.
[5] J. L. Dorado, P. Figueroa, J. Chardonnet, F. Merienne and J. T. Hernández, "Comparing VR environments for seat selection in an opera theater," 2017 IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, 2017, pp. 221-222.
[6] M. Schweppe and J. Geigel, "Live Theater on a Virtual Stage: Incorporating Soft Skills and Teamwork in Computer Graphics Education," in IEEE Computer Graphics and Applications, vol. 31, no. 1, pp. 85-89, Jan.-Feb. 2011.
[7] M. Husinsky and F. Bruckner, "Virtual Stage: Interactive Puppeteering in Mixed Reality," 2018 IEEE 1st Workshop on Animation in Virtual and Augmented Environments (ANIVAE), Reutlingen, 2018, pp. 1-7.
[8] Y. Zhang and A. Fangbemi, ""Third-Person" Augmented Reality-Based Interactive Chinese Drama," 2015 International Conference on Culture and Computing (Culture Computing), Kyoto, 2015, pp. 41-46.
[9] M. Maidi and M. Preda, "Interactive media control using natural interaction-based Kinect," 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, 2013, pp. 1812-1815.
[10] Chung-Yan Chih, Yi-Chen Wan, Yu-Chi Hsu and Liang-Gee Chen, "Interactive sticker system with Intel RealSense," 2017 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, 2017, pp. 174-175.
[11] A. B. Jani, N. A. Kotak and A. K. Roy, "Sensor Based Hand Gesture Recognition System for English Alphabets Used in Sign Language of Deaf-Mute People," 2018 IEEE SENSORS, New Delhi, 2018, pp. 1-4.
[12] S. Shin, D. Kim and Y. Seo, "Controlling Mobile Robot Using IMU and EMG Sensor-Based Gesture Recognition," 2014 Ninth International Conference on Broadband and Wireless Computing, Communication and Applications, Guangdong, 2014, pp. 554-557.
[13] A. Samadani, "EMG Channel Selection for Improved Hand Gesture Classification," 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, 2018, pp. 4297-4300.
[14] X. Dang, W. Wang, K. Wang, M. Dong and L. Yin, "A user-independent sensor gesture interface for embedded device," SENSORS, 2011 IEEE, Limerick, 2011, pp. 1465-1468.
[15] M. Tseng, K. Liu, C. Hsieh, S. J. Hsu and C. Chan, "Gesture spotting algorithm for door opening using single wearable sensor," 2018 IEEE International Conference on Applied System Invention (ICASI), Chiba, 2018, pp. 854-856.
[16] S. Jiang, B. Lv, X. Sheng, C. Zhang, H. Wang and P. B. Shull, "Development of a real-time hand gesture recognition wristband based on sEMG and IMU sensing," 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, 2016, pp. 1256-1261.
[17] H. J. Kim, Y. S. Lee and D. Kim, "Arm Motion Estimation Algorithm Using MYO Armband," 2017 First IEEE International Conference on Robotic Computing (IRC), Taichung, 2017, pp. 376-381.
[18] A. Rahagiyanto, A. Basuki, R. Sigit, A. Anwar and M. Zikky, "Hand Gesture Classification for Sign Language Using Artificial Neural Network," 2017 21st International Computer Science and Engineering Conference (ICSEC), Bangkok, 2017, pp. 1-5.
[19] S. P. Y. Jane and S. Sasidhar, "Sign Language Interpreter: Classification of Forearm EMG and IMU Signals for Signing Exact English *," 2018 IEEE 14th International Conference on Control and Automation (ICCA), Anchorage, AK, 2018, pp. 947-952.
[20] Y. Wang and H. Ma, "Real-Time Continuous Gesture Recognition with Wireless Wearable IMU Sensors," 2018 IEEE 20th International Conference on e-Health Networking, Applications and Services (Healthcom), Ostrava, 2018, pp. 1-6.
[21] B. Wan, R. Wu, K. Zhang and L. Liu, "A new subtle hand gestures recognition algorithm based on EMG and FSR," 2017 IEEE 21st International Conference on Computer Supported Cooperative Work in Design (CSCWD), Wellington, 2017, pp. 127-132
[22] I. Mendez et al., "Evaluation of the Myo armband for the classification of hand motions," 2017 International Conference on Rehabilitation Robotics (ICORR), London, 2017, pp. 1211-1214.
[23] E. H. El-Shazly, M. M. Abdelwahab, A. Shimada and R. Taniguchi, "Real time algorithm for efficient HCI employing features obtained from MYO sensor," 2016 IEEE 59th International Midwest Symposium on Circuits and Systems (MWSCAS), Abu Dhabi, 2016, pp. 1-4.
[24] S. He, C. Yang, M. Wang, L. Cheng and Z. Hu, "Hand gesture recognition using MYO armband," 2017 Chinese Automation Congress (CAC), Jinan, 2017, pp. 4850-4855.
[25] Z. Arief, I. A. Sulistijono and R. A. Ardiansyah, "Comparison of five time series EMG features extractions using Myo Armband," 2015 International Electronics Symposium (IES), Surabaya, 2015, pp. 11-14.
[26] F. S. Sayin, S. Ozen and U. Baspinar, "Hand Gesture Recognition by Using sEMG Signals for Human Machine Interaction Applications," 2018 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA), Poznan, 2018, pp. 27-30.
[27] W. Wei, Q. Dai, Y. Wong, Y. Hu, M. Kankanhalli and W. Geng, "Surface Electromyography-based Gesture Recognition by Multi-view Deep Learning," in IEEE Transactions on Biomedical Engineering.
[28] Tao, Wenjin & Lai, Ze-Hao & Leu, Ming & Yin, Zhaozheng. (2018). Worker Activity Recognition in Smart Manufacturing Using IMU and sEMG Signals with Convolutional Neural Networks. 26. 1159-1166. 10.1016/j.promfg.2018.07.152.
[29] Ordóñez, F.J.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115.
[30] D. C. Kavarthapu and K. Mitra, "Hand Gesture Sequence Recognition Using Inertial Motion Units (IMUs)," 2017 4th IAPR Asian Conference on Pattern Recognition (ACPR), Nanjing, 2017, pp. 953-957
[31] S. Xu and Y. Xue, "A Long Term Memory Recognition Framework on Multi-Complexity Motion Gestures," 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR), Kyoto, 2017, pp. 201-205.
[32] Lefebvre, Grégoire & Berlemont, Samuel & Mamalet, Franck & Garcia, Christophe. (2015). Inertial Gesture Recognition with BLSTM-RNN. 10.1007/978-3-319-09903-3_19.
[33] N. P. Brillantes, H. Kim, R. Feria, M. R. Solamo and L. L. Figueroa, "Evaluation of a 3D physics classroom with Myo gesture control armband and unity," 2017 8th International Conference on Information, Intelligence, Systems & Applications (IISA), Larnaca, 2017, pp. 1-6.
[34] A. A. Hidayat, Z. Arief and H. Yuniarti, "LOVETT scalling with MYO armband for monitoring finger muscles therapy of post-stroke people," 2016 International Electronics Symposium (IES), Denpasar, 2016, pp. 66-70.
[35] S. R. Kurniawan and D. Pamungkas, "MYO Armband sensors and Neural Network Algorithm for Controlling Hand Robot," 2018 International Conference on Applied Engineering (ICAE), Batam, 2018, pp. 1-6
[36] Y. Lecun, L. Bottou, Y. Bengio and P. Haffner, "Gradient-based learning applied to document recognition," in Proceedings of the IEEE, vol. 86, no. 11, pp. 2278-2324, Nov. 1998.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top