跳到主要內容

臺灣博碩士論文加值系統

(18.97.9.169) 您好!臺灣時間:2025/01/21 07:28
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:張晉豪
研究生(外文):Chang, Jin-Hau
論文名稱:基於皮膚顏色及手指角度之手勢辨識
論文名稱(外文):Skin-Color and Finger-Angle Based Hand-Gesture Recognition
指導教授:吳俊德吳俊德引用關係
指導教授(外文):Wu, Gin-Der
口試委員:洪志偉莊家峰
口試委員(外文):Hung, Jeih-WeihJuang, Chia-Feng
口試日期:2020-06-30
學位類別:碩士
校院名稱:國立暨南國際大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2020
畢業學年度:108
語文別:英文
論文頁數:50
中文關鍵詞:人機互動手勢辨識數位影像處理
外文關鍵詞:human-computer interactiongesture recognitiondigital image processing
DOI:10.6837/ncnu202000119
Facebook:andylin0505@yahoo.com.tw
相關次數:
  • 被引用被引用:1
  • 點閱點閱:165
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
在現今,人機互動依然是個很重要的技術。其中,手勢辨識可以說是基礎的人機互動方式,現今技術在辨識手掌及手指的方式不競相同。在這裡,我們使用了一些新的方式用於手指偵測的部分,而膚色辨別及一些影像處理的方式是現有的。
在我們捕抓到的RGB影像,我們先轉換成NCC色彩空間並捕抓膚色,並設定膚色為白,其餘為黑。然後,利用數位影像處理去除掉雜訊部分,並且取得正確的膚色區塊。接下來,取得手掌的位置作為之後計算角度上的點。之後,我們找尋手指上的點,並且排除錯誤的部分。最後,我們利用計算手指間的角度,以達成辨識手式符號的方法。
使用以上方法,我們可以準確的捕抓手指位置,以及計算辨識出手的姿態。這項研究可以幫助電腦理解人的手勢,並可以作為人機互動的基礎。並且在未來可以應用在機器人控制上。

Recently, human-machine interaction is still a very important technology. Among them, gesture recognition can be said to be the basic human-machine interaction method. There is no much difference in the way of identifying palms and fingers in most research. Here, we use some new methods for the part of finger detection. The methods of skin color discrimination and some image processing are existing.
In the RGB images we capture, we first convert to the NCC color space and capture the skin color, and set the skin color to white, and the others to black. Then, we use digital image processing to exclude the noise part and obtain the correct skin color. Next, the position of the palm is obtained as a point in the angle to be calculated later. After that, we search the points on the fingers and eliminate the wrong points. Finally, we use the calculation of the angle between fingers to achieve a method of identifying hand-shaped symbols.
Using the above method, we can accurately capture the position of the finger and calculate and recognize the gesture of the hand. This research can help computers understand human gestures and can be used as the basis for human-computer interaction. And it can be applied to robot control in the future.

Table of Contents
Acknowledgements i
摘要 ii
Abstract iii
Table of Contents iv
List of Figures vi
Chapter 1 Introduction 1
1.1 Motivation 2
1.2 Purpose 3
1.3 Construction of The Thesis 4
Chapter 2 Skin-color Detection 5
2.1 NCC Color Space 6
2.2 Rules of Detection 7
Chapter 3 Digital Image Processing 9
3.1 Erosion and Dilation 10
3.2 Opening and Closing 12
3.3 Processing and Result 14
Chapter 4 Wrist and Palm Positions 15
4.1 Why and How We Find These Positions 16
4.2 Positions in Horizontal Projection 17
4.3 Maximum in Vertical Projection and Wrist Position in Vertical Projection 18
4.4 Palm Position 21
Chapter 5 Edging Processing 22
Chapter 6 Fingers Recognition 24
6.1 Find Points on Fingers 25
6.2 Reducing Points 26
6.3 Excluding too Small Angle Points 27
6.4 Reducing the Points Still on The Same Fingers 28
6.5 Excluding Error Points 29
6.6 Results 30
Chapter 7 Number Comparison 31
7.1 Figure-Angle Calculation 32
7.2 Comparison 34
Chapter 8 Experiment and Result 36
Chapter 9 Conclusion and Future Work 46
9.1 Conclusion 47
9.2 Future Work 48
References 49

List of Figures
Fig.2-1. The skin color block proposed by Soriano 7
Fig.3-1. Erosion (A⊝B) 10
Fig.3-2. Dilation (A⊕B) 11
Fig.3-3. Opening operation 12
Fig.3-4. Closing operation 12
Fig.3-5. Opening and then Closing 14
Fig.4-1. Horizontal mapping of hand 17
Fig.4-2. Maximum value of the horizontal map (max_YL,max_XL) 18
Fig.4-3. Location of wrist (wrist_X,wrist_Y) 19
Fig.5-1. Edging Processing 22
Fig.5-2. Edging Processing (left: before; right: after) 23
Fig.6-1. Fingers recognition (the rightest image is experimental result) 25
Fig.6-2. Angle calculation (A and B are two points on the fingers, M is the vertex) 27
Fig.6-3. Three points obtained after quartering (p1, p2, p3) 28
Fig.6-4. Experimental results (0 to 9) 30
Fig.7-1. Inner product operation 32
Fig.7-2. Calculations for each angle 33
Fig.7-3. Judge each gesture 35
Fig.8-1. Hand-gesture recognition experiment results (hand-gesture 0) 36
Fig.8-2. Hand-gesture recognition experiment results (hand-gesture 1) 37
Fig.8-3. Hand-gesture recognition experiment results (hand-gesture 2) 38
Fig.8-4. Hand-gesture recognition experiment results (hand-gesture 3) 39
Fig.8-5. Hand-gesture recognition experiment results (hand-gesture 4) 40
Fig.8-6. Hand-gesture recognition experiment results (hand-gesture 5) 41
Fig.8-7. Hand-gesture recognition experiment results (hand-gesture 6) 42
Fig.8-8. Hand-gesture recognition experiment results (hand-gesture 7) 43
Fig.8-9. Hand-gesture recognition experiment results (hand-gesture 8) 44
Fig.8-10. Hand-gesture recognition experiment results (hand-gesture 9) 45
Fig.9-1. Hand-gesture recognition flow chart 46


[1]M. Soriano, S. Huovinen, B. Martinkauppi and M. Laaksonen, “Using the Skin Locus to Cope with Changing illumination Condition in Color-Based Face Tracking”, Proc. of IEEE Nordic Signal Processing Symposium, 2000, pp.383-386.
[2]D. Chai, and K. Ngan, “Face segmentation using skin-color map in videophone applications”, in IEEE Trans. Circuits and Systems for Video Technology, vol. 9 no. 4 pp. 551-564, 1999.
[3]E. Marszalec, B. Martinkauppi, M. Soriano, and M. Pietikäinen, “Physics-based face database for color research”, in J. Electronic Imaging, vol. 9, pp. 32-38, 2000
[4]Z. Lu, X. Chen, Q. Li, X. Zhang and P. Zhou, “A Hand Gesture Recognition Framework and Wearable Gesture-Based Interaction Prototype for Mobile Device” in IEEE transactions on human-machine systems, vol.44, iss.2, April 2014.
[5]M. Elmezain, A. Al-Hamadi, and B. Michaelis, “A robust method for hand gesture segmentation and recognition using forward spotting scheme in conditional random fields,” in Proceedings of the 20th International Conference on Pattern Recognition (ICPR ’10), pp. 3850– 3853, August 2010.
[6]M. R. Malgireddy, J. J. Corso, S. Setlur, V. Govindaraju, and D. Mandalapu, “A framework for hand gesture recognition and spotting using sub-gesture modeling,” in Proceedings of the 20th International Conference on Pattern Recognition (ICPR ’10), pp. 3780–3783, August 2010.
[7]P. Suryanarayan, A. Subramanian, and D. Mandalapu, “Dynamic hand pose recognition using depth data,” in Proceedings of the 20th International Conference on Pattern Recognition (ICPR ’10), pp. 3105– 3108, August 2010.
[8]J. L. Raheja, A. Chaudhary, and K. Singal, “Tracking of fingertips and centers of palm using KINECT,” in Proceedings of the 2nd International Conference on Computational Intelligence, Modelling and Simulation (CIMSim’11), pp. 248–252, September 2011.
[9]J. Choi, H. Park, and J.-I. Park, “Hand shape recognition using distance transform and shape decomposition,” in Proceedings of the 18th IEEE International Conference on Image Processing (ICIP ’11), pp. 3605– 3608, September 2011.
[10]J. Zeng, Y. Sun, and F. Wang, “A natural hand gesture system for intelligent human-computer interaction and medical assistance,” in Proceedings of the 3rd Global Congress on Intelligent Systems (GCIS ’12), pp. 382–385, November 2012.
[11]J. Suarez, and R. R. Murphy, “Hand Gesture recognition with depth images: A Review”, in IEEE international Symposium on Robot and human Interactive Communication, pp. 9-13, 2012.

電子全文 電子全文(網際網路公開日期:20250714)
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊