(18.204.227.34) 您好!臺灣時間:2021/05/19 08:50
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

: 
twitterline
研究生:郭禹銘
研究生(外文):KUO, Yu-Ming
論文名稱:使用膚色偵測和直方圖分析之手勢辨識
論文名稱(外文):Gesture Recognition Using Skin Detection and Histogram Techniques
指導教授:陳文淵陳文淵引用關係
指導教授(外文):Chen, Wen-Yuan
學位類別:碩士
校院名稱:國立勤益科技大學
系所名稱:電子工程系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2013
畢業學年度:101
語文別:中文
論文頁數:94
中文關鍵詞:膚色偵測直方圖分析背景相減邊緣偵測
外文關鍵詞:color detectionhistogram analysisbackground subtractionedge detection
相關次數:
  • 被引用被引用:3
  • 點閱點閱:267
  • 評分評分:
  • 下載下載:41
  • 收藏至我的研究室書目清單書目收藏:0
隨著科技的進步,使用者與電腦溝通的介面也越來越多,麥克風、觸控螢幕、攝影機等輸入裝置,使用上比傳統方式更為簡單且人性化。由於Kinect與XBOX360的盛行,使用者不需手持任何搖桿,只需站在Kinect前做出指定動作,就能達到與遊戲互動的效果,讓使用者有身歷其境的感覺。本文提出一種以結合膚色偵測和直方圖分析的手勢辨識方法,作為遊戲或機械的手勢輸入控制方式。
本系統分為兩個部分:第一部份為手掌擷取,第二部份為手指角度計算。在手掌擷取方面:首先輸入物件影像與背景影像,並分別將影像由RGB色彩空間轉換成YCbCr色彩空間,因為空間適合做膚色偵測。接著計算出手部角度,若角度大於閥值,則將手部轉正,最後利用直方圖分析找出手臂與手掌的交界處,將手掌擷取出來。在手指角度計算方面:首先使用sobel運算獲得手掌影像邊緣,再利用邊緣影像做直方圖分析與找出手腕的中點。利用直方圖特徵找出手指指尖與指谷,根據指谷的位置去除手指部份,只留下手掌物件。再依據手掌物件計算手掌的重心,以手腕中點與手掌重心兩點畫一直線作為計算手指角度的0度基準線。利用指谷求出每隻手指與手掌連接點,若指尖與連接點的距離大於閥值,表示手掌中有手指伸出。 並使用基準線與連接點計算手指角度。經以10個人之左手及右手,在各種不同手指狀態下做實驗,都獲得正確的手指辨識結果。

With advances in technology, the user interfaces to communicate with the computer are increasing more and more, such as microphones, touch screens, cameras and other input devices, so they become much simpler than the traditional way for humane. As Kinect and XBOX360 are prevalent, users without any hand-held joystick just stand in front of Kinect to make specified action so to achieve the effect of interaction with the game, in which users feel immersive. This paper proposes a gesture recognition method integrating skin color detection and histogram analysis as the control method for game or mechanical gesture input.
The system is divided into two parts: the first part is a palm capture, and the second part is the finger angle calculation. In the palm capture, the input object image and background image are first input, and the images were converted from RGB color into YCbCr color space, because space is suitable for color detection. Then the hand angle is calculated. If the angle is greater than the threshold, then the hand is turned positive. The final histogram analysis is used to find the junction of the arm and hand for retrieving the palm. In the finger angle calculation, sobel operator is first used to obtain the edge of the palm image, and then the edge of the image done by the histogram analysis is used to find the midpoint of the wrist. Fingertips and finger Valley are identified using the histogram based on the location of the valley to remove the finger part and leave only the palm object. Then based on the palm object's center, angle of 0 degrees baseline is calculated by drawing straight line from the center of wrist and the center of palm. Each finger and the palm valley connection point is calculated using the finger valley. If the distance between the fingertip and the connection point is greater than the threshold, it means that there are fingers outstretched from the palm. The baseline and the connection point are used to calculate the finger angle. For the left and right hands of 10 individuals, their fingers under various states are tested, and the experiments all shown the correct finger recognition results.

致 謝 I
中文摘要 II
Abstract IV
目 錄 VI
圖 目 錄 VIII
表 目 錄 XI
第1章、緒論 - 1 -
1.1研究背景 - 1 -
1.2研究動機與目的 - 2 -
1.3文獻探討 - 4 -
1.4章節概要 - 8 -
第2章、相關原理 - 9 -
2.1色彩空間 - 9 -
2.2影像二值化 - 11 -
2.3鄰域處理 - 14 -
2.3.1低通濾波器 - 14 -
2.3.2高通濾波器 - 15 -
2.3.3中值濾波器 - 15 -
2.4形態學 - 17 -
2.5拓撲學 - 22 -
第3章、演算法 - 23 -
3.1手掌辨識流程 - 23 -
3.2手掌擷取演算法 - 25 -
3.2.1色彩空間轉換 - 25 -
3.2.2膚色偵測 - 26 -
3.2.3背景相減 - 28 -
3.2.4擷取手部影像 - 29 -
3.2.5影像邊緣偵測 - 31 -
3.2.6計算手部角度 - 32 -
3.2.7手部影像轉正 - 32 -
3.2.8擷取手掌影像 - 33 -
3.3手指角度分析演算法 - 35 -
3.3.1邊緣距離直方圖分析 - 35 -
3.3.2找出指谷與指尖 - 36 -
3.3.3畫出角度基準線 - 37 -
3.3.4計算手指角度 - 38 -
3.3.5平均角度範圍 - 39 -
3.3.6合併手勢 - 41 -
第4章、實驗結果 - 46 -
4.1實驗環境 - 46 -
4.2實驗結果 - 48 -
4.3與其他方法之比較 - 85 -
第5章、結論與未來方向 - 87 -
5.1結論 - 87 -
5.2未來方向 - 88 -
參考文獻 - 89 -
作者簡介 - 94 -

[1] 陳旻廷,“以資料手套輸入裝置之手勢操控虛擬人物系統之建構”, 碩士論文,中原大學工業工程學系研究所,桃園,2007年。
[2] Yoon Sang Kim , Byung Seok Soh , Sang-Goog Lee , “A new wearable input device: SCURRY”, Industrial Electronics, IEEE Transactions on, pp.1490-1499, December 2005.
[3] D. J. Sturman and D. Zeltzer, “A survey of glove-based input”, IEEE Computer Graphics and Appl., pp.30-39, 1994.
[4] 王國榮,“基於資料手套的智慧型手勢辨識”,碩士論文,國立台灣科技大學電機工程所,台北,2001年。
[5] Wei Du and Hua Li, “Vision based gesture recognition system with single camera”, Proc. of ICSP2000, vol.2,pp.1351-1357, 2000.
[6] J. M. Rehg and T. Kanade, “DigitEyes:vision-based hand tracking for human-computer interaction”, IEEE Workshop on Motion of Non-Rigid and Articulated Objects, pp.16-22, Nov. 1994.
[7] J. J. Kuch and T. S. Huang, “Vision based hand modeling and tracking for virtual teleconference and telecollaboration”, ICCV, pp.666-671, 1995.
[8] J. Davis and M. Shah, “Visual gesture recognition”, Proc. of IEE on Vision, Image and Signal Processing, Vol.141, pp.101-106, Apr. 1994.
[9] 深圳市中視典數字科技有限公司。2013.07.01取自。(http://www.vrp3d.com/article/2008/0905/article_286.html)
[10] 羅技電子股份有限公司。2013.07.01取自(http://www.logitech.com/zh-tw/product/6065)。
[11] N. D. Binh and T. Ejima, “Real-Time Hand Gesture Recognition Using Pseudo 3-D Hidden Markov Model”, ICCI 2006, 5th IEEE International Conference on Cognitive Informatics, Beijing China, Vol. 2, 17-19 July, pp. 820-824, 2006.
[12] Y. Fang, K. Wang, J. Cheng and H. Lu, “A Real-Time Hand Gesture Recognition Method”, IEEE of International Conference on Multimedia and Expo (ICME’07), Beijing, China, 2-5 July, pp. 995-998, 2007.
[13] Y. Sriboonruang, P. Kumhom, and K. Chamnongthai, “Visual Hand Gestur Interface for Computer Board Game Control”, IEEE International Conference on Tenth International Symposium, St. Petersburg Russia, 28 June-01 July, pp. 1-5, 2006.
[14] K. Umeda, I. Furusawa and S. Tanaka, “Recognition of Hand Gestures Using Range Images”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Victoria, BC Canada, Vol. 3, Oct. 13-17, pp. 1727-1732, 1998.
[15] E.J.Holden and R. Owens, “Recognizing Moving Hand Shapes”, International Conference on Image Analysis and Processing, Mantova, Italy,17-19 Sept., pp.14-19, 2003.
[16] C. L. Huang and S. H. Jeng, “A model-based hand gesture recognition system”, Machine Vision and Appl., vol.12, pp.243-258, 2001.
[17] T. S. Huang, Ying Wu, and John Lin, “3D model-based visual hand tracking”, Proc. of the 2002 IEEE Int. Conf. on Multimedia and Expo, vol.1, pp.905-908, 2002.


[18] Y. Yasumuro, Qian Chen, and K. Chihara, “3D modeling of human hand with motion constraints”, Proc. of the Int. Conf. on 3-D Digital Imaging and Modeling, pp.275-282, May 1997.
[19] 陳治宇,“虛擬滑鼠:以視覺為基礎之手勢辨識”,碩士論文,國立中山大學資訊工程學系研究所,高雄,2003年。
[20] J. M. Rehg and T. Kanade, “Model-based tracking of self-occluding articulated objects”, Proc. of the 5th Int. Conf. on Computer Vision, pp.612-617, June 1995.
[21] Ying Wu and T. S. Huang, “Capturing articulated human hand motion: A divide-and-conquer approach”, Proc. of IEEE Int. Conf. on Computer Vision, pp.606-611, Sep.1999.
[22] F. Lathuiliere and J.-Y. Herve, “Visual tracking of hand posture with occlusion handling”, Proc. of the 15th Int. Conf. on Pattern Recognition, vol.3, pp.1129-1133, 2000.
[23] Shinn-Ying Ho, Zhen-Bang Huang and Shinn-Jang Ho, “An evolutionary approach for pose determination and interpretation of occluded articulated objects”, Proc. Of the 2002 Congress on Evolutionary Computation, vol.2, pp.1092-1097, 2002.
[24] C. C. Lien and C. L. Huang, “The model based dynamic hand posture identification using genetic algorithm”, Machine Vision and Appl., vol.11, pp.107-121, 1999.
[25] Jintae Lee and T. L. Kunii, “ Model-based analysis of hand posture”, IEEE Computer Graphics and Appl., vol.15, no.5, pp.77-86, Sep. 1995.


[26] C. C. Lien and C. L. Huang, “Model-based articulated hand motion tracking for gesture recognition”, Image and Vision Computing, vol.16, no.2, pp.121-134, 1998.
[27] 王旭東,“可感知人機介面手勢模型之建構”,碩士論文,國立成功大學建築研究所,台南,2009年。
[28] 劉昆暢,“結合膚色和骨架化之手掌辨識”,碩士論文,國立勤益科技大學電子工程系碩士班,台中,2010年。
[29] 古証兆,“猜拳遊戲手勢之影像辨識研究”,碩士論文,國立勤益科技大學電子工程系碩士班,台中,2011年。
[30] Qing Chen, Georganas, Nicolas D. and Petriu, E.M., “Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar”, Instrumentation and Measurement, vol.57, pp.1562-1571, 2008.
[31] Beifang Yi, Harris, F.C., Ling Wang and Yusong Yan, “Real-time natural hand gestures”, Computing in Science &; Engineering, vol.7, pp.92-96, 2005.
[32] Ying Wu and Huang, T.S., “Hand modeling, analysis and recognition”, Signal Processing Magazine, IEEE, vol.18, pp.51-60, 2001.
[33] 羅國育,“簡潔的手勢切割應用於複雜背景下的手勢辨識”,碩士論文,國立高雄應用科技大學,高雄,2009年。
[34] Yoruk, E., Konukoglu, E., Sankur, B. and Darbon, J., “Shape-based hand recognition”, Image Processing, IEEE Transactions on, vol.15, pp.1803-1815, 2006.

[35] 賴建宏,“使用M-SVM於Kinect特徵擷取之手勢辨識 及其於遠端遙控機器人之實現”,碩士論文,國立嘉義大學,嘉義,2012年。
[36] 維基百科。2013.07.01取自(https://zh.wikipedia.org/wiki/三原色光模式)。
[37] 劉震昌,數位影像處理,高立圖書有限公司,2010年。
[38] Digimag。2013.07.01取自(http://digimag.es/canon-eos-550d-a-prueba/)
[39] Digital Camera Reviews LetsGoDigital。2013.07.01取自(http://www.letsgodigital.org/en/23067/canon-efs-18-135mm/)

連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top