跳到主要內容

臺灣博碩士論文加值系統

(44.211.26.178) 您好!臺灣時間:2024/06/15 03:06
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:劉冠鷹
研究生(外文):Guan-Ying Liou
論文名稱:虛擬打字之研究
論文名稱(外文):Study on Virtual Tying
指導教授:蘇順豐
指導教授(外文):Shun-Feng Su
口試委員:蘇順豐
口試委員(外文):Shun-Feng Su
口試日期:2014-07-04
學位類別:碩士
校院名稱:國立臺灣科技大學
系所名稱:電機工程系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2014
畢業學年度:102
語文別:英文
論文頁數:68
中文關鍵詞:Kinect 感測器人機互動物件標記指尖偵測手部切割虛擬打字
外文關鍵詞:KinectHCICCLFingertip detectionHand segmentationVirtual Typing
相關次數:
  • 被引用被引用:0
  • 點閱點閱:201
  • 評分評分:
  • 下載下載:4
  • 收藏至我的研究室書目清單書目收藏:0
本論文提出兩個手指的相關特徵應用在3D Kinect sensor 打字動作偵測上。第一個特徵是手掌與指尖位置的相對關係,第二個是手指第一指節的某些特性,可用來定義手指休息與打字的差別。將兩特徵應用在本系統可偵測人的打字動作,而該系統包括:手部切割、指尖偵測以及偵測打字動作。手部切割的部分可利用Kinect所提供的3D影像先進行手部偵測,再利用深度閥值進行手部區域影像的萃取。指尖偵測則使用到類似將手部影像轉換為極座標的方法,並使用物件標記及一些幾何學的運算。而最後本論文所提出的特徵被應用來偵測手指打字。同時本論文也提出三個實驗,前兩個分別證明指尖偵測與打字偵測的準確性與有效性,最後藉由比較熟悉系統的使用者來進行實際的打字任務,以分析本論文所提出的方法在現實環境中取代實體鍵盤的可能性。最後本論文認定所提出的特徵在偵測打字動作時是有用的。
This study presents two novel features for human fingers typing based on Kinect 3D sensor in real time. The first one of the novel features is about the relative positions of palm and fingertips, and the second one is about the common feature of third knuckles, this feature is applied to define the difference between relaxing and typing motion. Combined those two features, the proposed system can detect the human typing motions well. The proposed system includes hand segmentation, fingertip detection and the proposed finger typing features. Using 3D image from Kinect, it can detect if a hand is showing up, and extracting one or both hands by a deep threshold. And then a detection of fingertips which is similar to polar hand image is applied, containing CCL and some geometric calculations, and then the proposed features are used to detect typing. In order to estimate the implemented system with proposed method, this study presents three experiments. The first two experiments are designed to prove that fingertip detection and typing method can work well. Finally, a real typing task for trained users is shown in last experiment with some analyses, and it is said that the proposed features are useful in detecting human typing motions.
中文摘要 I
Abstract II
Contents III
Figure list IV
Table list VI
Chapter 1 Introduction 1
Chapter 2 Related Work 3
Chapter 3 System Description 9
3.1 Hardware and Software Development Interface 10
3.2 Preprocessing 11
3.2.1 Hand Segmentation 11
3.2.2 Connected-Component Labeling 16
3.2.3 Centroid Calculation 19
3.3 Fingertip Detection 21
3.3.1 Average Length 21
3.3.2 Palms and Wrists Cutting 23
3.3.3 CCL 25
3.3.4 Locate Fingertips 26
3.3.5 Fingertips Tracking 27
3.4 Virtual Typing Method 31
3.5 User Interface 36
3.5.1 Text Information Window 36
3.5.2 Color Information Window 37
Chapter 4 Experimental Results 39
4.1 Experiment of Fingertip detection 41
4.2 Experiment of Typing Method 47
4.3 Experiment of Typing Tasks 57
Chapter 5 Conclusions 63
Reference 64
[1] Tenney, Matthew. (2012). Microsoft Kinect – Hardware.CAST Technical Publications Series. Number 10354 [Online]. Available: http://gmv.cast.uark.edu/
scanning/hardware/microsoft-kinect-resourceshardware/.
[2] W.M. Tsang and K. Pun, “A Finger-Tracking Virtual Mouse Realized in an Embedded System,” Proceedings of the International Symposium on Intelligent Signal Processing and Communication Systems, pp. 781-784, Dec. 2005.
[3] K. Oka, Y. Sato and H. Koike, “Real-time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface Systems,” Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, vol. 22, no. 6, pp. 411–416, 2002.
[4] Q. Chen, N.D. Georganas, and E.M. Petriu, “Hand gesture recognition using haar-like features and a stochastic context-free grammar,” IEEE Transactions on Instrumentation and Measurement, vol. 57, no. 8, pp. 1562– 1571, 2008.
[5] L. Dung and M. Mizukawa, “Fast fingertips positioning based on distance-based feature pixels,” 2010 Third International Conference on Communications and Electronics (ICCE), pp. 184-189.
[6] J. Suarez and R.R. Murphy, “Hand gesture recognition with depth images: A Review,” The 21st IEEE International Symposium on Robot and Human Interactive Communication, pp. 411-417, September 9-13, 2012.
[7] Wikipedia. (2014, May). OpenNI [Online]. Available: http://en.wikipedia.org/wiki/
OpenNI
[8] Itseez. (2014). Platforms [Online]. Available: http://opencv.org/platforms.html
[9] K. Oka, Y. Sato, and H. Koike, “Real-time fingertip tracking and gesture recognition,” IEEE Computer Graphics and Applications, Vol. 22, No. 6, pp: 64–71, Nov.-Dec. 2002.
[10] Q. Chen, N.D. Georganas, and E.M. Petriu, “Hand gesture recognition using Haar-like features and a stochastic context-free grammar,” IEEE Transactions on Instrumentation and Measurement, vol. 57, no. 8, pp. 1562 – 1571, Aug. 2008.
[11] Center for Advanced Spatial Technologies. Image Preprocessing [Online]. Available: http://cast.uark.edu/home/research/environmental-studies/lulc-change-
in-carroll-county/methodology1/image-preprocessing.html
[12] T. Sakata, S. Harigae and M. Takahashi, “Fingertips 3D mouse using a camera,” in Proceedings of SICE Annual Conference, pp. 1398 – 1401, Aug. 20-23, 2012.
[13] Z. Mo, and U. Neumann, “Real-time hand pose recognition using low-resolution depth images,” 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 1499-1505, 2006
[14] X, Liu, and K, Fujimura, “Hand gesture recognition using depth data,” Sixth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 529–534, May, 2004.
[15] E.-J. Holden and R. Owens, “Recognizing moving hand shapes,” Proceedings of 12th International Conference on Image Analysis and Processing, pp. 14-16, Sept. 2003.
[16] C. Yu, X. Wang, H. Huang, J. Shen, and K. Wu, “Vision-based hand gesture recognition using combinational features,” Proceedings of 2010 Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, pp. 543 – 546, 2010.
[17] A.Y. Dawod, J. Abdullah, and M.J. Alam, “Adaptive skin color model for hand segmentation,” in Proceedings of international conference on Computer Applications and Industrial Electronics, pp. 486-489, 2010.
[18] P. Peer, J. Kovac, and F. Solina, “Human skin colour clustering for face detection,” in Proceedings of International Conference on Computer as a Tool, vol. 2, pp. 144-148, Sept. 2003.
[19] M.M. Aznaveh, H. Mirzaei, E. Roshan, and M. Saraee, “A new color based method for skin detection using RGB vector space,” in 2008 Conference on Human System Interactions, pp.932-935, May 2008.
[20] X.N. Zhang, J. Jiang, Z.H. Liang and C.L. Liu, “Skin color enhancement based on favorite skin color in HSV color space,” IEEE Trans. on Consumer Electronics, vol. 56, no. 3, pp. 1789-1793, 2010.
[21] Y.-T. Pai, L.-T. Lee, S.-J. Ruan, Y.-H. Chen, S. Mohanty, and E. Kougianos, “Honeycomb model based skin color detector for face detection,” in 15th International Conference on in Mechatronics and Machine Vision in Practice, pp. 11–16, Dec. 2008.
[22] J.Y. You, H.C. Do, S. II Chien, and H.S. Tae, “Preferred skin color reproduction based on affine transform and contour reduction function using erosion,” SID Symposium Digest of Technical Papers, vol. 37, pp. 1598-1601, June 2006.
[23] O. Ikeda, “Segmentation of faces in video footage using HSV color for face detection and image retrieval,” in IEEE Interactional Conference on Image Processing, vol. 3, pp. III-913-6, Sept. 2003.
[24] R. Akmeliawati, F.S.B. Tis, and U.J. Wani, “Design and development of a hand-glove controlled wheel chair,” 2011 4th International Conference on Mechatronics (ICOM), pp. 1-5 17-19 May 2011,.
[25] M. Tabassum and D.D. Ray, “Intuitive control of three fingers robotic gripper with a data hand glove,” in International Conference on Control, Automation, Robotics and Embedded Systems (CARE), pp. 1-6, 2013.
[26] J. Condell, K. Curran, T. Quigley, P. Gardiner, M. McNeill, and E. Xie, “Automated measurements of finger movements in arthritic patients using a wearable data hand glove,” Medical Measurements and Applications (MeMeA), pp. 45:577–583, 2009.
[27] P. Apostolellis, B. Bortz, Mi Peng, N. Polys, and A. Hoegh, “Poster: Exploring the integrality and separability of the Leap Motion Controller for direct manipulation 3D interaction,” IEEE Symposium on 3D User Interfaces (3DUI), pp. 153-154, 2014.
[28] M. Nabiyouni, B. Laha, and D.A. Bowman, “Poster: Designing effective travel techniques with bare-hand interaction,” IEEE Symposium on 3D User Interfaces (3DUI), pp. 139-140, 2014.
[29] H.J. Lee and J.H. Chung, “Hand gesture recognition using orientation histogram,” Proceedings of the IEEE Region 10 Conference, vol. 2, pp. 1355-1358, Dec. 1999.
[30] B. Howard and S. Howard, “Lightglove: Wrist-worn virtual typing and pointing,” in Proc. IEEE Int. Symposium on Wearable Computer, pp. 172–173, Oct. 2001.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊