跳到主要內容

臺灣博碩士論文加值系統

(44.200.27.215) 您好!臺灣時間:2024/04/15 04:44
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:蔣劭廷
研究生(外文):CHIANG, SHAO-TING
論文名稱:基於紅外線之人眼注視點估測研究
論文名稱(外文):A Study of Eye Gaze Estimation Based on the Infrared Camera
指導教授:李棟良李棟良引用關係
指導教授(外文):LEE, DONG-LIANG
口試委員:陳慶逸駱樂
口試委員(外文):CHEN, CHING-YILUOH, LEH
口試日期:2017-07-15
學位類別:碩士
校院名稱:銘傳大學
系所名稱:電腦與通訊工程學系碩士班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:中文
論文頁數:76
中文關鍵詞:注視點估測紅外線眼球追蹤投影轉換
外文關鍵詞:Gaze estimationInfraredEye trackingProjective transformation
相關次數:
  • 被引用被引用:0
  • 點閱點閱:219
  • 評分評分:
  • 下載下載:10
  • 收藏至我的研究室書目清單書目收藏:0
近年來眼球控制技術逐漸普及,本論文提出一套低成本的眼球控制系統,實現以低價格的實驗器材達到正確偵測瞳孔與注視點分析的目的,並使用Matlab 作為開發軟體,系統中使用兩顆低價格之視訊攝影機,將其中一個視訊攝影機架設於頭戴式耳機麥克風上,並在視訊攝影機左右兩旁加裝兩個紅外線LED,讓使用者佩戴於頭上使用,而另一視訊攝影機則沿用筆記型電腦螢幕上方之視訊攝影機,利用佩戴於使用者頭上之視訊攝影機擷取眼睛移動之畫面,並將擷取之影像利用霍夫轉換(Hough transform)演算法抓取瞳孔中心位置。我們將電腦螢幕和眼球瞳孔分別劃分為四個象限,四象限再分別透過各別之投影轉換(Projective Geometric Transformation)來對映,並計算出瞳孔在螢幕上之座標位置。在電腦螢幕上之視訊攝影機則是用來擷取使用者之臉部畫面,利用此畫面使用可變形狀模型(Deformable Shape model)進行臉部辨識找出使用者臉上的66個點特徵點,並利用此特徵點來估測頭部的旋轉偏移量,透過類神經網路學習進行即時補償達到修正之目的。從實驗結果顯示,相較於傳統之方法,本論文所提出的方法能夠改善眼球控制滑鼠時的準確度。
Eye control technology is increasingly popular in recent years. A low-cost eye control system is proposed in this thesis. We accomplish the goal of using low-price experimental devices to achieve correct detection of user’s pupil and estimation of user’s gaze point. Two webcams are adopted in our system, one is fixed on a headset earphone and two infrared LEDs are installed in the left and right sides of it. A notebook’s build-in webcam is used as the second webcam. We use the webcam on user’s head to capture the images of the eye movement, and use the Hough transform to detect the center position of user’s pupil. The computer screen and the eye image were both divided into four quadrants. The Projective Geometric Transformation matrix between eye image and computer screen of each quadrant is computed and it can be used to obtain the gaze point on the screen. The webcam on the notebook screen is used to capture user's head movement. We use the Deformable Shape Model to obtain 66 landmark points of user’s face and use them to obtain the offset value caused by head movement. A neural network is used to learn and provide the mapping between head movements and the offset values of the pupil center. Experimental results show that, compare with some conventional methods, accuracy of the eye mouse control is improved obviously.
摘要 ii
ABSTRACT iii
致謝 iv
圖目錄 vii
表目錄 ix
第一章 緒論 1
1.1 研究動機與目的 1
1.2 文獻探討 2
1.3 論文架構 3
第二章 追瞳系統研究之介紹與相關技術 4
2.1 相關追瞳系統 4
2.1.1 搜尋線圈法 4
2.1.2 影像處理法 5
2.2 瞳孔中心座標轉換法 5
2.2.1 正形座標轉換 6
2.2.2 仿射座標轉換 7
2.2.3 投射座標轉換 8
2.2.4 線性網格點映射法[10],[11] 9
2.2.5 調和比轉換法[33] 11
2.2.6 仿射轉換法[13] 13
2.3 頭部移動修正 14
2.3.1 RGB值比對色塊法[11] 14
2.3.2 比例偏移法[13] 17
2.3.3 頭部移動偵測 18
第三章 研究方法 21
3.1 前處理 24
3.1.1 灰階化 27
3.1.2 二值化 28
3.1.3 型態學運算 29
3.1.4 標記演算法 30
3.1.5 霍夫轉換演算法 30
3.2 訓練階段 31
3.2.1 9點座標定位 33
3.2.2 將瞳孔與螢幕分為四象限 33
3.2.3 得到四個投影轉換矩陣 34
3.2.4 搜集訓練資料 36
3.2.5 類神經網路之學習 38
3.3 測試階段 41
3.3.1 修正因頭部偏移所造成的眼球中心座標偏移 42
3.3.2 判斷象限 42
3.3.3 投影轉換 43
第四章 實驗結果與討論 45
4.1 頭部固定實驗 45
4.1.1 眼控裝置穩定度測試 46
4.1.2 眼控裝置精準度測試 50
4.2 頭部移動實驗 53
4.2.1 類神經網路之訓練與測試 53
4.2.2 眼球中心位置之修正效果 55
4.2.3 頭部偏移比較 57
4.2.4 實際測試效果 59
第五章 結論與未來展望 63
參考文獻 64
[1]S. Eivazi, R. Bednarik, V. Leinonen, M. v. u. z. Fraunberg, and J. E. Jääskeläinen, '' Embedding an Eye Tracker Into a Surgical Microscope: Requirements, Design, and Implementation'', IEEE SENSORS JOURNAL, vol. 16, 2016, pp. 2070-2077.
[2]H. Cecotti, "A Multimodal Gaze-Controlled Virtual Keyboard", IEEE Trans. Human-Mach. Syst., vol. 46, 2016 , pp. 601–606.
[3]T. E. Hutchinson, K. P. White, Jr., W. N. Martin, K. C. Reichert,and L. A. Frey, '' Human-Computer Interaction Using Eye-GazeInput'', IEEE Transaction on Systems, Man and Cybernetics, vol. 19, 1989, pp. 1527-1534.
[4]Q. Ji, Z. Zhu, and P. Lan, “Real-Time Nonintrusive Monitoring and Prediction of Driver Fatigue,” IEEE Transactions on Vehicular Technology, vol. 53, 2004 , pp. 1052-1068.
[5]C. S. Lin, H. T. Chen, T. G. Lin, M. S. Yeh, and C. L. Tien, '' Development and Application of An Infrared Eye-mouse Control System'', Journal of Medical and Biological Engineering, 2005 , pp. 15-19.
[6]A. I. Adiba, N. Tanaka, and J. Miyake, "An Adjustable Gaze Tracking System and Its Application for Automatic Discrimination of Interest Objects", IEEE/ASME Transactions on Mechatronics, vol. 21, 2016 ,pp.973-979.
[7]C. S. Kehara and M. E. Crosby, ''Assessing Cognitive Load with Physiological Sensors'', IEEE Proceedings of the 38th Hawaii International Conference on System Sciences, 2005 , pp. 295a-295a.
[8]T. Partala, M. Jokiniemi, and V. Surakka, ''Pupillary responses to emotionally provocative stimuli'', Proceedings of the 2000 symposium on Eye tracking research and applications, 2000 , pp. 123–129.
[9]T. Partala and V. Surakka, ''Pupil size variation as an indication of affective processing'', International Journal of Human-Computer Studies, vol. 59, 2003 , pp. 185-198.
[10]莊英杰,「追瞳系統之研發於身障者之人機介面應用」,國立中央大學,資訊工程研究所碩士論文,2004年。
[11]邱國鈞,「追瞳系統之研製及其應用」,國立中央大學,資訊工程研究所碩士論文,2004年。
[12]林瑞碩,「使用網路攝影機即時人眼偵測與注視點分析」,國立台灣師範大學,資訊工程研究所碩士論文,2011年。
[13]Y. Li, D. S. Monaghan, and N. E. Connor, "Real-Time Gaze Estimation using a Kinect and a HD Webcam", MultiMedia Modeling. 2014, Dublin, 2014, pp. 506–517.
[14]J. L. Andreassi, Psychophysiology: Human Behavior and Physiological Response, Third Edition. Hillsdale, NJ: Lawrence Erlbaum, 1995.
[15]Y. Tomita, Y. Igarashi, S. Honda, and N. Matsuo, "Electro-Oculography Mouse for Amyotrophic Lateral Sclerosis Patients", IEEE Conference Engineering in Medicine and Biology Society, vol. 531, 1996 ,pp. 1708-1781.
[16]S. T. Iqbal, X. S. Zheng, and B. P. Bailey, "Task-evoked pupillary response to mental workload in human-computer interaction", Conference on Human Factors in Computing Systems, 2004, pp.1477–1480.
[17]J. Gang and E. Sung, "Study on Eye Gaze Estimation", Transaction on Systems, Man and Cybernetics, Part B, vol. 32, 2002 , pp.332-350.
[18]Z. Zhu and Q. Ji, "Eye gaze tracking under natural head movements", IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, 2005, pp. 918-925.
[19]C. S. Lin, C. C. Huan, C. N. Chan, M. S. Yeh, and C. C. Chiu, "Design of a computer game using an eye-tracking device for eye's activity rehabilitation", Optics and Lasers in Engineering, 2004, pp.91-108.
[20]林宸生、林南州、簡志忠,「光學式瞳位追蹤器之人機介面系統研製」,私立逢甲大學,自動控制工程研究所碩士論文,1997年。
[21]蔡金源,「以眼球控制之殘障者人機介面系統:紅外線視動滑鼠」,國立台灣大學,電機工程研究所碩士論文,1997年。
[22]張凱傑,「眼控與頭控之人機介面系統研發與整合」,私立逢甲大學,自動控制工程研究所碩士論文,2001年。
[23]郭靜男,「可眼控及頭控之多功能PC Camera 之研發與應用」,私立逢甲大學,自動控制工程研究所碩士論文,2003年。
[24]詹永舟,「瞳位追蹤應用於眼控系統及眼球動態量測儀器之製作與分析」,私立逢甲大學,自動控制工程研究所碩士論文,1998年。
[25]A. J. Glenstrup and T. E. Nielsen, "Eye Controlled Media: Present and Future State", Thesis of Bachelor in Information Psychology, Psychological Laboratory, University of Copenhagen, Denmark, 1995.
[26]R. H. S. Carpenter, Movements of the eyes, Aug. 1988.
[27]R. V. Kenyon, "A Soft Contact Lens Search Coil for Measuring eye Movements", Vision Res, vol. 25, 1985 , pp. 1629-1633.
[28]Y. Tomita, Y. Igarashi, S. Honda, and N. Matsuo, "Electro-Oculography Mouse for Amyotrophic Lateral Sclerosis Patients", IEEE Conference Engineering in Medicine and Biology Society, vol. 531, 1996, pp. 1708-1781.
[29]G. Norris and E. Wilson, "The Eye Mouse, An Eye Communication Device", IEEE Proceedings of Bioengineering, 1997, pp. 66–67.
[30]K. S. Park and K. T. Lee, "Eye-controlled human computer interface using the line-of-sight and the intentional blink", Computer Engineering, vol. 30, 1996, pp. 463-473.
[31]Available: http://www.med.upenn.edu/solomon/NewFiles/Frames/pictures/p_eyecage.html January 10 2017[data accessed]
[32]Y. Li, D. S. Monaghan, and N. E. Connor, "A Low-Cost Head and Eye Tracking System for Realistic Eye Movements in Virtual Avatars", MultiMedia Modeling, 2014, pp. 461–472.
[33]D. H. Yoo, J. H. Kim, B. R. Lee, and M. J. Chung, ''Non-contact eye gaze tracking system by mapping of corneal reflections'', 2002. Proceedings. Fifth IEEE International Conference on Automatic Face and Gesture Recognition, 2002, pp. 94-99.
[34]M. A. Fischler, R. C. Bolles, "Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography", Communications of the ACM, vol. 24, 1981 , pp.381-395.
[35]X. Yu, J. Huang, S. Zhang, W. Yan, D. N. Metaxas, "Pose-free Facial Landmark Fitting via Optimized Part Mixtures and Cascaded Deformable Shape Model", IEEE International Conference on Computer Vision, 2013,pp. 1944-1951.
[36]R. Hecht-Nielsen, "Theory of the Backpropagation Neural Network", IEEE International Joint Conference on Neural Networks, 1989, pp. 593–605.
[37]Available: http://www.ktnet.com.tw/OfficialProductDesc.aspx?PID=KTCCD323&CID=30,20,01,01 January 12 2017[data accessed]
[38]繆紹綱編譯,R. C. Gonzalez著,數位影像處理技術,普林斯頓國際有限公司,民國93年。
[39]Y.-M. Cheung and Q. Peng, "Eye gaze tracking with a web camera in a desktop environment", IEEE Trans. Human-Mach. Syst., vol. 45, 2015, pp. 419–430.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top