跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.82) 您好!臺灣時間:2025/02/07 03:20
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:劉定倫
研究生(外文):Ting-Lun Liu
論文名稱:結合慣性測量單元具頭動補償功能之可見光穿戴式眼動儀設計與實現
論文名稱(外文):Design and Implementation of Visible-light Wearable Eye Tracker joint Inertial Measurement Unit for Head-motion Compensation
指導教授:范志鵬范志鵬引用關係
指導教授(外文):Chih-Peng Fan
口試委員:高文忠吳俊霖
口試委員(外文):Wen-Chung KaoJiunn-Lin Wu
口試日期:2017-07-14
學位類別:碩士
校院名稱:國立中興大學
系所名稱:電機工程學系所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:中文
論文頁數:109
中文關鍵詞:可見光攝影機穿戴式眼動儀虹膜追蹤
外文關鍵詞:iris trackingVisible-lightWearable Eye Tracker
相關次數:
  • 被引用被引用:2
  • 點閱點閱:301
  • 評分評分:
  • 下載下載:21
  • 收藏至我的研究室書目清單書目收藏:1
近幾年在人機互動介面方面不斷地被廣泛應用,尤其是穿戴式系統朝更直覺以及更方便發展。本篇論文以低成本之穿戴式實景眼動儀系統,在使用者執行校正程序以及結束後,不需要像以往固定頭部不動,反之可以容許頭部晃動並且持續推測視線落點。本眼動儀系統偵測使用者之眼睛注視的方向資訊,來取代手部的執行動作,以直覺反射式指出使用者所注視區塊,並且分析使用者在實景上注視之區域。

本論文為實現虹膜穿戴式眼動儀系統,為利用雙攝影機分別對外擷取實景資訊以及對內擷取虹膜資訊。其中對外及對內攝影機皆為(640x480畫素)。而對外攝影機則以使用者視角至於眉心水平對外拍攝,模擬使用者透過眼睛向外觀看之方向。而對內攝影機則是置於左眼前方由下而上約5度擷取左眼資訊。本系統虹膜中心位置比較兩種演算法,一種為基於隨機抽樣一致性(RANSAC)方式計算虹膜的橢圓模型參數,另一種則是利用梯度影像中兩梯度向量之特性計算出極大值並視其位置為虹膜中心。得出虹膜中心後將透過對外攝影機擷取校正點並執行校正程序經由透視轉換,進而推測使用者凝視位置之視線落點估計,其中也利用慣性測量單元(IMU)所提供之資訊進行頭部移動補償之修正。本論文之實驗主要為利用四點校正程序,分別探討室外以及室內環境以及比較使用虹膜中心位移之頭動補償[9]與利用IMU資訊做頭動補償之差異。此外,由於眼睛並非平面構造,所以凝視四點位置投影至二維影像上時並非正方的矩形,上下左右的區塊會向外擴散,導致無法正確的達到預計結果,故以室內情形增加為九點校正做為測試並比較。

實驗結果顯示,使用桌上型電腦(3.4GHz)進行運算,演算法一使用橢圓擬合技術,在室外情形中心點水平與垂直平均偏移量約為4個像素點,室內情形平均偏移量水平與垂直平均偏移量約為5.5個像素點;演算法二使用梯度影像內積技術,室外情形中心點水平平均偏移量約為1個像素點,而垂直方向平均偏移量約為5個像素點,室內情形中心點水平平均偏移量約為2.3個像素點,而垂直方向平均偏移量約為7個像素點。而經由搭配IMU補償技術之透視轉換得到的測試視線落點與實際目標物的角度誤差,使用演算法一技術,室外水平角度誤差約在1.6。~2.1。之間,而垂直角度誤差約在1.7。~1.9。之間,在室內情形水平角度誤差約在1.5。~2.2。之間,而垂直角度誤差約在0.9。~2.1。之間;而使用演算法二技術,室外水平角度誤差約在1.5。~2.9。之間,而垂直角度誤差約在1.0。~1.9。之間,在室內情形水平角度誤差約在0.6。~2.8。之間,而垂直角度誤差約在2.5。~2.8。之間。
In recent years, the human-computer interaction has been widely implied in almost every way in our daily life. Especially the wearable devices toward more intuitive and more convenient. In this thesis, we develop a low-cost wearable eye tracker system that doesn’t need to fix the head after the user performs the calibration procedure. The proposed system also allows the head free and the correct gaze estimation simultaneously.
In this thesis, the eye tracker includes the visible-light front-view webcam, which captures the eye image, and the outward webcam, which captures the scene image. We propose two different algorithms for the iris center localization. One is based on the ellipse fitting method, the other is based the image gradients for accurate and robust eye center localization. In the calibration procedure, we compare different conditions in the outdoor and indoor conditions, and the head-movement compensation scheme with the IMU’s data is used for the tracking processes. Besides, since the human eyes are stereo-sphere shape, the gaze points are difficult to be projected from the 3-dimensions space to the 2-dimensions image. In order to solve this situation, we try to utilize 9 calibration points to increase the gaze tracking accuracy.
In our experiments by Algorithm 1: the experimental results show that in the outdoor condition, the offset of iris center is both within 4 pixels for horizontal and vertical coordinates. Then in the indoor case, the offset of iris center are both within 5.5 pixels for horizontal and vertical coordinates. By Algorithm 2 in the outdoor condition, its center offset of horizontal is 1 pixels, and the vertical offset is 5 pixels. Then in the indoor condition, its center offset of horizontal is 2.3 pixels, and the vertical offset is 7 pixels. Through the perspective function with the IMU compensation, by Algorithm 1 in the outdoor testing mode, the system achieves the horizontal accuracies of gaze tracking between 1.6 and 2.1 degrees, and the vertical accuracies of gaze tracking between 1.7 and 1.9 degrees respectively. In the indoor condition, the system achieves the horizontal accuracies of gaze tracking between 1.5 and 2.2 degrees, and the vertical accuracies of gaze tracking between 0.9 and 2.1 degrees respectively. By Algorithm 2 in the outdoor testing mode, the system achieves the horizontal accuracies of gaze tracking between 1.5 and 2.9 degrees, and the vertical accuracies of gaze tracking between 1.0 and 1.9 degrees respectively. In the indoor condition, the system achieves the horizontal accuracies of gaze tracking between 0.6 and 2.8 degrees, and the vertical accuracies of gaze tracking between 2.5 and 2.8 degrees respectively.
致謝 i
論文摘要 ii
Abstract iii
目錄 iv
圖目錄 vi
表目錄 ix
緒論 1
1.1 研究動機與目的 1
1.2 論文架構 3
第二章 預備知識 4
2.1.0 眼球基本結構 4
2.1.1 接觸式眼動儀 5
2.1.2 影像式眼動儀 6
2.2.1 HSV色彩空間 10
2.2.2 隨機抽樣一致性 11
2.2.3 橢圓擬合 12
2.2.4 哈爾特徵 13
2.2.5 積分影像 14
2.2.6 歐蘇法 15
2.2.7 估計眼部中心方法 17
2.2.8 透視變換 19
2.2.9 四元數 19
2.2.10 尤拉角法 20
2.2.11 三維旋轉矩陣及慣性測量單元系統 22
第三章 演算法實作 24
3.1 眼動儀實體架構 24
3.2 演算法流程 25
3.3 演算法一 : 橢圓擬合 25
3.3.1 影像前處理 26
3.3.2 虹膜樣板匹配 27
3.3.3 去除反光、增強對比 29
3.3.4 環境判斷 30
3.3.5 影像二値化(室外) 30
3.3.6 兩階段二値化(室內) 31
3.3.7 二値化邊界取樣 33
3.3.8 RANSAC虹膜橢圓擬合 34
3.4 演算法二 : 梯度影像法 36
3.4.1 影像前處理 37
3.4.2 虹膜樣板匹配 38
3.4.3 去除反光 39
3.4.4 執行水平與垂直方向梯度 40
3.4.5 計算梯度影像閥值 41
3.4.6 基於虹膜大小之切割並計算權重 43
3.4.7 尋找可能的中心點 44
3.5 校正點定位 46
3.5.1 頭動偏移補償校正程序 48
3.5.2 慣性測量單元(IMU)移動補償校正程序 52
第四章 實驗結果及分析 59
4.1 可見光虹膜追蹤演算法實驗結果 59
4.2 演算法一之實驗結果分析 60
4.2.1 虹膜擬合橢圓中心位置比較 60
4.3.1 虹膜視線估計落點正確率(頭動移動補償) 70
4.3.2 虹膜視線估計落點正確率(IMU移動補償) 78
4.3.3 虹膜演算法執行速度比較 79
4.4 演算法二之實驗結果分析 80
4.4.1 虹膜擬合中心位置比較 80
4.4.2 虹膜視線估計落點正確率(頭動移動補償) 84
4.4.3 虹膜視線估計落點正確率(IMU移動補償) 88
4.4.4 虹膜演算法執行速度比較 89
第五章 討論與未來展望 90
5.1 結論 90
5.2 未來展望 94
參考文獻 95
附錄 98
[1]R. C. Gonzalez and R. E. Woods, 2008, Digital Image Processing 3rd, Prentice Hall
[2]A. M. Earman,” Eye Safety for Proximity Sensing Using Infrared Light-emitting Diodes”, 2016
[3]M. A. FISCHLER, R. C. BOLLES, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography", Communications of the ACM, 24.6: 381-395, 1981.
[4]Wikimedia Commons, “Schematic diagram of the human eye en,” 2007.
[5]P. Viola and M. J. Jones, ”Robust Real-Time Face Detection”, International Journal of Computer Vision, vol.57(2), pp.137–154, May 2004.
[6]J. B. Kuipers, “Quaternions and Rotation Sequences”, Princeton university press, 1999.
[7]Y. Chen and J. Su, "Fast Eye Localization Based on a New Haar-like Feature", 10th IEEE World Congress on Intelligent Control and Automation (WCICA), pp.4825-4830, 2012.
[8]A. Duchowski, “Eye tracking methodology: Theory and practice,” vol. 373, 2007.
[9]T. Kocejko, A. Bujnowski, J.Ruminski, E.Bylinska, J.Wtorek, "Head movement compensation algorithm in multi-display communication by gaze", IEEE International Conference on Human System Interactions, p88-94, 2014.
[10]R.Kothari, J.L.Mitchell, “Detection of eye locations in unconstrained visual images,” IEEE International Conference on Image Processing, vol. 3, pp. 519–522,1996
[11]K. Holmqvist, M. Nyström, R. Andersson, R. Dewhurst, H.Jarodzka,and J. Van de Weijer, “Eye tracking: A comprehensive guide to methods and measures,” Oxford University Press, 2011.
[12]R. Lienhart, J. Mayd, "An Extended Set of Haar-like Features for Rapid Object Detection", IEEE International Conference on Image Processing, pp.900-903, 2002.
[13]范志鵬, 吳家豪, "應用於穿戴式即時眼動儀的雙模式瞳孔與虹膜視線追蹤系統", 國立中興大學電機工程學系碩士論文, July 2016年
[14]L. SWIRSKI, L. BULLING, N. DODGSON,"Robust real-time pupil tracking in highly off-axis images ",In:Proceedings of the Symposium on Eye Tracking Research and Applications,” ACM. p. 173-176, 2012.
[15]H. Drewes, "Eye Gaze Tracking for Human Computer Interaction," PhD thesis, Media Informatics Group, LMU University of Munich, 2010.
[16]【影像處理】透視變換 Perspective Transformation,
[17]F.Timm and E.Barth,“Accurate eye centre localization by means of gradients”, January 2011
[18]P. Yang, B. Du, S. Shan, and W. Gao, “A novel pupil localization method based on gaboreye model and radial symmetry operator,” IEEE International Conference on Image Processing, vol. 1, pp. 67–70, 2004.
[19]N. Wade and B. Tatler, “The moving tablet of the eye: The origins of modern eye movement research,” 2005.
[20]D. Young, H. Tunley, and R. Samuels, “Specialised hough transform and active contour methods for real-time eye tracking,” University of Sussex, Cognitive & Computing Science, 1995.
[21]T. D. Orazio, N. Ancona, G. Cicirelli, M. Nitti, “A ball detection algorithm for real soccer image sequences,” 16th IEEE International Conference on Pattern Recognition, vol. 1, pp. 210–213, 2002.
[22]https://webbuilder5.asiannet.com/ftp/2627/ColoeSpaceVol23.pdf,王坤亮,色彩空間調整Color Space
[23]C. Hendry, A. Farley, and E. McLafferty, “Anatomy and physiology of the senses,” Nursing Standard, vol. 27, no. 5, pp. 35–42, 2012.
[24]E. P. Widmaier, H. Raff, and K. T. Strang, “Vander’s human physiology: the mechanisms of body function”, McGraw-Hill Higher Education, 2011.
[25]J. N. Chi, P. Y. Zhang, S. Y. Zheng, C. Zhang and Y. Huang, “Key Techniques of Eye Gaze Tracking Based on Pupil Corneal Reflection,” WRI Global Congress on Intelligent Systems(GCIS), pp.133-188, 2009.
[26]OpenCV, http://opencv.org/
[27]D. Sliney and M. Wolbarsht, “Safety with Laser and Other Optical Sources,” Plenum Press. New York and London, pp. 65~151, 1980.
[28]張正平,劉立文,The Management and Assessment of Infrared Exposure Dose in Foundry Industry鑄造業紅外線暴露劑量的評估與管理,行政院勞工委員會勞工安全衛生研究所,2006
[29]J. W. Lee, H. Heo and K. R. Park, “A novel gaze tracking method based on the generation of virtual calibration points. Sensors” , 2013, 13.8: 10802-10822.
[30]A. Schwaller, “Combining Eye- and Head-Tracking Signals for Improved Event Detection,” Lund university, 2014
[31]C. Garhart and V. Lakshminarayanan, “Anatomy of the eye,” in Handbook of Visual Display Technology. Springer, pp. 73–83, 2012.
[32]J. D. Enderle and J. D. Bronzino, “Introduction to biomedical engineering. Academic Press,” 2012.
[33]K. N. Kim and R. S. Ramakrishna, “Vision-based eyegaze tracking for human computer interface,” in Systems, Man, and Cybernetics, 1999. IEEE SMC’99 Conference Proceedings. 1999 IEEE International Conference on. IEEE, vol. 2, pp. 324–329, 1999.
[34]R. Newman, Y. Matsumoto, S. Rougeaux, and A. Zelinsky, “Real-time stereo tracking for head pose and gaze estimation,” in Automatic Face and Gesture Recognition, 2000. Proceedings. Fourth IEEE International Conference on. IEEE, pp. 122–128, 2000.
[35]胡竹生, 鄭期元, "整合GNSS與INS量測資訊的地平面運動軌跡估測", 國立交通大學電控工程學研究所碩士論文, Aug 2013年
[36]范志鵬, 鄭竣文, "結合樣板匹配與橢圓擬合的視線追蹤系統雛型實作", 國立中興大學電機工程學系碩士論文, July 2015年
[37]http://www.lp-research.com.
[38]K. Ahuja, R. Banerjee, S. Nagar, K. Dey, F. Barbhuiya, “Eye center localization and detection using radial mapping,”, 2016 IEEE International Conference on Image Processing(ICIP), Sep. 2016
[39]Shu-Fei Yang, An eye-tracking study of the Elaboration Likelihood Model in online shopping, Electronic Commerce Research and Applications, Volume 14, Issue 4, July–August 2015, Pages 233-240, ISSN 1567-4223,
[40]Sabine Heuer and Brooke Hallowell,A novel eye-tracking method to assess attention allocation in individuals with and without aphasia using a dual-task paradigm ,Journal of Communication Disorders,vol. 55,pp. 15-30,2015
[41]A.Nakazawa C.Nitschke T.Nishida., “NON-CALIBRATED AND REAL-TIME HUMAN VIEW ESTIMATION USING A MOBILE CORNEAL IMAGING CAMERA,” , 2015 IEEE International Conference on Multimedia & Expo Workshops (ICMEW) , July. 2015
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊