跳到主要內容

臺灣博碩士論文加值系統

(3.237.38.244) 您好!臺灣時間:2021/07/24 16:40
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:曹家豪
研究生(外文):Cao, Jia-Hao
論文名稱:即時視線追蹤技術
論文名稱(外文):A Real-Time Eye Gaze Tracking Technique
指導教授:陳文雄陳文雄引用關係
指導教授(外文):Chen, Wen-Shiung
口試委員:謝俐麗張永昌
口試委員(外文):Hsieh, Li-LiChang, Yung-Chang
口試日期:2012-07-23
學位類別:碩士
校院名稱:國立暨南國際大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2013
畢業學年度:100
語文別:中文
論文頁數:51
中文關鍵詞:人臉偵測人眼偵測視線追蹤樣本比對投影法
外文關鍵詞:Face detectionEye detectionEye gaze trackingTemplate matchingProjection function
相關次數:
  • 被引用被引用:0
  • 點閱點閱:264
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文主要目的在探討如何以簡易的設備完成人眼之視線追蹤,以改良傳統視
線追蹤成本過高的問題,本系統不僅成本低廉,且不須要任何額外之配戴裝置,甚
至僅需一臺含視訊裝置之筆記型電腦即可使用,可攜帶性極高!此套系統可估算出
人眼凝視的方向及位置,可應用於醫護,軍事,娛樂等。
在傳統的視線追蹤方式,不外乎利用光線反射原理、配戴視線追蹤裝置或眼動
圖資訊(electro-oculogram,EOG)等,不僅所需成本較高,且有些光線反射裝置更可
能對人眼造成侵入式的傷害。本論文基於低成本、不具侵略性之前提下,使用一部
web camera偵測裝置就能運作。
本論文提出的視線追蹤演算法概分為三大部份,首先,使用YCbCr色彩空間將
人臉的色彩資訊切割出來,以便快速定位出人臉位置,接著運用樣本比對的方式找
出人眼粗略部位,此時我們採用擴張方格的方式,以便能快速的搜尋到人眼位置,
進而利用垂直投影與水平投影資訊推論得知人眼凝視的方向。
The objective of thesis is to present a set techniques integrated into a low-cost eye
gaze tracking system. An eye gaze tracking system is a system that can estimate the
direction and a position of the human eye gaze. It can be used in medical, military and
entertainment.
There are several ways of tracking the direction of the eye-gaze by using reflection of
light, electro-oculogram (EOG), or contact lens, etc. Each of them has its own advantages
and disadvantages such as high complexity, high cost, and invasive way. Price and
convenience are two important considerations, and therefore we proposed the theory not
just only easy to use but also install a web camera device it can be operated.
In this thesis three-stage algorithm is proposed to estimate the eye gaze direction. At
the first stage, an efficient face detection based on PCA Feature is employed to locate the
user’s face. Then use template matching to detect the human eye. Last, we used horizontal
projection and vertical projection to identify where the eye looking at.
誌謝 .................................................................................... i
論文摘要 ................................................................................. ii
Abstract ............................................................................... iii
目錄 .................................................................................... iv
圖目錄 ................................................................................... vi
表目錄 ................................................................................... ix
第一章 緒論 ............................................................................... 1
1.1 研究動機 .............................................................................. 1
1.2 研究目標與方向 ......................................................................... 2
1.3 論文大綱與組織 ......................................................................... 3
第二章 相關文獻回顧 ......................................................................... 4
2.1 搜尋線圈法 ............................................................................ 4
2.2 眼電圖法 .............................................................................. 5
2.3 紅外線眼動圖法 ......................................................................... 6
2.4 影像眼動圖法 ........................................................................... 6
2.5 異色邊界追蹤技術 ........................................................................ 7
2.6 Purkinje 影像追蹤法 .................................................................... 8
2.7 基於影像方法 ............................................................................ 9
第三章 人臉分割之影像處理技術 ................................................................. 11
3.1 膚色篩選 .............................................................................. 11
3.1.1 RGB 色彩模型 ........................................................................ 11
3.1.2 HSL 色彩空間 ......................................................................... 12
3.1.3 YIQ 色彩空間 ......................................................................... 13
v
3.1.4 HSV 色彩空間 ......................................................................... 13
3.1.5 HUE 向量空間 ......................................................................... 14
3.1.6 YCbCr 色彩空間 ....................................................................... 15
3.2 PCA 特徵萃取 ........................................................................... 19
3.3 樣本比對 ............................................................................... 24
第四章 視線追蹤系統 .......................................................................... 32
4.1 系統架構 ............................................................................... 32
4.2 影像攝取模組 ............................................................................ 33
4.3 影像前處理模組 ........................................................................... 34
4.3.1 人臉偵測 .............................................................................. 34
4.3.2 眼睛區塊的定位 ......................................................................... 35
4.3.3 眼球投影資訊的運用 ...................................................................... 36
4.3.4 九宮格校準 ............................................................................. 39
第五章 實驗結果 ............................................................................... 40
5.1 實驗環境 ................................................................................. 40
5.2 實驗成果 ................................................................................. 41
5.3 實驗結果分析與討論 ......................................................................... 42
第六章 結論與建議............................................................................... 46
6.1 結論 ..................................................................................... 46
參考文獻 ...................................................................................... 47
[1] L. Bour, “DMI-search scleral coil,” Dept. Neurology, Clinical Neurophysiol,
Academic Medical Cent, AZUA, Amsterdam, The Netherlands, Tech. Rep. H2-214,
1997.
[2] R. V. Kenyon, “A soft contact lens search coil for measuring eye movements,” Vision
Research, vol. 25, no. 11, pp. 1629-1633, 1985.
[3] J. Gips, P. Olivieri, and J. Tecce, “Direct control of the computer through electrodes
placed around the eyes”, in Proc. Fifth Int. Conf. Human-Computer Interaction.
Orlando, FL: Elsevier, 1993, pp. 630–635.
[4] L. Young and D. Sheena, “Survey of eye movement recording methods,” Behavior
Research Methods and Instrumentations, vol. 7, no. 5, pp. 397-429, 1975.
[5] D. Kumar and E. Poole, “Classification of EOG for human computer interface,” in the
Second Joint EMBS/BMES Conference, vol. 1, pp. 64-67, Oct. 2002.
[6] K. S. Park and K. T. Lee, “Eye-controlled human/computer interface using the lineof-
sight and the intentional blink,” Computer Engineering, vol. 30, no. 3, pp. 463-473,
1996.
[7] T. E. Hutchinson, J. K. P. White, W. M. Martin, K. C. Reichert, and L. A. Frey,
“Human-computer interaction using eye-gaze input,” IEEE Trans. Systems, Man,
Cybernetics., vol. 19, pp. 1527–1534, Nov./Dec. 1989.
[8] S. Baluja and D. Pomerleau, “Non-intrusive gaze tracking using artificial neural
networks,” Sch. Comput. Sci, Carnegie Mellon Univ, Pittsburgh, PA, USA, Tech. Rep.
CMU-CS-94-102, 1994.
[9] S. I. Kim, D. K. Lee, S. Y. Kim, O. S. Kwno and J. Cho, “An algorithm to detect a
center of pupil for extraction of point of gaze,” 26th Annual International Conference
48
of the IEEE Engineering in Medicine and Biology Society, vol. 1, pp. 1237-1240,
Sept. 2004.
[10] S. I. Kim, J. M. Cho, T. W. Nam, J. H. Kim, S. H. Kim and J. H. Lim, “Study to find
an interrelationship with position of the pupil and an eyelid to extraction of point of
gaze,” Proceedings of 7th International Workshop on Enterprise networking and
Computing in Healthcare Industry (HEALTHCOM), pp. 426-429, June 2005.
[11] S. I. Kim, J. M. Cho, J. Y. Jung, S. H. Kim, J. H. Lim, T. W. Nam and J. H. Kim, “A
fast center of pupil detection algorithm for VOG-Based eye movement tracking,” 27th
Annual International Conference of the Engineering in Medicine and Biology Society,
pp. 3188-3191, Jan. 2006.
[12] A. Villanueva, R. Cabeza and S. Porta, “Eye tracking: Pupil orientation geometrical
modeling,” Image and Vision Computing, vol. 24, no. 7, pp. 663-679, July 2006.
[13] A. Kumar and G. Krol, “Binocular infrared oculography,” The Laryngoscope, vol.
102, pp. 367-378, Apr. 1992.
[14] A. J. Glenstrup and T. E. Nielsen, “Eye controlled media: Present and future state,”
Thesis of Bachelor in Information Psychology, University of Copenhagen, Denmark,
1995.
[15] A. Kapoor and R.W. Picard, “Real-time, fully automatic upper facial feature
tracking,” Fifth IEEE International Conference on Automatic Face and Gesture
Recognition, vol. 10, pp. 8-13, May 2002.
[16] C. S. Lin, C. C. Huan, C. N. Chan, M. S. Yeh and C. C. Chiu, “Design of a computer
game using an eye-tracking device for eye's activity rehabilitation,” Optics and Lasers
in Engineering, vol. 24, pp. 91-108, July 2004.
[17] T. Kocejko, A. Bujnowski and J. Wtorek, “Eye mouse for disabled,” Conference on
Human System Interactions, pp. 199-202, May 2008.
49
[18] J. H. Kim, D. W. Lee, C. G. Park, H. C. Bang, J. H. Kim, S. Y. Cho, Y. I. Kim and K.
Y. Baek, “Construction of integrated simulator for developing head/eye tracking
system,” International Conference on Control, Automation and System, pp. 2485-
2488, Oct. 2008.
[19] 張凱傑,「眼控與頭控之人機介面系統研發與整合」,逢甲大學自動控制工程
研究所碩士論文,民國九十年。
[20] 郭靜男,「可眼控及頭控之多功能PC Camera 之研發與應用」,逢甲大學自動
控制工程研究所碩士論文,民國九十二年五月。
[21] 詹永舟,「瞳位追蹤應用於眼控系統及眼球動態量測儀器之製作與分析」,逢
甲大學自動控制工程研究所碩士論文,民國八十七年。
[22] J. P. Ivins, J. Porrill and J. P. Frisby, “A deformable model of the human iris driven by
non-linear least-squares minimization,” Sixth International Conference on Image
Processing and Its Applications, vol. 1, pp. 234-238, July 1997.
[23] J. P. Ivins, J. Porrill and J. P. Frisby, “Deformable model of the human iris for
measuring ocular torsion from video images,” IEE Proceedings-Vision, Image and
Signal Processing, vol. 145, no. 3, pp.213-220, Jun. 1998.
[24] 郭中仁,「以CCD 影像作視向偵測」,國立清華大學,電機工程研究所碩士論
文,1997。
[25] S. G. Cho, K. S. Jin and J. J. Hwang, “Gaze tracking based pointer: eye-click,”
Proceedings of 2004 International Symposium on Intelligent Signal Processing and
Communication Systems (ISPACS), pp. 71-74, Nov. 2004.
[26] T. Nishimura, M. Nakashige, T. Akashi, Y. Wakasa and K. Tanaka, “Eye interface for
physically impaired people by Genetic Eye Tracking,” Annual Conference SICE, pp.
828-833, Sept. 2007.
[27] B. L. Nguyen, “Eye gaze tracking,” International Conference on Computing and
50
Communication Technologies, pp. 1-4, June 2009.
[28] C. Garcia, G. Zikos, G. Tziritas, “Face Detection in Color Images using Wavelet
Packet Analysis,” Proceedings of the 6th IEEE International Conference on
Multimedia Computing and Systems, Florence, pp. 703-708, 1999.
[29] 謝怡竹,“An Optical-Flow Based Automatic Expression Recognition
System”, National Central University, Master Paper, 2005.
[30] Matthew A. Turk, Alex P. Pentland, “Face Recognition Using Eigenfaces,” 1991.
[31] J. R. Parker, “Algorithms for Image Processing and Computer Vision,” John Wiley &
Sons, 1996.
[32] R. C. Gonzalez and R. E. Woods, “Digital Image Processing,” 2nd Edition, Prentice-
Hall, New Jersey, 2002.
[33] V.V. Kohir and U.B. Desai, “Face recognition using a DCT-HMM approach,” in Proc.
IEEE Workshop on Applications of Computer Vision (WACV’98), Princeton, NJ,
1998, pp.226–231.
[34] S. Mallat and S. Zhong, “Characterization of signals from multiscale edge,” IEEE
Transaction on Pattern Analysis and Machine Intelligence, vol. 14, no. 7, pp. 710-732,
Jul. 1992.
[35]Chern-Sheng Lin, Wei-Zun Wu, Yun-Long Lay, Ming-Wen Chang, “A digital imagebased
measurement system for a LCD backlight odule,” Optics & Laser Technology,
Vol.33, pp.499–505,2001.
[36] R. Brunelli, and T. Poggio, “Template Matching: Matched Spatial Filters And
Beyond,” MIT AI Memo 1549,July 1995.
[37] M.S. Lew, N. Sube, T.S. Huang, “Improving visual matching,” IEEE Conference on
Computer Vision and Pattern Recognition, Vol. 2, pp. 58-65, 2000.
[38] Alejandro Backer and Eva Peral “A Remote Video Eye Tracker,” California Institute
51
of Technology, June 1997.
[39] P. Viola and M. Jones, “Rapid object detection using a boosted cascade of simple
features,” IEEE Computer Society Conference on Computer Vision and Pattern
Recognition, vol. 1, pp. 511-518, 2001.
[40] S. H. Yeh, “Human facial animation based on real image sequence,” URL:
http://sun.lib.nsysu.edu.tw/ETD-db/etd-0724101-163439.pdf, 2001.
[41] J. M. Park, C. G. Looney, and H. C. Chen, “Fast connected component labeling
algorithm using a divide and conquer technique,” URL:
http://cs.ua.edu/TechnicalReports/TR-2000-04.pdf.
[42] M. Xu, T. Akatsuka, “Detecting head pose from stereo image sequence for active face
recognition,” AFGR98 (82-87). IEEE Top Reference. Bib Ref 9800, 1998.
[43]王國強,「人眼追蹤系統及其於人機介面之應用」,國立中央大學資訊工程研
究所碩士論文,民國九十二年七月。
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊