|
[1] W. Wang, Y. Huang, and R. Zhang, “Driver Gaze Tracker Using Deformable Template Matching,” IEEE International Conference on Vehicular Electronics and Safety, pp. 244-247, 2011. [2] L. M. Bergasa, J. Nuevo, M. A. Sotelo, R. Barea, and M. E. Lopez, “Real-Time System For Monitoring Driver Vigilance,” IEEE Transactions on Intelligent Transportation Systems, vol. 7, no. 1, pp. 63-77, 2006. [3] J. Jimenez, D. Gutierrez, and P. Latorre. “Gaze-Based Interaction For Virtual En-vironments,’’ Journal of Universal Computer Science, vol. 14, no. 19, pp. 3085-3098, 2008. [4] H. Adrian, and B. Russel. “Eye Tracking And Gaze Based Interaction Within Immersive Virtual Environments,” International Conference on Computational Science, pp.729-736,2009. [5] M. Ghani, S. Chaudhry, M. Sohail, and M. Geelani, “GazePointer: A Real Time Mouse Pointer Control Implementation Based On Eye Gaze Tracking,” Interna-tional Multi Topic Conference, pp 154-159, 2013. [6] O. Dicky, H. Ito, T. Imabuchi, H. Kikuchi, and Y. Horie, “Visible-Spectrum Re-mote Eye Tracker For Gaze Communication,” International Conference on Graphic and Image Processing, vol. 9443, p. 944333, 2015. [7] J. Babcock, and J. Pelz, “Building A Lightweight Eye Tracking Headgear,” ACM Symposium on Eye tracking research and applications, pp. 109-114, 2004. [8] C. Wang, F. Xia, and J. Chai, “Realtime 3D Eye Gaze Animation Using A Single RGB Camera,” ACM Transactions on Graphics, vol. 35, no. 4, pp. 118, 2016. [9] C.H. Morimoto, and M. Flickner, “Real-Time Multiple Face Detection Using Ac-tive Illumination,” IEEE International Conference on Automatic Face and Ges-ture Recognition, pp. 8-13, 2000. [10] J. Bengoechea, J. Cerrolaza, A. Villanueva, and R. Cabeza, “Evaluation Of Ac-curate Eye Corner Detection Methods For Gaze Estimation,” Journal of Eye Movement Research, vol. 7, no. 3, 2014. [11] J. Merchant, R. Morrissette, and J. Porterfield, “Remote Measurements Of Eye Direction Allowing Subject Motion Over One Cubic Foot Of Space,” IEEE Transactions on Biomedical Engineering, vol. 21, no. 4, pp. 309- 317, 1974. [12] D. W. Hansen, and Q. Ji, “In The Eye Of The Beholder: A Survey Of Models For Eyes And Gaze,” IEEE Transactions on Pattern Analysis And Machine Intelli-gence, vol. 32, no. 3, pp. 478–500, 2010. [13] Applied Science Laboratories Website, [Online], [http://www.as-l.com.com]. [14] Lc Technologies Website, [Online], [http://www.eyegaze.com]. [15] Tobii Website, [Online], [http://www.tobii.com]. [16] M. U. Ghani, S. Chaudhry, and M. N. Geelani, “Gazepointer: A Real Time Mouse Pointer Control Implementation Based On Eye Gaze Tracking,” IEEE Interna-tional Multitopic Conference, pp. 154-159, 2013. [17] C. A. Hennessey, and P. D. Lawrence, “Improving The Accuracy And Reliability Of Remote System-Calibration-Free Eye-Gaze Tracking,” IEEE Transactions On Biomedical Engineering, vol. 56, no. 7, pp. 1891-1900, 2009. [18] W. Sunu, T. Supan, and P. Chuchart, “Dual-Camera Acquisition For Accurate Measurement Of Three-Dimensional Eye Movements,” IEEJ Transactions On Electrical And Electronic Engineering, pp. 238-246, 2013. [19] Z. Zhu, and Q. Ji, “Eye And Gaze Tracking For Interactive Graphic Display,” Machine Vision and Application, vol. 15, no. 3, pp. 139-148, 2004. [20] D. Yoo, and M. J. Chung, “A Novel Non-Intrusive Eye Gaze Estimation Using Cross-Ratio Under Large Head Motion,” Computer Vision and Understanding, vol. 98, no. 1, pp. 25-52, 2005. [21] J. Wang, G. Zhang, and J. Shi, “2D Gaze Estimation Based On Pupil-Glint Vec-tor Using An Artificial Neural Network,” MPDI Applied Sciences, vol. 6, no. 6, pp. 174, 2016. [22] Y. Shin, K. Choi, S. Kim, C. Yoo, and S. Ko, “A Novel 2-D Mapping-Based Remote Eye Gaze Tracking Method Using Two IR Light Sources,” IEEE Inter-national Conference on Consumer Electronics, pp. 190-191, 2015. [23] X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, “Mpiigaze: Real-World Dataset And Deep Appearance-Based Gaze Estimation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1, 2017. [24] W. Zhang, T. N. Zhang, and S. J. Chang, “Eye Gaze Estimation From The Ellip-tical Features Of One Iris,” SPIE Optical Engineering, vol. 50, no.4, pp. 047003, 2011. [25] T. Ishikawa, S. Baker, I. Matthews, and T. Kanade, “Passive Driver Gaze Track-ing With Active Appearance Models,” World Congress On Intelligent Transporta-tion Systems, pp. 1-12, 2004. [26] Y. L. Wu, C. T. Yeh, W. C. Hung, and C. Y. Tang, “Gaze Direction Estimation Using Support Vector Machine With Active Appearance Model,” Multimedia Tools and Applications, vol.70, no.3, pp. 2037-2062, 2014. [27] H. C. Lu, C. Wang, and Y. W. Chen, “Gaze Tracking By Binocular Vision And LBP Features,” International Conference on Pattern Recognition, pp. 8-11, 2008 [28] K. A. Funes-Mora, and J. M. Odobez, “Gaze Estimation In The 3d Space Using RGB-D Sensors,” International Journal of Computer Vision, vol. 118, no.2, pp. 194, 2016. [29] R. Jafari, and D. Ziou, “Gaze Estimation Using Kinect/PTZ Camera,” Interna-tional Symposium on Robotic and Sensors Environments Proceedings, pp. 13-18, 2012. [30] J. Li, and S. Li, “Eye-Model-Based Gaze Estimation By RGB-D Camera,” IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 606-610, 2014. [31] X. Xiong, Q. Cai, Z. Liu, and Z. Zhang, “Eye Gaze Tracking Using An RGBD Camera: A Comparison With A RGB Solution,” International Joint Conference on Pervasive and Ubiquitous Computing, pp. 1113-1121, 2014. [32] K. Wang, and Q. Ji, “Real Time Eye Gaze Tracking With 3d Deformable Eye-Face Model,” IEEE Conference on Computer Vision and Pattern Recognition, pp. 1003-1011, 2017. [33] P. Zanuttigh, G. Marin, C.D Mutto, F. Dominio, L. Minto, and G.M. Cortelazzo, “Operating principles of structured light depth cameras,” In Time-of-Flight and Structured Light Depth Cameras, pp. 43-79. Springer, 2016. [34] S. Barone, A. Paoli, and A. Razionale, “A Coded Structured Light System Based on Primary Color Stripe Projection and Monochrome Imaging”. Sensors, vol. 13, no. 10, pp. 13802-13819, 2013. [35] T. Hutchinson, “Eye Movement Detection With Improved Calibration And Speed,” U.S. Patent 4950069, April, 1990. [36] R. H. Spector, “Clinical methods: The history, physical, and laboratory examina-tions”, Boston Butterworth, 1990. [37] G. J. Anders, and D. Tong, “Depth Post-Processing for Intel RealSense D400 Depth Cameras”, 2018. [38] R. M. Haralick, S. R. Sternberg, and X. Zhuang, “Image Analysis Using Mathe-matical Morphology,” IEEE Transactions on Pattern Analysis and Machine Intel-ligence, vol. PAMI-9, no. 4, pp. 532-550, 1987. [39] J. Canny, “A Computational Approach To Edge Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. PAMI-8, no.6, pp. 679-698, 1986. [40] J. Lewis, “Fast Template Matching,” Vision Interface 95, pp. 15-19, 1995. [41] T. Y. Chai, M. Rizon, S. S. Woo, and C. S. Tan, “Facial Features For Template Matching Based Face Recognition,” American Journal of Applied Sciences, vol. 6, no. 11, pp. 1897-1901, 2009. [42] C. Xiu, and X. Pan, “Tracking Algorithm Based On The Improved Template Matching,” Chinese Control And Decision Conference, pp. 483-486, 2017. [43] C. Gao, N. Sang, and R. Huang, “Biologically Inspired Template Matching Using Scene Context,” PLOS One, pp. 1-13, 2014. [44] S. M. Weinberg, M. J. Kesterke, Z. D. Raffensperge, C. L. Heike, M. L. Cun-ningham, J. T. Hecht, J. C. Murray, G. L. Wehby, L. M. Moreno, and M. L. Marazita, “The 3D Facial Norms Database: Part 1. A Web-Based Craniofacial Anthropometric And Image Repository For The Clinical And Research Commu-nity,” Cleft Palate Craniofacial Journal, vol. 53, no. 6, pp. e185–e197, 2016. [45] Y. H. Kwon, and D. V. Lobo, “Age Classification From Facial Images,” IEEE Conference on Computer Vision and Pattern Recognition, pp. 762-767, 1994. [46] F. Shih, and C. Chuang, “Automatic Extraction Of Head And Face Boundaries And Facial Features,” Journal of Information Sciences, vol. 158, pp. 117-130, Jan 2004. [47] B. H. Oh, and K. S. Hong, “A Study On Facial Components Detection Method For Face-Based Emotion Recognition,” International Conference on Audio, Lan-guage and Image Processing, pp. 256-259, 2014. [48] A. Nikolaidis, and I. Pitas, “Facial Feature Extraction And Determination Of Pose,” International Conference on Pattern Recognition, vol. 33, pp. 1783-1791, 2000. [49] R. C. Gonzalez, and R. E. Woods, Digital Image Processing, 2nd ed. Reading, MA, USA: Addison-Wesley, 1992. [50] E. Weisstein, “Least Squares Fitting,” MathWorld A Wolfram Web Resource, [Online], [http://mathworld.wolfram.com/LeastSquaresFitting.html]. [51] T. Hutchinson, “Eye Movement Detection With Improved Calibration And Speed,” U.S. Patent 4950069, April, 1990. [52] J. Qiang, and Y. Xiaojie, “Real-Time Eye, Gaze, And Face Pose Tracking For Monitoring Driver Vigilance,” Journal of Real-Time Imaging Processing, vol. 8, pp. 357-377, 2002. [53] K. Mikolajczyk, and C. Schmid, “A Performance Evaluation Of Local De-scriptors,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 10, pp. 1615–1630, 2005. [54] M. Huang, Z. Mu, H, Zeng, and H. Huang, “A Novel Approach For Interest Point Detection Via Laplacian-Of-Bilateral Filter,” Journal of Sensors, vol. 3, pp. 1-9, 2015. [55] R. Maini, and H. Aggarwal, “Study And Comparison Of Various Image Edge Detection Techniques,” International Journal of Image Processing, vol. 3, no. 1, pp. 1-11, 2009. [56] S. Suzuki, and K. Abe, “Topological Structural Analysis Of Digitized Binary Images By Border Following,” Computer Vision, Graphics, and Image Pro-cessing, pp. 32-46, 1985. [57] G. Bradski, and A. Kaehler, Learning OpenCV, 1st edition: O'Reilly Media, 2008. [58] D. Pedoe, Circles: A Mathematical View (Spectrum), 2nd ed. Washington, DC, USA: Math. Assoc. America, 1997. [59] E. Weisstein, “Circumcircle,” MathWorld A Wolfram Web Resource, [Online], [http://mathworld.wolfram.com/Circumcircle.html]. [60] M. Fischler, and R. Booles, “Random Sample Consensus: A Paradigm For Model Fitting With Applications To Image Analysis And Automated Cartography,” ACM Communications, vol. 23, no. 6, pp. 381-395, 1981. [61] B. S. Lin, M. J. Su, P. H. Cheng, P. J. Tseng, and S. J. Chen, “Temporal And Spa-tial Denoising Of Depth Maps,” Sensors, vol. 15, no. 8, pp. 18506-18525, 2015. [62] OpenCV Website, [Online], [http://opencv.org]. [63] Intel Website, [Online], [https://communities.intel.com/docs/DOC-24012]. [64] R. Jafari, and D. Ziou, “Gaze Estimation Using Kinect/PTZ Camera,” Interna-tional Symposium on Robotic and Sensors Environments Proceedings, pp. 13-18, 2012. [65] A. George, and A. Routray, “Real-Time Eye Gaze Direction Classification Using Convolutional Neural Network,” International Conference on Signal Processing and Communications, pp. 1-5, 2016. [66] X. Zhang, H. Kulkarni, and M, Morris, “Smartphone-Based Gaze Gesture Com-munication For People With Motor Disabilities,” Conference on Human Factors in Computing Systems, pp. 2878-2889, 2017. [67] R. Hyder, S. Chowdhury, and S. Fattah, “Real-Time Non-Intrusive Eye-Gaze Tracking Based Wheelchair Control For The Physically Challenged,” Internation-al Conference on Biomedical Engineering and Sciences, pp. 784-787, 2016.
|