[1] Rayner, K., & Reingold, E. M. (2015). Evidence for direct cognitive control of fixation durations during reading. Current opinion in behavioral sciences, 1, 107-112.
[2] Rayner, K. (2009). Eye movements and attention in reading, scene perception, and visual search. The quarterly journal of experimental psychology, 62(8), 1457-1506.
[3] Posner, M. I., & Boies, S. J. (1971). Components of attention. Psychological review, 78(5), 391.
[4] Chen, H. C., Wang, C. C., Hung, J. C., & Hsueh, C. Y. (2022). Employing Eye Tracking to Study Visual Attention to Live Streaming: A Case Study of Facebook Live. Sustainability, 14(12), 7494.
[5] García-Carrión, B., Del Barrio-García, S., Muñoz-Leiva, F., & Porcu, L. (2023). Effect of social-media message congruence and generational cohort on visual attention and information-processing in culinary tourism: An eye-tracking study. Journal of Hospitality and Tourism Management, 55, 78-90.
[6] Park, B., Knörzer, L., Plass, J. L., & Brünken, R. (2015). Emotional design and positive emotions in multimedia learning: An eyetracking study on the use of anthropomorphisms. Computers & Education, 86, 30-42.
[7] Lin, Z., Liu, Y., Wang, H., Liu, Z., Cai, S., Zheng, Z., ... & Zhang, X. (2022). An eye tracker based on webcam and its preliminary application evaluation in Chinese reading tests. Biomedical Signal Processing and Control, 74, 103521.
[8] Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey, Nediyana Daskalova, Jeff Huang, and James Hays. (2016). Webgazer: scalable webcam eye tracking using user interactions. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence
[9] Pathirana, P., Senarath, S., Meedeniya, D., & Jayarathna, S. (2022). Eye gaze estimation: A survey on deep learning-based approaches. Expert Systems with Applications, 199, 116894.
[10] Przybyło, J., Kańtoch, E., & Augustyniak, P. (2019). Eyetracking-based assessment of affect-related decay of human performance in visual tasks. Future Generation Computer Systems, 92, 504-515.
[11] Scott, G. G., Pinkosova, Z., Jardine, E., & Hand, C. J. (2023). “Thinstagram”: Image content and observer body satisfaction influence the when and where of eye movements during instagram image viewing. Computers in Human Behavior, 138, 107464.
[12] Yee, N. (2006). Motivations for play in online games. CyberPsychology & behavior, 9(6), 772-775.
[13] Locke, E. A. (2012). Construct validity vs. concept validity. Human Resource Management Review, 22(2), 146-148.
[14] Hinkin, T. R. (1998). A brief tutorial on the development of measures for use in survey questionnaires. Organizational research methods, 1(1), 104-121.
[15] de Mesquita Neto, J. A., & Becker, K. (2018). Relating conversational topics and toxic behavior effects in a MOBA game. Entertainment computing, 26, 10-29.
[16] Bopp, J. A., Mekler, E. D., & Opwis, K. (2016, May). Negative emotion, positive experience? Emotionally moving moments in digital games. In Proceedings of the 2016 CHI conference on human factors in computing systems (pp. 2996-3006).
[17] Gutwin, C., Vicencio-Moreira, R., & Mandryk, R. L. (2016, October). Does helping hurt? Aiming assistance and skill development in a first-person shooter game. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (pp. 338-349).
[18] Heitz, R. P. (2014). The speed-accuracy tradeoff: history, physiology, methodology, and behavior. Frontiers in neuroscience, 8, 150.
[19] Chalmers, D. J. (1995). Facing up to the problem of consciousness. Journal of consciousness studies, 2(3), 200-219.
[20] James, W. (2007). The principles of psychology (Vol. 1). Cosimo, Inc..
[21] Ginsburg, S., & Jablonka, E. (2019). The evolution of the sensitive soul: learning and the origins of consciousness. MIT Press.
[22] Sattin, D., Magnani, F. G., Bartesaghi, L., Caputo, M., Fittipaldo, A. V., Cacciatore, M., ... & Leonardi, M. (2021). Theoretical models of consciousness: a scoping review. Brain Sciences, 11(5), 535.
[23] Hayes, S. C., & Hofmann, S. G. (2023). A biphasic relational approach to the evolution of human consciousness: Un enfoque relacional bifásico para la evolución de la conciencia humana. International Journal of Clinical and Health Psychology, 23(4), 100380.
[24] Lieberman, J. D., Solomon, S., Greenberg, J., & McGregor, H. A. (1999). A hot new way to measure aggression: Hot sauce allocation. Aggressive Behavior: Official Journal of the International Society for Research on Aggression, 25(5), 331-348.
[25] Buss, A. H., & Perry, M. (1992). The aggression questionnaire. Journal of personality and social psychology, 63(3), 452.
[26] Denson, T. F., Pedersen, W. C., & Miller, N. (2006). The displaced aggression questionnaire. Journal of personality and social psychology, 90(6), 1032.
[27] Herpin, G., Gauchard, G. C., Lion, A., Collet, P., Keller, D., & Perrin, P. P. (2010). Sensorimotor specificities in balance control of expert fencers and pistol shooters. Journal of electromyography and kinesiology, 20(1), 162-169.
[28] Peleg, K., Rivkind, A., Aharonson-Daniel, L., & Israeli Trauma Group. (2006). Does body armor protect from firearm injuries?. Journal of the American College of Surgeons, 202(4), 643-648.
[29] Miller, N., Pedersen, W. C., Earleywine, M., & Pollock, V. E. (2003). A theoretical model of triggered displaced aggression. Personality and Social Psychology Review, 7(1), 75-97.
[30] Lajunen, T., Parker, D., & Stradling, S. G. (1998). Dimensions of driver anger, aggressive and highway code violations and their mediation by safety orientation in UK drivers. Transportation Research Part F: Traffic Psychology and Behaviour, 1(2), 107-121.
[31] Chow, R. M., Tiedens, L. Z., & Govan, C. L. (2008). Excluded emotions: The role of anger in antisocial responses to ostracism. Journal of Experimental Social Psychology, 44(3), 896-903.
[32] Löw, A., Weymar, M., & Hamm, A. O. (2015). When threat is near, get out of here: Dynamics of defensive behavior during freezing and active avoidance. Psychological science, 26(11), 1706-1716.
[33] Mobbs, D., Hagan, C. C., Dalgleish, T., Silston, B., & Prévost, C. (2015). The ecology of human fear: survival optimization and the nervous system. Frontiers in neuroscience, 9, 55.
[34] Hendrie, C. A., Weiss, S. M., & Eilam, D. (1998). Behavioural response of wild rodents to the calls of an owl: a comparative study. Journal of Zoology, 245(4), 439-446.
[35] Wade, N., & Tatler, B. W. (2005). The moving tablet of the eye: The origins of modern eye movement research. Oxford University Press, USA.
[36] 蔡介立(2000)。《從眼動控制探討中文閱讀的訊息處理歷程:應用眼動誘發呈現技術之系列研究》,國立政治大學心理學研究所未發表之博士論文。
[37] Chen, Y., & Tsai, M. J. (2015). Eye-hand coordination strategies during active video game playing: An eye-tracking study. Computers in Human Behavior, 51, 8-14.
[38] Mikalef, P., Sharma, K., Chatterjee, S., Chaudhuri, R., Parida, V., & Gupta, S. (2023). All eyes on me: Predicting consumer intentions on social commerce platforms using eye-tracking data and ensemble learning. Decision Support Systems, 114039.
[39] Meo, M., Del Punta, J. A., Sánchez, I., de Luis García, R., Gasaneo, G., & Martin, R. (2023). A dynamical method to objectively assess infantile nystagmus based on eye tracking. A pilot study. Journal of Optometry, 16(3), 221-228.
[40] Park, S., Spurr, A., & Hilliges, O. (2018). Deep pictorial gaze estimation. In Proceedings of the European conference on computer vision (ECCV) (pp. 721-738).
[41] Schmitz, I., & Einhäuser, W. (2023). Gaze estimation in videoconferencing settings. Computers in Human Behavior, 139, 107517.
[42] de Chambrier, A. F., Pedrotti, M., Ruggeri, P., Dewi, J., Atzemian, M., Thevenot, C., ... & Terrier, P. (2023). Reading numbers is harder than reading words: An eye-tracking study. Acta Psychologica, 237, 103942.
[43] Andersson, R., Nyström, M., & Holmqvist, K. (2010). Sampling frequency and eye-tracking measures: how speed affects durations, latencies, and more. Journal of Eye Movement Research, 3(3).
[44] 吳昇儒(2015)。自然光源照明眼動儀系統設計。﹝碩士論文。國立臺灣師範大學﹞臺灣博碩士論文知識加值系統。[45] Meißner, M., Pfeiffer, J., Pfeiffer, T., & Oppewal, H. (2019). Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research. Journal of Business Research, 100, 445-458.
[46] Yoshimura, Y., Kizuka, T., & Ono, S. (2021). Properties of fast vergence eye movements and horizontal saccades in athletes. Physiology & Behavior, 235, 113397.
[47] Wang, H., Pi, J., Qin, T., Shen, S., & Shi, B. E. (2018, June). SLAM-based localization of 3D gaze using a mobile eye tracker. In Proceedings of the 2018 ACM symposium on eye tracking research & applications (pp. 1-5).
[48] Wan, Z. H., Xiong, C. H., Chen, W. B., & Zhang, H. Y. (2021). Robust and accurate pupil detection for head-mounted eye tracking. Computers & Electrical Engineering, 93, 107193.
[49] Al-Kassim, Z., & Memon, Q. A. (2017). Designing a low-cost eyeball tracking keyboard for paralyzed people. Computers & Electrical Engineering, 58, 20-29.
[50] Zhang, X., Sugano, Y., Fritz, M., & Bulling, A. (2017). It's written all over your face: Full-face appearance-based gaze estimation. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops (pp. 51-60).
[51] Blignaut, P., & Wium, D. (2014). Eye-tracking data quality as affected by ethnicity and experimental design. Behavior research methods, 46, 67-80.
[52] Wang, Y. (2023). The impact of linguistic metaphor on translation unit in target text processing: An eye tracking and keylogging English-Chinese translation study. Ampersand, 100129.
[53] Jin, J., Wang, A., Wang, C., & Ma, Q. (2023). How do consumers perceive and process online overall vs. individual text-based reviews? Behavioral and eye-tracking evidence. Information & Management, 60(5), 103795.
[54] Blascheck, T., Schweizer, M., Beck, F., & Ertl, T. (2017, June). Visual comparison of eye movement patterns. In Computer Graphics Forum (Vol. 36, No. 3, pp. 87-97).
[55] O'Shea, K., & Nash, R. (2015). An introduction to convolutional neural networks. arXiv preprint arXiv:1511.08458.
[56] Lindsley, D. B. (1951). Emotion.
[57] He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770-778).
[58] Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.
[59] Yarotsky, D. (2017). Error bounds for approximations with deep ReLU networks. Neural Networks, 94, 103-114.
[60] Song, L., Fan, J., Chen, D. R., & Zhou, D. X. (2023). Approximation of nonlinear functionals using deep relu networks. arXiv preprint arXiv:2304.04443.
[61] Kutlugün, M. A., Sirin, Y., & Karakaya, M. (2019, September). The effects of augmented training dataset on performance of convolutional neural networks in face recognition system. In 2019 Federated Conference on Computer Science and Information Systems (FedCSIS) (pp. 929-932). IEEE.
[62] Zhang, K., Zhang, Z., Li, Z., & Qiao, Y. (2016). Joint face detection and alignment using multitask cascaded convolutional networks. IEEE signal processing letters, 23(10), 1499-1503.
[63] Zhang, D., Li, J., & Shan, Z. (2020, November). Implementation of Dlib deep learning face recognition technology. In 2020 International Conference on Robots & Intelligent System (ICRIS) (pp. 88-91). IEEE.
[64] Wang, C. Y., Bochkovskiy, A., & Liao, H. Y. M. (2023). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 7464-7475).