[1]Billinghurst, M., Kato, H., & Poupyrev, I. 2001. The MagicBook: a transitional AR interface. Computers & Graphics, 25, pp. 745-753.
[2]Billinghurst, M., Kato, H., & Poupyrev, I. 2001. The MagicBook - Moving Seamlessly between Reality and Virtuality. Computer Graphics and Applications, IEEE, 21(3), pp. 6-8.
[3]Zhong, Z., Hu, J., Tan, G., & Sun, C. 2009. The Application of Google Earth in Education. Education Technology and Computer Science, 2009. ETCS '09., 1, pp. 10-13.
[4]McIntyre, D., & Wolff, F. 1998. An experiment with WWW interactive learning in university education. Computers & Education, 31(3), pp. 255-264.
[5]Scaife, M., Rogers, Y., Aldrich, F., & Davies , M. 1997. Designing for or designing with? Informant design for interactive learning environments. Proceedings of the SIGCHI conference on Human factors in computing systems , pp. 343-350 .
[6]李青蓉、魏丕信、施郁芬、邱昭彰,1998,人機介面設計。空中大學出版社。
[7]洪瑞隆,1993,中文畫面設計評估之研究。東吳大學商學院管理研究所碩士論文。[8]Carroll, J. M. 1997. HUMAN-COMPUTER INTERACTION: Psychology as a Science of Design. Annual Review of Psychology, 48, pp. 61-83.
[9]Hakiel, S. 1997. Delivering ease of use [software development]. Computing & Control Engineering Journal, 8(2), pp. 81-87.
[10]鄧友清,2000,以『使用性』為訴求的國中數學網路教學介面之研究。大葉大學資訊管理研究所碩士論文。[11]Preece , J., Benyon, D., Davies, G., & Keller, L. 1993. A Guide to Usability: Human Factors in Computing.
[12]Weinschenk, S., Jamar, P., & Yeo, S. C. 1997. GUI Design Essentials. Wiley.
[13]Newman, M.W.& Lamming , G.M. 1995 Interactive System Design , Prentice Hall Europe.
[14]王俊人,1991,多媒體軟體使用者介面與互動性之研究。政治大學資訊管理研究所碩士論文。
[15]張吉元,1997,多媒體商品展示系統介面文字語音及顯示特效的搭配對消費者影響之研究,大葉大學資訊管理研究所碩士論文。[16]李宜珍,1992,談高效應的多媒體介面設計,中華民國八十一年國際視聽教育學術研討會論文集。
[17]李孟軒,2007,『擴增實境』科技結合互動式數位典藏展示介面設計之研究,崑山科技大學視覺傳達設計研究所碩士論文。[18]Milgram, P., Takemura, H., Utsumi, A., & Kishino, F. 1994. Augmented Reality: A class of displays on the reality-virtuality continuum. Telemanipulator and Telepresence Technologies.
[19]Azuma, R. T. 1997. A Survey of Augmented Reality. Teleoperators and Virtual Environments, pp. 355-385.
[20]科學人雜誌(2002),六月號刊
[21]高振源(2006),觸碰式『擴增實境』應用在無人店舖自動販賣機操作面板設計的可行性之研究,崑山科技大學視覺傳達設計研究所碩士論文。[22]Vallino, J. R. 1998. Interactive Augmented Reality.
[23]Mazilu, G., P. Milgram, et al. 1997. Stear 10 Telepresence Stereoscopic Interface for Teleoperation (SITE). Final Report to Canadian Space Agency for CSA/SSPO/STEAR 10 contract, Canadian Space Agency.
[24]Asai, K., Kobayashi, H., & Kondo, T. 2005. Augmented instructions - a fusion of augmented reality and printed learning materials. Advanced Learning Technologies, 2005. ICALT 2005. Fifth, pp. 213-215.
[25]Alamri, A., Cha, J., & El Saddik, A. 2010. AR-REHAB: An Augmented Reality Framework for Poststroke-Patient Rehabilitation. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 10(59), pp. 2554 - 2563.
[26]Bailey, R. N. 1996. Human Performance Engineering: Designing High Quality Professional User Interfaces for Computer Products, Applications and Systems (3rd Edition).
[27]De Paolis, L., Aloisio, G., & Pulimeno, M. 2009. A Simulation of a Billiards Game Based on Marker Detection. IEEE Conference Publications, (pp. 148-151).
[28]Garcia Macias, J., Alvarez-Lozano, J., Estrada-Martinez, P., & Aviles-Lopez, E. 2011. Browsing the Internet of Things with Sentient Visors. IEEE Computer Society, 44(5), pp. 46-52.
[29]Gross, T. 1997. Towards flexible support for cooperation: group awareness in shared workspaces. IEEE Conference Publications, (pp. 406-411).
[30]Guan, T., Duan, L., Yu, J., Chen, Y., & Zhang, X. 2011. Real-Time Camera Pose Estimation for Wide-Area Augmented Reality Applications. Computer Graphics and Applications, 31(3), pp. 56-68.
[31]Hadidi, R., & Sung, C.-H. 1998. Students' Acceptance of Web-Based Course Offerings: An Empirical Assessment. AMCIS, p. 359.
[32]Hornecker, E., & Dunser, A. 2009. Of pages and paddles: Children’s expectations and mistaken interactions. Interacting with Computers, 21(1-2), pp. 95-107.
[33]Janssen, J., Bailenson, J., IJsselsteijn, W., & Westerink, J. 2010. Intimate Heartbeats: Opportunities for Affective Communication Technology. Affective Computing, pp. 72-80.
[34]Lee, W., Park, Y., Lepetit, V., & Woo, W. 2011. Video-Based In Situ Tagging on Mobile Phones. Circuits and Systems for Video Technology, 21(10), pp. 1487-1496.
[35]Milster, T. 1997. A user-friendly diffraction modeling program. Optical Data Storage Topical Meeting, 1997. ODS. Conference Digest, (pp. 60-61).
[36]Minocha, S. 1999. Requirements development in user-centred system design . Making User-Centred Design Work in Software Development , pp. 7/1-7/4 .
[37]Morales , C., Cory , C., & Bozell , D. 2001. A Comparative Efficiency Study Between a Live Lecture and a Web Based Live-Switched Multi-Camera Streaming Video Distance Learning Instructional Unit. Managing Information Technology in a Global Economy, p. 4.
[38]Niewiadomski, R., Hyniewska, S., & Pelachaud, C. 2011. Constraint-Based Model for Synthesis of Multimodal Sequential Expressions of Emotions. Affective Computing, 2(3), pp. 134-146 .
[39]Park, H., Seo, B.-K., & Park, J.-I. 2010. Subjective Evaluation on Visual Perceptibility of Embedding Complementary Patterns for Nonintrusive Projection-Based Augmented Reality. Circuits and Systems for Video Technology, 20(5), pp. 687-696.
[40]Stapleton, C., & Rolland, J. 2010. Mixing Realities at Ismar 2009: Scary and Wondrous. Computer Graphics and Applications, 30(3), pp. 89-95.
[41]Strosninder, M. 1999. User and task analysis for interface design. Professional Communication, 42(3), pp. 188-189.
[42]Zhang, D., Zhou, L., Briggs, R. O., & Nunamaker, J. F. 2006. Instructional video in e-learning: Assessing the impact of interactivevideo on learningeffectiveness. Information & Management, 43(1), pp. 15–27.