|
[1]T. Okuma, T. Kurata, and K. Sakaue, “Real-Time Camera Parameter Estimation from Images for a Wearable Vision System,” in Proc. IAPR Workshop on Machine Vision Applications, pp. 4482-4486, 2000. [2]K. Oka, Y. Sato, and H. Koike, “Real-time Tracking of Multiple Fingertips and Gesture Recognition for Augmented Desk Interface Systems,” in Proc. IEEE International Conference on Automatic Face and Gesture Recognition, pp. 429-434, 2002. [3]K. Hu, S. Canavan, and L. Yin, “Hand Pointing Estimation for Human Computer Interaction Based on Two Orthogonal-Views,” in Proc. International Conference on Pattern Recognition, pp. 3760-3763, 2010. [4]S. Hodges, L. Williams, E. BerryI, S. Izadi, J. Srinivasan, A. Butler, G. Smyth, and N. Kapur, “SenseCam: A Retrospective Memory Aid,” in Proc. International Conference on Ubiquitous Computing, pp. 177-193, 2006. [5]M. Havlena, A. Ess, W. Moreau, A. Torii, M. Jancosek, T. Pajdla, and L. Van Gool, “AWEAR 2.0 System: Omni-directional Audio-Visual Data Acquisition and Processing,” in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 49-56, 2009. [6]G. Balakrishnan, G. Sainarayanan, R. Nagarajan, and S. Yaacob, “Wearable Real-Time Stereo Vision for the Visually Impaired,” Engineering Letters, vol. 14, no. 2, pp. 6-14, 2007. [7]Y. Liu, X. Liu, and U. Jia, “Hand-Gesture Based Text Input for Wearable Computer,” in Proc. IEEE International Conference on Computer Vision Systems, pp. 8-13, 2006. [8]R. Grasset, A. Dunser, and M. Billinghurst, “Human-Centered Development of an AR Handheld Display,” in Proc. IEEE and ACM International Symposium, pp. 177-180, 2007. [9]B. F. Goldiez, A. M. Ahmad, and P. A. Hancock, “Effects of Augmented Reality Display Settings on Human Wayfinding Performance,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 37, no. 5, pp. 839-845, 2007. [10]J. Yang, W. Yang, M. Denecke, and A. Waibel, “Smart sight: A Tourist Assistant System,” in Proc. Symposium on Wearable Computers, vol. 1, pp. 73-78, Oct. 1999. [11]T. Brown and R. C. Thomas, “Finger Tracking for the Digital Desk,” in Proc. Australasian User Interface Conference, vol. 1, pp. 11-16, 2000. [12]A. Wu, M. Shah, and N. D. V. Lobo, “A virtual 3D Blackboard: 3D Finger Tracking Using a Single Camera,” in Proc. IEEE Intnternational Conference Automatic Face and Gesture Recognition, pp. 536-543, 2000. [13]T. Keaton, S. M. Dominguez, and A. H. Sayed, “Snap&Tell™: A Multimodal Wearable Computer Interface for Browsing the Environment,” in Proc. Intnternational Symp. Wearable Computers, pp. 75-82, Oct. 2002. [14]S. Dominguez, T. Keaton, and A. Sayed, “A Robust Fnger Tracking Method for Multimodal Wearable Computer Interfacing,” IEEE Transactions on Multimedia, vol. 8, no. 5, pp. 956-972, 2006. [15]S. Mitra and T. Acharya, “Gesture Recognition: A Survey,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 37, no. 3, pp. 311-324, May 2007. [16]K. Wang, W. Li, R. F. Li, and L. Zhao, “Real-time Hand Gesture Recognition for Service Robot,” in Proc. International Conference on Intelligent Computation Technology and Automation, vol. 2, pp. 976-979, 2010 [17]S. W. Lee, “Automatic Gesture Recognition for Intelligent Human-Robot Interaction,” in Proc. Seventh International Conference on Automatic Face and Gesture Recognition, pp. 645-650, 2006. [18]A. Corradini, “Dynamic Time Warping for Off-Line Recognition of a Small Gesture Vocabulary,” in Proc. IEEE ICCV Workshop Recognition, Analysis, and Tracking of Faces and Gestures in RealTime Systems, pp. 82-89, 2001. [19]R. Cutler, and M. Turk, “View-Based Interpretation of Real-Time Optical Flow for Gesture Recognition,” in Proc. Third IEEE International Conference Automatic Face and Gesture Recognition, pp. 416-421, 1998. [20]T. Darrell and A. Pentland, “Space-Time Gestures,” in Proc. IEEE Conference Computer Vision and Pattern Recognition, pp. 335-340, 1993. [21]M. Gandy, T. Starner, J. Auxier, and D. Ashbrook, “The Gesture Pendant: A Self-Illuminating, Wearable, Infrared Computer Vision System for Home Automation Control, and Medical Monitoring,” in Proc. Fourth International Symp. Wearable Computers, pp. 87- 94, 2000. [22]K. Oka, Y. Sato, and H. Koike, “Real-Time Fingertip Tracking and Gesture Recognition,” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 64-71, Dec. 2002. [23]T. Starner, J. Weaver, and A. Pentland, “Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video,” IEEE Transactions Pattern Analysis and Machine Intelligence, vol. 20, no. 12, pp. 1371-1375, Dec. 1998. [24]M. H. Yang, N. Ahuja, and M. Tabb, “Extraction of 2D Motion Trajectories and Its Application to Hand Gesture Recognition,” IEEE Transactions Pattern Analysis and Machine Intelligence, vol. 24, no. 8, pp. 1061-1074, Aug. 2002. [25]V. Pavlovic, R. Sharma, and T. Huang, “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review,” IEEE Transactions Pattern Analysis and Machine Intelligence, vol. 19, no. 7, pp. 677-695, July 1997. [26]Y. Cui and J. Weng, “Appearance-Based Hand Sign Recognition from Intensity Image Sequences,” Computer Vision and Image Understanding, vol. 78, no. 2, pp. 157-176, May 2000. [27]E. Ong and R. Bowden, “A Boosted Classifier Tree for Hand Shape Detection,” in Proc. Sixth IEEE International Conference Automatic Face and Gesture Recognition, pp. 889-894, 2004. [28]M. Isard and A. Blake, “CONDENSATION-Conditional Density Propagation for Visual Tracking,” International J. Computer Vision, vol. 29, no. 1, pp. 5-28, 1998. [29]M. Kolsch and M. Turk, “Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration,” in Proc. IEEE Workshop RealTime Vision for Human-Computer Interaction, pp. 158-165, 2004. [30]N. Stefanov, A. Galata, and R. Hubbold, “Real-Time Hand Tracking with Variable-Length Markov Models of Behaviour,” in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 3, no. 73-80, 2005. [31]B. Stenger, A. Thayananthan, P. Torr, and R. Cipolla, “Filtering Using a Tree-Based Estimator,” in Proc. Ninth IEEE International Conference Computer Vision, pp. 1063-1070, 2003. [32]E. Sudderth, M. Mandel, W. Freeman, T. Freeman, and S. Willsky, “Visual Hand Tracking Using Nonparametric Belief Propagation,” in Proc. IEEE CVPR Workshop Generative Model Based Vision, pp. 189-197, 2004. [33]F. Chen, C. Fu, and C. Huang, “Hand Gesture Recognition Using a Real-Time Tracking Method and Hidden Markov Models,” Image and Video Computing, vol. 21, no. 8, pp. 745-758, Aug. 2003. [34]J. Martin, V. Devin, and J. Crowley, “Active Hand Tracking,” in Proc. Third IEEE International Conference Automatic Face and Gesture Recognition, pp. 573-578, 1998. [35]T. S. Caetano, S. D. Olabarriaga, and D. A. C. Barone, “Do Mixture Models in Chromaticity Space Improve Skin Detection?” Pattern Recognition, vol. 36, no. 12, pp. 3019-3021, 2003. [36]V. Monga, and R. Bala, “Algorithms for Color Look-Up-Table(LUT) Design via Joint Optimization of Node Locations and Output Values,” in Proc. International Conference on Acoustics, Speech, and Signal Processing, pp. 998-1001, 2010. [37]M. Mese and P. P. Vaidyanathan, “Look up Table (LUT) Method for Image Halftoning,” in Proc. International Conference on Image Processing, vol. 3, pp. 993-996, 2000. [38]“BeagleBoard System Reference Manual Rev C4, ” http://beagleboard.org/, 2010. [39]J. Lincoln, “The Latest Video Projection Can Fit Inside Tiny Cameras or Cellphones Yet Still Produce Big Pictures,” IEEE Spectrum, vol. 47, no. 5, pp. 41-45, 2010. [40]“FFMpeg,” http://www.ffmpeg.org/, 2010. [41]G. R. Bradski and A. Zelinsky, “Learning OpenCV-Computer Vision with the OpenCV Library,” IEEE Robotics and Automation Society, vol. 16, no. 3, pp. 100-100, 2009. [42]“QT,” http://qt.nokia.com/products/, 2010.
|