[1]S. T. Barnard and M. A. Fischler, “Computation Stereo,” Computing Surveys, Vol. 14, pp. 553-572, 1982.
[2]D. Marr and T. Poggio, “Cooperative Computation of Stereo Disparity,” Science, Vol. 194, pp. 283-287, 1976.
[3]S. T. Barnard and W. B. Thompson, “Disparity Analysis of Images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 2, pp. 330-340, 1980.
[4]范國清、王元凱、陳炳富,追蹤演算法簡介,影像與識別,第八卷,第四期,民國九十一年十二月。[5]Y. K. Jung, K.W. Lee and Y. S.Ho, “Content-Based Event Retrieval Using Semantic Scene Interpretation for Automated Traffic Surveillance,”IEEE Transactions on Intelligent Transportation Systems, Vol. 2, No. 3, pp. 151-163, Sept. 2001.
[6]D. J. Dailey, et. al., “An Algorithm to Estimate Mean Traffic Speed Using Uncalibrated Cameras,” IEEE Transactions. on Intelligent Transportation Systems, Vol. 1, No 2, pp. 98-107, June 2000.
[7]M. Ye and R. M. Haralick, “Optical Flow from a Least-Trimmed Squares Based Adaptive Approach,” Processing in 15th International Conference on Pattern Recognition, Vol. 3. pp. 1052-1055. 2000.
[8]B. D. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,” Processing in DARPA Image Under standing Workshop, pp. 121-130, 1981.
[9]C. S. Fuh and P. Maragos, “Region-Based Optical Flow Estimation,” IEEE Conference on Computer Vision and Patten Recognition, San Diego, C. A., pp.130-133, 1989.
[10]A. F. Bobick and J. W. Davis, “The Recognition of Human Movement Using Temporal Templates,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 3, pp. 257-267, 2001.
[11]J. C. Burie, J. G. Postaire, “A New Edge Matching Procedure For Obstacle Detection By Linear Stereo Vision,” IEEE Conference on Intelligent Vehicles '93 Symposium 14-16, pp. 414-419, July 1993.
[12]S. Birchfield, “Elliptical Head Tracking Using Intensity Gradients and Color Histograms,” IEEE Conference on Computer Vision and Pattern Recognitions, Santa Barara, California, Jun. 1998.
[13]何宜達,視覺伺服技術於三為目標軌跡預測與攔截之應用,國立成功大學碩士論文,民國九十年。[14]楊智凱,桿上平衡球系統影像伺服控制器之設計與製作,國立彰化師範大學碩士論文,民國九十三年。[15]W. J. Wilson, C. C. Williams Hulls and D. S. Bell, “Relative End-Effector Control Using Cartesian Position Based Visual Servoing,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 684-696, Oct. 1996.
[16]D. B. Westmore and W. J. Wilson, “Direct Dynamic Control of a Robot Using an End-Point Mounted Camera and Kalman Filter Position Estimation,” IEEE International Conference on Robotics and Automation, Vol. 3, pp. 2376-2384, Apr. 1991.
[17]V. Lippiello, B. Siciliano and L. Villani, “Position and Orientation Estimation Based on Kalman Filtering of Stereo Image,” IEEE International Conference on Control Applications, pp. 702-707, Sep. 2001.
[18]K. A. Peter, T. Aleksandar, Y. Billibon, and M. Paul, “Automated Tracking and Grasping of a Moving Object with a Robotic Hand-Eye System,”IEEE Transactions on Robotics and Automation, Vol. 9, No.2, pp.152-165, Apr. 1993.
[19]L. E. Weiss, A. C. Sanderson and C. P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Journal of Robotics and Automation, Vol. RA-3, No. 5, pp. 404-417, 1987.
[20]S. Hutchinson, G. D. Hager and P. I. Corke, “A tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 651-670, Oct. 1996.
[21]A. Castano and S. Hutchinson, “Visual Compliance: Task-Directed Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 10, No. 3, pp. 334-342, Jun. 1994.
[22]K. Hashimoto, T. Kimoto, T. Ebine and H. Kimura, “Manipulator Control with Image-Based Visual Servo,” IEEE International Conference on Robotics and Automation, Vol. 3, pp. 2267-2271, Apr. 1991.
[23]P. Liang, Y. L. Chang, and S. Hackwood, “Adaptive Self-Calibration of Vision-Based Robot Systems,” IEEE Transactions On System, Man, and Cybernetics, Vol. 19, No. 4, pp. 811-824, 1989.
[24]W. Y. Yau, and H. Wang, “Fast Relative Depth Computation for an Active Stereo Vision System,” Real-Time Imaging, Vol. 5, pp. 189-202, 1999.
[25]R. Y. Tsai and R. K. Lenz, “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand / Eye Calibration,” IEEE Transactions on Robotics and Automation, Vol. 5, No. 3, pp. 345-358, Jun. 1989.
[26]R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE Journal of Robotics and Automation, Vol. 3, Issue 4, pp. 323-344, Aug. 1987.
[27]Z. Hanqi and C. S. Yiu, “A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement,” IEEE Transactions on Systems, Man and Cybernetics, Vol. 23,Issue 4,pp. 1168-1175, July-Aug. 1993.
[28]Y. Shiu and S. Ahmad. “Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB,” IEEE Transactions on Robotics and Automation, pp. 16-29, Feb. 1989.
[29]D. E. Rizzi and A. A. Koditschek, “An Active Visual Estimator for Dexterous Manipulation,” IEEE Transactions on Robotics and Automation Vol. 12, Issue 5, pp. 697-713, Oct. 1996.
[30]K. Reinhard, S. Karsten and K. Andreas, “Computer Vision Three-Dimensional Data from Images,”Springer, 1996.
[31]B. K. Saeed, “Introduction to Robotics Analysis, Systems, Application,”Prentice Hall 2001.
[32]C. G. Rafael and E. W. Richard,“Digital Image Processing Second Edition,”Prentice Hall, 2002.
[33]林義欽,自走式銲接機械手臂之遠端遙控,國立中央大學碩士論文,民國九十年。[34]陳彥良,即時立體視覺物體追蹤系統,私立中原大學碩士論文,民國九十二年。[35]黃志鴻,機器手臂視覺抓取系統,國立中正大學碩士論文,民國九十二年。[36]吳承柯,戴善榮,程湘君,雲立實,“數位影像處理”,儒林,民國八十二年。
[37]晉茂林,“機械人學”,五南圖書,民國八十八年。
[38]郭再興,“1394 介面殺手級應用-機器視覺”,�皏K科技股份有限公司,http://www.G4.com.tw。