跳到主要內容

臺灣博碩士論文加值系統

(216.73.216.172) 您好!臺灣時間:2025/09/12 06:35
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:張宜平
研究生(外文):I-Ping Chang
論文名稱:利用3D視覺之機械手眼控制演算法
論文名稱(外文):Robot Hand/Eye Control Algorithms Using 3D Machine Vision
指導教授:蕭瑛星
指導教授(外文):Ying-Shing Shiao
學位類別:碩士
校院名稱:國立彰化師範大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2006
畢業學年度:94
語文別:中文
中文關鍵詞:立體視覺手眼機構軌跡偵測
外文關鍵詞:SSGHand-Eye Configurationtrajectory detection
相關次數:
  • 被引用被引用:1
  • 點閱點閱:482
  • 評分評分:
  • 下載下載:96
  • 收藏至我的研究室書目清單書目收藏:3
本論文是利用影像處理的方法建立立體機械視覺用以控制機械臂抓取移動中的物體。本研究是結合攝影機和機械臂成為具有視覺回授(Visual Feedback)的手眼機構(Hand-Eye Configuration),透過手眼協調控制來抓取移動的物體。控制的方法是用兩部攝影機擷取移動目標物與背景的連續影像,由相鄰兩張影像做差分運算、二值化、關聯運算及質心運算後,得到移動目標物的移動量,再配合標準雙攝影機立體幾何(SSG)計算出目標物在空間中的移動軌跡,由機械臂的逆向運動學求機械臂各軸的控制量,完成視覺控制。本文研究左右影像相對應特徵點的搜尋法,能快速計算出目標物在3D空間的移動軌跡,配合機械臂最短運動路徑規劃可以抓取快速的移動物,最後建立實驗系統以實驗證明本文提出的方法是有效的。
This thesis proposes a binocular machine vision using image processing algorithms for controlling a robot grasping a moving target. Binocular cameras are combined with a robot in a hand-eye cooperative configuration using the cameras as vision feedback sensors for controlling the manipulators. The binocular cameras are used to grab sequential images in which the moving target is detected using subtraction, binary, projection, and cross-correlation image processing procedures. The geometric relations for standard stereo geo-metry (SSG) are used to calculate the trajectory of the m-oving target in world coordinates. The target trajectory is used for determining the inverse kinematic solutions for the robot joint command, thereby accomplishing visual servo control. An algorithm for searching the correspondence points in the left and right images is proposed. This method is expected to have fast 3D target trajectory calculation. The shortest path planning for the robot to grasp the moving object is also a research topic in this thesis. The proposed experimental system is implemented and experimental results are used to demonstrate that the proposed vision control algorithms are feasible.
中文摘要..................................................i
英文摘要.................................................ii
謝誌....................................................iii
目錄.....................................................iv
圖目錄...................................................vi
表目錄...................................................ix
第一章 緒論.............................................1
1.1 前言..................................................1
1.2 文獻探討..............................................2
1.3 研究動機與目的........................................6
1.4 論文架構..............................................7
第二章 立體視覺與影像處理...............................8
2.1 共平面立體影像幾何學..................................8
2.2 移動物體偵測.........................................13
第三章 機械臂運動控制..................................21
3.1 齊次座標轉換.........................................21
3.2 機械臂運動學.........................................23
3.3 路徑規劃.............................................33
第四章 視覺伺服與運動軌跡偵測..........................36
4.1 影像伺服架構.........................................37
4.2 運動軌跡.............................................43
4.3 RS-232傳輸架構.......................................47
第五章 實驗結果........................................54
5.1 立體校正實驗.........................................58
5.2 移動物軌跡偵測實驗...................................63
第六章 結論與建議......................................75
參考文獻.................................................77
附錄A....................................................82
[1]S. T. Barnard and M. A. Fischler, “Computation Stereo,” Computing Surveys, Vol. 14, pp. 553-572, 1982.
[2]D. Marr and T. Poggio, “Cooperative Computation of Stereo Disparity,” Science, Vol. 194, pp. 283-287, 1976.
[3]S. T. Barnard and W. B. Thompson, “Disparity Analysis of Images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 2, pp. 330-340, 1980.
[4]范國清、王元凱、陳炳富,追蹤演算法簡介,影像與識別,第八卷,第四期,民國九十一年十二月。
[5]Y. K. Jung, K.W. Lee and Y. S.Ho, “Content-Based Event Retrieval Using Semantic Scene Interpretation for Automated Traffic Surveillance,”IEEE Transactions on Intelligent Transportation Systems, Vol. 2, No. 3, pp. 151-163, Sept. 2001.
[6]D. J. Dailey, et. al., “An Algorithm to Estimate Mean Traffic Speed Using Uncalibrated Cameras,” IEEE Transactions. on Intelligent Transportation Systems, Vol. 1, No 2, pp. 98-107, June 2000.
[7]M. Ye and R. M. Haralick, “Optical Flow from a Least-Trimmed Squares Based Adaptive Approach,” Processing in 15th International Conference on Pattern Recognition, Vol. 3. pp. 1052-1055. 2000.
[8]B. D. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,” Processing in DARPA Image Under standing Workshop, pp. 121-130, 1981.
[9]C. S. Fuh and P. Maragos, “Region-Based Optical Flow Estimation,” IEEE Conference on Computer Vision and Patten Recognition, San Diego, C. A., pp.130-133, 1989.
[10]A. F. Bobick and J. W. Davis, “The Recognition of Human Movement Using Temporal Templates,” IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 3, pp. 257-267, 2001.
[11]J. C. Burie, J. G. Postaire, “A New Edge Matching Procedure For Obstacle Detection By Linear Stereo Vision,” IEEE Conference on Intelligent Vehicles '93 Symposium 14-16, pp. 414-419, July 1993.
[12]S. Birchfield, “Elliptical Head Tracking Using Intensity Gradients and Color Histograms,” IEEE Conference on Computer Vision and Pattern Recognitions, Santa Barara, California, Jun. 1998.
[13]何宜達,視覺伺服技術於三為目標軌跡預測與攔截之應用,國立成功大學碩士論文,民國九十年。
[14]楊智凱,桿上平衡球系統影像伺服控制器之設計與製作,國立彰化師範大學碩士論文,民國九十三年。
[15]W. J. Wilson, C. C. Williams Hulls and D. S. Bell, “Relative End-Effector Control Using Cartesian Position Based Visual Servoing,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 684-696, Oct. 1996.
[16]D. B. Westmore and W. J. Wilson, “Direct Dynamic Control of a Robot Using an End-Point Mounted Camera and Kalman Filter Position Estimation,” IEEE International Conference on Robotics and Automation, Vol. 3, pp. 2376-2384, Apr. 1991.
[17]V. Lippiello, B. Siciliano and L. Villani, “Position and Orientation Estimation Based on Kalman Filtering of Stereo Image,” IEEE International Conference on Control Applications, pp. 702-707, Sep. 2001.
[18]K. A. Peter, T. Aleksandar, Y. Billibon, and M. Paul, “Automated Tracking and Grasping of a Moving Object with a Robotic Hand-Eye System,”IEEE Transactions on Robotics and Automation, Vol. 9, No.2, pp.152-165, Apr. 1993.
[19]L. E. Weiss, A. C. Sanderson and C. P. Neuman, “Dynamic Sensor-Based Control of Robots with Visual Feedback,” IEEE Journal of Robotics and Automation, Vol. RA-3, No. 5, pp. 404-417, 1987.
[20]S. Hutchinson, G. D. Hager and P. I. Corke, “A tutorial on Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, pp. 651-670, Oct. 1996.
[21]A. Castano and S. Hutchinson, “Visual Compliance: Task-Directed Visual Servo Control,” IEEE Transactions on Robotics and Automation, Vol. 10, No. 3, pp. 334-342, Jun. 1994.
[22]K. Hashimoto, T. Kimoto, T. Ebine and H. Kimura, “Manipulator Control with Image-Based Visual Servo,” IEEE International Conference on Robotics and Automation, Vol. 3, pp. 2267-2271, Apr. 1991.
[23]P. Liang, Y. L. Chang, and S. Hackwood, “Adaptive Self-Calibration of Vision-Based Robot Systems,” IEEE Transactions On System, Man, and Cybernetics, Vol. 19, No. 4, pp. 811-824, 1989.
[24]W. Y. Yau, and H. Wang, “Fast Relative Depth Computation for an Active Stereo Vision System,” Real-Time Imaging, Vol. 5, pp. 189-202, 1999.
[25]R. Y. Tsai and R. K. Lenz, “A New Technique for Fully Autonomous and Efficient 3D Robotics Hand / Eye Calibration,” IEEE Transactions on Robotics and Automation, Vol. 5, No. 3, pp. 345-358, Jun. 1989.
[26]R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses,” IEEE Journal of Robotics and Automation, Vol. 3, Issue 4, pp. 323-344, Aug. 1987.
[27]Z. Hanqi and C. S. Yiu, “A noise-tolerant algorithm for robotic hand-eye calibration with or without sensor orientation measurement,” IEEE Transactions on Systems, Man and Cybernetics, Vol. 23,Issue 4,pp. 1168-1175, July-Aug. 1993.
[28]Y. Shiu and S. Ahmad. “Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form AX = XB,” IEEE Transactions on Robotics and Automation, pp. 16-29, Feb. 1989.
[29]D. E. Rizzi and A. A. Koditschek, “An Active Visual Estimator for Dexterous Manipulation,” IEEE Transactions on Robotics and Automation Vol. 12, Issue 5, pp. 697-713, Oct. 1996.
[30]K. Reinhard, S. Karsten and K. Andreas, “Computer Vision Three-Dimensional Data from Images,”Springer, 1996.
[31]B. K. Saeed, “Introduction to Robotics Analysis, Systems, Application,”Prentice Hall 2001.
[32]C. G. Rafael and E. W. Richard,“Digital Image Processing Second Edition,”Prentice Hall, 2002.
[33]林義欽,自走式銲接機械手臂之遠端遙控,國立中央大學碩士論文,民國九十年。
[34]陳彥良,即時立體視覺物體追蹤系統,私立中原大學碩士論文,民國九十二年。
[35]黃志鴻,機器手臂視覺抓取系統,國立中正大學碩士論文,民國九十二年。
[36]吳承柯,戴善榮,程湘君,雲立實,“數位影像處理”,儒林,民國八十二年。
[37]晉茂林,“機械人學”,五南圖書,民國八十八年。
[38]郭再興,“1394 介面殺手級應用-機器視覺”,�皏K科技股份有限公司,http://www.G4.com.tw。
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top