跳到主要內容

臺灣博碩士論文加值系統

(3.237.38.244) 您好!臺灣時間:2021/07/26 10:57
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:黃俊傑
研究生(外文):Jiun-JieHuang
論文名稱:人形機器人利用雙軸相機與全景式攝影機之視覺導引抓取任務之研究
論文名稱(外文):Study on Vision-Guided Grasping Tasks by Humanoid Robots Using a Pan-Tilt Camera and a Panoramic Camera
指導教授:蔡清元蔡清元引用關係
指導教授(外文):Tsing-Iuan Tsay
學位類別:碩士
校院名稱:國立成功大學
系所名稱:機械工程學系碩博士班
學門:工程學門
學類:機械工程學類
論文種類:學術論文
論文出版年:2012
畢業學年度:100
語文別:中文
論文頁數:99
中文關鍵詞:人型機器人全景式攝影機視覺導引
外文關鍵詞:humanoid robotpanoramic cameravision-guided
相關次數:
  • 被引用被引用:0
  • 點閱點閱:173
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
機器人技術的進展已經超出自動化工業,並已經被證明應用於保健,娛樂,安全和家庭生活等領域。
本實驗室先前建構了一個輪式人形機器人,用於幫助那些無法獨立進食的人,現今應用於一視覺導引抓取任務之研究平台。其主要是由一全方向輪式移動平台、一個架於輪式移動平台上的固定式身軀、兩個各七自由度的機械手臂、兩個機械手及一個安裝著全景式攝影機與雙軸相機之機械頭所組成。
本論文之目的為提出一種視覺導引控制策略,利用全景式攝影機及雙軸相機,導引機械手臂,賦予人形機器人抓取物件之能力。其基礎建立在兩台攝影機之幾何模型以及兩台攝影機之相對位置關係。最後,藉由驅動人形機器人右手抓取桌子上不同位置之物件之實驗,來評估其性能。實驗結果顯示,藉由機器人頭上兩台攝影機和機器人右手,可以在協調的方式運作,成功的定位及抓取目標物。

Progress in robot technology has extended beyond automation of the manufacturing industry and has proven to be applicable to such areas as health care, entertainment, security and home life. A wheel-type humanoid robot that was constructed to support those who are unable to eat independently is adopted as a research platform for vision-guided grasping tasks. The robot comprises mainly an omni-directional base, a fixed torso mounted on the base, two seven-degrees-of-freedom arms, two robot hands and a head equipped with a panoramic camera and a pan-tilt camera. The objective of this thesis is to propose a vision-guided control strategy to equip the humanoid robot with the ability to grasp the material by the robot arm using a pan-tilt camera and a panoramic camera. The proposed control strategy is based on geometric camera models and the relative location of both cameras. Finally, the grasping performance of the humanoid robot is experimentally evaluated by driving the robotic right hand to grasp a workpiece in various locations on a table. Experimental results demonstrate that both cameras of the robotic head and the right arm of the humanoid robot can function in a coordinated fashion to locate and grasp the target object successfully.
目錄
中文摘要 i
英文摘要 ii
誌謝 iii
目錄 iv
圖目錄 vii
表目錄 x
符號說明 xi

第一章 緒論 1
1.1 前言 1
1.2 研究動機與目的 1
1.3 文獻回顧 1
1.4 本文架構 2

第二章 人形機器人之硬體介紹 4
2.1 人形機器人之硬體架構 4
2.1.1 七自由度機械手臂之機構 5
2.1.2 機械手掌之機構 10
2.1.3 全方向輪式平台 13
2.2 機器人之感測元件 15
2.2.1雷射測距儀 15
2.2.2陀螺儀 15
2.2.3 視覺系統 16
2.2.4 影像擷取卡 19
2.2.5 力量感測器 19
2.3 機器人系統之硬體控制架構 20

第三章 機械人分析 22
3.1 全方向輪式底盤之運動學分析 22
3.2 機器手臂運動學分析 25
3.2.1 座標系統 25
3.2.2 機械手臂座標系統 26
3.2.3 機械手臂之順向運動學 31
3.2.4 機械手臂之逆向運動學 32
3.2.5 機械手臂速度運動學 36
3.3 機械手臂工作空間之軌跡規劃 42
3.3.1 空間中直線路徑 43
3.3.2 空間之方位 44
3.3.3 軌跡規劃 46
3.3.4 手臂抓取方塊之軌跡規劃 49

第四章 機器人之攝影機及影像處理 52
4.1 前言 52
4.2 PTZ攝影機校正 53
4.2.1 一般攝影機之幾何模型 53
4.2.2 攝影機內部參數校正 56
4.2.3 參數表示 56
4.2.4 Homography估測 57
4.2.5 內部參數之計算 57
4.2.6 外部參數之計算 59
4.2.7 PTZ攝影機內部參數之校正 60
4.3 全景式攝影機校正 62
4.3.1 全景式攝影機之幾何模型簡介 62
4.3.2 全景式攝影機校正 66
4.3.3 全景式攝影機內、外部參數估測 70
4.4 雙攝影機之校正 71
4.4.1 全景式攝影機之視角對應關係 71
4.4.2 利用兩台攝影機幾何關係之景深計算 73
4.4.3 空間中目標點之景深測試 74
4.5 影像處理 76
4.5.1 影像前處理 76
4.5.2 計算影像面積與重心 77
4.5.3 估測影像主軸 78
4.5.4 邊緣偵測 78
4.5.5 擷取角落特徵 79
4.5.6 全景式攝影機之影像處理 82

第五章 實驗 86
5.1 實驗設置 86
5.2 機器人手掌對應於物件之關係 89
5.3 機器人定位性能評估 89
5.4 機器人物件抓取測試 92

第六章 結論 95
6.1 總結 95
6.2 未來發展 95

參考文獻 96

參考文獻
【1】G. Asuni, G. Teti, C. Laschi, E. Guglielmelli and P. Dario, “Extension to End-effector Position and Orientation Control of a Learning-based Neurocontroller for a Humanoid Arm, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4151-4156, Beijing, China, Oct. 2006.
【2】S. Baker and S. Nayar, “A Theory of Catadioptric Image Formation, Proceedings of the IEEE International Conference on Computer Vision, pp. 35-42, New York, January 1998.
【3】C. Charalambous, “Conjugate Gradient Algorithm for Efficient Training of Artificial Neural Network, Proceedings of the IEE International Conference on Circuits, Devices and Systems, vol. 139, no. 3, pp. 301-310, June 1992.
【4】J. J. Craig, Introduction of Robotics Mechanics & Control, Addision-Wesley, 1986.
【5】G. Flandin, F. Chaumette and E. Marchand, “Eye-in-hand/Eye-to-hand Cooperation for Visual Servoing, Proceedings of the IEEE International Conference on Robotics and Automation, vol. 3, pp. 2741-2746, France, Apr. 2000.
【6】C. Geyer and K. Daniilidis, “Structure and Motion from Uncalibrated Catadioptric Views, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp.279-286, Philadelphia, USA, 2001.
【7】C. Geyer and K. Daniilidis, “Catadioptric Projective Geometry, International Journal of Computer Vision, vol. 45, pp. 223-243, USA, 2001.
【8】M. T. Hagan and M. B. Menhaj, “Training Feedforward Networks with the Marquardt Algorithm, IEEE Transactions on Neural Networks, vol. 5, no.6, pp. 989-993, Nov. 1994.
【9】S. Kang, “Catadioptric Self-Calibration, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 201-207, Redmond, 2000.
【10】W. Khalil and J. F. Kleinfinger, “A New Geometric Notation for Open and Closed Loop Robots, Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1174-1180, Nantes, Cedex, 1986.
【11】M. Koga, K. Kosuge, K. Furuta and K. Nosaki, “Coordinated Motion Control of Robot Arms based on the Virtual Internal Model, IEEE Transactions on Robotics and Automation, vol. 8, no.1, pp. 77-85, Japan, 1992.
【12】C. H. Lai, Design and Control of an Anthropomorphic Robot, Master Thesis, Dept. of Mechanical Eng., Nation Cheng Kung University, July 2003.
【13】Q. Meng and M. H. Lee, “Biologically Inspired Automatic Construction of Cross-Modal Mapping in Robotic Eye/Hand systems, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4742-4749, Wales, UK, Oct. 2006.
【14】A. Muis and K. Ohnishi, “Eye-to-Hand Approach on Eye-in-Hand Configuration Within Real-Time Visual Servoing, IEEE/ASME Transactions on Mechatronics, vol. 10, Issue 4, pp. 404-410, Japan, Aug. 2005.
【15】S. Mukherjee, E. Osuna and F. Girosi, “Nonlinear Prediction of Chaotic Time Series Using Support Vector Machines, Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing, pp. 511-519, Cambridge, MA, Sep. 1997.
【16】E. D. Orin and W. W. Schrader, “Efficient Computation of the Jacobian for Robot Manipulator, International Journal of Robotics Research, vol. 3, no. 4, pp. 66-75, 1984.
【17】D. Scaramuzza, “A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion, Proceedings of the Fourth IEEE International Conference on Computer Vision Systems, Lausanne, Switzerland, 2006.
【18】D. Scaramuzza, A Toolbox for Easy Calibrating Omnidirectional Cameras, Proceedings of the 2006 IEEE/RSJ Intermational Conference on Intelligent Robots and Systems, Beijing, China, Oct. 7-15, 2006.
【19】W. Sepp, S. Fuchs and G. Hirzinger, “Hierarchical Featureless Tracking for Position-Based 6-DoF Visual Servoing, Proceedings of the 2006 IEEE/RSJ Intermational Conference on Intelligent Robots and Systems, Beijing, China, Oct. 9-15, 2006.
【20】J. Spletzer, A. K. Das, R. Fierro, C. J. Taylor, V. Kumar and J. P. Ostrowski, “Cooperative Localization and Control for Multi-Robot Manipulation, Proceeding of the 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Philadelphia, USA.
【21】J. Su, Y. Xi, U. D. Hanebeck and G. Schmidt, “Nonlinear Visual Mapping Model for 3-D Visual Tracking With Uncalibrated Eye-in-Hand Robotic System, IEEE Transactions on System, Man and Cybernetics, Part B: Cybernetics, vol. 34, no.1, pp. 652-659, Feb. 2004.
【22】T. Yoshikawa, “Shape Recognition and Grasping by Robtic Hands with soft Fingers and Omnidrectional Camera, Proceedings of the 2008 IEEE International Conference on Robtics and Automation, Pasadena, CA, USA.
【23】T. P. Vogl, J. K. Mangis, A. K. Zigler, W. T. Zink and D. L. Alkon, “Accelerating the Convergence of the Backpropagation Method, Biological Cybernetics, vol. 59, no. 4-5, pp. 256-264, Pasadena, CA, USA, Sep. 1998.
【24】J. K. Waldron, W. S. Liang and S. J. Bolin, “A Study of the Jacobian Matrix of Serial Manipulator, Journal of Mechanisms, Transmissions, and Automation in Design, vol. 107, pp. 230-238, Bethesda, MD, USA, June 1985.
【25】Z. Zhang, A flexible New Technique for Camera Calibration,Technical Report MSR-TR-98-71, Redmond, WA, USA, 1998.
【26】劉逸明,Construction and Intelligent Control of a Feeding Humanoid Robot and Its Application to Two-Arm Coordination for Material Handling, Master Thesis, Dept. of Mechanical Eng., Nation Cheng Kung University , July 2010.
【27】黃嬿慈,Study on Fusion of Omni-directional and PTZ Cameras for Moving Object Detection and Tracking, Master Thesis, Dept. of Electrical Eng., Nation Cheng Kung University , Aug. 2011.
【28】黃啟彰,Localization of a Mobile Robot Using Omnidirectional Video Cameras, Master Thesis, Dept. of Electrical Engineering , National Chung Cheng University, Jan. 2007.
【29】林京燁,Motion Object Detection Using Multiple Parametric Background and Foregruond Models at Omni-Directional Camera, Master Thesis, Dept. of Computer Science and Information Eng., Nation Cheng Kung University , Aug. 2010.
【30】周家至,Incorporating Omni-Directional image and the Optical Flow Technique into Movement Estimation, Master Thesis, Dept. of Mechanical and Electr- Mechanical Eng., Nation Sun Yat-Sen University , July 2007.
【31】許正達,A Study on Vision-Guided Material Handling by a Humanoid Robot, Master Thesis, Dept. of Mechanical Eng., Nation Cheng Kung University , July 2011.

QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top