跳到主要內容

臺灣博碩士論文加值系統

(44.211.26.178) 您好!臺灣時間:2024/06/15 04:22
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:陳奕憲
研究生(外文):Yi-Sian Chen
論文名稱:雙眼視覺在機械手臂末端多腳位物件之辨識定位方法
論文名稱(外文):Binocular Vision in Recognition and Positioning of Multi-pin Objects in Robot Manipulators
指導教授:詹魁元
指導教授(外文):Kuei-Yuan Chan
口試委員:李志中陳湘鳳
口試委員(外文):Jyh-Jone LeeShana Smith
口試日期:2020-07-28
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:機械工程學研究所
學門:工程學門
學類:機械工程學類
論文種類:學術論文
論文出版年:2020
畢業學年度:108
語文別:中文
論文頁數:115
中文關鍵詞:機械手臂手眼校正相機校正異地校正TCP校正雙眼視覺最佳化
外文關鍵詞:Robot manipulatorHand/Eye CalibrationCamera CalibrationOffsite CalibrationTCP CalibrationBinocular VisionOptimization
DOI:10.6342/NTU202002914
相關次數:
  • 被引用被引用:0
  • 點閱點閱:186
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
垂直多關節機械手臂憑借其多自由度的機構,能夠達成多種不同功能的運動路徑規劃,在自動化組裝的製程中扮演了十分關鍵的角色。然而,在一般裝配任務過程中,對於組裝時插件的外型、姿態卻有非常嚴格的限制,甚至時至今日仍有部份機械手臂無法勝任的組裝任務有賴於人工,如電子元件彈塑性腳位的組裝。因此本論文建立一套以雙眼視覺對機械手臂末端插件進行定位以及辨識的方法,同時考慮到系統中元件裝配誤差、機械加工誤差抑或是其它實驗配置,可能會造
成機械手臂各座標系 (機械手臂基座、末端法蘭面、插件、校正板、相機) 間的轉換關係的精度誤差,因此我們先是透過偵測校正版特徵之影像尺度資訊,並與實際物理尺度進行相機校正以及手眼校正。在經過上述一系列異地校正的流程之後,藉由視覺感測之優勢,不但能幫助手臂定位出未知插件末端位置,更有機會辨識出插件種類,以利於之後對各式插件擬定出相對應的插件策略或路徑規劃。而最後本研究在虛擬環境與真實系統中各自建立一套雙眼視覺辨識定位系統作為演示及討論,並且由虛擬環境中所有已知的資訊來驗證本演算法可行性與精準度,得到對插件尖端定位精度 0.21 mm ,而應用於真實系統後雖然環境中存在著環境雜訊與裝配誤差等不確定因素,但本研究方法對於插件尖端位置移動感知靈敏,在經過校正後根據 15 次不同插件姿態的測試,其定位上仍然能達到平均 0.37 mm 之絕對精度水平。
With multi-degree-of-freedom mechanisms, robotic manipulators can achieve a variety of motion path planning in the automated assembly process. However, in the course of general assembly tasks, flexibility is extremely limited by the appearance and the posture of the end-effector. Robotic manipulators are therefore not yet capable of assembling tasks such as peg-in-hole with elastoplastic electronic components. In this research, we propose a method using binocular vision for positioning and identifying the end-effector. In order to verify the feasibility and the accuracy of the proposed method, we performed experiments in both virtual and real environments. In the virtual system with ideal environment knowing all information, the positioning accuracy of the end-effector’s tip is 0.21 mm based on the simulation result. Even though there are uncertainties such as noises and assembly errors in the real environment, the absolute accuracy level of 0.37 mm can still be reached.
口試委員會審定書 ...................................................................................................... i
誌謝 .............................................................................................................................. ii
摘要 .............................................................................................................................. iii
Abstract ........................................................................................................................v
目錄 .............................................................................................................................. vi
圖目錄 .......................................................................................................................... x
表目錄 .......................................................................................................................... xiii
第一章 緒論 ................................................................................................................ 1
1.1 前言 ............................................................................................................... 1
1.2 機械手臂於工廠中之應用方式 ................................................................... 2
1.3 研究動機與研究目的 ................................................................................... 5
1.4 本文架構 ....................................................................................................... 7
第二章 文獻回顧 ........................................................................................................ 9
2.1 可變形一維物件 ........................................................................................... 9
2.1.1 可變形一維物之相關文獻 ............................................................... 10
2.1.2 小結 ................................................................................................... 12
2.2 加工工具應用於機械手臂之校正方法 ....................................................... 12
2.2.1 人工調校 TCP 校正方法 ................................................................. 13
2.2.2 自動化調校 TCP .............................................................................. 14
2.2.3 小結 ................................................................................................... 15
2.3 工業相機應用於機械手臂之校正方法 ....................................................... 16
2.3.1 相機校正 ........................................................................................... 17
2.3.2 手眼校正 ........................................................................................... 18
2.3.3 小結 ................................................................................................... 21
2.4 總結 ............................................................................................................... 22
第三章 模型建置 ........................................................................................................ 23
3.1 工業相機與視覺模型 ................................................................................... 23
3.1.1 相機的基本性質 ............................................................................... 23
3.1.2 相機模型與座標系統 ....................................................................... 35
3.1.3 相機應用 ........................................................................................... 42
3.2 機械手臂系統座標 ....................................................................................... 49
3.2.1 手眼校正 ........................................................................................... 50
3.2.2 工具中心點校正 ............................................................................... 52
3.3 卷積神經網路 ............................................................................................... 54
3.3.1 卷積神經網路組成 ........................................................................... 54
3.3.2 YOLO(You Only Look Once)系列 ............................................. 57
第四章 研究方法 ........................................................................................................ 61
4.1 環境建立 ....................................................................................................... 63
4.2 異地校正 (Offsite Calibration) ..................................................................... 64
4.2.1 置中 (Align Center) .......................................................................... 65
4.2.2 對焦 (Focusing) ................................................................................ 67
4.2.3 計算拍照位置 (Auto Pose) .............................................................. 68
4.2.4 相機校正 (Camera Calibration) ....................................................... 71
4.2.5 手眼校正 (Hand/Eye Calibration) .................................................... 74
4.3 產線操作 (Online Operation) ....................................................................... 77
4.3.1 特徵偵測 (Feature Detection) .......................................................... 78
4.3.2 雙眼視覺 (Binocular Vision) ............................................................ 81
4.3.3 工具中心校正 (TCP Calibration) ..................................................... 81
4.4 精度驗證 ....................................................................................................... 82
第五章 工程案例 ........................................................................................................ 83
5.1 硬體規格與其控制環境 ............................................................................... 83
5.2 自動化校正流程 ........................................................................................... 88
5.2.1 異地校正 (Offsite Calibration) ......................................................... 88
5.2.2 線上操作 (Online Operation) ........................................................... 94
5.3 誤差來源 ....................................................................................................... 98
5.3.1 機械手臂之誤差 ............................................................................... 99
5.3.2 工具之誤差 ....................................................................................... 99
5.3.3 視覺影像之誤差 ............................................................................... 100
5.3.4 數值計算之誤差 ............................................................................... 100
第六章 結論與未來展望 ............................................................................................102
6.1 結論 ............................................................................................................... 102
6.2 研究建議與未來研究方向 ........................................................................... 103
參考文獻 ......................................................................................................................105
[1] ABB, 2020. [Online]. Available: https://www.directindustry.es/prod/abb-robotics/product-30265-1882646.html
[2] Booster-Machine, 2020. [Online]. Available: https://en.booster-machine.com/
[3] ZIS Industrietechnik GmbH, “Robot calibration unit,” 2019. [Online]. Available:https://www.zis-cutting.de/en/machine-components/robot-calibration-unit/
[4] Z. Gan and Q. Tang, “Calibration of a Robot Visual System,” Advanced Topics inScience and Technology in China, pp. 93–141, 2011.
[5] R. Y. Tsai and R. K. Lenz, “A New Technique for Fully Autonomous and Efficient 3DRobotics Hand/Eye Calibration,” IEEE Transactions on Robotics and Automation,vol. 5, no. 3, pp. 345–358, 1989.
[6] F. Dornaika, R. Horaud, F. Dornaika, R. H. S. Robot-world, and H.-e. C. I. Trans,“Simultaneous Robot-World and Hand-Eye Calibration,” 1998.
[7] I. Ali, S. , A. Gotchev, and M. , “Methods for simultaneous robot-world-hand– eyecalibration: A comparative study,” Sensors, vol. 19, p. 2837, 2019.
[8] MathWorks, “What is camera calibration?”2019. [Online]. Available: https://www.mathworks.com/help/vision/ug/camera-calibration.html
[9] Chess, 2020. [Online]. Available: https://www.chess.com/member/isd_sampath
[10] Q. Wang, W.-C. Cheng, N. Suresh, and H. Hua, “Development of the local magnification method for quantitative evaluation of endoscope geometric distortion,”Journal of Biomedical Optics, vol. 21, no. 5, pp. 1 – 13 – 13, 2016.
[11] J. Wang, F. Shi, J. Zhang, and Y. Liu, “A new calibration model of camera lensdistortion,” Pattern Recognition, vol. 41, no. 2, pp. 607 – 615, 2008.
[12] Wikiwand, 2020. [Online]. Available:https://www.wikiwand.com/en/Image_rectification
[13] Andeggs, “Spherical coordinate system,” 2009. [Online]. Available:https://en.wikipedia.org/wiki/Spherical_coordinate_system
[14] OpenCV, “Detection of aruco markers,” 2019. [Online]. Available:https://docs.opencv.org/4.1.0/d5/dae/tutorial_aruco_detection.html
[15] Opencv, 2020. [Online]. Available:https://docs.opencv.org/2.4/doc/tutorials/imgproc/imgtrans/hough_circle/hough_circle.html
[16] B. Solvang, G. Sziebig, and P. Korondi, “Vision based robot programming,” in 2008 IEEE International Conference on Networking, Sensing and Control. IEEE, 2008,pp. 949–954.
[17] M. Saadat and P. Nan, “Industrial applications of automatic manipulation of flexible materials,” Industrial Robot: An International Journal, 2002.
[18] S. Chan, “A disturbance observer for robot manipulators with application to electronic components assembly,” IEEE Transactions on Industrial Electronics,vol. 42, no. 5, pp. 487–493, 1995.
[19] H. Nakagaki, K. Kitagi, T. Ogasawara, and H. Tsukune, “Study of insertion task of a flexible wire into a hole by using visual tracking observed by stereo vision,” in Proceedings of IEEE International Conference on Robotics and Automation, vol. 4.IEEE, 1996, pp. 3209–3214.
[20] H. Wakamatsu and S. Hirai, “Static modeling of linear object deformation based on differential geometry,” The International Journal of Robotics Research, vol. 23,no. 3, pp. 293–311, 2004.
[21] H. Wakamatsu, T. Yamasaki, A. Tsumaya, E. Arai, and S. Hirai, “Dynamic modeling of linear object deformation considering contact with obstacles,” in 2006 9th International Conference on Control, Automation, Robotics and Vision. IEEE, 2006,pp. 1–6.
[22] X. Yanchun, B. Yuewei, and H. Yafei, “Assembly strategy study on the elastic deformable peg in hole,” in 2010 The 2nd International Conference on Industrial Mechatronics and Automation, vol. 1. IEEE, 2010, pp. 193–197.
[23] E. Yoshida, K. Ayusawa, I. G. Ramirez-Alpizar, K. Harada, C. Duriez, and A. Kheddar, “Simulation-based optimal motion planning for deformable object,” in 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO). IEEE, 2015, pp. 1–6.
[24] S. Javdani, S. Tandon, J. Tang, J. F. O’Brien, and P. Abbeel, “Modeling and perception of deformable one-dimensional objects,” in 2011 IEEE International Conference on Robotics and Automation. IEEE, 2011, pp. 1607–1614.
[25] T. Bretl and Z. McCarthy, “Quasi-static manipulation of a kirchhoff elastic rod based on a geometric analysis of equilibrium configurations,” The International Journal of Robotics Research, vol. 33, no. 1, pp. 48–68, 2014.
[26] C. Chen and Y. F. Zheng, “Deformation identification and estimation of one-dimensional objects by vision sensors,” Journal of robotic systems, vol. 9, no. 5,pp. 595–612, 1992.
[27] Y. F. Zheng, R. Pei, and C. Chen, “Strategies for automatic assembly of deformable objects,” in Proceedings. 1991 IEEE International Conference on Robotics and Automation. IEEE, 1991, pp. 2598–2603.
[28] D. Henrich, T. Ogasawara, and H. Worn, “Manipulating deformable linear objects-contact states and point contacts,” in Proceedings of the 1999 IEEE International Symposium on Assembly and Task Planning (ISATP’99) (Cat. No. 99TH8470).IEEE, 1999, pp. 198–204.
[29] J. Acker and D. Henrich, “Manipulation of deformable linear objects: From geometric model towards program generation,” in Proceedings of the 2005 IEEE International Conference on Robotics and Automation. IEEE, 2005, pp. 1541–1547.
[30] C. Jiao, X. Jiang, X. Li, and Y. Liu, “Vision based cable assembly in constrained environment,” in 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO). IEEE, 2018, pp. 8–13.
[31] W. Wang, D. Berenson, and D. Balkcom, “An online method for tight-tolerance insertion tasks for string and rope,” in 2015 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2015, pp. 2488–2495.
[32] J. Kim, D. Kang, and H. Cho, “A flexible parts assembly algorithm based on a visual sensing system,” in Proceedings of the 2001 IEEE International Symposium on Assembly and Task Planning (ISATP2001). Assembly and Disassembly in the Twenty-first Century.(Cat. No. 01TH8560).IEEE, 2001, pp. 417–422.
[33] S. Huang, Y. Yamakawa, T. Senoo, and M. Ishikawa, “Robotic needle threading manipulation based on high-speed motion strategy using high-speed visual feedback,”in 2015 IEEE/ RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2015, pp. 4041–4046.
[34] D. De Gregorio, R. Zanella, G. Palli, S. Pirozzi, and C. Melchiorri, “Integration of robotic vision and tactile sensing for wire-terminal insertion tasks,” IEEE Transactions on Automation Science and Engineering, vol. 16, no. 2, pp. 585–598,2018.
[35] R. Zanella, D. De Gregorio, S. Pirozzi, and G. Palli, “Dlo-in-hole for assembly tasks with tactile feedback and lstm networks,” in 2019 6th International Conference on Control, Decision and Information Technologies (CoDIT). IEEE, 2019, pp. 285–290.
[36] A. Buschhaus, M. Wagner, and J. Franke, “Inline Calibration Method for Robot Supported Process Tasks with High Accuracy Requirements,” IEEE/ ASME International Conference on Advanced Intelligent Mechatronics, AIM, pp. 682–687,2017.
[37] Y. Cai, H. Gu, C. Li, and H. Liu, “Easy industrial robot cell coordinates calibration with touch panel,” Robotics and Computer-Integrated Manufacturing, vol. 50, no.October 2017, pp. 276–285, 2018.
[38] Y. Sun, D. J. Giblin, and K. Kazerounian, “Accurate robotic belt grinding of workpieces with complex geometries using relative calibration techniques,”Robotics and Computer-Integrated Manufacturing, vol. 25, no. 1, pp. 204–210,2009.
[39] Z. Gordić and C. Ongaro, “Calibration of robot tool centre point using camera-based system,” Serbian Journal of Electrical Engineering, vol. 13, no. 1, pp. 9–20, 2016.
[40] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the accuracy and robustness of the Leap Motion Controller,” Sensors (Switzerland), vol. 13, no. 5, pp.6380–6393, 2013.
[41] X. Yu, T. Baker, Y. Zhao, and M. Tomizuka, “Robot Tool Calibration in Precise Glass Handling,” p. V002T16A001, 2017.
[42] G. B. de Sousa, A. Olabi, J. Palos, and O. Gibaru, “3D Metrology Using a Collaborative Robot with a Laser Triangulation Sensor,” Procedia Manufacturing,vol. 11, no. June, pp. 132–140, 2017.
[43] G. Li and M. Vossiek, “A multilateral synthetic aperture wireless positioning approach to precise 3D localization of a robot tool center point,” 2011 IEEE Radio and Wireless Week, RWW 2011 - 2011 IEEE Topical Conference on Wireless Sensors and Sensor Networks, WiSNet 2011, no. 1, pp. 37–40, 2011.
[44] A. Joubair, M. Slamani, and I. A. Bonev, “Kinematic calibration of a five-bar planar parallel robot using all working modes,” Robotics and Computer-Integrated Manufacturing, vol. 29, no. 4, pp. 15–25, 2013.
[45] M. Morozov, J. Riise, R. Summan, S. G. Pierce, C. Mineo, C. N. MacLeod, and R. H. Brown, “Assessing the accuracy of industrial robots through metrology for the enhancement of automated non-destructive testing,” IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 335–340, 2017.
[46] R. C. Luo, H. Wang, and M. H. Kuo, “Low cost solution for calibration in absolute accuracy of an industrial robot for iCPS applications,” Proceedings - 2018 IEEE Industrial Cyber-Physical Systems, ICPS 2018, pp. 428–433, 2018.
[47] R. Tsai, “A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-the-shelf tv cameras and lenses,” IEEE Journal on Robotics and Automation, vol. 3, no. 4, pp. 323–344, 1987.
[48] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Transactions on pattern analysis and machine intelligence, vol. 22, no. 11, pp. 1330–1334, 2000.
[49] X. Meng and Z. Hu, “A new easy camera calibration technique based on circular points,” Pattern Recognition, vol. 36, no. 5, pp. 1155–1164, 2003.
[50] K. H. Strobl and G. Hirzinger, “More accurate pinhole camera calibration with imperfect planar target,” Proceedings of the IEEE International Conference on Computer Vision, pp. 1068–1075, 2011.
[51] M. Shortis, “Calibration techniques for accurate measurements by underwater camera systems,” Sensors (Switzerland), vol. 15, no. 12, pp. 30 810–30 827, 2015.
[52] L. Huang, F. Da, and S. Gai, “Research on multi-camera calibration and point cloud correction method based on three-dimensional calibration object,” Optics and Lasers in Engineering, vol. 115, no. October 2018, pp. 32–41, 2019.
[53] B. Caprile and V. Torre, “Using vanishing points for camera calibration,” International Journal of Computer Vision, vol. 4, no. 2, pp. 127–139, 1990.
[54] O. D. Faugeras, Q. T. Luong, and S. J. Maybank, “Camera self-calibration: Theory and experiments,” in Computer Vision — ECCV’92, G. Sandini, Ed.Berlin,Heidelberg: Springer Berlin Heidelberg, 1992, pp. 321–334.
[55] L. Grammatikopoulos, G. Karras, and E. Petsa, “An automatic approach for cameracalibration from vanishing points,” ISPRS Journal of Photogrammetry and Remote Sensing, vol. 62, pp. 64–76, 2007.
[56] S. Huang, X. Ying, J. Rong, Z. Shang, H. Zha, and C. Science, “Camera Calibration from Periodic Motion of a Pedestrian Shiyao Huang , Xianghua Ying *, Jiangpeng Rong , Zeyu Shang and Hongbin Zha Key Laboratory of Machine Perception (Ministry of Education ) School of Electronic Engineering and Computer Science, Cent,” Cvpr, pp. 3025–3033, 2016.
[57] X. You and Y. Zheng, “An accurate and practical calibration method for roadside camera using two vanishing points,” Neurocomputing, vol. 204, pp. 222–230, 2016.
[58] H. Chang and F. Tsai, “Vanishing point extraction and refinement for robust camera calibration,” Sensors (Switzerland), vol. 18, no. 1, pp. 1–19, 2018.
[59] R. Pflugfelder and H. Bischof, “Online auto-calibration in man-made worlds,” in Digital Image Computing: Techniques and Applications (DICTA’05), Dec 2005, pp.75–75.
[60] S. Nedevschi, C. Vancea, T. Marita, and T. Graf, “Online Extrinsic Parameters Calibration for Stereovision Systems Used in Far-Range Detection Vehicle Applications,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 4,pp. 651–660, dec 2007.
[61] A. Basu, “Active calibration: alternative strategy and analysis,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. IEEE Comput. oc. Press, 1993, pp. 495–500.
[62] Q. Ji and S. Dai, “Self-Calibration of a Rotating Camera With a Translational Offset,”IEEE Transactions on Robotics and Automation, vol. 20, no. 1, pp. 1–14, feb 2004.
[63] Lei Wang, Sing Bing Kang, Heung-Yeung Shum, and Guangyou Xu, “Error analysis of pure rotation-based self-calibration,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, no. 2, pp. 275–280, feb 2004.
[64] N. Anjum and A. Cavallaro, “Single camera calibration for trajectory-based behavior analysis,” in 2007 IEEE Conference on Advanced Video and Signal Based Surveillance. IEEE, sep 2007, pp. 147–152.
[65] K. Genovese, Y. Chi, and B. Pan, “Stereo-camera calibration for large-scale DIC measurements with active phase targets and planar mirrors,” Optics Express, vol. 27,no. 6, p. 9040, mar 2019.
[66] J. C. Chou and M. Kamel, “Finding the position and orientation of a sensor on a robot manipulator using quaternions,” International Journal of Robotics Research,
vol. 10, no. 3, pp. 240–254, 1991.
[67] J. C. K. Chou and M. Kamel, “Quaternions approach to solve the kinematic equation of rotation, a/sub a/a/sub x/=a/sub x/a/sub b/, of a sensor-mounted robotic manipulator,” in Proceedings. 1988 IEEE International Conference on Robotics andAutomation, April 1988, pp. 656–662 vol.2.
[68] K. Daniilidis, “Hand-Eye Calibration Using Dual Quaternions,” The International Journal of Robotics Research, vol. 18, no. 3, pp. 286–298, 1999.
[69] F. Dornaika and R. Horaud, “Simultaneous robot-world and hand-eye calibration,”IEEE Transactions on Robotics and Automation, vol. 14, no. 4, pp. 617–622, Aug1998.
[70] F. Horaud, R.;Dornaika, “Hand-Eye Calibration,” The International Journal of Robotics Research, vol. 14, no. 3, pp. 195–210, 1995.
[71] J. Schmidt, F. Vogt, and H. Niemann, “Vector Quantization Based Data Selection for Hand-Eye Calibration,” Vision, Modeling, and Visualization, pp. 21–28, 2004.
[72] I. Fassi and G. Legnani, “Hand to sensor calibration: A geometrical interpretation of the matrix Equation AX=XB,” Journal of Robotic Systems, vol. 22, no. 9, pp.
497–506, 2005.
[73] M. Ikits, “Coregistration of pose measurement devices using nonlinear least squares parameter estimation,” The University of Utah, Tech. Rep., 2001.
[74] F. Leali, F. Pini, and M. Ansaloni, “Integration of cam off-line programming in robot high-accuracy machining,” in proceedings of the 2013 IEEE/ SICE international symposium on system Integration. IEEE, 2013, pp. 580–585.
[75] Network, 2020. [Online]. Available:https://www.pinterest.com/williammish/homemade-cameras/
[76] Wiki, 2020. [Online]. Available: https://en.wikipedia.org/wiki/Pinhole_camera
[77] L. RAYLEIGH, “Some Applications of Photography1,” Nature, vol. 44, no. 1133,pp. 249–254, 1891.
[78] X. Xu, Y. Wang, J. Tang, X. Zhang, and X. Liu, “Robust automatic focus algorithm for low contrast images using a new contrast measure,” Sensors, vol. 11, no. 9, pp.8281–8294, 2011.
[79] J. Tang, S. Guo, Q. Sun, Y. Deng, and D. Zhou, “Speckle reducing bilateral filter for cattle follicle segmentation,” BMC genomics, vol. 11, no. 2, p. S9, 2010.
[80] G. Q. Wei and S. D. Ma, “Implicit and Explicit Camera Calibration: Theory and Experiments,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 16, no. 5, pp. 469–480,may 1994.
[81] D. C. Brown,“Close-range camera calibration,”PHOTOGRAM METRIC ENGINEERING, vol. 37, no. 8, pp. 855–866, 1971.
[82] J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 10, pp. 965–980, oct 1992.
[83] R. I. Hartley, “In defense of the eight-point algorithm,” IEEE Transactions on pattern analysis and machine intelligence, vol. 19, no. 6, pp. 580–593, 1997.
[84] C. Loop and Z. Zhang, “Computing rectifying homographies for stereo vision,” in Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149), vol. 1.
IEEE, 1999, pp. 125–131.
[85] M. Pollefeys, R. Koch, and L. Van Gool, “A simple and efficient rectification method for general motion,” in Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 1.IEEE, 1999, pp. 496–501.
[86] Esteiras, “robotic-welder,” 2020. [Online]. Available: https://www.gratispng.com/png-wjdb0a/
[87] On Robot, “On robot launches customizable grippers for collaborative robots,”2017. [Online]. Available: https://www.manufacturing.net/product-announcement/2017/08/robot-launches-customizable-grippers-collaborative-robots
[88] J. S. GmbH, “Tool center point vee-tcp,”2019. [Online]. Available:https://www.schmalz.com/en/vacuum-technology-for-automation/vacuum-114components/area-gripping-systems-and-end-effectors/vacuum-end-effectors-vee/tool-center-point-vee-tcp
[89] M. Hägele, K. Nilsson, J. N. Pires, and R. Bischoff, “Industrial robotics,” in Springerhandbook of robotics. Springer, 2016, pp. 1385–1422.
[90] Nikon, 2015. [Online]. Available: https://www.nikon.com
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊