( 您好!臺灣時間:2021/05/09 19:25
字體大小: 字級放大   字級縮小   預設字形  


論文名稱(外文):Study on Collision Avoidance for Industrial Manipulators Based on Computer Vision and Danger Fields
指導教授(外文):Ming-Yang Cheng
中文關鍵詞:視覺影像控制Kinect 攝影機危險能量場零空間再分配方法線上軌跡生成函式庫人機協作
外文關鍵詞:Vision ControlKinect CameraDanger FieldsJoint RedistributionOnline Trajectory GenerationHuman-Robot Collaboration
  • 被引用被引用:0
  • 點閱點閱:39
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
With the progress of time and the innovations of technology, the relationship between robots and humans has gradually become closer than ever. In order to pursue more diverse and efficient manufacturing methods, many approaches have been advocated to replace the traditional human-machine-work-separately manufacturing mode. Among them, the human-robot collaboration mode has received most attention. However, with the interaction distance between the human operator and the robot decreasing, the potential danger of operating industrial robots will increase accordingly. As a result, how to control robots so as not to harm the human operator without sacrificing work performance is a current issue of concern. In view of this, the main purpose of this thesis is to develop a human-robot interaction architecture that can be applied to industrial robots. A robot can be operated in a dynamic complex environment through the concept of computer vision and danger fields. Methods such as joint redistribution and online trajectory generation constrain the motion behavior of the robot arm, so that it can perform cooperation tasks with humans in a safe manner. As such, several experiments are conducted to verify the feasibility of the proposed approach. It is expected that the architecture developed in this thesis will enable the industrial robot to escape from a particular location in a stable manner when encountering dynamic obstacles while also maintaining work efficiency and safety. This new type of human-robot collaboration mode will lay the foundation for future industrial development.
中文摘要 I
誌謝 XI
目錄 XII
表目錄 XV
圖目錄 XVI
第一章、緒論 1
1.1 研究動機與目的 1
1.2 文獻回顧 2
1.3 論文架構與貢獻 4
第二章、使用RGBD 攝影機之三維影像重建 6
2.1 Kinect 攝影機簡介 6
2.2 深度影像預處理 7
2.2.1 Kinect 深度資訊串轉換 8
2.2.2 深度影像填補 9
2.3 三維影像重建 10
2.3.1 色彩與深度資訊擬合 11
2.3.2 相機內部參數 12
2.3.3 相機外部參數與世界座標系 13
2.3.4 手眼校正 15
第三章、機械手臂運動學模型 17
3.1 基於DH 表之順向運動學 17
3.2 機器人雅可比矩陣 23
3.2.1 微分旋轉變化 24
3.2.2 幾何解機器人雅可比矩陣 26
3.3 微分運動學與雅可比矩陣零空間投影 27
第四章、基於危險能量場之障礙物避免策略 29
4.1 危險能量場與危險程度 29
4.2 危險能量場與末端避障命令 33
4.3 基於關節空間再分配方法之連桿避障策略 35
4.3.1 風險函數與連桿限制 35
4.3.2 雅可比矩陣零空間飽和再分配演算法 37
4.4 Reflexxes Motion Library 41
第五章、實驗架構與結果分析 44
5.1 實驗架構 44
5.1.1 實驗設備 44
5.1.2 實驗場景 46
5.1.3 系統架構 46
5.2 實驗方法及結果分析 47
5.2.1 Kinect 深度影像修復結果 47
5.2.2 障礙物偵測與基於危險能量場之避障命令生成結果 49
5.2.3 RML 約束軌跡急跳度與零空間飽和再分配結果 52
第六章、結論與建議 70
6.1 結論 70
6.2 未來展望與建議 71
參考文獻 73
[1]S. Robla-Gómez, V. M. Becerra, J. R. Llata, E. González-Sarabia, C. Torre-Ferrero and J. Pérez-Oria, “Working Together: A Review on Safe Human-Robot Collaboration in Industrial Environments, IEEE Access, vol. 5, pp. 26754-26773, 2017.
[2]R. C. Luo and C. Kuo, “Intelligent Seven-Dof Robot with Dynamic Obstacle Avoidance and 3-D Object Recognition for Industrial Cyber–Physical Systems in Manufacturing Automation, Proceedings of the IEEE, vol. 104, no. 5, pp. 1102-1113, 2016.
[3]S. Haddadin, M. Suppa, S. Fuchs, T. Bodenmüller, A. Albu-Schäffer and G. Hirzinger, “Towards the Robotic Co-Worker, Robotics Research, 2011, pp. 261-282.
[4]R. Bischoff et al., “The KUKA-DLR Lightweight Robot Arm - a New Reference Platform for Robotics Research and Manufacturing, in Proceedings of ISR 2010 and ROBOTIK 2010, 2010, pp. 1-8.
[5]N. Pedrocchi, F. Vicentini, M. Matteo and L. M. Tosatti, “Safe Human-Robot Cooperation in an Industrial Environment, International Journal of Advanced Robotic Systems, vol. 10, no. 1, p. 27, 2013.
[6]M. Vasic and A. Billard, “Safety Issues in Human-Robot Interactions, in Proceedings of 2013 IEEE International Conference on Robotics and Automation, 2013, pp. 197-204.
[7]A. D. Luca and F. Flacco, “Integrated Control for PHRI: Collision Avoidance, Detection, Reaction and Collaboration, in Proceedings of 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, 2012, pp. 288-295.
[8]D. Kulić and E. A. Croft, “Real-Time Safety for Human–Robot Interaction, Robotics and Autonomous Systems, vol. 54, no. 1, pp. 1-12, 2006.
[9]A. Albu-Schäffer, C. Ott and G. Hirzinger, “A Unified Passivity-Based Control Framework for Position, Torque and Impedance Control of Flexible Joint Robots, The International Journal of Robotics Research, vol. 26, no. 1, pp. 23-39, 2007.
[10]S. Haddadin, A. Albu-Schaffer, A. D. Luca and G. Hirzinger, “Collision Detection and Reaction: A Contribution to Safe Physical Human-Robot Interaction, in Proceedings of 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2008, pp. 3356-3363.
[11]A. D. Luca, A. Albu-Schaffer, S. Haddadin and G. Hirzinger, “Collision Detection and Safe Reaction with the DLR-Lii Lightweight Manipulator Arm, in Proceedings of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, pp. 1623-1630.
[12]P. T. Zacharia, E. K. Xidias and N. A. Aspragathos, “Task Scheduling and Motion Planning for an Industrial Manipulator, Robotics and Computer-Integrated Manufacturing, vol. 29, no. 6, pp. 449-462, 2013.
[13]F. J. Abu-Dakka, F. Rubio, F. Valero and V. Mata, “Evolutionary Indirect Approach to Solving Trajectory Planning Problem for Industrial Robots Operating in Workspaces with Obstacles, European Journal of Mechanics - A/Solids, vol. 42, pp. 210-218, 2013.
[14]R. Meziane, M. J. D. Otis and H. Ezzaidi, “Human-Robot Collaboration While Sharing Production Activities in Dynamic Environment: Spader System, Robotics and Computer-Integrated Manufacturing, vol. 48, pp. 243-253, 2017.
[15]O. Khatib, “Real-Time Obstacle Avoidance for Manipulators and Mobile Robots, in Autonomous Robot Vehicles, 1990, pp. 396-404.
[16]F. Flacco, T. Kröger, A. D. Luca and O. Khatib, “A Depth Space Approach to Human-Robot Collision Avoidance, in Proceedings of 2012 IEEE International Conference on Robotics and Automation, 2012, pp. 338-345.
[17]O. Brock and O. Khatib, “Elastic Strips: A Framework for Motion Generation in Human Environments, The International Journal of Robotics Research, vol. 21, no. 12, pp. 1031-1052, 2002.
[18]K. H. Dinh, O. Oguz, G. Huber, V. Gabler and D. Wollherr, “An Approach to Integrate Human Motion Prediction into Local Obstacle Avoidance in Close Human-Robot Collaboration, in Proceedings of 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts, 2015, pp. 1-6.
[19]T. Petrič, A. Gams, N. Likar and L. Žlajpah, Obstacle Avoidance with Industrial Robots, Motion and Operation Planning of Robotic Systems: Background and Practical Approaches, 2015, pp. 113-145.
[21]B. Lacevic, P. Rocco and A. M. Zanchettin, “Safety Assessment and Control of Robotic Manipulators Using Danger Field, IEEE Transactions on Robotics, vol. 29, no. 5, pp. 1257-1270, 2013.
[22]D. Park, H. Hoffmann, P. Pastor and S. Schaal, “Movement Reproduction and Obstacle Avoidance with Dynamic Movement Primitives and Potential Fields, in Proceedings of Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots, 2008, pp. 91-98.
[23]S. M. Khansari-Zadeh and A. Billard, “A Dynamical System Approach to Realtime Obstacle Avoidance, Autonomous Robots, vol. 32, no. 4, pp. 433-454, 2012.
[24]H. Reimann, I. Iossifidis and G. Schöner, “Generating Collision Free Reaching Movements for Redundant Manipulators Using Dynamical Systems, in Proceedings of 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010, pp. 5372-5379.
[25]T. Tsuji and M. Kaneko, “Noncontact Impedance Control for Redundant Manipulators, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 29, no. 2, pp. 184-193, 1999.
[26]T. Tsuji, M. Terauchi and Y. Tanaka, “Online Learning of Virtual Impedance Parameters in Non-Contact Impedance Control Using Neural Networks, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cyberneti, vol. 34, no. 5, pp. 2112-2118, 2004.
[27]C. Cai, N. Somani, S. Nair, D. Mendoza and A. Knoll, “Uncalibrated Stereo Visual Servoing for Manipulators Using Virtual Impedance Control, in Proceedings of 2014 13th International Conference on Control Automation Robotics & Vision, 2014, pp. 1888-1893.
[28]S.-Y. Lo, C.-A. Cheng and H.-P. Huang, “Virtual Impedance Control for Safe Human-Robot Interaction, Journal of Intelligent & Robotic Systems, vol. 82, no. 1, pp. 3-19, 2016.
[29]G. Du and P. Zhang, “Markerless Human–Robot Interface for Dual Robot Manipulators Using Kinect Sensor, Robotics and Computer-Integrated Manufacturing, vol. 30, no. 2, pp. 150-159, 2014.
[30]B. Schmidt and L. Wang, “Depth Camera Based Collision Avoidance Via Active Robot Control, Journal of Manufacturing Systems, vol. 33, no. 4, pp. 711-718, 2014.
[31]S. Moon, Y. Park, D. W. Ko and I. H. Suh, “Multiple Kinect Sensor Fusion for Human Skeleton Tracking Using Kalman Filtering, International Journal of Advanced Robotic Systems, vol. 13, no. 2, p. 65, 2016.
[32]S. Kumar, C. Savur and F. Sahin, “Dynamic Awareness of an Industrial Robotic Arm Using Time-of-Flight Laser-Ranging Sensors, in Proceedings of 2018 IEEE International Conference on Systems, Man, and Cybernetics, 2018, pp. 2850-2857.
[33]F. Fabrizio and A. D. Luca, “Real-Time Computation of Distance to Dynamic Obstacles with Multiple Depth Sensors, IEEE Robotics and Automation Letters, vol. 2, no. 1, pp. 56-63, 2017.
[35]M. Safeea, P. Neto and R. Bearee, “On-Line Collision Avoidance for Collaborative Robot Manipulators by Adjusting Off-Line Generated Paths: An Industrial Use Case, Robotics and Autonomous Systems, vol. 119, pp. 278-288, 2019.
[36]K. B. Kaldestad, S. Haddadin, R. Belder, G. Hovland and D. A. Anisi, “Collision Avoidance with Potential Fields Based on Parallel Processing of 3d-Point Cloud Data on the Gpu, in Proceedings of 2014 IEEE International Conference on Robotics and Automation, 2014, pp. 3250-3257.
[37]J.-W. Chang, W. Wang and M.-S. Kim, “Efficient Collision Detection Using a Dual Obb-Sphere Bounding Volume Hierarchy, Computer-Aided Design, vol. 42, no. 1, pp. 50-57, 2010.
[38]K. Okada, M. Inaba and H. Inoue, “Real-Time and Precise Self Collision Detection System for Humanoid Robots, in Proceedings of the 2005 IEEE International Conference on Robotics and Automation, 2005, pp. 1060-1065.
[39]T. Akenine-Mo, E. Haines and N. Hoffman, “Real-Time Rendering, 2018.
[40]Z. Zhang, “Microsoft Kinect Sensor and Its Effect, IEEE MultiMedia, vol. 19, no. 2, pp. 4-10, 2012.
[42]L. Cruz, D. Lucio and L. Velho, “Kinect and Rgbd Images: Challenges and Applications, in Proceedings of 2012 25th SIBGRAPI Conference on Graphics, Patterns and Images Tutorials, 2012, pp. 36-49.
[43]M. Camplani, T. Mantecón and L. Salgado, “Depth-Color Fusion Strategy for 3-D Scene Modeling with Kinect, IEEE Transactions on Cybernetics, vol. 43, no. 6, pp. 1560-1571, 2013.
[44]L. Chen, H. Lin and S. Li, “Depth Image Enhancement for Kinect Using Region Growing and Bilateral Filter, in Proceedings of the 21st International Conference on Pattern Recognition, 2012, pp. 3070-3073.
[45]S. Matyunin, D. Vatolin, Y. Berdnikov and M. Smirnov, “Temporal Filtering for Depth Maps Generated by Kinect Depth Camera, in Proceedings of 2011 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video, 2011, pp. 1-4.
[46]K. Sanford, “Smoothing Kinect Depth Frames in Real-Time, Internet:https://www.codeproject.com/Articles/317974/KinectDepthSmoothing?fbclid=IwAR26EZrfQEK3EVaZiwVkAInVi1nJsIqcFl0LQcF1AZCX609c9V0vFuArPfc, 24 Jan. 2012 2012.
[47]R. B. Rusu and S. Cousins, “3D Is Here: Point Cloud Library, in Proceedings of 2011 IEEE International Conference on Robotics and Automation, 2011, pp. 1-4.
[53]C. Cai, N. Somani and A. Knoll, “Orthogonal Image Features for Visual Servoing of a 6-Dof Manipulator with Uncalibrated Stereo Cameras, IEEE Transactions on Robotics, vol. 32, no. 2, pp. 452-461, 2016.
[54]P. I. Corke, “A Simple and Systematic Approach to Assigning Denavit–Hartenberg Parameters, IEEE Transactions on Robotics, vol. 23, no. 3, pp. 590-594, 2007.
[56]H. Sadeghian, L. Villani, M. Keshmiri and B. Siciliano, “Task-Space Control of Robot Manipulators with Null-Space Compliance, IEEE Transactions on Robotics, vol. 30, no. 2, pp. 493-506, 2014.
[57]G. Antonelli, F. Arrichiello and S. Chiaverini, “The Null-Space-Based Behavioral Control for Autonomous Robotic Systems, Intelligent Service Robotics, vol. 1, no. 1, pp. 27-39, 2008.
[58]F. Flacco, A. D. Luca and O. Khatib, “Control of Redundant Robots under Hard Joint Constraints: Saturation in the Null Space, IEEE Transactions on Robotics, vol. 31, no. 3, pp. 637-654, 2015.
[59]T. Kröger and F. M. Wahl, “Online Trajectory Generation: Basic Concepts for Instantaneous Reactions to Unforeseen Events, IEEE Transactions on Robotics, vol. 26, no. 1, pp. 94-111, 2010.
[60]D. N. Tuong, M. Seeger and J. Peters, “Computed torque control with nonparametric regression models, in Proceedings of 2008 American Control Conference, 2008
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔