跳到主要內容

臺灣博碩士論文加值系統

(44.210.85.190) 您好!臺灣時間:2022/11/30 01:57
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:許翔詠
研究生(外文):Hsu, Hsiang-Yung
論文名稱:結合機械手臂與電腦視覺的智慧型自走車之物體分類控制設計
論文名稱(外文):Control Design of Object Classification for An Intelligent Vehicle Combining Robot Arm with Computer Vision
指導教授:林容杉
指導教授(外文):Lin, Jung-Shan
口試委員:洪志偉黃秋杰林容杉
口試委員(外文):Hung, Jeih-weihHuang, Chiou-JyeLin, Jung-Shan
口試日期:2015-07-29
學位類別:碩士
校院名稱:國立暨南國際大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2015
畢業學年度:103
語文別:英文
論文頁數:51
中文關鍵詞:物體分類智慧型自走車機械手臂逆向運動學YCbCr像素最大面積演算法樣板比對法
外文關鍵詞:Object classificationIntelligent vehicleRobotic armInverse kinematicsYCbCrPixel maximum area algorithmTemplate matching algorithm
相關次數:
  • 被引用被引用:3
  • 點閱點閱:591
  • 評分評分:
  • 下載下載:163
  • 收藏至我的研究室書目清單書目收藏:0
在一般的物體分類任務中,往往只使用人力來完成,或者是以位置固定的機械手臂來做夾取,但此類系統只能在有限的範圍內使用,並可能會產生不必要的人力消耗。因此,若能給予機械手臂自動判別與移動的能力,便能使它在任務中發揮更大的效果,進而達到節省人力的目的。
本論文提出一個全自動化的智慧型自走車系統來完成物體分類任務,使機械手臂能克服距離的障礙來完成夾取與放置的動作,主要內容分為自走車、機械手臂與影像處理三大部分,結合此三部分來完成夾取與分類的任務。任務目標與完成方式如下:首先由攝影機拍攝擺放於自走車前方的平台,透過影像分析得知平台中心座標,並下指令使智慧型自走車向其行駛,且在行走的過程中,攝影機將會再次拍照,持續地以影像分析來校正自走車的行進方向,此時安裝在自走車上的紅外線感測器也將同步偵測,當距離小於設定值時,即判定為到達平台。此時,旋轉攝影機角度使其面對平台,以影像處理結果判定目標物位置並做夾取,並讓自走車向後旋轉,以相同方式再回到最初位於自走車後方的分類平台,判定物體該放置之位置,以完成分類放置物體的工作。
在控制系統設計方面,機械手臂使用了逆向運動學做為運動的依據,並藉由影像處理分析出座標供手臂運算,而在影像處理本身,運用了YCbCr顏色判別、像素最大面積演算法與樣板比對法來判定物體與平台的位置與相對距離。本篇論文最主要的貢獻在於將各自獨立的系統做統整與結合,即使自走車不在平台正前方或目標物不唯一的情況下,只要拍攝到平台都能正常判斷並進行物體分類之工作。

In daily tasks of classification, we often employ human power or robotic arms, and
these systems can only be used in a limited range or they may cause unnecessary
consumption of human resources. Therefore, if we can give robotic arm abilities of
moving and looking, they can play a greater role in the job to achieve the purpose of
object classification and save human power.
This thesis presents an intelligent vehicle with object classification, and lets the
robotic arm overcome the obstacle of distance to execute the pick-and-place action.
The thesis is divided into three main parts: mobile vehicle, robotic arm and image
processing. Combining these three parts to reach the pick-and-place task for object
classification is our major control objective. For this purpose, the computer must
analyze images from camera, and transmit the control instructions to the vehicle for
moving or the robotic arm for gripping the correct object and putting it into the
desired destination.
For the control design of systems, inverse kinematics is used to calculate the
movement of a robotic arm, and the computer analyzes and calculates the coordinate
of objects to manipulate the robotic arm. In the part of image processing, color space
of YCbCr, pixel maximum area algorithm and template matching algorithm are
employed to determine the position and relative distance between object and platform.
The major contribution of this thesis is to integrate the separate systems together for
object classification, even if the vehicle is not in the front of platform or in the
situation that the object is not unique. As long as the system can capture the images of
platform or objects to recognize and grip the object correctly, the purpose of object
classification would be achieved successfully.
Table of Contents
Chapter 1 Introduction 1
Chapter 2 System Construction 5
2.1 Vehicle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Robotic Arm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.3 Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 Control Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Chapter 3 Image Processing 13
3.1 Color Space Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2 Pixel Maximum Area Algorithm . . . . . . . . . . . . . . . . . . . . . 14
3.3 Template Matching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.4 Recognition of Object Position . . . . . . . . . . . . . . . . . . . . . . 21
Chapter 4 Control System Design 23
4.1 Vehicle System Control . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.2 Robotic Arm Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.2.1 Model Simpli cation . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.2 Inverse Kinematics . . . . . . . . . . . . . . . . . . . . . . . . 30
Chapter 5 Practical Experiments 35
5.1 Complete System and Workspace . . . . . . . . . . . . . . . . . . . . 35
5.2 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
5.2.1 Plastic Bottle Target . . . . . . . . . . . . . . . . . . . . . . . 39
5.2.2 Foil Packet Target . . . . . . . . . . . . . . . . . . . . . . . . 39
5.2.3 Classi cation Task . . . . . . . . . . . . . . . . . . . . . . . . 40
Chapter 6 Conclusions 44
Bibliography 47

List of Figures
2.1 Vehicle model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 Voltage-distance conversion . . . . . . . . . . . . . . . . . . . . . . . 7
2.3 Robotic arm model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.4 CM-510 controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.5 USB2Dynamixel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.6 Signal transformation from computer to robotic arm . . . . . . . . . . 9
2.7 Camera set on a motor for rotation . . . . . . . . . . . . . . . . . . . 10
2.8 Model of the intelligent vehicle system . . . . . . . . . . . . . . . . . 11
2.9 Environment of the classi cation system . . . . . . . . . . . . . . . . 12
2.10 Signal flow diagram of the system . . . . . . . . . . . . . . . . . . . . 12
3.1 Color space of RGB . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 Color space of YCbCr . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3 Color of Y xed on 50% . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4 Original image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.5 Recognition by YCbCr . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.6 Desired area and the lter . . . . . . . . . . . . . . . . . . . . . . . . 18
3.7 Process of pixel maximum area algorithm . . . . . . . . . . . . . . . . 18
3.8 Result of pixel maximum area algorithm . . . . . . . . . . . . . . . . 19
3.9 Original image of object on the platform . . . . . . . . . . . . . . . . 19
3.10 Image after edge detection and the template . . . . . . . . . . . . . . 19
3.11 Comparison of two images . . . . . . . . . . . . . . . . . . . . . . . . 20
3.12 Object with its center . . . . . . . . . . . . . . . . . . . . . . . . . . 20
3.13 Distance between robotic arm and image bottom . . . . . . . . . . . 22
3.14 View from the camera . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.1 Signal flow diagram of vehicle movement . . . . . . . . . . . . . . . . 24
4.2 Judgment of vehicle movement . . . . . . . . . . . . . . . . . . . . . . 25
4.3 Platform in the left of vehicle . . . . . . . . . . . . . . . . . . . . . . 26
4.4 Turning the vehicle facing the platform . . . . . . . . . . . . . . . . . 27
4.5 Model of robotic arm . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.6 Simpli cation of model . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.7 Three dimensional situation . . . . . . . . . . . . . . . . . . . . . . . 29
4.8 Simpli cation from three dimensions to two dimensions . . . . . . . . 30
4.9 The xed action of robotic arm . . . . . . . . . . . . . . . . . . . . . 32
4.10 Inverse kinematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.11 Real situation of gripping . . . . . . . . . . . . . . . . . . . . . . . . 33
4.12 Works of each subsystem . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.1 Flowchart of the system . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.2 Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.3 Boxes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.4 Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.5 Integration of complete system . . . . . . . . . . . . . . . . . . . . . . 38
5.6 Environment of experiment . . . . . . . . . . . . . . . . . . . . . . . 38
5.7 Experiment with plastic bottle target . . . . . . . . . . . . . . . . . . 41
5.8 Experiment with foil packet target . . . . . . . . . . . . . . . . . . . 42
5.9 Experiment of the classi cation task . . . . . . . . . . . . . . . . . . 43

List of Tables
2.1 Speci cation of AX-12+ motor . . . . . . . . . . . . . . . . . . . . . 9
2.2 Comparison of camera position . . . . . . . . . . . . . . . . . . . . . 11
4.1 Speci cation of robotic arm . . . . . . . . . . . . . . . . . . . . . . . 26
4.2 Speci cation of robotic arm . . . . . . . . . . . . . . . . . . . . . . . 28
[1] A. Hu and P. Harbor, \A survey of experiments for modeling veri cation and
control of exible robotic manipulators," Proceedings of the First IEEE Regional
Conference on Aerospace Control Systems, pp. 344-353, 1993.
[2] R. Buchner, A.Weiss and D.Wurhofer, \User experience of industrial robots over
time," Proceedings of the 7th ACM/IEEE International Conference on Human-
Robot Interaction (HRI), pp. 115-116, 2012.
[3] J. Saez-Pons, H. Lehmann, D. S. Syrdal and K. Dautenhahn, \Development of
the sociability of non-anthropomorphic robot home companions," Proceedings
of the Joint IEEE International Conferences on Development and Learning and
Epigenetic Robotics (ICDL-Epirob), pp. 111-116, 2014.
[4] W. Motooka, T. Nozaki, T. Mizoguchi and K. Sugawara, \Development of 16-
DOF telesurgical forceps master/slave robot with haptics," Proceedings of the
Annual Conference on IEEE Industrial Electronics Society, pp. 2018-2086, 2010.
[5] C.-C. Chang, J.-H.Wang, C.-C. Lin and M.-D. Jeng, \The study of remotely tele-
operated robotic manipulator system for underwater construction," Proceedings
of the IEEE International Symposium on Underwater Technology, pp. 269-276,
2004.
[6] T.W. Yang, Z.Q. Sun, S.K. Tso and W.L. Xu, \Trajectory control a
exible space manipulator utilizing a macro-micro architecture," Proceedings of the IEEE
International Conference on Robotics and Automation, pp. 2522-2528, 2003.
[7] Y. Liu and G. Liu, \Interaction analysis and online tip-over avoidance for a recon-
gurable tracked mobile modular manipulator negotiating slopes," IEEE/ASME
Transactions on Mechatronics, pp. 623-635, 2009.
[8] Y. Chen, L. Lu, M. Zhang and H. Rong, \Study on coordinated control and hard-
ware system of a mobile manipulator," Proceedings of the Sixth World Congress
on Intelligent Control and Automation, pp. 9037-9041, 2006.
[9] J.W.F. Cheung and Y.S. Hung, \Modelling and control of a 2-DOF planar
parallel manipulator for semiconductor packaging systems," Proceedings of the
IEEE/ASME International Conference on Advanced Intelligent Mechatronics,
pp. 717-722, 2005.
[10] S.-S. Lin, \Development of an autonomous recycling robot system for large work-
ing area applications," M.S. Thesis, Department of Mechanical Engineering, Na-
tional Taiwan University of Science and Technology, Taipei, Taiwan, 2012.
[11] J.-C. Lu, J.-C. Pan and J.-S. Lin, \Fuzzy logic design of a robotic manipulator
combined with computer vision for shape sorting motion control," Proceedings of
2011 International Conference on Service and Interactive Robotics, pp. 507-512,
2011.
[12] H.-Y. Hsu, H.-Y. Hsu and J.-S. Lin \Control design and implementation of in-
telligent vehicle with robot arm and computer vision," Proceedings of the IEEE
International Conference on Advanced Robotics and Intelligent System (ARIS),
2015.
[13] H. Park, S. Baek and S. Lee, \IR sensor array for a mobile robot," Proceed-
ings of the IEEE International Conference on Advanced Intelligent Mechatronics,
pp. 928-933, July, 2005.
[14] H. Choset and J. W. Burdick, \Sensor based planning the hierarhical generalized
Voronoi graph," Proceedings of the Workshop on Algorithmic Foundations of
Robotics, 1996.
[15] R. Mobus and U. Kolbe, \Multi-target multi-object tracking, sensor fusion of
radar and infrared," Proceedings of the IEEE Conference on Intelligent Vehicles
Symposium, pp. 732-737, 2004.
[16] C.N. Thai and M. Paulishen, \Using robotis bioloid systems for instructional
robotics," Proceedings of the IEEE South East Conferences , pp. 300-306, 2011.
[17] J.K.B. Garcia, A.J.B. Lazaro, J.O.Y. Lim and C.M. Oppus \Platoon system
implementation using the Robotis Bioloid platform," Proceedings of the Interna-
tional Conference on Humanoid, Nanotechnology, Information Technology, Com-
munication and Control, Environment and Management (HNICEM), pp. 1-6,
2014.
[18] H. Noda, N. Takao and M. Niimi, \Colorization in YCbCr space and its ap-
plication to improve quality of JPEG color images," Proceedings of the IEEE
International Conference on Image Processing, pp. 385-388, 2007.
[19] H. Noda, J. Korekuni and M. Niimi, \Simple and ecient colorization in YCbCr
color space," Proceedings of the 18th International Conference on Pattern Recog-
nition, pp. 685-688, 2006.
[20] J. Huang and Y. Wang, \Compression of color facial images using feature cor-
rection two-stage vector quantization," Proceedings of 1997 IEEE International
Symposium on Circuits and Systems, pp. 1249-1252, 1997.
[21] G.-T. Wu, H.-C. Chen and J.-S. Lin \Following control design of moving targets
for an intelligent vehicle combined with computer vision," Proceedings of the 20th
International Symposium on Arti cial Life and Robotics (AROB), pp. 1007-1012,
2015.
[22] A. Gpsjtasby, \Template matching in rotated images," IEEE Transactions on
Pattern Analysis and Machine Intelligence, pp. 338-344, 1985.
[23] L. Ma, Y. Sun, N. Feng and Z. Liu, \Image fast template matching algorithm
based on projection and sequential similarity detecting," Proceedings of the Fifth
International Conference on Intelligent Information Hiding and Multimedia Sig-
nal Processing, pp. 957-960, 2009.
[24] H. Peng, F. Long and Z. Chi, \Document image recognition based on template
matching of component block projections," IEEE Transactions on Pattern Anal-
ysis and Machine Intelligence, pp. 1188-1192, 2003.
[25] M. Choi, N.P. Galatsanos and D. Schonfeld, \On the relation of image restora-
tion and template matching: application to block-matching motion estimation,"
Proceedings of 1996 IEEE International Conference on Acoustics, Speech, and
Signal Processing, pp. 2112-2115, 1996.
[26] G.-S. Huang, C.-K. Tung, H.-C. Lin and S.-H. Hsiao, \Inverse kinematics analysis
trajectory planning for a robot arm," Proceedings of the 8th IEEE Asian Control
Conference (ASCC), pp. 965-970, 2011.
[27] S. Kucuk and Z. Bingul, \The inverse kinematics solutions of industrial robot
manipulators," Proceedings of the IEEE International Conference on Mecharon-
ics, pp. 274-279, 2004.
[28] C. Yu, M. Jin and H. Liu \An analytical solution for inverse kinematic of 7-DOF
redundant manipulators with o set-wrist," Proceedings of the IEEE Interna-
tional Conference on Mechatronics and Automation (ICMA), pp. 92-97, 2012.
[29] W. Shen, J. Gu and E.E. Millios \Self-con guration fuzzy system for inverse
kinematics of robot manipulators," Proceedings of the Annual Meeting of North
American on Fuzzy Information Processing Society, pp. 41-45, 2006.
[30] J. Peng, W. Xu, D. Meng and Z.Wang \Analytical inverse kinematics and trajec-
tory planning for a 6DOF grinding robot," Proceedings of the IEEE International
Conference on Information and Automation (ICIA), pp. 834-839, 2013.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊