(3.238.173.209) 您好!臺灣時間:2021/05/16 05:47
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:徐偉庭
研究生(外文):Hsu, Wei-Ting
論文名稱:圖形辨識與位置控制於五軸機械手臂在智慧型手機自動測試系統之應用
論文名稱(外文):Application of Pattern Recognition and Position Control for 5-DOF Robot Arm on Smartphone Automatic Test System
指導教授:莊季高
指導教授(外文):Juang, Jih-Gau
口試委員:魏榮宗王乃堅莊季高
口試委員(外文):Wai, Rong-JongWang, Nai-JianJuang, Jih-Gau
口試日期:2016-07-21
學位類別:碩士
校院名稱:國立臺灣海洋大學
系所名稱:通訊與導航工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2016
畢業學年度:104
語文別:英文
論文頁數:85
中文關鍵詞:模糊控制影像處理YUV色彩空間光學字元辨識改良型類神經網路
外文關鍵詞:fuzzy controlimage processYUV color spaceoptical character recognitionimproved BP neural network
相關次數:
  • 被引用被引用:1
  • 點閱點閱:97
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
智慧型機器人的應用,發展到現在給了人類各式各樣的幫助與便利,不管是居家的掃地機器人、工業的機械手臂或是醫療用的微創手術機器手臂,在人們的生活上都佔有一席之地。本文提出了一個自動辨識螢幕文字與圖片能力的機器手臂系統,通過我們所給的動作命令,機器手臂便會做出相對應的動作來測試智慧型手機。在實驗中我們利用攝影機來檢測智慧型手機的畫面,影像處理的方法是利用RGB轉YUV色彩空間來減少影像處理的圖片資料量;再藉由改良型類神經網路 (LM-HLP) 對攝影機與機械手臂進行座標轉換;且應用DH法與模糊理論控制機器手臂的關節角度;而光學字元辨識技術則是用於辨識英文字母與阿拉伯數字,辨識結果最後會再經由字典校正處理以提高辨識率。由LabVIEW 2010來處理人機界面,並使用MATLAB程式碼至控制器。經由D-H法提供馬達角度給模糊控制器,使機器手臂可以移動到期望位置。實驗結果顯示,電腦螢幕上的文字辨識率可達92.9%,經由字典處理校正過後可提高至99%,我們所提出的控制系統可以順利地驅動機器手臂,實現不同指定的手機測試功能。
Intelligent robotic techniques play an important role in human’s daily life, for example, household sweeping, industrial mechanical arms or medical micro-wound surgery. The applications of intelligent robotic techniques have assisted human beings a lot. In here, we propose a robot system that can recognize numbers and words automatically. The mechanic arm can do corresponding movements to test the smartphones through requested commands. We utilize a camera to inspect the screen of the smartphone. The image processing transforms RGB image to YUV image for reducing the quantity of image data. We use improved back propagation neural network (LM-HLP) to transform coordinates between webcam and robot arm. We provide desired angles to joint motors by the use of D-H model and fuzzy control. Optical Character Recognition (OCR) technique is used for the character recognition, and the recognition results are then checked by the dictionary process to increase the recognition accuracy. The human-machine interface is handled by the Labview 2010, and then uses the Matlab codes to the controller. With the camera, coordinates are provided to the fuzzy controller, the robot arm can be moved to the desired position. Experimental results show that recognition accuracy is 92.4% for images on the computer screen and 99% by a dictionary process. The proposed control scheme can make the robot arm to perform different assigned smartphone tests successfully.
Abstract(Chinese)..........................................................................................................I
Abstract(English).........................................................................................................II
Acknowledgement(Chinese).......................................................................................III
Contents.......................................................................................................................IV
List of Figures………………………………………………………………………..VI
List of Tables……………………………………………………………………….VIII
1 Introduction………………………………………………………………………1
1.1 Preface………………………………………………………………………1
1.2 Research Motivation and Goal……………………………...……………1
1.3 Literature Review.…………………………………………..……………2
1.4 Organization of This Thesis………………………………………………....3
2 Robot Construction and Experimental Setup………………………………….4
2.1 System Description………………………………………….………………4
2.2 Hardware Apparatus……………………………………………………...4
2.2.1 The Dynamixel MX-28 Servo Motor……………………………….4
2.2.2 Arm Hardware....................................................................................7
2.2.3 USB2Dynamixel.................................................................................8
2.2.4 Camera.....…………………………………………………………9
2.2.5 Complete Hardware Architecture.....................................................10
2.3 Robotic Kinematics….…………………………………………………..11
2.3.1 Kinematic Modeling.........................................................................11
2.3.2 Forward Kinematic Equations..........................................................14
2.3.3 Inverse Kinematics………………………………………...………15
3 Image Recognition……………………………………………………………18
3.1 Image Description………………………………………………………….18
3.2 Image Processing Module…………………………..……………………..18
3.2.1 Image Preprocessing……………………………………...……….19
3.2.2 Conversion between RGB and YUV Color Space………....………19
3.3 Image Recognition Technique......................................................................23
3.3.1 Match Pattern....................................................................................23
3.3.2 Optical Character Recognition..........................................................26
3.3.3 NI Vision Assistant...........................................................................26
3.3.4 Check the Pressed Button.................................................................27
4 Intelligent Control Scheme...................................................................................31
4.1 Control Description…...................................................................………...31
4.2 Control System.............................................................................................31
4.2.1 LM-HLP............................................................................................31
4.2.2 Fuzzy Control Scheme…..................................................................36
4.2.3 Controller Design of Position Control.......……..………………….38
4.2.4 Path Scheme……………………………………..........………….45
4.3 Discussion of Position Control.....................................................................46
5 Experimental Results………………………………………………………….47
5.1 Performance…………………….….............................……………………47
5.2 Searching Pending Icon……………................…………..……….……….48
5.3 Send a Message............................................................................................51
5.4 Make a Telephone Call………………………………………….................65
5.5 Experimental Discussion...…………………………………………..…..79
6 Conclusions and Future Works………………………………………………….80
6.1 Conclusions………………………………………………………………..80
6.2 Future Works………………………………………………..……………80
References…………………………………………………………………….......….82

[1] I.H. Cheng, “Application of 3D Coordinate and Real-time Character Recognition for 5DoF Robotic Arm on Smartphone Automatic Test System,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2015.
[2] C.L. Yu, “Application of Real-Time Image Recognition and Path Planning to Wheeled Mobile Robot for Taking Elevator,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2012.
[3] Y.W. Fan, “Application of Pattern Recognition and Position Control to Smart Phone Automatic Test System,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2013.
[4] L. Chen, “The Visual Location of Workpiece Based on Hermite Interpolation and Mapping for Robot Arms,” Proceedings of 2015 5th International Conference on Information Science and Technology (ICl ST), pp. 171-176, 2015.
[5] Z. Chi and Q. Xu, “Precision Control of Piezoelectric Actuator Using Fuzzy Feedback Control with Inverse Hysteresis Compensation,” Proceedings of the 10th IEEE International Conference on Nana/Micro Engineered and Molecular Systems, Xi’an, China, April 7-11, 2015.
[6] G.F. Liu, “Design and Optimization of Spraying Robot Arm for Hull Blocks,” Proceedings of the 2015 IEEE Conference on Robotics and Biomimetics, 2015.
[7] M. Rivai, “Implementation of Fuzzy Logic Control in Robot Arm for Searching Location of Gas Leak,” Proceedings of 2015 International Seminar on Intelligent Technology and Its Applications, pp. 69-74, 2015.
[8] D. Pamungkas, “Immersive Teleoperation of a Robot Arm Using Electro-tactile Feedback,” Proceedings of the 6th International Conference on Automation, Robotics and Applications, pp. 300-305, 2015.
[9] F. Šuligoj, “Medical Applicability of a Low-cost Industrial Robot Arm Guided with an Optical Tracking System*,” Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Congress Center Hamburg, pp. 3785-3790, 2015.
[10] C.L. Yu, “Application of Real-Time Image Recognition and Path Planning to Wheeled Mobile Robot for Taking Elevator,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2012.
[11] T. Wei, “Design and Realization of Autonomous Mobile Robot Tracking System by Multi-Switching Fuzzy Sliding Mode Control,” Master Thesis, Department of Electrical Engineering, National Cheng Kung University, 1998.
[12] G.S. Huang, C.K. Tung, H.C. Lin, and S.H. Hsiao, “Inverse Kinematics Analysis Trajectory Planning for a Robot Arm,” Proceedings of the 2011 IEEE International Conference on Robotics & Automation, pp. 965-970, 2011.
[13] C.M. Chao, “A Parallel Back-Propagation Algorithm with the Levenberg Marquardt Method,” Master thesis, Department of Computer & Information Science, Soochow University 2006.
[14] J.S. Huang, “Servo Motion Control of Robot Arm,” Master Thesis, Department of Power Vehicle and System Engineering, National Defense University, 2012.
[15] Denavit, “A Kinematic Notation for Lower-pair Mechanisms Based on Matrices,” Trans ASME J. Appl. Mech 23: 215–221,1955.
[16] S. Manigpan, S. Kiattisin and A. Leelasantitham, “A Simulation of 6R Industrial Articulated Robot Arm Using Backpropagation Neural Network,” Proceedings of the 2010 IEEE International Conference on Robotics & Automation, pp. 823-826, 2010.
[17] M. Koga, K. Kosuge, K. Furuta, and K. Nosaki, “Corrdinated Motion Control of Robot Arms Based on the Virtual Internal Model,” IEEE Transactions on Robotics and Automation, vol. 8, no. 1, pp. 77-85, 1992.
[18] Y.C. Chuang, “The Implementation of Object Grabing by Robot Hand Based on Two-Camera Vision,” Master thesis, Department of Mechanical Engineering Tatung University, 2012.
[19] M.O. Efe, “Fractional Fuzzy Adaptive Sliding-Mode Control of a 2-DOF Direct-Drive Robot Arm,” Transactions on System, Man, and Cybernetics-Part B: Cybernetics, vol. 38, no. 6, pp. 1561-1570, 2008.
[20] W.A. Bin Wan Daud, W. Faizura, M.A. Adly, I. Elamvazuthi, and M. Begam, “Kinematic Modeling of Humanoid Arm,” Proceedings of the 2010 IEEE International Conference on Intelligent and Advanced Systems (ICIAS), pp. 1-4, 2010.
[21] W.J. Wang, C.H. Huang, I.H. Lai, and H.C. Chen, “A Robot Arm for Pushing Elevator Buttons,” Proceedings of SICE Annual Conference 2010, pp. 1844–1848, 2010.
[22] W.C. Chang, “Hybrid Fuzzy Control of an Eye-to-Hand Robotic Manipulator for Autonomous Assembly Tasks,” Proceedings of SICE Annual Conference 2010, August 18-21, pp. 408-414, 2010.
[23] Schantz, “The history of OCR, optical character recognition,” Recognition Technologies Users Association, ISBN 9780943072012.
[24] F. Hollaus, “Improving OCR Accuracy by Applying Enhancement Techniques on MultiSpectral Images,” Proceedings of 22nd International Conference on Pattern Recognition, pp. 3080-3085, 2014.
[25] D. M. Bulanon, T. Kataoka, H. Okamoto, and S. Hata, “Development of a Real-time Machine Vision System for the Apple Harvesting Robot,” Proceedings of Annual Conference on SICE, vol. 1, pp.595-598, 2004.
[26] D. Kragic, M. Björkman, H.I. Christensen, and J.O. Eklundh, “Vision for Robotic Object Manipulation in Domestic Settings,” Robotics and Autonomous Systems, vol. 52, no.1, pp. 85-100, 2005.
[27] D. Lttoumeau, F. Michaud, J.M. Valin, and C. Proulx, “Textual Message Read by a Mobile Robot,” Proceedings of the 2003 IEEE International Conference on Intelligent Robots and Systems, pp. 2724-2729, 2003
[28] McCulloch, “A Logical Calculus of Ideas Immanent in Nervous Activity,” Bulletin of Mathematical Biophysics 5,1943.
[29] MX-28, http://support.robotis.com/en/product/dynamixel/rx_series/mx-28.htm
[30] USB2Dynamixel, http://www.crustcrawler.com/electronics/USB2Dynamixel/docs/USB2Dynamixel(english).pdf
[31] P.C. Tsai, “Based on Inverse Kinematics for Robot Arm Control,” Master Thesis, Department of Electrical Engineering, National Central University, 2011.
[32] T.H. H, “A stereo Vision-Based Robot Arm system and its applications,” Master Thesis, Department of Computer Science and Information Engineering, National Central University, 2000.
[33] W.J. Wang, C.H. Huang, I.H. Lai, and H.C. Chen, “A Robot Arm for Pushing Elevator Buttons,” Proceedings of SICE Annual Conference 2010, pp. 1844–1848, 2010.
[34] NI, Machine Vision and Stereo Vision on Vision Concepts Help, 372916M-01, August 2012.
[35] Y.J. Tsai, “Application of Real-time Character Recognition and Fuzzy Control for Smart Phone Automatic Operation,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2014.
[36] Color Space Basics Bill Claff, http://home.comcast.net/~NikonD70/GeneralTopics/Post_Processing/Color_Space_Basics.htm
[37] YUV, https://en.wikipedia.org/wiki/YUV
[38] http://blog.xuite.net/jonathan0503a/wretch/102953763-RGB+%E8%88%87+YCbCr+%E7%9A%84%E5%B7%AE%E7%95%B0
[39] Y.L. Peng, “A Video Surveillance System for Missing Object Detection with Reporting Capability,” Master Thesis, Department of Computer Science and Information Engineering, National Central University, 2009.


[40] S.Y. Juang, “Application of Smartphone and Autonomous Mobile Robot to Real-Time Indoor Surveillance,” Master Thesis, Communications, Navigation and Control Engineering, Nation Taiwan Ocean University, 2012.

連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top