(44.192.66.171) 您好!臺灣時間:2021/05/17 23:40
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:陳順益
研究生(外文):Shun-Yi Chen
論文名稱:自走式機器人之影像定位
論文名稱(外文):Image self-localization of a mobile robot
指導教授:吳佳儒吳佳儒引用關係
指導教授(外文):Chia-Ju Wu
學位類別:碩士
校院名稱:國立雲林科技大學
系所名稱:電機工程系碩士班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2005
畢業學年度:93
語文別:中文
論文頁數:88
中文關鍵詞:影像定位
外文關鍵詞:image localization
相關次數:
  • 被引用被引用:1
  • 點閱點閱:151
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文主旨在設計一具有影像定位功能的自走式機器人,其中主要硬體架構包含了控制電路板、步進馬達、馬達驅動電路、CCD以及電子羅盤。配合設計的H型驅動電路,來達到控制馬達正反轉之目的;透過PWM脈波寬度調變技術來控制馬達轉速;而定位部分,事先將四個地標(landmark)放置於場地中特定的位置,在影像中只需任三個地標即可求得機器人的位置,其方向的資訊則是從電子羅盤來得知;人機介面是以 Borland C++ Builder 為基礎,使用者可以在主電腦下達命令後,透過RS232無線傳輸模組傳送字串至自走式機器人上的微控制器,接收從電子羅盤所回傳的方位資料。本論文的硬體架構與所提出的影像技術,從控制與自我定位的實驗中,可以證實本系統是符合成本效益的。
The aim of this thesis is the study of an image self-localization technique for a mobile robot system. The major hardware components of the robot system include a control board, motors and their drivers, a CCD, and an electronic compass. The motor drivers of the H-bridge type and can be used to control the rotation directions of the motors. The speeds of motors are controlled by utilizing the PWM techniques. For self-localization, four landmarks are placed at specified positions of the working space of the robot. The position of the robot is then determined from the image information of any three landmarks. Meanwhile, the orientation of the robot is determined from the reading of the electronic compass. Interface program between the user and the robot system is written in Borland C++ Builder. A RS-232 wireless module is then used for transmission of control commands and sensory information between the host computer and the robot system. In this hardware architecture and the proposed image technique, it is shown in experiments that the control and self-localization of the robot system can be implemented cost-effectively.
中文摘要 -------------------------------------------------------------- I
英文摘要 -------------------------------------------------------------- II
誌謝 -------------------------------------------------------------- III
目錄 -------------------------------------------------------------- IV
表目錄 -------------------------------------------------------------- VI
圖目錄 -------------------------------------------------------------- VII
第一章 緒論----------------------------------------------------------- 1
1.1 前言----------------------------------------------------------- 1
1.2 文獻回顧----------------------------------------------------- 2
1.3 研究動機與目的--------------------------------------------- 3
1.4 論文架構----------------------------------------------------- 4
第二章 自走車系統架構--------------------------------------------- 5
2.1 系統規劃----------------------------------------------------- 5
2.2 定位系統----------------------------------------------------- 7
2.2.1 電子羅盤----------------------------------------------------- 7
2.2.2 影像定位----------------------------------------------------- 9
2.2.2.1 Webcam系統------------------------------------------------- 9
2.3 無線通訊系統------------------------------------------------ 12
2.4 硬體電路設計------------------------------------------------ 13
2.4.1 步進馬達系統------------------------------------------------ 13
2.4.2 電路板架構圖------------------------------------------------ 15
第三章 影像定位之數學基礎--------------------------------------- 17
3.1 影像處理----------------------------------------------------- 17
3.3.1 色彩與光源的關係------------------------------------------ 17
3.1.1.1 RGB彩色座標系統------------------------------------------ 19
3.1.1.2 YIQ彩色座標系統------------------------------------------ 21
3.1.1.3 HSI彩色座標系統------------------------------------------- 22
3.1.1.4 HSV彩色座標系統------------------------------------------ 24
3.1.2 影像二值化--------------------------------------------------- 26
3.1.3 區域成長/分割法-------------------------------------------- 27
3.1.4 型態學-------------------------------------------------------- 28
3.1.5 標記----------------------------------------------------------- 31
3.1.6 影像處理總結------------------------------------------------ 31
3.2 定位----------------------------------------------------------- 32
3.2.1 影像與攝影機的座標轉換--------------------------------- 32
3.2.2 定位公式推導------------------------------------------------ 34
3.2.3 定位總結----------------------------------------------------- 35
第四章 系統程式設計------------------------------------------------ 36
4.1 微控制器設計------------------------------------------------ 36
4.1.1 主程式設計--------------------------------------------------- 36
4.1.2 電子羅盤解碼------------------------------------------------ 42
4.1.3 步進馬達之正反轉------------------------------------------ 43
4.2 通訊介面----------------------------------------------------- 44
4.2.1 串列通訊界面------------------------------------------------ 44
4.2.2 呼叫方式----------------------------------------------------- 45
4.3 影像定位程式設計------------------------------------------ 45
第五章 影像定位誤差分析------------------------------------------ 55
5.1 實驗(一)------------------------------------------------------ 55
5.2 實驗(二)------------------------------------------------------ 61
5.3 實驗(三)------------------------------------------------------ 65
5.4 實驗總結----------------------------------------------------- 70
第六章 結論與未來展望--------------------------------------------- 71
參考文獻 ----------------------------------------------------------------- 72
自傳 ----------------------------------------------------------------- 76
[1]Y. Yagi, Y. Nishizawa, and M. Yachida, “Map-based navigation for a mobile robot with omnidirectional image sensor COPIS,” IEEE Transactions on Robotics and Automation, vol. 11, 1995.
[2]F. Olson, L. H. Matthies, “Maximum likelihood rover localization by matching range maps,” Proceedings of IEEE International Conf. on Robotics and Automation, vol. 1, pp. 272-277, 1998.
[3]A. Mallet, S. Lacroix, “Toward real-time 2D localization in outdoor environments,” Proceedings of IEEE International Conf. on Robotics and Automation, vol. 4, pp. 2827-2832, 1998.
[4]N. X. Dao, J. B. You, S. R. Oh, and M. Hwangbo, “Visual self-localization for indoor mobile robots using natural lines,” Proceedings of IEEE International Conf. on Intelligent Robots and Systems, vol. 2, pp. 1252-1257, 2003.
[5]M. Betke, L. Gurvits, “Mobile robot localization using landmarks,” Proceedings of IEEE International Conf. on Intelligent Robots and Systems, vol. 1, pp. 135-142, 1994.
[6]L. Matthies, P. Grandjean, “Stochastic performance modeling and evaluation of obstacle detectability with imaging range Sensors,” IEEE Transactions on Robotics and Automation, vol. 10, pp. 783-792, 1994.
[7]Murray, C. Jennings, “Stereo vision based mapping and navigation for mobile robots,” Proceedings of IEEE International Conf. on Robotics and Automation, vol.2, pp.1694-1699, 1997.
[8]J. Neira, J. D. Tardos, J. Horn, and G. Schmidt, “Fusing range and intensity images for mobile robot localization,” IEEE Transactions on Robotics and Automation, vol. 15, pp. 76-84, 1999.
[9]呂紹銓,具位置回報功能之無人自走車的研製控制,雲林科技大學碩士論文,2003。
[10]http://www.logitech.com.tw/index.asp
[11]賴岦俊,二對二機器足球員系統之研製與行為控制,雲林科技大學碩士論文,2002。
[12]田世豪,機器足球員之模糊競賽策略,雲林科技大學碩士論文,2001。
[13]http://sanyodb.colle.co.jp/pdf/seihin/2Step_j.pdf
[14]鄧明發、陳茂璋,微電腦專題製作應用電路,知行文化事業有限公司,1998。
[15]太田昭雄,河原英界,色彩與配色,新形象出版事業有限公司,1990。
[16]李璟林,群組足球機器員之即時影像伺服系統設計與模糊控制,雲林科技大學碩士論文,2003。
[17]R. John, The image processing handbook, IEEE Press, 1999.
[18]L. G. Shapiro, G. C. Stockman, Computer Vision, Prentice-Hall, NJ, 2001.
[19]C. Amoroso, E. Ardizzone, V. Morreale, and P. Storniolo, “A new technique for color image segmentation,” Proceedings of International Conf. on Image Analysis and Processing, pp. 352-357, 1999.
[20]G. Tong, W. Xiao, and X. Xu, “Design of RoboCup robot soccer global vision system,” Proceedings of the 3rd World Congress on Intelligent Control and Automation, vol. 1, pp. 212-214, 2000.
[21]J. Bruce, T. Balch, and M. Veloso, “Fast and inexpensive color image segmentation for interactive robots,” Proceedings of IEEE/RSJ International Conf. on Intelligent Robots and Systems, vo1. 3, pp. 2061-2066, 2000.
[22]X. Lin, S. Chen, “Color image segmentation using modified HSI system for road following,” Proceedings of IEEE International Conf. on Robotics and Automation, vo1. 3, pp. 1998-2003, 1991.
[23]G. Q. Wei, K. Arbter, and G. Hirzinger, “Real-time visual servoing for laparoscopic surgery,” IEEE Engineering in Medicine and Biology Magazine, vol. 16, pp. 40-45, 1997.
[24]歐良派,風管清潔用機器人之設計與研製,雲林科技大學碩士論文,2004。
[25]E. S. Umbaugh, Computer Vision and Image Processing, Prentic Hall PTR, 1998.
[26]H. Haddad, M. Khatib, S. Lacroix and R.Chatila, “Reactive navigation in outdoor environments using potential fields,” Proceedings of IEEE International Conf. on Robotics and Automation, vol. 2, pp. 1232-1237, 1998.
[27]P. Veelaert, W. Bogaerts, “Ultrasonic potential field sensor for obstacle avoidance,” IEEE Transactions on Robotics and Automation, vol. 15, pp. 774-779, 1999.
[28]張正東,結合影像及紅外線感測於自動導航車避障策略之設計,台北科技大學碩士論文,2001。
[29]鍾國亮,影像處理與電腦視覺,台灣東華書局股份有限公司,2002。
[30]陳柏安,利用電腦視覺作自走車之障礙物定位與環境掃描,成功大學碩士論文,2003。
[31]Y. Wang, J. Ostermann, Y. Q. Zhang, Digital Video Processing and Communications, Prentice Hall, 2001.
[32]B. Kolman, D. R. Hill, Elementary Linear Algebra, 7th edition, Prentice Hall, 1999.
[33]http://delphi.ktop.com.tw/
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top