跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.91) 您好!臺灣時間:2024/12/11 02:30
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:林忠運
研究生(外文):Chung-Yun Lin 林忠運
論文名稱:移動型機器人於智慧型空間之導航與控制
論文名稱(外文):Navigation and Control of a Mobile Robot in anIntelligent Space
指導教授:張文中
口試委員:張耀仁王銀添蔡清元林錫寬
口試日期:2007-07-25
學位類別:碩士
校院名稱:國立臺北科技大學
系所名稱:電機工程系所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2007
畢業學年度:95
語文別:中文
論文頁數:76
中文關鍵詞:智慧型空間移動型機器人機器人導航感測器網路視覺伺服
外文關鍵詞:Intelligent SpaceMobile RobotRobot NavigationSensor NetworkVisual Servoing
相關次數:
  • 被引用被引用:1
  • 點閱點閱:188
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:3
現今研究中,架設在空間中攝影機的絕對位置多半設為已知,以利於移動型機器人在智慧型空間中導航與控制。為了解決在未知空間攝影機絕對位置下之機器人導航控制,本論文提出在未知空間攝影機絕對位置下以視覺為基礎之機器人控制系統。此系統運用機器人內建單眼攝影機來尋找空間中標有色碼之攝影機,並藉由所提出之選取法則切換空間中攝影機以控制機器人導航至目標點。本論文
提出兩種控制器以達到機器人定位任務,分別為固定增益控制器和模糊控制器。其中,固定增益控制器是以收斂角度及距離誤差來設計;模糊控制器則是以收斂距離、與影像水平軸角度誤差與機器人與目標物角度誤差所設計。本論文於室內空間中架設三台網路型攝影機以建構智慧型環境,並利用一搭載單眼網路視訊攝影機之自製移動式機器人驗證兩種控制器、攝影機選取法則及控制機器人在智慧型空間中導航至目標點。
For most of the existing studies, absolute positions of space cameras are mostly assumed to be known so as to facilitate navigation and control of a mobile robot in an
intelligent space. For the purpose of navigating and controlling robots using video cameras with unknown positions, this thesis proposes a vision-based control system employing video cameras with unknown positions.
The system uses a single on-board camera to search and identify the space cameras with color codes in the intelligent space and relies on the proposed selection rule to switch the space cameras in the feedback loop to control and navigate the robot to a target. This paper further designs a fixed gain controller and a fuzzy controller for the control purpose. Three IP
cameras are installed in an indoor laboratory environment to build up the intelligent space. A custom wheeled mobile robot with a single on-board camera has been used to verify the effectiveness of the two controllers, the viability of the proposed selection rule, and the performance of controlling and navigating the robot to the
target in an intelligent space.
中文摘要i
英文摘要ii
誌謝iii
目錄iv
圖目錄vii
1 緒論1
1.1 研究動機及目的. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 論文具體成果. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 論文章節瀏覽. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 系統簡介4
2.1 文獻回顧. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2 符號. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 系統架構. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 系統流程. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3 智慧型空間移動式機器人導航系統13
3.1 攝影機網路建立. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.1.1 IP攝影機影像擷取. . . . . . . . . . . . . . . . . . . . . . . . . 14
3.1.2 網路視訊攝影機影像擷取. . . . . . . . . . . . . . . . . . . . . 14
3.2 影像處理. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2.1 顏色濾波. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.2 連通量. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.2.3 追蹤視窗. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3 空間攝影機選取法則. . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4 移動式機器人於智慧型空間之固定增益控制器導航與控制23
4.1 任務簡介. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.2 固定增益控制器設計. . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5 移動式機器人於智慧型空間之模糊控制器導航與控制29
5.1 任務簡介. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2 模糊控制器設計. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.2.1 歸屬函數之設計. . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.2.2 模糊規則. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.2.3 解模糊化. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
6 實驗36
6.1 固定增益控制器導航控制. . . . . . . . . . . . . . . . . . . . . . . . . 37
6.1.1 實驗環境設定. . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
6.1.2 攝影機123導航控制. . . . . . . . . . . . . . . . . . . . . . . . 41
6.1.3 攝影機321導航控制. . . . . . . . . . . . . . . . . . . . . . . . 41
6.1.4 攝影機231導航控制. . . . . . . . . . . . . . . . . . . . . . . . 46
6.2 模糊控制器導航控制. . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
6.2.1 實驗環境設定. . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
6.2.2 攝影機123導航控制. . . . . . . . . . . . . . . . . . . . . . . . 56
6.2.3 攝影機321導航控制. . . . . . . . . . . . . . . . . . . . . . . . 56
6.2.4 攝影機231導航控制. . . . . . . . . . . . . . . . . . . . . . . . 61
6.3 實驗結果討論. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
7 結論71
[1] S. Hutchinson, G. D. Hager, and P. I. Coke. A tutorial on visual servo control. IEEE
Transactions on Robotics and Automation, 12(5):651–670, Oct. 1996.
[2] Joo-Ho Lee, K.Morioka, N.Ando, and H.Hashimoto. Cooperation of distributed intelligent
sensors in intelligent environment. IEEE Transactions on Mechatronics, 9(3),
Sep. 2004.
[3] Joo-Ho Lee and H.Hashimoto. Controlling mobile robots in distributed intelligent sensor
network. IEEE Transactions on Industrial Electronics, 50(5):890–902, Oct. 2003.
[4] Wen-Chung Chang and Chih-Wei Cho. Mobile robot localization and navigation in an
vision-based intelligent environment. In Submitted to International Forum on Systems
and Mechatronics, IFSM, Dec. 2007.
[5] Wen-Chung Chang and Yi-Yang Hung. Visual navigation of a mobile robot in intelligent
environment. In National System Science and Engineering Symposium, NSSES,
Jun. 2007.
[6] Wen-Chung Chang and Shu-An Lee. Feature-based stereo visual guidance and control
of mobile robots for autonomous hallway following tasks. In Proceedings of the 36th
International Symposium on Robotics, volume 29, Japan, Dec. 2005.
[7] Gregory D. Hager, Wen-Chung Chang, and A. S. Morse. Robot hand-eye coordination
based on stereo vision. IEEE Control Systems Magazine, pages 30–39, Feb. 1995.
[8] Wen-Chung Chang. Hybrid force and vision-based contour following of planar robots.
Journal of Intelligent and Robotic Systems, 47(3):215–237, Nov. 2006.
[9] Wen-Chung Chang. Binocular vision-based trajectory following for autonomous
robotic manipulation. Robotica, Apr. 2007.
[10] Wen-Chung Chang. Precise positioning of binocular eye-to-hand robotic manipulators.
Journal of Intelligent and Robotic Systems, 49(3):219–236, Jul. 2007.
[11] Wen-Chung Chang and Shu-An Lee. Autonomous stereo visual guidance and control
of mobile robots. In IEEE International Conference on Mechatronics, 10-12, pages
118–123, Jul. 2005.
[12] Wen-Chung Chang and Shu-An Lee. Real-time feature-based 3d map reconstruction
for stereo visual guidance and control of mobile robots in indoor environments. In
IEEE International Conference on Systems, Man and Cybernetics, 10-13, pages 5386
– 5391, Oct. 2004.
[13] N. Y. Chong, H. Hongu, K. Ohba, S. Hirai, and K. Tanie. A distributed knowledge
network for real world robot applications. In 2004 IEEE/RSJ International Conference
on Aerospace and Electronic Systems, volume 1, pages 187–192, Oct. 2004.
[14] M.Niitsuma and H.Hashimoto. Spatial memory as an aid system for human activity
in intelligent space. IEEE Transactions on Industrial Electronics, 54:1122–1131, Apr.
2007.
[15] H. Morioka, Joo-Ho Lee, and H. Hashimoto. Human centered robotics in intelligent
space. In IEEE International Conference on Robotics and Automation, 2002. Proceedings.
ICRA ’02., volume 2 of 11-15, pages 2010–2015, May 2002.
[16] P. T. Szemes, H. Hashimoto, and P. Korondi. Pedestrian-behavior-based mobile agent
control in intelligent space. IEEE Transactions on Instrumentation and Measurement,
pages 2250–2257, Dec. 2005.
[17] Yamaguchi, T.Sato, and E.Takama. Intelligent space and human centered robotics.
IEEE Transactions on Industrial Electronics, 50:881–889, Oct. 2003.
[18] D. Brscic, T. Sasaki, and H. Hashimoto. Implementation of mobile robot control in intelligent
space. In International Joint Conference on SICE-ICASE, 18-21, pages 1228–
1233, Oct. 2006.
[19] R. E. Precup, S. Preitl, C. Szabo, Z. Gyurko, and P. Szemes. Sliding mode navigation
control in intelligent space. In Intelligent Signal Processing, 2003 IEEE International
Symposium on, 4-6, pages 225–230, Sep. 2003.
[20] I. Nagy, Wai Keung Fung, and P. Baranyi. Neuro-fuzzy based vector field model: an
unified representation for mobile robot guiding styles. In Systems, Man, and Cybernetics,
2000 IEEE International Conference on, volume 5 of 8-11, pages 3538–3543, Oct.
2000.
[21] M. Brezak, I. Petrovic, and D. Rozman. Global vision based tracking ofmultiplemobile
robots. In Industrial Electronics, 2006 IEEE International Symposium on, volume 1,
pages 649–654, Jul. 2006.
[22] C. L. Hwang and L. J. Chang. Trajectory tracking and obstacle avoidance of carlike
mobile robots in an intelligent space using mixed h2/h¥ decentralized control.
Mechatronics, IEEE/ASME Transactions on, 12:345–352, Jun. 2007.
[23] T. Sasaki and H. Hashimoto. Camera calibration usingmobile robot in intelligent space.
In SICE-ICASE International Joint Conference, 18-21, pages 2659–2662, Korea, Oct.
2006.
[24] I. Rekleitis and G. Dudek. Automated calibration of a camera sensor network. In
Proc. of IEEE/RSJ International Conference on Intelligent Robots and Systems, pages
3384–3389, Aug. 2005.
[25] H.Hashimoto. Intelligent interactive spaces - integration of it and robotics. In IEEE
Workshop on SICE-ICASE, Advanced Robotics and its Social Impacts, 12-15, pages
85–90, Jun. 2005.
[26] LevelOne. Vitamin Control User’s Manual, 2006.
[27] R. C. Gonzalez and R. E.Woods. Digital Image Processing. Prentice-Hall, New Jersey,
2002.
[28] Kevin M. Passino and Stephen Yurkovich. Fuzzy Control. Addison-Wesley, New Jersey,
1998.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top