跳到主要內容

臺灣博碩士論文加值系統

(3.236.50.201) 您好!臺灣時間:2021/08/02 02:14
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:陶冠廷
研究生(外文):Guan-Ting Tao
論文名稱:微型飛行器於室內環境之視覺導航與控制
論文名稱(外文):Visual Navigation and Servoing of MAV in the Indoor Environment
指導教授:曹大鵬曹大鵬引用關係黃正民黃正民引用關係
指導教授(外文):Ta-Peng TsaoCheng-Ming Huang
口試委員:簡忠漢練光祐
口試日期:2012-07-19
學位類別:碩士
校院名稱:國立臺北科技大學
系所名稱:電機工程系研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2012
畢業學年度:100
語文別:中文
論文頁數:65
中文關鍵詞:光流法消失點微型飛行器導航牆面偵測
外文關鍵詞:Optical FlowVanishing PointMAVNavigationWall Detection
相關次數:
  • 被引用被引用:3
  • 點閱點閱:655
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文利用微型飛行器(Micro Aerial Vehicle, MAV)來進行室內環境之自動導航。本文主要以單攝影機作為感測器,利用攝影機所得影像進行影像處理,而透過影像處理求得資訊偵測週遭環境,並用此進行飛行控制。本篇論文將利用光流法及消失點執行室內走廊導航任務,透過光流法找到局部導航點,而透過消失點找到全域導航點。局部導航點可以閃避動態或靜態障礙物,全域導航點則可以找到走廊中心位置。由於光流法進行導航任務會有重灑特徵點時,光流資訊無法估測之問題,本論文將提出雙程序光流法來克服此問題。本論文將局部導航點及全域導航點相結合,來進行微型飛行器之穿越走廊飛行規劃。由於室內導航任務中,常會因偵測不出牆面進而造成導航失敗,所以本文提出透過影像之光流方向統計分析,進行牆面偵測,來執行閃避牆面動作,進而規劃出漫遊飛行策略。接著,本文設計視覺伺服控制器來達成微型飛行器之各種飛行導航任務目標。最後實驗證實,雙程序光流法結合消失點的導航方法及牆面偵測的方法,可以有效的應用於室內環境。

This paper used MAV (Micro Aerial Vehicle) to navigate in the indoor environment. From the images captured by the monocular camera on the MAV, the information of the MAV''s surroundings can be observed and utilized to navigate the MAV. Here, the optical flow and vanishing point are estimated and integrated to assist the navigation through the corridor. The optical flow obtained by the pyramid Lacus-Kanade algorithm is utilized to construct the local navigation point for avoiding the static of moving obstacles, and the vanishing point is estimated to be the global navigation point for indicating the trend of the corridor. Since the feature points tracked by the Lacus-Kanade algorithm must be re-sprinkled every several image frames for keeping the tracking correctness, the information of the optical flow cannot be employed as local navigation point at the moment of re-sprinkling feature points. The pyramid Lacus-Kanade algorithm with dual thread is proposed to overcome this problem. On the other hand, the wondering navigation in the room space is presented with the wall detection method which analyzes the distribution in direction of the optical flow. The visual servoing law is also developed here to accomplish the task designed by the corridor-passing or room-wondering navigation. The experimental results show the efficiency of our approaches for MAV automatic flying in indoor environment.

目錄
摘 要 i
ABSTRACT ii
誌謝 iv
目錄 v
表目錄 vii
圖目錄 viii
第一章 緒論 - 1 -
1.1 前言 - 1 -
1.2 研究動機 - 2 -
1.3 相關文獻 - 4 -
1.4 論文架構 - 5 -
第二章 飛行導航規劃 - 6 -
2.1 閃避障礙物之局部導航規劃 - 6 -
2.1.1 灑點方式 - 7 -
2.1.2 重灑特徵點條件 - 8 -
2.1.3 雙程序光流法 - 10 -
2.1.4 局部導航點之建立 - 14 -
2.2 穿越走廊之全域導航規劃 - 18 -
2.2.1 去除複雜紋理邊緣之消失點估測 - 20 -
2.2.2 全域導航點之建立 - 23 -
2.3室內漫遊導航規劃 - 23 -
2.3.1牆面偵測(Wall Detection) - 24 -
2.3.2 漫遊導航 - 31 -
2.4 各種導航搭配利用策略 - 33 -
2.4.1室內走廊環境之導航策略 - 33 -
2.4.2室內封閉場景之導航策略 - 34 -
2.4.3 各導航資訊關係 - 35 -
第三章 控制策略 - 36 -
3.1穿越室內走廊環境之影像伺服控制器設計 - 39 -
3.2 室內漫遊之影像伺服控制器設計 - 40 -
第四章 實驗結果 - 41 -
4.1 實驗設備 - 41 -
4.2 實驗場景 - 42 -
4.3 閃避障礙物 - 43 -
4.3.1 靜態障礙物 - 43 -
4.3.2 動態障礙物 - 44 -
4.4 牆面偵測 - 45 -
4.4.1 灰白色牆面偵測 - 45 -
4.4.2 非灰白色之牆面偵測 - 46 -
4.4.3 微型飛行器之牆面偵測任務 - 47 -
4.5 室內環境導航 - 49 -
4.5.1 室內走廊導航 - 49 -
4.5.2 室內導航漫遊 - 51 -
4.5.3 室內封閉場景與走廊場景之漫遊導航 - 53 -
4.5.4 走廊場景與室內封閉場景之漫遊導航 - 55 -
第五章 結論及未來期望 - 57 -
參考文獻 - 59 -
附錄 - 63 -
A 微型飛行器之硬體架構 - 63 -
B 光流法 - 64 -

表目錄
表2.1雙程序光流決策圖 - 12 -
表3.1 微型飛行器之控制命令 - 38 -
表4.1 實驗設備表 - 41 -


圖目錄
圖1.1系統架構圖 - 3 -
圖1.2微型飛行器 - 3 -
圖2.1 Harris Corner - 7 -
圖2.2均勻灑點 - 7 -
圖2.3點數重灑示意圖 - 8 -
圖2.4過度集中之特徵點(藍色圓點)示意圖 - 8 -
圖2.5張數重灑示意圖 - 9 -
圖2.6單程序光流之重灑前後比較 - 10 -
圖2.7 GPU架構圖 - 10 -
圖2.8雙程序光流法示意圖 - 11 -
圖2.9光流地圖與可通行區域圖 - 15 -
圖2.10局部導航點流程圖 - 16 -
圖2.11建立局部導航點之影像圖 - 17 -
圖2.12局部導航點之閃避行人示意圖 - 17 -
圖2.13全域導航點(灰色)即消失點(藍色)示意圖 - 18 -
圖2.14消失點示意圖 - 19 -
圖2.15導航點與消失點之產生流程 - 19 -
圖2.16各通道平面圖 - 20 -
圖2.17 Sobel與增強邊界圖比較 - 20 -
圖2.18消除雜訊之流程 - 22 -
圖2.19去除複雜邊緣之消失點估測比較圖 - 22 -
圖2.20全域導航點流程圖 - 23 -
圖2.21室內漫遊示意圖 - 24 -
圖2.22 灰白牆面情況 - 25 -
圖2.23 白板牆面情況 - 25 -
圖2.24黑色鐵箱之牆面情況 - 26 -
圖2.25 布料隔板之牆面情況 - 26 -
圖2.26 玻璃牆面情況 - 26 -
圖2.27 各種牆面模板之型態 - 27 -
圖2.28 影像網格圖 - 27 -
圖2.29 對稱軸流程圖 - 28 -
圖2.30 點數角度直條圖 - 28 -
圖2.31 單位極座標 - 29 -
圖2.32 判別型態之流程 - 29 -
圖2.33判別牆面之流程圖 - 31 -
圖2.34 當發生牆面情況時之導航決策 - 32 -
圖2.35 走廊環境之導航策略 - 33 -
圖2.36 局部導航點與全域導航點結合之導航切換 - 34 -
圖2.37室內封閉場景之導航策略 - 34 -
圖2.38各個導航資訊之場景關係圖 - 35 -
圖3.1 影像上之位置誤差量定義示意圖 - 36 -
圖3.2 微型飛行器動作姿態示意圖 - 37 -
圖3.3一般伺服控制系統 - 37 -
圖3.4 走廊環境之影像伺服控制流程 - 39 -
圖3.5 室內漫遊之影像伺服控制流程 - 40 -
圖4.1 MAV飛行於走廊環境 - 41 -
圖4.2 複雜室內走廊場景 - 42 -
圖4.3 室內封閉場景 - 42 -
圖4.4 閃避靜態障礙物 - 43 -
圖4.5 閃避動態障礙物 - 44 -
圖4.6 灰白牆面偵測結果圖 - 45 -
圖4.7 非灰白牆面偵測結果圖 - 46 -
圖4.8 微型飛行器之白橘牆面偵測 - 47 -
圖4.9微型飛行器之玻璃牆面偵測 - 47 -
圖4.10微型飛行器之灰色牆面偵測 - 48 -
圖4.11微型飛行器之灰白牆面偵測 - 48 -
圖4.12走廊導航之全域導航點(灰點)與消失點(藍點) - 49 -
圖4.13 走廊導航之局部導航點(藍色圓圈) - 49 -
圖4.14 走廊導航之光流地圖 - 50 -
圖4.15 微型飛行器執行室內封閉場景漫遊任務 (藍圈表示局部導航點)- 51 -
圖4.16 微型飛行器執行室內封閉場景漫遊任務(第三人稱視角) - 52 -
圖4.17室內封閉場景切換至走廊場景之導航任務 - 53 -
圖4.18室內封閉場景切換至走廊場景之導航任務(第三人稱視角) - 54 -
圖4.19走廊場景切換至室內封閉場景之導航任務 - 55 -
圖4.20走廊場景切換至室內封閉場景之導航任務(第三人稱視角) - 55 -


[1] Z. Jun, L. Weisong, and W. Yirong, “Novel technique for vision-based UAV navigation,” IEEE Trans. Aerospace and Electronic Systems, vol. 47, pp. 2731-2741, 2011.
[2] S. Zingg, D. Scaramuzza, S. Weiss, and R. Siegwart, “MAV navigation through indoor corridors using optical flow,” in ICRA, 2010.
[3] W. E. Green and P. Y. Oh, “Optic-flow-based collision avoidance,” IEEE Robotics & Automation Magazine, vol. 15, pp. 96-103, 2008.
[4] V. Cantoni, L. Lombardi, M. Porta, and N. Sicard, "Vanishing point detection: representation analysis and new approaches," in IEEE Conf. Image Analysis and Process, pp. 90-94, 2001.
[5] Blo x, M. sch, S. Weiss, D. Scaramuzza, and R. Siegwart, “Vision based MAV navigation in unknown and unstructured environments,” in IEEE Inter. Robotics and Automation (ICRA), pp. 21-28, 2010.
[6] J. Stowers, M. Hayes, and A. Bainbridge-Smith, “Biologically inspired UAV obstacle avoidance and control using monocular optical flow & divergence templates,” in IEEE conf. Automation Robotics and Applications (ICARA), pp. 378-383, 2011.
[7] S. P. Soundararaj, A. K. Sujeeth, and A. Saxena, “Autonomous indoor helicopter flight using a single onboard camera,” in IEEE Conf. Intelligent Robots and Systems, pp. 5307-5314, 2009.
[8] Y. Matsumoto, M. Inaba, and H. Inoue, “Visual navigation using view-sequenced route representation,” in IEEE Conf. Robotics and Automation, vol.1, pp. 83-88, 1996.
[9] J. Saunders and R. Beard, “Tracking a target in wind using a micro air vehicle with a fixed angle camera,” in IEEE Conf. American Control, pp. 3863-3868, 2008.
[10] S. Azrad, F. Kendoul, D. Perbrianti, and K. Nonami, “Visual servoing of an autonomous micro air vehicle for ground object tracking,” in IEEE/RSJ Conf. Intelligent Robots and Systems , pp. 5321-5326, 2009.
[11] K. Jun-Sik, H. Myung, and T. Kanade, “Parallel algorithms to a parallel hardware: designing vision algorithms for a GPU,” in IEEE Conf. Computer Vision Workshops (ICCV Workshops) , pp. 862-869, 2009
[12] P. Kuchnio and D. W. Capson, “A parallel mapping of optical flowto compute unified device architecture for motion-based image segmentation,” in IEEE Inter. Image Processing (ICIP), pp. 2325-2328, 2009.
[13] S. Hinterstoisser, C. Cagniart, S. Ilic, P. Sturm, N. Navab, P. Fua, and V. Lepetit, "Gradient response maps for real-time detection of textureless objects," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 34, pp. 876-888, 2012.
[14] H. Moradi, C. Jongmoo, K. Eunyoung, and L. Sukhan, “A real-time wall detection method for indoor environments,” in IEEE Conf. Intelligent Robots and Systems, pp. 4551-4557, 2006.
[15] N. Thuy Tuong and J. Jae Wook, “Camera auto-exposing and auto-focusing for edge-related applications using a particle filter," in Conf. Intelligent Robots and Systems (IROS), pp. 1177-1182, 2010.
[16] J. Y. Bouguet, “Pyramidal implementation of the Lucas Kanade feature tracker description of the algorithm,” Intel Corporation Microprocessor Research Labs.
[17] A. Giachetti, M. Campani, and V. Torre, “The use of optical flow for road navigation,” IEEE Trans. on, Robotics and Automation, vol. 14, pp. 34-48, 1998.
[18] D. H. Ballard, “Generalizing the Hough transform transform to detect arbitrary shapes,” Pattern Recognition Vol. 11, No.2. pp. 11 1122. 1981.
[19] J. C. H. Leung and G. F. NcLean, "Vanishing point matching," in IEEE Inter. Image Processing, pp. 305-308 vol.2, 1996.
[20] V. Cantoni, L. Lombardi, M. Porta, and N. Sicard, “Vanishing point detection: representation analysis and new approaches,” in IEEE Conf. Image Analysis and Processing, pp. 90-94, 2001.
[21] K. Hui, J. Y. Audibert, and J. Ponce, “Vanishing point detection for road detection,” in IEEE Conf. on Computer Vision and Pattern Recognition, pp. 96-103, 2009.
[22] C. Bills, J. Chen, and A. Saxena, “Autonomous MAV Flight in Indoor Environments using Single Image Perspective Cues,” in IEEE Conf. on Robotics and Automation (ICRA), pp. 5776-5783, 2011.
[23] G. Bradski and A. Kaehler, Learning OpenCV: Computer Vision with the OpenCV Library, O''Reilly Media, 2008.
[24] “Parrot AR.Drone,” [Online]. Available: http://cdn.ardrone.parrot.com


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top