跳到主要內容

臺灣博碩士論文加值系統

(98.82.120.188) 您好!臺灣時間:2024/09/11 17:01
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:沈士隆
研究生(外文):SHEN, SHIH-LUNG
論文名稱:基於視覺偵測之模糊PID控制系統於無人機自主追蹤與降落任務
論文名稱(外文):Vision-based Fuzzy PID Control Systems for Tracking and Landing of Autonomous UAV
指導教授:林正堅林正堅引用關係
指導教授(外文):LIN,CHENG-JIAN
口試委員:李慶鴻陳政宏
口試委員(外文):LEE,CHING-HUNGCHEN,CHENG-HUNG
口試日期:2024-07-22
學位類別:碩士
校院名稱:國立勤益科技大學
系所名稱:資訊工程系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2024
畢業學年度:112
語文別:中文
論文頁數:69
中文關鍵詞:無人機自主追蹤與降落系統目標偵測模糊神經網路PD控制器模糊自適應P控制器
外文關鍵詞:Unmanned aerial and ground vehicle cooperation systemtarget detectionfuzzy neural networkPD controllerfuzzy adaptive P controller
相關次數:
  • 被引用被引用:0
  • 點閱點閱:23
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
近年來,無人機的應用迅速擴展。其中,在無人機的自主追蹤與降落任務中,仍然是一個困難的挑戰,尤其當環境是在複雜的實際環境下,感測器會因環境干擾導致沒辦法精確地降落在無人地面載具上。此外,還需考慮無人機載重問題,若所承載的感測器太重,除了會使無人機電池更耗電,在飛行時也會變得更不穩定。此外還須考慮到無人機機載的邊緣運算器之資源,運算量不能太過複雜,以及由於無人地面載具並非靜止狀態,在設計無人機的自主降落控制器時,還需將移動中的無人地面載具特性考慮到控制器中。因此,為了克服上述問題,本論文基於視覺偵測以控制無人機進行自主降落在無人地面載具上。使用YOLO-tiny(You Only Look Once-tiny,YOLO-tiny)與ArUco marker來檢測自行設計的降落平台,並且訓練一個基於視覺的模糊神經網路之高度預測器,將YOLO-tiny所檢測到的降落平台面積輸入進此預測器,獲取無人機與無人地面載具之間高度,以及將無人機、相機與世界座標進行轉換,根據影像座標來控制無人機到達無人地面載具位置,以及設計PD控制器與模糊自適應P控制器進行追蹤與降落任務。最後在實驗結果中,本文所提出的模糊神經網路中,其平均絕對誤差為0.0424。並且在實際降落至移動的無人地面載具上之實驗中,本文所提出的模糊自適應P控制器的降落成功率高達95%。
In recent years, there has been a rapid expansion in the utilization of drones. One of the challenging issues that persists is the autonomous tracking and landing tasks for drones, particularly in complex real-world environments. In such environments, sensor accuracy may be compromised by environmental interference, posing difficulties in achieving precise landings on unmanne3-d ground vehicles (UGVs). Moreover, it is essential to take into consideration the payload of the drone; if the sensors being transported are excessively heavy, it will result in increased battery usage and may compromise the stability of the drone during flight. Additionally, the capabilities of the onboard edge computing devices need to be evaluated, ensuring that the computational tasks are not overly complex. Given that the UGVs are mobile entities, the dynamic nature of the moving UGVs should be factored in when developing the autonomous landing controller for the drone. To tackle the aforementioned challenges, this study introduces a vision-based system designed to facilitate the autonomous landing of a drone on an unmanned vehicle. The system leverages YOLO- tiny (You Only Look Once-tiny) and ArUco markers for the detection of a specially crafted landing platform. Additionally, a vision-based fuzzy neural network height estimator is developed and trained to utilize the landing platform’s area identified by YOLO-tiny to calculate the vertical distance separating the drone from the unmanned vehicle. The coordinates of the drone, camera, and world are converted to regulate the drone’s position in relation to the unmanned ground vehicle using image coordinates. Furthermore, a PD controller and a fuzzy adaptive P controller are formulated for the purpose of executing tracking and landing operations. The experimental outcomes reveal that the proposed fuzzy neural network displayed an average absolute error of 0.0424. Furthermore, during experiments that entailed the actual landing on a moving unmanned ground vehicle, the proposed fuzzy adaptive P controller demonstrated an impressive landing success rate of 95%.


摘要 I
Abstract III
誌謝 V
目錄 VI
圖目錄 IX
表目錄 XII
第一章 緒論 1
1.1 研究動機 1
1.2 研究目的 2
1.3 論文架構 3
第二章 文獻探討 4
第三章 基於視覺偵測之模糊自適應P控制系統 10
3.1 系統架構介紹 11
3.1.1硬體設備與通訊架構 12
3.1.2軟體架構 13
3.1.3座標系校正 15
3.2 視覺偵測與高度預測方法 17
3.2.1 YOLO-tiny之降落平台偵測 18
3.2.2 FNN高度預測器 19
3.2.3 ArUco marker偵測 21
3.3 PD追蹤與模糊自適應P降落控制器之設計 23
3.3.1偏航校正控制器 23
3.3.2 PD追蹤控制器 26
3.3.3模糊自適應P降落控制器 29
第四章 實驗結果 39
4.1 資料收集與預處理 39
4.2 評估指標 41
4.3 目標偵測之實驗結果 42
4.4 FNN高度預測器之實驗結果 44
4.5 ArUco marker檢測之實驗結果 46
4.6 追蹤與降落控制器之控制參數設定 48
4.7 無人機自主返航至無人地面載具之追蹤與降落實驗 51
第五章 結論與未來工作 62
參考文獻 64

[1]X. Dong, Y. Ren, J. Meng, S. Lu, T. Wu and Q. Sun, "Design and implementation of multi-rotor UAV power relay platform", Proc. 2nd IEEE Adv. Inf. Manage. Communicates Electron. Automat. Control Conf., pp. 1142-1146, 2018.
[2]K. A. O. Suzuki, P. K. Filho and J. R. Morrison, "Automatic battery replacement system for UAVs: Analysis and design", J. Intell. Robot. Syst., vol. 65, no. 1, pp. 563-586, Jan. 2012.
[3]N. K. Ure, G. Chowdhary, T. Toksoz, J. P. How, M. A. Vavrina and J. Vian, "An automated battery management system to enable persistent missions with multiple aerial vehicles", IEEE/ASME Trans. Mechatron., vol. 20, no. 1, pp. 275-286, Feb. 2015.
[4]P. Marcon, J. Janousek and R. Kadlec, "Vision-based and differential global positioning system to ensure precise autonomous landing of UAVs", Proc. Progr. Electromagn. Res. Symp. (PIERS-Toyama), pp. 542-546, 2018.
[5]L. Yan, J. Qi, M. Wang, C. Wu and J. Xin, "A safe landing site selection method of uavs based on LiDAR Point Clouds", 2020 39th Chinese Control Conference (CCC), 2020.
[6]B. Herissé, T. Hamel, R. Mahony and F. X. Russotto, "Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow", IEEE Trans. Robot., vol. 28, no. 1, pp. 77-89, Feb. 2012.
[7]A. Keipour, G. A. S. Pereira, R. Bonatti, R. Garg, P. Rastogi, G. Dubey and S. Scherer, "Visual servoing approach to autonomous UAV landing on a moving vehicle", Sensors, vol. 22, no. 17, 2022.
[8]S. Battiato, L. Cantelli, F. D’Urso, G. M. Farinella, L. Guarnera, D. Guastella, C. D. Melita, G. Muscato, A. Ortis, F. Ragusa and C. Santoro, "A System for Autonomous Landing of a UAV on a Moving Vehicle", Image Analysis and Processing - ICIAP 2017. ICIAP 2017: Vol. 10484. Lecture Notes in Computer Science, (pp. 129–139) Cham: Springer.
[9]A. Keipour, G. A. S. Pereira and S. A. Scherer, "Real-time ellipse detection for robotics applications", IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7009-7016, Oct. 2021, doi: 10.1109/LRA.2021.3097057.
[10]Z. Li, C. Meng, F. Zhou, X. Ding, X. Wang, H. Zhang, P. Guo and X. Meng, "Fast vision-based autonomous detection of moving cooperative target for unmanned aerial vehicle landing", J. Field Robot., vol. 36, no. 1, pp. 34-48, Jan. 2019.
[11]R. Jin, H. M. Owais, D. Lin, T. Song and Y. Yuan, "Ellipse proposal and convolutional neural network discriminant for autonomous landing marker detection", J. Field Robot., vol. 3, no. 1, pp. 6-16, 2019.
[12]S. Garrido-Jurado, R. Munoz-Salinas, F. J. Madrid-Cuevas and M. J. Marin-Jimenez, "Automatic generation and detection of highly reliable fiducial markers under occlusion", Pattern Recognition, vol. 47, no. 6, pp. 2280-2292, 2014.
[13]E. Olson, "AprilTag: A robust and flexible visual fiducial system", Proc. IEEE Int. Conf. Robot. Automat., pp. 3400-3407, 2011.
[14]A. Borowczyk, D. Nguyen, A. Phu-Van Nguyen, D.Q. Nguyen, D. Saussié and J. Le Ny, "Autonomous landing of a multirotor micro air vehicle on a high velocity ground vehicle", 2016, [online] Available: http://arxiv.org/abs/1611.07329.
[15]H.-T. Zhang et al., "Visual navigation and landing control of an unmanned aerial vehicle on a moving autonomous surface vehicle via adaptive learning", IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 12, pp. 5345-5355, Dec. 2021.
[16]P. Serra, R. Cunha, T. Hamel, D. Cabecinhas and C. Silvestre, "Landing of a quadrotor on a moving target using dynamic image-based visual servo control", IEEE Trans. Robot., vol. 32, no. 6, pp. 1524-1535, Dec 2016.
[17]P.M. Gupta, "Landing a UAV in hash winds and turbulent open waters", IEEE Robotics and Automation Letters, vol. 8, no. 2, 2023.
[18]J. Redmon, S. Divvala, R. Girshick and A. Farhadi, "You only look once: Unified real-time object detection", IEEE Conference on Computer Vision and Pattern Recognition, pp. 779-788, 2016.
[19]S. Ren, K. He, R. Girshick and J. Sun, "Faster R-CNN: Towards real-time object detection with region proposal networks", Proc. Int. Conf. Neural Inf. Process. Syst., pp. 91-99, 2015.
[20]N. Q. Truong, P. H. Nguyen, S. H. Nam and K. R. Park, "Deep learning-based super-resolution reconstruction and marker detection for drone landing", IEEE Access, vol. 7, pp. 61639-61655, 2019.
[21]Y. Rao, S. Ma, J. Xing, H. Zhang and X. Ma, "Real time vision-based autonomous precision landing system for UAV airborne processor", 2020 Chinese Automation Congress (CAC), pp. 532-537, 2020.
[22]J. Redmon and A. Farhadi, "YOLO9000: Better faster stronger", Proc. IEEE Conf. Comput. Vis. Pattern Recognit., pp. 6517-6525, Jul. 2017.
[23]N. Xuan-Mung, S. K. Hong, N. P. Nguyen, L. N. N. T. Ha and T.-L. Le, "Autonomous quadcopter precision landing onto a heaving platform: New method and experiment", IEEE Access, vol. 8, pp. 167192-167202, Sep. 2020.
[24]M. Demirhan and C. Premachandra, "Development of an automated camera-based drone landing system", IEEE Access, vol. 8, pp. 202111-202121, Oct. 2020.
[25]G. Niu, Q. Yang, Y. Gao and M. -O. Pun, "Vision-based autonomous landing for unmanned aerial and ground vehicles cooperative systems", IEEE Robot. Automat. Lett., vol. 7, no. 3, pp. 6234-6241, Jul. 2022.
[26]D. E. Rumelhart, G. E. Hinton and R. J. Williams, "Learning representations by back-propagating errors" in Nature, Nature Publishing Group, vol. 323, no. 6088, pp. 533-536, Oct. 1986.
[27]Yann LeCun, Léon Bottou, Yoshua Bengio and Patrick Haffner, "Gradient-based learning applied to document recognition", Proceedings of the IEEE, vol. 86, no. 11, pp. 2278-2324, 1998.
[28]M. Demirhan and C. Premachandra, "Development of an automated camera-based drone landing system", IEEE Access, vol. 8, pp. 202111-202121, 2020.
[29]C. Chen, S. Chen, G. Hu, B. Chen, P. Chen and K. Su, "An auto-landing strategy based on pan-tilt based visual servoing for unmanned aerial vehicle in GNSS-denied environments", Aerosp. Sci. Technol., vol. 116, Sep. 2021.
[30]M. Rabah, A. Rohan, M.-H. Haghbayan, J. Plosila and S.-H. Kim, "Heterogeneous parallelization for object detection and tracking in UAVs", IEEE Access, vol. 8, pp. 42784-42793, 2020.
[31]Q. Lu, B. Ren and S. Parameswaran, "Shipboard landing control enabled by an uncertainty and disturbance estimator", J. Guid. Control Dyn., vol. 41, no. 7, pp. 1502-1520, Jul. 2018.
[32]L. Tan, J. Wu, X. Yang and S. Song, "Research on optimal landing trajectory planning method between an UAV and a moving vessel", Appl. Sci., vol. 9, no. 18, pp. 3708, Sep. 2019.
[33]R. N. Jácome, H. L. Huertas, P. C. Procel and A. G. Garcés, "Fuzzy logic for speed control in object tracking inside a restricted area using a drone" in Developments and Advances in Defense and Security, Singapore:Springer, pp. 135-145, 2020.
[34]M. Rabah, A. Rohan, Y.-J. Han and S.-H. Kim, "Design of fuzzy-PID controller for quadcopter trajectory-tracking", Int. J. Fuzzy Logic Intell. Syst., vol. 18, no. 3, pp. 204-213, 2018.
[35]J. Redmon and A. Farhadi, "YOLOv3: An incremental improvement", 2018.
[36]A. Bochkovskiy, C.-Y. Wang and H.-Y. Mark Liao, "YOLOv4: Optimal speed and accuracy of object detection", arXiv:2004.10934, 2020.
[37]C.-Y. Wang, A. Bochkovskiy and H.-Y. M. Liao, "YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors", arXiv:2207.02696, 2022.

電子全文 電子全文(網際網路公開日期:20290806)
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top