跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.89) 您好!臺灣時間:2024/12/10 00:26
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:黃自誠
研究生(外文):Huang, Tzu-Cheng
論文名稱:具視覺慣性里程術的模型預測控制於無人履帶車之軌跡跟蹤
論文名稱(外文):Trajectory Tracking of Unmanned Tracked Vehicle by Model Predictive Control with Visual-Inertial Odometry
指導教授:陳顯禎
指導教授(外文):Chen, Shean-Jen
口試委員:李祖聖莊智清黃國勝彭昭暐
口試委員(外文):Li, Tzuu-HsengJuang, Jyh-ChinHwang, Kao-ShingPeng, Chao-Wei
口試日期:2020-07-08
學位類別:碩士
校院名稱:國立交通大學
系所名稱:影像與生醫光電研究所
學門:工程學門
學類:生醫工程學類
論文種類:學術論文
論文出版年:2020
畢業學年度:108
語文別:中文
論文頁數:47
中文關鍵詞:無人履帶車視覺慣性里程術感測器融合瞬時旋轉中心滑差參數軌跡跟蹤模型預測控制
外文關鍵詞:unmanned tracked vehiclevisual inertial odometrysensor fusionInstantaneous center of rotationslip parametertrajectory Trackingmodel predictive control
相關次數:
  • 被引用被引用:0
  • 點閱點閱:455
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
近幾年將無人地面載具(unmanned ground vehicle,UGV)應用場域非常多樣,本論文主要為將UGV應用於農地上並進行公分等級的精準軌跡追蹤,以利此UGV可做為後續雷射除蟲或疏花的載具。為了易於適應農地瞬間變化與車體操控性需求,我們採用滑移轉向(skid-steer)電動履帶車(tracked vehicle),並藉由機器人作業系統(robot operating system,ROS)將所有機電與無人控制整合載一起。首先建立其運動學模型(kinematic model),並進一步藉由Intel D435i深度相機之視覺慣性里程術(visual-inertial odometry,VIO)協助的瞬時旋轉中心(instantaneous centers of rotation,ICR)方式來即時求出滑差參數(sliding parameter),從而降低打滑對履帶車速度(含線速度與角速度)控制的影響。經實驗結果證明,此履帶車在不同的地面其速度控制誤差可降至到10%以內。於未知環境中進行軌跡跟蹤(trajectory tracking),我們透過同步定位與地圖構建(simultaneous localization and mapping,SLAM)之RTAB-Map (Real-Time Appearance-Based Mapping)方式來進行。這裡亦以上述Intel D435i深度相機來提供視覺里程(visual odometry,VO)資訊,而非光達(light detection and ranging,LiDAR),主要考量Intel D435i元件成本較低以及在農田特徵參考點較少,另為了適應農地中非剛體、非亮暗不變、非靜態的環境,亦加入Intel D435i內含的陀螺儀提供的慣性測量單元(inertial measurement unit,IMU),並藉由擴展卡爾曼濾波器(extended Kalman filter,EKF)進行感測器融合(sensor fusion)來提供VIO資訊,從而使動態定位誤差低於5%以內,並且提升定位的強健性。最後透過模型預測控制(model predictive control,MPC)方式來進行Matlab模擬預測與實地實驗,來達到於複雜農地環境上之精準軌跡跟蹤。
In recent years, the fields for application of unmanned ground vehicle (UGV) has been very diverse. This paper is mainly to apply UGV to agricultural land and carry out accurate trajectory tracking at the centimeter level. In the future, it will be combined with lasers for pest control and fruit thinning. In order to adapt to the changeable agricultural land, we use a skid-steer tracked vehicle, and all hardware and programs are integrated through robot operating system (ROS). First, we establish its kinematic model, and then use the position information provided by visual-inertial odometry (VIO) to calculate instantaneous centers of rotation (ICR) and sliding parameters in real time so as to reduce the influence of skid on the speed control of tracked vehicles. The experimental results prove that the deviation of speed control on different grounds can be reduced to less than 10%. For trajectory tracking in an unknown environment, we choose RTAB-Map (Real-Time Appearance-Based Mapping) as our simultaneous localization and mapping (SLAM) algorithm. Here, Intel D435i depth camera is also used to provide visual odometry (VO) information instead of light detection and ranging (LiDAR). It is more economic by using Intel D435i depth camera and there will be fewer feature points in agricultural land. In order to adapt to the non-rigid body, non-fixed brightness, and unstable environment in agricultural land, we use the inertial measurement unit (IMU) provided by Intel d435i and extended Kalman filter (EKF) for sensor fusion to improve positioning accuracy. The results show that the deviation of dynamic positioning is less than 5%, and the robustness of positioning is improved. Finally, we use model predictive control (MPC) for trajectory tracking, and use Matlab for simulation and experiments to achieve accurate trajectory tracking in complex agricultural land environments.
摘要 I
ABSTRACT III
致謝 IV
目錄 VI
圖目錄 VIII
表目錄 IX
第一章 序論 1
1-1前言 1
1-2研究動機與方法 1
1-3論文架構 3
第二章 履帶車的機電系統與運動學模型 4
2-1 機電系統 4
2-2 履帶車運動學模型 5
2-3 瞬時旋轉中心與滑動參數計算 7
第三章 即時定位與地圖構建 11
3-1 即時定位與地圖構建 11
3-1-1 特徵點擷取 12
3-1-2 建圖 15
3-1-3 迴環檢測 16
3-2 視覺慣性里程計 17
3-1-1 卡爾曼濾波器 18
3-1-2 擴展卡爾曼濾波器 20
第四章 模型預測控制 22
4-1 模型預測控制 22
4-1-1 預測模型 23
4-1-2 滾動優化 24
4-1-3 反饋校正 25
4-2 非線性模型預測控制 25
4-3 適應性模型預測控制 26
第五章 實驗討論與結果 27
5-1 視覺慣性里程計定位精度 27
5-2 滑動參數與瞬時旋轉中心 29
5-3 純跟蹤控制進行軌跡追蹤 39
5-4 模型預測控制進行軌跡追蹤 40
第六章 結論與未來工作 43
參考文獻 45
[1] M. Labbe and F. Michaud, “RTAB‐Map as an open‐source lidar and visual simultaneous localization and mapping library for large‐scale and long‐term online operation” Journal of Field Robotics 36 (2019) 416–446.
[2] Mathieu Labb´ et al., “Appearance-based loop closure detection for online large-scale and long-term operation,” IEEE Transactions on Robotics 29 (2013) 734–745.
[3] J. Shi “Good features to track,” Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (1994) 593–600.
[4] M. Calonder et al., “BRIEF: binary robust independent elementary features,” Computer Vision – ECCV (2010) 778–792.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top