跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.91) 您好!臺灣時間:2024/12/11 01:04
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:楊榮程
研究生(外文):Yang, Jung-Cheng
論文名稱:光達與慣性感測器融合之定位與建圖於無人機自主導航之應用
論文名稱(外文):Tightly-Coupled Lidar-Inertial SLAM with Autonomous Navigation for Quadrotors
指導教授:程登湖
指導教授(外文):Cheng, Teng-Hu
口試委員:程登湖陳宗麟王傑智
口試委員(外文):Cheng, Teng-HuChen, Tsung-LinWang, Chieh-Chih
口試日期:2020-08-06
學位類別:碩士
校院名稱:國立交通大學
系所名稱:工學院機器人碩士學位學程
學門:工程學門
學類:其他工程學類
論文種類:學術論文
論文出版年:2020
畢業學年度:109
語文別:英文
論文頁數:63
中文關鍵詞:同步定位與建圖自動駕駛四軸無人機
外文關鍵詞:SLAMAutonomous NavigationQuadrotors
相關次數:
  • 被引用被引用:0
  • 點閱點閱:442
  • 評分評分:
  • 下載下載:60
  • 收藏至我的研究室書目清單書目收藏:0
近幾年,機器人相關領域漸漸受到矚目。SLAM這個領域已經被研究一段時間,是一項機器人完成高難度任務所需的基礎。然而,感測器融合的重要性在近幾年才開始被研究。
光達與慣性感測器融合之定位與建圖系統在這篇論文中被提出,其中也包含回環偵測及全局位姿優化。光達與慣性感測器被使用來互相補足對方的缺點,達到更好的位置與姿態估測。在SLAM系統中,估測出的位姿及地圖提供機器人環境資訊,使其實現搜索式路徑規劃。在同時考慮環境中障礙物與自身的運動限制下,搜索式路徑規劃可以產生出一條平滑且最短時間的軌跡。
最後,這個光達與慣性感測器系統在KITTI資料集中測是,以評估其方法之精確度。同時也在室內環境中,在無人機上實現搜索式路徑規劃,達到實時地產生一條局部最佳的路徑。
In recent years, the field of robotics has attracted lots of attention. For the robot to perform high level tasks, SLAM becomes a fundamental technique to build on. The SLAM problem has been well studied for a period of time, but the importance of sensor fusion to complement each other was just investigated in the past ten years.
In this work, a tightly-coupled lidar-inertial SLAM system is developed with loop-closure and pose graph optimization. The estimate pose and map from the SLAM system provide the knowledge of the environment for the search-based motion planning method. By taking the obstacles in the surrounding and the motion constraints into consideration, a smooth and minimum-time trajectory is generated.
Finally, the lidar-inertial SLAM system is evaluated in KITTI dataset, and an indoor flight experiment has proven the capability of generating a locally-optimal trajectory in real-time.
Abstract i
摘要 ii
Acknowledgements iii
Contents iv
List of Figures vii
List of Tables viii
Notations ix
Chapter 1 Introduction 1
1.1 Motivation 1
1.2 Problem Statement 2
1.3 Contribution 2
Chapter 2 Background and Related Works 3
2.1 Lidar Ego Motion Estimation 3
2.2 Loop-Closure and Pose-Graph Optimization 4
2.3 Motion Planning 5
2.4 System Overview 6
Chapter 3 Front-End: Lidar-Inertial SLAM 7
3.1 Estimator Initialization 7
3.1.1 IMU Pre-Integration 8
3.1.2 Lidar Feature Extraction and Frame-to-Frame Odometry 11
3.1.3 Gyroscope Bias Estimation 11
3.1.4 Gravity and Velocity Estimation 12
3.1.5 Gravity Estimation with High Speed Motion 13
3.1.6 Completing Initialization 13
3.2 Tightly-Coupled Lidar-Inertial SLAM 14
3.2.1 Formulation 14
3.2.2 IMU Measurement Residual 17
3.2.3 Lidar Measurement Residual 17
3.2.4 Marginalization 19
3.2.5 Mapping for The Lidar Point Cloud 21
Chapter 4 Back-End: Loop-Closure and Pose Graph Optimization 22
4.1 Loop-Closure and Relocalization 23
4.1.1 Keyframe Selection 23
4.1.2 Ground Segmentation 24
4.1.3 Loop-Closure 25
4.1.4 Tightly-Coupled Relocalization 27
4.2 Global Pose Graph Optimization 28
4.2.1 Adding Keyframes into Pose Graph 28
4.2.2 4-DOF Pose Graph Optimization 30
4.2.3 Global Consistency Maintenance 31
Chapter 5 Search-Based Motion Planning 32
5.1 Optimal Trajectory Planning 32
5.2 Real-Time Heuristic Search 35
5.3 Collision Checking for Trajectory 37
Chapter 6 Experiments 39
6.1 Front-End Performance 39
6.2 Full Closed Loop Performance 40
6.2.1 Evaluate with KITTI Evaluation Tool 40
6.2.2 Evaluate with EVO 46
6.3 Indoor Planning Test 48
6.3.1 Comparison with Global Planning 48
6.3.2 Re-planning with Unknown Environment 50
6.4 Time Consumption 53
Chapter 7 Conclusion and Future Work 54
7.1 Conclusion 54
7.2 Future Work 55
Reference 56
Appendix 58
[1] J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real-time,” in Proceedings of Robotics: Science and Systems Conference, July 2014.
[2] H. Ye, Y. Chen, and M. Liu, “Tightly coupled 3d lidar inertial odometry and mapping,” in 2019 International Conference on Robotics and Automation (ICRA), May 2019, pp. 3144–3150.
[3] Cedric Le Gentil, Teresa A. Vidal-Calleja, and Shoudong Huang, "IN2LAAMA: INertial Lidar Localisation Autocalibration And MApping", ArXiv abs/1905.09517 (2019).
[4] S. Zhao, Z. Fang, H. Li and S. Scherer, "A Robust Laser-Inertial Odometry and Mapping Method for Large-Scale Highway Environments," 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019, pp. 1285-1292, doi: 10.1109/IROS40897.2019.8967880.
[5] C. Qin, H. Ye, C. E. Pranata, J. Han, and M. Liu, “LINS: A lidar-inerital state estimator for robust and fast navigation,” CoRR, vol. abs/1907.02233, 2019.
[6] T. Qin, P. Li and S. Shen, "VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator," in IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004-1020, Aug. 2018, doi: 10.1109/TRO.2018.2853729
[7] S. Leutenegger, S. Lynen, M. Bosse, R. Siegwart, and P. Furgale, “Keyframe-based visual-inertial odometry using nonlinear optimization,” The International Journal of Robotics Research, vol. 34, no. 3, pp. 314–334, 2015.
[8] C. Cadena et al., "Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age," in IEEE Transactions on Robotics, vol. 32, no. 6, pp. 1309-1332, Dec. 2016, doi: 10.1109/TRO.2016.2624754.
[9] T. Shan and B. Englot, “Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain,” in 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Oct 2018, pp. 4758–4765.
[10] W. Hess, D. Kohler, H. Rapp and D. Andor, “Real-time loop closure in 2D LIDAR SLAM,” 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, 2016, pp. 1271-1278.
[11] R. Dubé, D. Dugas, E. Stumm, J. Nieto, R. Siegwart and C. Cadena, “SegMatch: Segment based place recognition in 3D point clouds,” 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 2017, pp. 5266-5272.
[12] S. Shen, N. Michael and V. Kumar, "Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs," 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, 2015, pp. 5303-5310, doi: 10.1109/ICRA.2015.7139939.
[13] T. Liu and S. Shen, “Spline-based initialization of monocular visual-inertial state estimators at high altitude,” IEEE Robotics and Automation Letters, vol. 2, no. 4, pp. 2224–2231, 2017.
[14] C. Forster, L. Carlone, F. Dellaert and D. Scaramuzza, “On-Manifold Preintegration for Real-Time Visual–Inertial Odometry,” in IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1-21, Feb. 2017.
[15] J. Solà, “Quaternion kinematics for the error-state kalman filter,”
CoRR, vol. abs/1711.02508, 2017.
[16] D. Mellinger and V. Kumar, "Minimum snap trajectory generation and control for quadrotors," 2011 IEEE International Conference on Robotics and Automation, Shanghai, 2011, pp. 2520-2525, doi: 10.1109/ICRA.2011.5980409.
[17] C. Richter, A. Bry, and N. Roy, “Polynomial trajectory planning for aggressive quadrotor flight in dense indoor environments,” in International Journal of Robotics Research, Springer, 2016.
[18] S. Liu, N. Atanasov, K. Mohta and V. Kumar, "Search-based motion planning for quadrotors using linear quadratic minimum time control", Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 2872-2879, Sep. 2017.
[19] S. Liu, K. Mohta, N. Atanasov and V. Kumar, "Search-based motion planning for aggressive flight in SE(3)", IEEE Robot. Autom. Lett., vol. 3, no. 3, pp. 2439-2446, Jul. 2018.
[20] S. Liu, K. Mohta, N. Atanasov, and V. Kumar, “Towards search-based motion planning for micro aerial vehicles,” arXiv preprint arXiv:1810.03071, 2018.
[21] A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous driving? the kitti vision benchmark suite,” in Conference on Computer Vision and Pattern Recognition (CVPR), 2012.
[22] A. Geiger, P. Lenz, C. Stiller, and R. Urtasun, “Vision meets robotics: The kitti dataset,” International Journal of Robotics Research (IJRR), 2013.
[23] S. Agarwal, K. Mierle, and Others, “Ceres solver,” .
[24] https://github.com/HKUST-Aerial-Robotics/A-LOAM
[25] J. Zhang, M. Kaess and S. Singh, “On degeneracy of optimization-based state estimation problems,” 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, 2016, pp. 809-816.
[26] E. B. Olson, “Real-time correlative scan matching,” in Robotics and Automation, 2009. ICRA’09. IEEE International Conference on. IEEE, 2009, pp. 4387–4393.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top