跳到主要內容

臺灣博碩士論文加值系統

(44.220.44.148) 您好!臺灣時間:2024/06/14 12:12
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:傅兆陽
研究生(外文):Fu, Chao-Yang
論文名稱:基於視差變化之視覺里程器計算測繪車前拍雙相機定位定向之精度
論文名稱(外文):Precision of Orientation Data of Dual Forwards Photographing Cameras on MMS Determined by Visual Odometry Based on Disparity Changing
指導教授:蔡展榮蔡展榮引用關係
指導教授(外文):Tsay, Jaan-Rong
口試委員:趙鍵哲邱式鴻
口試委員(外文):Jen-Jer JawShih-Hong Chio
口試日期:2017-07-21
學位類別:碩士
校院名稱:國立成功大學
系所名稱:測量及空間資訊學系
學門:工程學門
學類:測量工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:英文
論文頁數:68
中文關鍵詞:視覺里程器六自由度自運動定位定向視差
外文關鍵詞:Visual Odometry6-DOFEgo-motionPose EstimationDisparity
相關次數:
  • 被引用被引用:0
  • 點閱點閱:2
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本文使用基於視差變化的視覺里程器(DBVO)於室外估計測繪車(MMS)前拍雙相機的位置與姿態。執行DBVO所需之輸入資料為序列前拍立體像對、雙相機之內方位以及它們之間的相對方位,而雙相機之內方位與相對方位在拍攝過程中視為已知且固定不變。在本實驗中,利用序列影像中第一組立體像對的外方位,將DBVO的成果轉換至製圖框架(Mapping frame)。DBVO分為4個步驟: (1)共軛像點的偵測與匹配。(2)以視差方程式計算像點對應物點的坐標。(3)以三維正形轉換決定相機的自運動(Ego-motion)。(4)估計序列影像在製圖框架的位置與姿態。實驗區域為一個閉合街區外加一個U形迴轉,MMS行駛約1.5公里,並取得447組前拍立體像對。以DBVO決定自運動的品質,在平行與垂直景深方向的平移量RMSD分別為0.19公尺與0.05公尺,而三個旋轉角ω, φ, κ的RMSD分別為0.0416°、0.0837°與0.0664°,且自運動的RMSD在實驗區直線與轉彎路段並無明顯差異。以DBVO在製圖框架中定位定向的品質,平面位置的絕對較差根據路況不同而有所差異,相對較差則隨著移動距離增加而呈現階梯式遞減,當MMS移動距離超過650公尺,相對較差降到1%以下。高程的絕對較差隨著MMS移動距離增加而緩步線性成長,而相對較差則微幅線性降低,當MMS行駛約1.2公里時,高程相對較差約為0.20%。以DBVO成果在製圖框架中前方交會定點的品質,在不同立體像會因其定位定向成果品質而有所差異,在起點附近之立體像對,其定點之坐標較差在平面與高程方向分別為0.88公尺與-0.04公尺; 隨著MMS行駛約550公尺,定點的坐標較差在平面與高程方向分別達到138.73公尺與3.83公尺。
This thesis uses Disparity Based Visual Odometry (DBVO) to estimate the pose of dual forwards photographing cameras on MMS in outdoor environment. The input data of DBVO contain a sequence of forwards photographing stereo pairs, Interior Orientation (IO) of dual cameras and their Relative Orientation (RO). The IO and RO are assumed to be invariant during taking images. In the experiment, the results of DBVO are transformed into mapping frame by means of the Exterior Orientation (EO) of first exposure station. DBVO includes 4 steps: (1) Keypoint detection and matching. (2) Calculating coordinates of object points by parallax equations. (3) Determining ego-motion by 3D conformal transformation. (4) Estimating pose of sequential images in mapping frame. Test area is a block with an extra U turn. MMS drives about 1.5 km and takes 447 stereo pairs. In quality of ego-motion determination, the RMSD of translations is 0.19 m and 0.05 m along and across depth direction, respectively. The RMSD of rotation angles ω, φ and κ is 0.0416°, 0.0837° and 0.0664°, respectively. Besides, ego-motion determination has the same performance along straight and turning paths. In quality of pose estimation in mapping frame. The Absolute Difference (AD) of horizontal position depends on the scenes along the paths, but the Relative Difference (RD) has elevated reduction when the moving distance increases. The RD is lower than 1% when MMS drives more than 650 m. The AD of elevation grows gradually as moving distance increases, but the RD decreases slightly, which is 0.20% when MMS drives 1.2 km. In quality of object points determination. The horizontal and vertical coordinate difference is 0.88 m and -0.04 m in the beginning. As MMS drives about 550 m, the horizontal and vertical coordinate difference is 138.73 m and 10.79 m, respectively.
摘要 I
Abstract II
致謝 III
Contents IV
List of Tables VI
List of Figures VII
List of Abbreviations IX
Chapter 1. Introduction 1
1.1 Background 1
1.1.1 Mobile Mapping System (MMS) 1
1.1.2 Visual Odometry (VO) 2
1.2 Motivation 4
1.3 Contribution 5
Chapter 2. Methodology 7
2.1 Equipment of MMS 7
2.2 Control surveying 11
2.3 Photo Triangulation (PT) 12
2.4 Disparity Based Visual Odometry (DBVO) 14
2.4.1 Keypoint Detection & Matching 15
2.4.2 Calculating Coordinates of Object Points 20
2.4.3 Determining 6-DOF Ego-motion 24
2.4.4 Estimating The Pose of Images in Mapping Frame 25
Chapter 3. Results 26
3.1 Test Data 26
3.2 Object points measured by e-GPS 27
3.3 Reference Data Derived by PT 30
3.3.1 Report of Minimal Constraint Adjustment 30
3.3.2 Report of General Constraint Adjustment 34
3.4 The Result of DBVO 35
3.4.1 The Influence of Lens Distortion 35
3.4.2 Keypoints under Epipolar Geometry 37
3.4.3 Object Points determined by Parallax Equations 39
3.4.4 Ego-motion Determined by DBVO 40
3.4.5 Pose of Each Exposure Stations Estimated by DBVO 41
Chapter 4. Discussion 43
4.1 Quality of Ego-motion Determination 43
4.1.1 Compare Parallax Equations and Intersection 43
4.1.2 Evaluated by Posteriori STD 44
4.1.3 Evaluated by Reference Data 47
4.2 Quality of Pose Estimation in Mapping Frame 49
4.2.1 Evaluated by Reference Trajectory 49
4.2.2 Evaluated by Check Points 58
Chapter 5. Conclusion 61
References 65
Appendix A. 67
Baarda, W. (1968). A TESTING PROCEDIJRE FOR USE IN GE, ODETIC NE, TWORKS. Netherlands Geodetic Commission, 2(5).
Brown, D. C. (1971). Close-range camera calibration. Photogrammetric Engineering, 37(8), 855-866.
Caruso, D., Engel, J., & Cremers, D. (2015). Large-Scale Direct SLAM for Omnidirectional Cameras Paper presented at the International Conference on Intelligent Robots and Systems (IROS).
Concha, A., & Civera, J. (2014). Using superpixels in monocular SLAM. Paper presented at the Robotics and Automation (ICRA), 2014 IEEE International Conference on.
Eade, E., & Drummond, T. (2006). Edge Landmarks in Monocular SLAM. Paper presented at the BMVC.
Engel, J., Schöps, T., & Cremers, D. (2014). LSD-SLAM: Large-scale direct monocular SLAM. Paper presented at the European Conference on Computer Vision.
Engel, J., Stückler, J., & Cremers, D. (2015). Large-scale direct SLAM with stereo cameras. Paper presented at the Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on.
Förstner, W., & Wrobel, B. P. (2016). Photogrammetric Computer Vision: Springer International Publishing.
Fusiello, A., Trucco, E., & Verri, A. (2000). A compact algorithm for rectification of stereo pairs. Machine Vision and Applications, 12(1), 16-22.
Geiger, A., Ziegler, J., & Stiller, C. (2011). Stereoscan: Dense 3d reconstruction in real-time. Paper presented at the Intelligent Vehicles Symposium (IV), 2011 IEEE.
Hirschmüller, H., Schmid, K., & Suppa, M. (2015). Computer Vision for Mobile Robot Navigation. Photogrammetric Week 2015, 143-154.
Hirschmuller, H., Innocent, P. R., & Garibaldi, J. M. (2002). Fast, unconstrained camera motion estimation from stereo without tracking and robust statistics. Paper presented at the Control, Automation, Robotics and Vision, 2002. ICARCV 2002. 7th International Conference on.
Howard, A. (2008, 22-26 Sept. 2008). Real-time stereo visual odometry for autonomous ground vehicles. Paper presented at the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.
Hyun, J. J., Chun, H.-J., Keum, B., Seok Seo, Y., Kim, Y. S., Tae Jeen, Y., . . . Lim, M. T. (2012). Feasibility of Obtaining Quantitative 3-Dimensional Information Using Conventional Endoscope: A Pilot Study (Vol. 45).
Irani, M., Rousso, B., & Peleg, S. (1994, 21-23 Jun 1994). Recovery of ego-motion using image stabilization. Paper presented at the 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.
Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2), 91-110.
Molton, N., & Brady, M. (2000). Practical structure and motion from stereo when motion is unconstrained. International Journal of Computer Vision, 39(1), 5-23.
Nistér, D., Naroditsky, O., & Bergen, J. (2004). Visual odometry. Paper presented at the Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on.
Roh, H., Jeong, J., Cho, Y., & Kim, A. (2016). Accurate Mobile Urban Mapping via Digital Map-Based SLAM. Sensors, 16(8), 1315.
Sünderhauf, N. (2012). Robust optimization for simultaneous localization and mapping. Technischen Universitat Chemnitz.
Saez, J. M., Escolano, F., & Penalver, A. (2005). First steps towards stereo-based 6DOF SLAM for the visually impaired. Paper presented at the Computer Vision and Pattern Recognition-Workshops, 2005. CVPR Workshops. IEEE Computer Society Conference on.
Schenk, T. (1999). Digital photogrammetry: Vol. I: Background, fundamentals, automatic orientation produceres: TerraScience.
Schneider, J., & Förstner, W. (2014). Real-Time Accurate Geo-Localization of a MAV with Omnidirectional Visual Odometry and GPS. Paper presented at the ECCV Workshops (1).
Snavely, N., Seitz, S. M., & Szeliski, R. (2008). Modeling the world from internet photo collections. International Journal of Computer Vision, 80(2), 189-210.
Wang, M.-S., Liu, C.-C., Liu, J.-L., & Hsiao, F.-D. (2006). An Analysis on Virtual-Base-Station Real Time Kinematic GPS Positioning System of e-GPS Base Station Network. [Surveying Results Analysis on Virtual Base Station Real Time Kinematic Positioning System of e-GPS Base Station Network]. Journal of Cadastral Survey, 25(2), 1-19.
Wolf, P. R., Dewitt, B. A., & Wilkinson, B. E. (2000). Elements of Photogrammetry: with applications in GIS (Vol. 3): McGraw-Hill New York.
Wu, K. Y. (2009). Utilization of Object Space Conditions for Camera Calibration of A Mobile Mapping System. (Master), National Cheng Kung University, Tainan, Taiwan.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top