跳到主要內容

臺灣博碩士論文加值系統

(98.82.140.17) 您好!臺灣時間:2024/09/08 03:06
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:劉存皓
研究生(外文):LIU, CHUN-HAO
論文名稱:基於影像單應性的影像輔助導航系統研究
論文名稱(外文):A Research of Homography-Based Vision-Aided Inertial Navigation System
指導教授:林繼耀林繼耀引用關係
指導教授(外文):LUM, KAI-YEW
口試委員:田豐李佩君
口試委員(外文):TYAN, FENGLEE, PEI-JUN
口試日期:2017-03-10
學位類別:碩士
校院名稱:國立暨南國際大學
系所名稱:電機工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:英文
論文頁數:135
中文關鍵詞:基於影像的服務影像單應性尤拉角利用尺不變特徵轉換加速穩健特徵值慣性導航系統卡爾曼濾波器
外文關鍵詞:image-based visual servohomographyEular angularSIFTSURFINSKalman filter
相關次數:
  • 被引用被引用:0
  • 點閱點閱:252
  • 評分評分:
  • 下載下載:47
  • 收藏至我的研究室書目清單書目收藏:0
自從第一顆人造衛星被發射上軌道並在之後的幾年提出全球定位系統(GPS)理論不過短短三四十年,至今GPS已發展非常成熟,無人機(UAV)的導航便是其應用之一。現今的無人機可以比以往更便宜的價格、更低的汙染以及更高的靈活性來使用,例如多軸無人機幾乎能在各種環境起降,但是GPS輔助仍有些限制條件像是在室內時,GPS的電磁波訊號會大量衰減,在這樣的狀況下操作無人機會有一定的困難及危險性存在。因此此論文便討論一種使用電腦視覺方法來輔助慣性導航系統,其電腦視覺理論稱為homography 演算法。

在本篇論文中,我們運用卡爾曼濾波器(Kalman filter)來估測速度偏差以及角速度偏差,利用這兩項估測值來補償慣性導航系統。
Currently, the Global Positioning System (GPS) products have been matured and have applied in multiple ways. One of the application is Unmanned Aircraft Vehicle (UAV). The feature points of the UAV are the lower pollution and price, highly freedom to change instrument for the mission, and highly maneuverability in the fly field. Therefore, it will make the easy navigation but there are still some limitations of the GPS aiding, for example, the GPS signal is not efficient indoor because of the attenuation of electromagnetic waves. Therefore, we discuss a vision approach to aid low-cost inertial navigation system (INS) where the vision approach is called homography algorithm.
In this thesis, by using the Kalman filter, we estimate the velocity errors and angular velocity errors from the images to compensate the low-cost INS.
TABLE OF CONTENTS
致謝辭. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .i
摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .ii
ABSTRACT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .iii
TABLE OF CONTENTS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .iv
LIST OF TABLES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .vii
LIST OF FIGURES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .viii
LIST OF SYMBOLS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xi

CHAPTER

PART I. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
1. Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.2 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . .4
1.2.1Navigation Review . . . . . . . . . . . . . . . . . . . . .4
1.2.2Computer Vision Review . . . . . . . . . . . . . . . . . .8
1.2.3Homography Matrix . . . . . . . . . . . . . . . . . . . . .10
1.2.4Kalman Filter Review . . . . . . . . . . . . . . . . . . . .12
1.3 Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13
PART II. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
2. Mechanisation Equations of Inertial Navigation Systems. . . . . .17
2.1 Problem Description: Inertial Navigation System . . . . . . . . . .17
2.2 Euler Angular: Description of Rotation . . . . . . . . . . . . . . .18
2.2.1Yaw Axis Rotation . . . . . . . . . . . . . . . . . . . . .19
2.2.2Pitch Axis Rotation . . . . . . . . . . . . . . . . . . . . .20
2.2.3Roll Axis Rotation . . . . . . . . . . . . . . . . . . . . .21
2.2.43-2-1 Rotation Matrix . . . . . . . . . . . . . . . . . . . .22
2.2.5Angular Velocities . . . . . . . . . . . . . . . . . . . . . .25
2.3 INS Mechanisation Equations . . . . . . . . . . . . . . . . . . . . .26
2.4 GPS Error Equations . . . . . . . . . . . . . . . . . . . . . . . . .28
3. 2D Homography. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .30
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .30
3.1.12D Projective Transformation . . . . . . . . . . . . . . .30
3.1.2The Direct Linear Transformation (DLT) Algorithm . . .32
3.2 2D Projective Coordinates . . . . . . . . . . . . . . . . . . . . . . .32
3.2.1Downward-Looking Projective Coordinates . . . . . . . .32
3.2.2Side-Looking Projective Coordinates . . . . . . . . . . . .34
3.3 Homography between Two Images of a 2D Scene . . . . . . . . . .35
3.4 Computation of the Homography Matrix . . . . . . . . . . . . . . .37
3.4.1Feature Extraction and Matching . . . . . . . . . . . . .38
3.4.2The Normalized Direct Linear Transform (DLT) Algorithm 44
4. Homography-Based Vision-Aided Inertia Navigation Model. . .46
4.1 Navigation-Error Process Model . . . . . . . . . . . . . . . . . . .46
4.2 Homography-Based Measure Model . . . . . . . . . . . . . . . . . .49
4.3 Extended Kalman Filter (EKF) Based Estimation of NavigationError . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .51
PART III. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
5. Implementation on UAV. . . . . . . . . . . . . . . . . . . . . . . . . .54
5.1 Inertial Measurement Sensor Bios Correction . . . . . . . . . . . .54
5.2 Camera Calibration . . . . . . . . . . . . . . . . . . . . . . . . . .56
5.2.1Calibration Step . . . . . . . . . . . . . . . . . . . . . . .56
5.2.2Calibration Result . . . . . . . . . . . . . . . . . . . . . .58
5.3 MAVLink Communication . . . . . . . . . . . . . . . . . . . . . . .64
5.3.1Communication between Groundstation and RaspberryPi 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .64
5.3.2Communication between Pixhawk and Raspberry Pi 3 . .68
5.4 Automatic Controlled by Sending MAVLink Messages . . . . . . .69
5.5 Flight Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . .72
5.5.1Image Time Problem Lead Data Confusing . . . . . . . .72
5.5.2Timing Implementation of Homography-Based Vision-AidedINS Using Extended Kalman Filter . . . . . . . . . . . .73
5.5.3INS Mechanisation with Compensation . . . . . . . . . .77
6. Flight Experiment Result. . . . . . . . . . . . . . . . . . . . . . . . .88
6.1 Experiment Result . . . . . . . . . . . . . . . . . . . . . . . . . . .91
6.1.1INS Mechanisation with No Aiding . . . . . . . . . . . .91
6.1.2INS Mechanisation with Homography Estimation and Compensation ofδZKFandδψKF. . . . . . . . . . . . . . . .98
6.1.3INS Mechanisation with Estimation and Compensation ofδVKF,δωKF,δZKF, andδψKF. . . . . . . . . . . . . . . 107
6.2 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
PART IV. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .120
7. Conclusion and Future Work. . . . . . . . . . . . . . . . . . . . . . . 121
7.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
7.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
BIBLIOGRAPHY. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .123
APPENDICES. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .130

LIST OF TABLES
Table
3.1 Comparison of SIFT and SURF. . . . . . . . . . . . . . . . . . . . . . . .38
5.1 MAVLink package frame description. . . . . . . . . . . . . . . . . . . . .65

LIST OF FIGURES
Figure
2.1 3-2-1 Euler Angles transform, display the transformation from Earth tothe body frame. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18
2.2 First step: Rotate the yaw axis with theψangle by following the right-hand rule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19
2.3 Second step: Rotate the pitch axis with theθangle by following theright-hand rule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20
2.4 Third step: Rotate the roll axis with theφangle by following the right-hand rule. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
2.5 Simple model of INS mechanization. . . . . . . . . . . . . . . . . . . . . .28
2.6 A lever arm with a distancelbbetween GPS antenna and IMU frame. . .29
3.1 A cuboid structure in 3D space projects onto 2D photo. . . . . . . . . . .31
3.2 The projective pointspAandpBtaken from 3D space pointPare appliedon aircraft capturing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33
3.3 According to the Figure 3.2, the projective frame models are transformedto apply to the ground vehicle. . . . . . . . . . . . . . . . . . . . . . . . .34
3.4 The camera lens mounted on the ground vehicle points toyaxis direc-tion. The axisx′andy′are represented the image axis and they will betransformed tou′andv′to fix with vehicle body frame. . . . . . . . . . .35
3.5 The difference of Gaussian: Each octave in initial image is repeatedlyconvolved with Gaussian to produce scale space image which is shown inright image. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .40
3.6 The image gradients vector will be accumulated into a histogram, thenthe gradients from histogram will integrate to become new bigger regionshown in right image. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .41
3.7 The matching feature points calculated from the images by SIFT algo-rithm. (a) Two original images captured by front and rear cameras at thesame time. (b) The result of (a) after doing SIFT calculation. The 395matching point was found and the descriptor was stored in the front andrear matrices [128×2621] and [128×1809] separately. . . . . . . . . . .42
3.8 The matching interest points are calculated from images after SURF al-gorithm. The matching points used by SURF are less than SIFT but thecomputing time is much faster than SIFT. . . . . . . . . . . . . . . . . .43
5.1 Accelerometer measurement: the measurements from top to bottom arerepresented x-, y- and z-axis separately with accelerometer biasεaandnoisewa. (Unit:meters/sec2) . . . . . . . . . . . . . . . . . . . . . . . .55
5.2 Gyroscope measurement: the measurements from top to bottom are rep-resented x-, y- and z-axis separately with gyroscope biasεgand noisewg.(Unit:radian/sec) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .55
5.3 MATLAB Toolbox initial window. . . . . . . . . . . . . . . . . . . . . . .56
5.4 Image names. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .57
5.5 Image names result. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .58
5.6 Input the designated extract window size. . . . . . . . . . . . . . . . . . .58
5.7 Input the size of one grid measured from real life grid. The measurementsize is 97×94 millimeters. . . . . . . . . . . . . . . . . . . . . . . . . . . .59
5.8 Extract the number of the grid: (a) The four boundary points of gridcircled by the green line. (b) The red symbols “+” are represented thecross points of each grid. . . . . . . . . . . . . . . . . . . . . . . . . . . .59
5.9 Calibration result. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .60
5.10 The extracted grids from image (a) to image (i) denote the captured po-sitions from left top of image to right bottom of image. Each capturedimage will include 15 different rotation angles and little shifts to increasethe diversity of database. . . . . . . . . . . . . . . . . . . . . . . . . . . .62
5.11 3D camera-centered view presents the result of capturing 135 images fromgrid in MATLAB. The symbolOon the left side denotes camera location.The captured grid is locatied on the right side. . . . . . . . . . . . . . . .63
5.12 3D world-centered view presents the fixed grid is captured by a movablecamera. This image transforms the Figure 5.11 from camera-centered toworld-centered view. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .63
5.13 MAVLink package frame. . . . . . . . . . . . . . . . . . . . . . . . . . . .64
5.14 Communication map. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .65
5.15 CERIO operating mode. . . . . . . . . . . . . . . . . . . . . . . . . . . .66
5.16 Setting the ID of Repeater AP. For example, the ID is called “RepeaterAP”in the figure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .67
5.17 The system connects successfully. The Pi3 IP is 192.168.2.13 which isgiven by CERIO. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .67
5.18 Raspberry Pi 3 pinout. . . . . . . . . . . . . . . . . . . . . . . . . . . . .68
5.19 Raspberry Pi3 controls the roll, pitch and yaw axis which representx,yandzaxis of UAV. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .69
5.20 Drepresents the ground project of pixel resolution.D=6×10−66×10−3×100 = 0.1 72
5.21 The homography time ∆THis extended to a longer time to remove theeffect of errors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .73
5.22 UsingδˆHto compensate INS model. . . . . . . . . . . . . . . . . . . . .74
5.23 Homography time ∆THis 0.33 seconds but Kalman filter time ∆TKFis0.01 second. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .75
5.24 Simple INS model with Kalman filter. . . . . . . . . . . . . . . . . . . . .77
5.25 δVcompensator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .81
5.26 δZcompensator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .82
5.27 Calculate DCM and Euler angles. . . . . . . . . . . . . . . . . . . . . . .83
5.28 INS mechanisation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .84
5.29 INS mechanisation with Kalman filter. . . . . . . . . . . . . . . . . . . .85
5.30 Raw angular velocity records from the gyroscope of Pixhawk. . . . . . . .86
5.31 Inertial Accelerometer. . . . . . . . . . . . . . . . . . . . . . . . . . . . .87
6.1 Road test trajectory displayed on Google map. . . . . . . . . . . . . . . .88
6.2 Flight experiment trajectory displays on Mission Planner by using GPS. .90
6.3 The position data imported from Pixhawk displays on MATLAB. . . . .91
6.4 The estimation of Euler angle. . . . . . . . . . . . . . . . . . . . . . . . .94
6.5 Altitude. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .95
6.6 Velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .95
6.7 Anguler velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . .96
6.8 Height error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .96
6.9 ψerror. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .97
6.10 The trajectory of pure INS mechanisation. . . . . . . . . . . . . . . . . .97
6.11 The estimation of Euler angle. . . . . . . . . . . . . . . . . . . . . . . . . 101
6.12 Altitude. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
6.13 The errors of the homography estimates. . . . . . . . . . . . . . . . . . . 102
6.14 The compensationsδV,δω,δZ, andδψestimated by Kalman filter. . . . 103
6.15 Velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
6.16 Anguler velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
6.17 Height error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
6.18 ψerror. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
6.19 The trajectory of INS with homography estimation and compensation ofδZandδψ. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
6.20 The estimation of Euler angle. . . . . . . . . . . . . . . . . . . . . . . . . 109
6.21 Altitude. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
6.22 The errors of the homography estimates. . . . . . . . . . . . . . . . . . . 110
6.23 The compensationsδV,δω,δZ, andδψestimated by Kalman filter. . . . 111
6.24 Velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.25 Anguler velocity error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.26 Height error. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
6.27 ψerror. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
6.28 The trajectory of INS with homography estimation and all compensations:δV,δω,δZ, andδψ. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
6.29 The trajectories of Section 6.1.1, 6.1.2, and 6.1.3 which are performed inblue, green, and red lines. . . . . . . . . . . . . . . . . . . . . . . . . . . 117
6.30 Example of images that captured by binocular camera. . . . . . . . . . . 118
6.31 Histogram of match points between reference images and current images. 119
6.32 Histogram of ∆TH. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
[1] A. P. A. Mohinder S. Grewal, Lawrence R. Weill,Global Positioning Systems InertialNavigation and Integration. John Wiley & Sons, Inc., 2001.
[2] J. L. Weston and D. H. Titterton, “Modern inertial navigation technology and itsapplication,”Electronics Communication Engineering Journal, vol. 12, no. 2, pp.49–64, Apr 2000.
[3]ADXL345 Digital Accelerometer Product Specification, 2009.
[4]ITG-3200 Gyroscope Product Specification Revision 1.4, InvenSense Inc., 1197 Bor-regas Ave, Sunnyvale, CA 94089 U.S.A., 03 2010.
[5] K. Y. Lum, X. Dong, K. Z. Y. Ang, and F. Lin, “Simulation study of homography-based vision-aided inertial navigation for aerial vehicles,” in11th IEEE InternationalConference on Control Automation (ICCA), June 2014, pp. 1357–1362.
[6] I. Q. Whishaw, “Dead reckoning (path integration) requires the hippocampal for-mation: evidence from spontaneous exploration and spatial learning tasks in light(allothetic) and dark (idiothetic) tests,”Behavioural Brain Research, vol. 127, pp.49–69, 2001.
[7] J. S. Brlow, “Inertial navigation as a basis for animal navigation,”Journal of Theo-retical Biology, vol. 6, pp. 76–117, January 1964.
[8] W. H. Pickering, “Missiles, rockets, and space flight,”Electrical Engineering, vol. 78,no. 5, pp. 449–459, May 1959.
[9] R. L. Greenspan,Inertial Navigation Technology from 1970–1995.John Wiley &Sons, Inc., March 1995.
[10] D. T. Knight, “Achieving modularity with tightly-coupled GPS/INS,” inIEEEPLANS 92 Position Location and Navigation Symposium Record, Mar 1992, pp. 426–432.
[11] A. K. Brown, “GPS/INS uses low-cost MEMS IMU,”IEEE Aerospace and ElectronicSystems Magazine, vol. 20, no. 9, pp. 3–10, Sept 2005.
[12] D. Liao, J. q. Yang, and Y. Zhu, “INS computer design basing on subdivision tech-nology,” inComputer Engineering and Technology (ICCET), 2010 2nd InternationalConference on, vol. 4, April 2010, pp. V4–46–V4–49.
[13] S. G. Andrey Soloviev and F. van Graas, “Deeply integrated GPS/low-cost IMUfor low CNR signal processing: Flight test results and real time implementation,”Proceedings of the 17th International Technical Meeting of the Satellite Division ofThe Institute of Navigation (ION GNSS 2004), pp. 1598 – 1608, September 21 - 242004.
[14] J. S. Randle and M. A. Horton, “Low cost navigation using micro-machined technol-ogy,” inProceedings of Conference on Intelligent Transportation Systems, Nov 1997,pp. 1064–1067.
[15] D. Gebre-Egziabher, R. C. Hayward, and J. D. Powell, “A low-cost GPS/inertialattitude heading reference system (AHRS) for general aviation applications,” inIEEE1998 Position Location and Navigation Symposium (Cat. No.98CH36153), Apr 1998,pp. 518–525.
[16] S. Hong, M. H. Lee, H.-H. Chun, S.-H. Kwon, and J. L. Speyer, “Observability oferror states in GPS/INS integration,”IEEE Transactions on Vehicular Technology,vol. 54, no. 2, pp. 731–743, March 2005.
[17] S. Hong, M. H. Lee, S. H. Kwon, and H. H. Chun, “A car test for the estimation ofGPS/INS alignment errors,”IEEE Transactions on Intelligent Transportation Sys-tems, vol. 5, no. 3, pp. 208–218, Sept 2004.
[18] J. A. R. J. L. S. Sinpyo Hong, Man Hyung Lee, “Observability analysis of INS witha GPS multi-antenna system,”KSME International Journal, vol. 16, pp. 1367–1378,November 2002.
[19] C. M. B. Dana H. Ballard,Computer Vision. Prentice-Hell, Inc., 1982.
[20] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,”IEEERobotics Automation Magazine, vol. 13, no. 4, pp. 82–90, Dec 2006.
[21] N. Owens, C. Harris, and C. Stennett, “Hawk-Eye tennis system,” in2003 Inter-national Conference on Visual Information Engineering VIE 2003, July 2003, pp.182–185.
[22] W. J. Wilson, C. C. W. Hulls, and G. S. Bell, “Relative end-effector control usingcartesian position based visual servoing,”IEEE Transactions on Robotics and Au-tomation, vol. 12, no. 5, pp. 684–696, Oct 1996.
[23] B. Thuilot, P. Martinet, L. Cordesses, and J. Gallice, “Position based visual servoing:keeping the object in the field of vision,” inProceedings 2002 IEEE InternationalConference on Robotics and Automation (Cat. No.02CH37292), vol. 2, 2002, pp.1624–1629 vol.2.
[24] R. Basri, E. Rivlin, and I. Shimshoni, “Visual homing: surfing on the epipoles,” inSixth International Conference on Computer Vision (IEEE Cat. No.98CH36271), Jan1998, pp. 863–869.
[25] F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches[Tutorial],”IEEE Robotics Automation Magazine, vol. 14, no. 1, pp. 109–118, March2007.
[26] A. J. Davison, “Real-time simultaneous localisation and mapping with a single cam-era,”Proceedings of the Ninth IEEE International Conference on Computer Vision,vol. 2, p. 1403, October 2003.
[27] P. Pinies, T. Lupton, S. Sukkarieh, and J. D. Tardos, “Inertial aiding of inversedepth SLAM using a monocular camera,” inProceedings 2007 IEEE InternationalConference on Robotics and Automation, April 2007, pp. 2797–2802.
[28] C. N. Taylor, M. J. Veth, J. F. Raquet, and M. M. Miller, “Comparison of two im-age and inertial sensor fusion techniques for navigation in unmapped environments,”IEEE Transactions on Aerospace and Electronic Systems, vol. 47, no. 2, pp. 946–958,April 2011.
[29] D. Zachariah and M. Jansson, “Camera-aided inertial navigation using epipolarpoints,” inIEEE/ION Position, Location and Navigation Symposium, May 2010,pp. 303–309.
[30] R. Hartley and A. Zissermann,Multiple View Geometry in Computer Vision, 2nd,Ed. Cambridge University Press, 2003.
[31] I.TheMathWorks,“Whatiscameracalibration?”[On-line].Available:http://www.mathworks.com/help/vision/ug/camera-calibration.html?requestedDomain=www.mathworks.com
[32] J.-Y.Bouguet, “Camera calibration toolbox for Matlab,”http://www.vision.caltech.edu/bouguetj/calibdoc/.
[33] M. Brown and D. Lowe, “Invariant features from interest point groups,”In BritishMachine Vision Conference, pp. 656–665, September 2002.
[34] D. G. Lowe, “Distinctive image features from scale-invariant keypoints,”InternationalJournal of Computer Vision, vol. 60, no. 2, pp. 91–110, 2004. [Online]. Available:http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94
[35] ——, “Object recognition from local scale-invariant features,” inProceedings of theSeventh IEEE International Conference on Computer Vision, vol. 2, 1999, pp. 1150–1157 vol.2.
[36] T. T. H. Bay, A. Ess and L. V. Gool, “Speeded-up robust features (SURF),” inComputer Vision and Image Understanding, vol. 110, no. 3, 2008., p. 346–359.
[37] L. Juan and O. Gwun, “A comparison of SIFT, PCA-SIFT and SURF,” inInterna-tional Journal of Image Processing, vol. 3, no. 4, 2009, pp. 143–152.
[38] S. Benhimane and E. Malis, “Homography-based 2D visual servoing,” inProceedings2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.,May 2006, pp. 2397–2402.
[39] B. Espiau, F. Chaumette, and P. Rives, “A new approach to visual servoing inrobotics,”IEEE Transactions on Robotics and Automation, vol. 8, no. 3, pp. 313–326,Jun 1992.
[40] F. Chaumette, “Image moments: a general and useful set of features for visual ser-voing,”IEEE Transactions on Robotics, vol. 20, no. 4, pp. 713–723, Aug 2004.
[41] E. Malis, F. Chaumette, and S. Boudet, “212-D visual servoing,”IEEE Transactionson Robotics and Automation, vol. 15, no. 2, pp. 238–250, Apr 1999.
[42] J. Gao, S. J. Kim, and M. S. Brown, “Constructing image panoramas using dual-homography warping,” inCVPR 2011, June 2011, pp. 49–56.
[43] S. Zhao, X. Dong, J. Cui, Z. Y. Ang, F. Lin, K. Peng, B. M. Chen, and T. H. Lee,“Design and implementation of homography-based vision-aided inertial navigationof UAVs,” inProceedings of the 32nd Chinese Control Conference, July 2013, pp.5101–5106.
[44] M. Zuliani, C. S. Kenney, and B. S. Manjunath, “The multiRANSAC algorithm andits application to detect planar homographies,” inIEEE International Conference onImage Processing 2005, vol. 3, Sept 2005, pp. III–153–6.
[45] A. Soloviev and A. J. Rutkowski,Fusion of inertial, optical flow, and airspeed mea-surements for UAV navigation in GPS-denied environments. SPIE 7332, UnmannedSystems Technology XI, 733202 (30 April 2009); doi: 10.1117/12.820177, 30 April2009.
[46] H. A. Ardakani and T. Bridges, “Review of the 3-2-1 Euler angles: a yaw-pitch-rollsequence,”Department of Mathematics, University of Surrey, Guildford GU2 7XHUK, 2010.
[47] M. Zuliani,RANSAC for Dummies, July 2014.
[48] E.R.S.L.,“Erlerobotics: Erle-brain, a linux brain for drones,” https://erlerobotics.gitbooks.io/erle-robotics-erle-brain-a-linux-brain-for-drones/content/en/mavlink/mavlink.html.
[49] Fritzing, “Raspberry Pi 2 & 3 pin mappings,” https://developer.microsoft.com/en-us/windows/iot/docs/pinmappingsrpi.
[50] LorenzMeier,“ninnux/testmavlink,”2012.[Online].Available:https://github.com/ninnux/testmavlink/tree/master/mavlink/include/mavlink/v1.0/common
[51] P. Y. H. Robert Grover Brown,Introduction to Random Signal and Applied Kalmanfiltering, 3rd ed.New York, Chichester, Brisbane, Toronto, Singapore, Weinheim:John Wiley & Sons, Inc.y, 1997.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊