|
[1] A. P. A. Mohinder S. Grewal, Lawrence R. Weill,Global Positioning Systems InertialNavigation and Integration. John Wiley & Sons, Inc., 2001. [2] J. L. Weston and D. H. Titterton, “Modern inertial navigation technology and itsapplication,”Electronics Communication Engineering Journal, vol. 12, no. 2, pp.49–64, Apr 2000. [3]ADXL345 Digital Accelerometer Product Specification, 2009. [4]ITG-3200 Gyroscope Product Specification Revision 1.4, InvenSense Inc., 1197 Bor-regas Ave, Sunnyvale, CA 94089 U.S.A., 03 2010. [5] K. Y. Lum, X. Dong, K. Z. Y. Ang, and F. Lin, “Simulation study of homography-based vision-aided inertial navigation for aerial vehicles,” in11th IEEE InternationalConference on Control Automation (ICCA), June 2014, pp. 1357–1362. [6] I. Q. Whishaw, “Dead reckoning (path integration) requires the hippocampal for-mation: evidence from spontaneous exploration and spatial learning tasks in light(allothetic) and dark (idiothetic) tests,”Behavioural Brain Research, vol. 127, pp.49–69, 2001. [7] J. S. Brlow, “Inertial navigation as a basis for animal navigation,”Journal of Theo-retical Biology, vol. 6, pp. 76–117, January 1964. [8] W. H. Pickering, “Missiles, rockets, and space flight,”Electrical Engineering, vol. 78,no. 5, pp. 449–459, May 1959. [9] R. L. Greenspan,Inertial Navigation Technology from 1970–1995.John Wiley &Sons, Inc., March 1995. [10] D. T. Knight, “Achieving modularity with tightly-coupled GPS/INS,” inIEEEPLANS 92 Position Location and Navigation Symposium Record, Mar 1992, pp. 426–432. [11] A. K. Brown, “GPS/INS uses low-cost MEMS IMU,”IEEE Aerospace and ElectronicSystems Magazine, vol. 20, no. 9, pp. 3–10, Sept 2005. [12] D. Liao, J. q. Yang, and Y. Zhu, “INS computer design basing on subdivision tech-nology,” inComputer Engineering and Technology (ICCET), 2010 2nd InternationalConference on, vol. 4, April 2010, pp. V4–46–V4–49. [13] S. G. Andrey Soloviev and F. van Graas, “Deeply integrated GPS/low-cost IMUfor low CNR signal processing: Flight test results and real time implementation,”Proceedings of the 17th International Technical Meeting of the Satellite Division ofThe Institute of Navigation (ION GNSS 2004), pp. 1598 – 1608, September 21 - 242004. [14] J. S. Randle and M. A. Horton, “Low cost navigation using micro-machined technol-ogy,” inProceedings of Conference on Intelligent Transportation Systems, Nov 1997,pp. 1064–1067. [15] D. Gebre-Egziabher, R. C. Hayward, and J. D. Powell, “A low-cost GPS/inertialattitude heading reference system (AHRS) for general aviation applications,” inIEEE1998 Position Location and Navigation Symposium (Cat. No.98CH36153), Apr 1998,pp. 518–525. [16] S. Hong, M. H. Lee, H.-H. Chun, S.-H. Kwon, and J. L. Speyer, “Observability oferror states in GPS/INS integration,”IEEE Transactions on Vehicular Technology,vol. 54, no. 2, pp. 731–743, March 2005. [17] S. Hong, M. H. Lee, S. H. Kwon, and H. H. Chun, “A car test for the estimation ofGPS/INS alignment errors,”IEEE Transactions on Intelligent Transportation Sys-tems, vol. 5, no. 3, pp. 208–218, Sept 2004. [18] J. A. R. J. L. S. Sinpyo Hong, Man Hyung Lee, “Observability analysis of INS witha GPS multi-antenna system,”KSME International Journal, vol. 16, pp. 1367–1378,November 2002. [19] C. M. B. Dana H. Ballard,Computer Vision. Prentice-Hell, Inc., 1982. [20] F. Chaumette and S. Hutchinson, “Visual servo control. I. Basic approaches,”IEEERobotics Automation Magazine, vol. 13, no. 4, pp. 82–90, Dec 2006. [21] N. Owens, C. Harris, and C. Stennett, “Hawk-Eye tennis system,” in2003 Inter-national Conference on Visual Information Engineering VIE 2003, July 2003, pp.182–185. [22] W. J. Wilson, C. C. W. Hulls, and G. S. Bell, “Relative end-effector control usingcartesian position based visual servoing,”IEEE Transactions on Robotics and Au-tomation, vol. 12, no. 5, pp. 684–696, Oct 1996. [23] B. Thuilot, P. Martinet, L. Cordesses, and J. Gallice, “Position based visual servoing:keeping the object in the field of vision,” inProceedings 2002 IEEE InternationalConference on Robotics and Automation (Cat. No.02CH37292), vol. 2, 2002, pp.1624–1629 vol.2. [24] R. Basri, E. Rivlin, and I. Shimshoni, “Visual homing: surfing on the epipoles,” inSixth International Conference on Computer Vision (IEEE Cat. No.98CH36271), Jan1998, pp. 863–869. [25] F. Chaumette and S. Hutchinson, “Visual servo control. II. Advanced approaches[Tutorial],”IEEE Robotics Automation Magazine, vol. 14, no. 1, pp. 109–118, March2007. [26] A. J. Davison, “Real-time simultaneous localisation and mapping with a single cam-era,”Proceedings of the Ninth IEEE International Conference on Computer Vision,vol. 2, p. 1403, October 2003. [27] P. Pinies, T. Lupton, S. Sukkarieh, and J. D. Tardos, “Inertial aiding of inversedepth SLAM using a monocular camera,” inProceedings 2007 IEEE InternationalConference on Robotics and Automation, April 2007, pp. 2797–2802. [28] C. N. Taylor, M. J. Veth, J. F. Raquet, and M. M. Miller, “Comparison of two im-age and inertial sensor fusion techniques for navigation in unmapped environments,”IEEE Transactions on Aerospace and Electronic Systems, vol. 47, no. 2, pp. 946–958,April 2011. [29] D. Zachariah and M. Jansson, “Camera-aided inertial navigation using epipolarpoints,” inIEEE/ION Position, Location and Navigation Symposium, May 2010,pp. 303–309. [30] R. Hartley and A. Zissermann,Multiple View Geometry in Computer Vision, 2nd,Ed. Cambridge University Press, 2003. [31] I.TheMathWorks,“Whatiscameracalibration?”[On-line].Available:http://www.mathworks.com/help/vision/ug/camera-calibration.html?requestedDomain=www.mathworks.com [32] J.-Y.Bouguet, “Camera calibration toolbox for Matlab,”http://www.vision.caltech.edu/bouguetj/calibdoc/. [33] M. Brown and D. Lowe, “Invariant features from interest point groups,”In BritishMachine Vision Conference, pp. 656–665, September 2002. [34] D. G. Lowe, “Distinctive image features from scale-invariant keypoints,”InternationalJournal of Computer Vision, vol. 60, no. 2, pp. 91–110, 2004. [Online]. Available:http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94 [35] ——, “Object recognition from local scale-invariant features,” inProceedings of theSeventh IEEE International Conference on Computer Vision, vol. 2, 1999, pp. 1150–1157 vol.2. [36] T. T. H. Bay, A. Ess and L. V. Gool, “Speeded-up robust features (SURF),” inComputer Vision and Image Understanding, vol. 110, no. 3, 2008., p. 346–359. [37] L. Juan and O. Gwun, “A comparison of SIFT, PCA-SIFT and SURF,” inInterna-tional Journal of Image Processing, vol. 3, no. 4, 2009, pp. 143–152. [38] S. Benhimane and E. Malis, “Homography-based 2D visual servoing,” inProceedings2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.,May 2006, pp. 2397–2402. [39] B. Espiau, F. Chaumette, and P. Rives, “A new approach to visual servoing inrobotics,”IEEE Transactions on Robotics and Automation, vol. 8, no. 3, pp. 313–326,Jun 1992. [40] F. Chaumette, “Image moments: a general and useful set of features for visual ser-voing,”IEEE Transactions on Robotics, vol. 20, no. 4, pp. 713–723, Aug 2004. [41] E. Malis, F. Chaumette, and S. Boudet, “212-D visual servoing,”IEEE Transactionson Robotics and Automation, vol. 15, no. 2, pp. 238–250, Apr 1999. [42] J. Gao, S. J. Kim, and M. S. Brown, “Constructing image panoramas using dual-homography warping,” inCVPR 2011, June 2011, pp. 49–56. [43] S. Zhao, X. Dong, J. Cui, Z. Y. Ang, F. Lin, K. Peng, B. M. Chen, and T. H. Lee,“Design and implementation of homography-based vision-aided inertial navigationof UAVs,” inProceedings of the 32nd Chinese Control Conference, July 2013, pp.5101–5106. [44] M. Zuliani, C. S. Kenney, and B. S. Manjunath, “The multiRANSAC algorithm andits application to detect planar homographies,” inIEEE International Conference onImage Processing 2005, vol. 3, Sept 2005, pp. III–153–6. [45] A. Soloviev and A. J. Rutkowski,Fusion of inertial, optical flow, and airspeed mea-surements for UAV navigation in GPS-denied environments. SPIE 7332, UnmannedSystems Technology XI, 733202 (30 April 2009); doi: 10.1117/12.820177, 30 April2009. [46] H. A. Ardakani and T. Bridges, “Review of the 3-2-1 Euler angles: a yaw-pitch-rollsequence,”Department of Mathematics, University of Surrey, Guildford GU2 7XHUK, 2010. [47] M. Zuliani,RANSAC for Dummies, July 2014. [48] E.R.S.L.,“Erlerobotics: Erle-brain, a linux brain for drones,” https://erlerobotics.gitbooks.io/erle-robotics-erle-brain-a-linux-brain-for-drones/content/en/mavlink/mavlink.html. [49] Fritzing, “Raspberry Pi 2 & 3 pin mappings,” https://developer.microsoft.com/en-us/windows/iot/docs/pinmappingsrpi. [50] LorenzMeier,“ninnux/testmavlink,”2012.[Online].Available:https://github.com/ninnux/testmavlink/tree/master/mavlink/include/mavlink/v1.0/common [51] P. Y. H. Robert Grover Brown,Introduction to Random Signal and Applied Kalmanfiltering, 3rd ed.New York, Chichester, Brisbane, Toronto, Singapore, Weinheim:John Wiley & Sons, Inc.y, 1997.
|