(54.236.58.220) 您好!臺灣時間:2021/03/08 09:37
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:黃世豪
研究生(外文):Shi-Hao Huang
論文名稱:廠內自動搬運車之微特徵視覺里程計定位系統
論文名稱(外文):Visual Odometry Using Tiny Features for AGV Localization in Factory Environment
指導教授:林巍聳
指導教授(外文):Wei-Song Lin
口試委員:鍾鴻源施慶隆陳柏全
口試委員(外文):Hung-Yuan ChungChing-Long ShihBo-Chiuan Chen
口試日期:2014-07-16
學位類別:碩士
校院名稱:國立臺灣大學
系所名稱:電機工程學研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2014
畢業學年度:102
語文別:中文
論文頁數:96
中文關鍵詞:自動搬運車視覺里程計定位導引
外文關鍵詞:Automated guided vehiclevisual odometrylocalizationnavigation
相關次數:
  • 被引用被引用:0
  • 點閱點閱:203
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
廠房的屋頂和牆壁阻絕衛星定位訊號,廠內的機台設備強烈干擾無線電波和磁場,本論文提出微特徵視覺里程計定位系統使自動搬運車可以在廠內環境順暢定位和導引。微特徵視覺里程計的攝影機採用主動光源朝向地面攝像,可以在單調的路面擷取微型特徵和深度的訊息,藉由匹配連續兩張影像中的特徵點達成估算位移和方向。微特徵視覺里程計採用尺度不變特徵轉換來擷取特徵點,並藉由瞬時旋轉中心推出車頭轉動的角度,路面顛頗引起的攝影機視野和尺度變動不會影響特徵點的匹配,因此位移和方向的估算準確度高又可靠,採用主動光源使自動搬運車無論在白天或晚上都可以工作。此微特徵視覺里程計定位系統安裝在一台實驗用的自動搬運車上,可以在主控電腦上規劃或變更行進路線,經過多種場景的測試,結果顯示八米的行進距離以內,定位和定向的準確度都可以符合廠內自動搬運車的需求,更長距離的行進可以利用基準標記系統消除累績的誤差,使自動搬運車可以在廠內長時間工作。

Navigation of automatic guided vehicles for transporting components inside a factory needs accurate estimates about vehicle’s position and orientation. The global positioning system fails to offer vehicle’s position in-door due to the structure of the building blocking the electromagnetic wave from the satellites. Electronic compass and radio positioning equipment are interfered severely by electromagnetic signals generated from electrical machines operating in the factory. This thesis presents a visual odometry which extracts tiny image features of the floor to estimate the displacement and orientation of the vehicle. The video camera which has an illuminating source points toward the floor to avoid disturbances resulted from reflection or shadow. We employ the scale-invariant feature transform algorithm to extract tiny image features from the image captured. Feature matching between two consecutive images enables using the triangulation method to calculate the displacement of the vehicle. Integrating the displacement data and using the rotating center of the vehicle lead to estimates of the position and orientation of the vehicle. The accuracy of estimates satisfies the requirements of AGV for factory use, and the result does not influenced by wheel slip or rough floor. Experiments under several scenarios show that path plan and modification can be achieved on the central control station. Position error is less than twenty centimeters for a path shorter than eight meters. For constant operations, the visual odometry needs a position-calibration system to eliminate accumulated errors.

國立臺灣大學碩士學位論文口試委員會審定書 i
致謝 ii
中文摘要 iii
ABSTRACT iv
圖目錄 viii
表目錄 xii
緒論 1
1.1 研究背景和動機 1
1.2 研究目標和貢獻 5
1.3 論文架構 7
第2章 自動搬運車導引技術簡介 8
2.1 自動搬運車的導引方法 8
2.1.1 外導式導引技術 10
2.1.2 內導式導引技術 13
2.1.3 視覺里程計 17
2.1.4 導引技術總結 19
2.2 視覺里程計演算法介紹 20
2.2.1 特徵偵測 20
2.2.2 特徵描述子 22
2.2.3 特徵匹配追蹤 23
2.2.4 移動估測 25
第3章 微特徵視覺里程計定位系統 27
3.1 影像前處理 28
3.2 視覺里程計演算法 34
3.2.1 尺度不變特徵轉換 34
3.2.2 瞬時旋轉中心 40
3.3 攝影機校正 43
3.4 視覺里程計演算法流程 51
第4章 智慧型定位導引系統 55
4.1 系統架構 55
4.1.1 自動搬運車的規格 58
4.1.2 攝影機的規格 60
4.1.3 雷射測距儀的規格 63
4.2 人機介面電子地圖 64
4.3 駕駛系統 66
4.4 實驗結果 72
第5章 結論與未來展望 90
參考文獻 91


Bay, H., Tuytelaars, T., &; Van Gool, L. (2006). Surf: Speeded up robust features Computer Vision–ECCV 2006 (pp. 404-417): Springer.
Beliveau, Y. J., Fithian, J. E., &; Deisenroth, M. P. (1996). Autonomous vehicle navigation with real-time 3D laser based positioning for construction. Automation in Construction, 5(4), 261-272. doi: http://dx.doi.org/10.1016/S0926-5805(96)00140-9
Brown, M., &; Lowe, D. G. (2002). Invariant Features from Interest Point Groups. Paper presented at the BMVC.
Byrne, R. H. (1993). Global positioning system receiver evaluation results.
Campbell, J., Sukthankar, R., Nourbakhsh, I., &; Pahwa, A. (2005). A robust visual odometry and precipice detection system using consumer-grade monocular vision. Paper presented at the Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference on.
Cassinis, R., Tampalini, F., &; Fedrigotti, R. (2005). Active markers for outdoor and indoor robot localization. Proceedings of TAROS, 27-34.
Cheng, Y., Maimone, M. W., &; Matthies, L. (2006). Visual odometry on the Mars exploration rovers-a tool to ensure accurate driving and science imaging. Robotics &; Automation Magazine, IEEE, 13(2), 54-62.
Corke, P., Strelow, D., &; Singh, S. (2004). Omnidirectional visual odometry for a planetary rover. Paper presented at the Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on.
Correa, D. S. O., Sciotti, D. F., Prado, M. G., Sales, D. O., Wolf, D. F., &; Osorio, F. S. (2012). Mobile robots navigation in indoor environments using kinect sensor. Paper presented at the Critical Embedded Systems (CBSEC), 2012 Second Brazilian Conference on.
Coulter, R. C. (1992). Implementation of the pure pursuit path tracking algorithm: DTIC Document.
Csaba, G., Somlyai, L., &; Vamossy, Z. (2012). Differences between Kinect and structured lighting sensor in robot navigation. Paper presented at the Applied Machine Intelligence and Informatics (SAMI), 2012 IEEE 10th International Symposium on.
Davison, A. J. (2003). Real-time simultaneous localisation and mapping with a single camera. Paper presented at the Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on.
Duane, C. B. (1971). Close-range camera calibration. Photogramm. Eng, 37, 855-866.
Forstner, W. (1986). A feature based correspondence algorithm for image matching. International Archives of Photogrammetry and Remote Sensing, 26(3), 150-166.
Fiala, M., &; Ufkes, A. (2011). Visual odometry using 3-dimensional video input. Paper presented at the Computer and Robot Vision (CRV), 2011 Canadian Conference on.
Fraundorfer, F., &; Scaramuzza, D. (2012). Visual odometry: Part II: Matching, robustness, optimization, and applications. Robotics &; Automation Magazine, IEEE, 19(2), 78-90.
Fujimoto, T., Ota, J., Arai, T., Ueyama, T., &; Nishiyama, T. (2001). Semi-guided navigation of AGV through iterative learning. Paper presented at the Intelligent Robots and Systems, 2001. Proceedings. 2001 IEEE/RSJ International Conference on.
Goecke, R., Asthana, A., Pettersson, N., &; Petersson, L. (2007). Visual vehicle egomotion estimation using the fourier-mellin transform. Paper presented at the Intelligent Vehicles Symposium, 2007 IEEE.
Gu, J., Hitomi, Y., Mitsunaga, T., &; Nayar, S. (2010). Coded rolling shutter photography: Flexible space-time sampling. Paper presented at the Computational Photography (ICCP), 2010 IEEE International Conference on.
Han, W.-G., Baek, S.-M., &; Kuc, T.-Y. (1997). Genetic algorithm based path planning and dynamic obstacle avoidance of mobile robots. Paper presented at the Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on.
Harris, C. G., &; Pike, J. (1988). 3D positional integration from image sequences. Image and Vision Computing, 6(2), 87-90.
Hartley, R., &; Zisserman, A. (2003). Multiple view geometry in computer vision: Cambridge university press.
Ho Wei, K., &; Low Sew, M. (2012, 19-22 Nov. 2012). An investigation of the use of Kinect sensor for indoor navigation. Paper presented at the TENCON 2012 - 2012 IEEE Region 10 Conference.
JIANGdagger, Z.-P., &; Nijmeijer, H. (1997). Tracking control of mobile robots: a case study in backstepping. Automatica, 33(7), 1393-1399.
Kampman, P., &; Schmidt, G. (1991). Indoor navigation of mobile robots by use of learned maps. Information Processing in Autonomous Mobile Robots, Springer Verlag, Berlin, Heidelberg, New York, 1(99), 1.
Koch, J., Wettach, J., Bloch, E., &; Berns, K. (2007, 17-19 Sept. 2007). Indoor Localisation of Humans, Objects, and mobile Robots with RFID Infrastructure. Paper presented at the Hybrid Intelligent Systems, 2007. HIS 2007. 7th International Conference on.
Lacroix, S., Mallet, A., Chatila, R., &; Gallo, L. (1999). Rover self localization in planetary-like environments. Paper presented at the Artificial Intelligence, Robotics and Automation in Space.
Lee, S., &; Song, J.-B. (2004). Mobile robot localization using optical flow sensors. International Journal of Control, Automation, and Systems, 2(4), 485-493.
Lee, S., &; Song, J. (2004). Mobile robot localization using optical flow sensors. International Journal of Control, Automation, and Systems, 2(4), 485-493.
Lee, T., Lam, H., Leung, F. H., &; Tam, P. K. (2003). A practical fuzzy logic controller for the path tracking of wheeled mobile robots. Control Systems, IEEE, 23(2), 60-65.
Lhuillier, M. (2005). Automatic structure and motion using a catadioptric camera. Paper presented at the Proceedings of the 6th Workshop on Omnidirectional Vision, Camera Networks and Non-Classical Cameras.
Longuet-Higgins, H. (1987). A computer algorithm for reconstructing a scene from two projections. Readings in Computer Vision: Issues, Problems, Principles, and Paradigms, MA Fischler and O. Firschein, eds, 61-62.
Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. International journal of computer vision, 60(2), 91-110.
Lundgren, M. (2003). Path tracking and obstacle avoidance for a miniature robot. Umea University, Masterarbeit.
Matthies, L., &; Shafer, S. A. (1987). Error modeling in stereo navigation. Robotics and Automation, IEEE Journal of, 3(3), 239-248.
Matthies, L. H., &; Chairman-Shafer, S. A. (1989). Dynamic stereo vision.
Milella, A., &; Siegwart, R. (2006). Stereo-based ego-motion estimation using pixel tracking and iterative closest point. Paper presented at the Computer Vision Systems, 2006 ICVS''06. IEEE International Conference on.
Milford, M. J., &; Wyeth, G. F. (2008). Single camera vision-only SLAM on a suburban road network. Paper presented at the Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on.
Montaner, M. B., &; Ramirez-Serrano, A. (1998). Fuzzy knowledge-based controller design for autonomous robot navigation. Expert Systems with Applications, 14(1), 179-186.
Moravec, H. P. (1980). Obstacle avoidance and navigation in the real world by a seeing robot rover: DTIC Document.
Moriyama, N., Takahashi, K., Yokota, T., Cho, T., Kobayashi, K., Watanabe, K., &; Kurihara, Y. (2012). Development of a Kinect-sensor-based navigation component for JAUS compliant mobile robots. Paper presented at the Soft Computing and Intelligent Systems (SCIS) and 13th International Symposium on Advanced Intelligent Systems (ISIS), 2012 Joint 6th International Conference on.
Mouragnon, E., Lhuillier, M., Dhome, M., Dekeyser, F., &; Sayd, P. (2006). Real time localization and 3d reconstruction. Paper presented at the Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on.
Nakada, T., Ohkubo, T., Kobayashi, K., Watanabe, K., &; Kurihara, Y. (2010). A study of visual odometry for mobile robots using omnidirectional camera. Paper presented at the SICE Annual Conference 2010, Proceedings of.
Nister, D., Naroditsky, O., &; Bergen, J. (2004). Visual odometry. Paper presented at the Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on.
Nister, D., Naroditsky, O., &; Bergen, J. (2006). Visual odometry for ground vehicle applications. Journal of Field Robotics, 23(1), 3-20.
Normey-Rico, J. E., Alcala, I., Gomez-Ortega, J., &; Camacho, E. F. (2001). Mobile robot path tracking using a robust PID controller. Control Engineering Practice, 9(11), 1209-1214.
Olson, C. F., Matthies, L. H., Schoppers, M., &; Maimone, M. W. (2000). Robust stereo ego-motion for long distance navigation. Paper presented at the Computer Vision and Pattern Recognition, 2000. Proceedings. IEEE Conference on.
Olson, C. F., Matthies, L. H., Schoppers, M., &; Maimone, M. W. (2003). Rover navigation using stereo ego-motion. Robotics and Autonomous Systems, 43(4), 215-229.
Piyathilaka, L., &; Munasinghe, R. (2010). Multi-camera visual odometry for skid steered field robot. Paper presented at the Information and Automation for Sustainability (ICIAFs), 2010 5th International Conference on.
Pretto, A., Menegatti, E., &; Pagello, E. (2011). Omnidirectional dense large-scale mapping and navigation based on meaningful triangulation. Paper presented at the Robotics and Automation (ICRA), 2011 IEEE International Conference on.
Reza, D. S. H., Mutijarsa, K., &; Adiprawita, W. (2011, 17-19 July 2011). Mobile robot localization using augmented reality landmark and fuzzy inference system. Paper presented at the Electrical Engineering and Informatics (ICEEI), 2011 International Conference on.
Rosten, E., &; Drummond, T. (2006). Machine learning for high-speed corner detection Computer Vision–ECCV 2006 (pp. 430-443): Springer.
Scaramuzza, D., &; Fraundorfer, F. (2011). Visual odometry [tutorial]. Robotics &; Automation Magazine, IEEE, 18(4), 80-92.
Scaramuzza, D., Fraundorfer, F., &; Siegwart, R. (2009). Real-time monocular visual odometry for on-road vehicles with 1-point ransac. Paper presented at the Robotics and Automation, 2009. ICRA''09. IEEE International Conference on.
Scaramuzza, D., &; Siegwart, R. (2008). Appearance-guided monocular omnidirectional visual odometry for outdoor ground vehicles. Robotics, IEEE Transactions on, 24(5), 1015-1026.
Shi, J., &; Tomasi, C. (1994). Good features to track. Paper presented at the Computer Vision and Pattern Recognition, 1994. Proceedings CVPR''94., 1994 IEEE Computer Society Conference on.
Takagi, T., &; Sugeno, M. (1985). Fuzzy identification of systems and its applications to modeling and control. Systems, Man and Cybernetics, IEEE Transactions on(1), 116-132.
Tamura, Y., Suzuki, M., Ishii, A., &; Kuroda, Y. (2009). Visual odometry with effective feature sampling for untextured outdoor environment. Paper presented at the Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on.
Tardif, J.-P., Pavlidis, Y., &; Daniilidis, K. (2008). Monocular visual odometry in urban environments using an omnidirectional camera. Paper presented at the Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on.
Tsumura, T. (1986). Survey of automated guided vehicle in a Japanese factory. Paper presented at the Robotics and Automation. Proceedings. 1986 IEEE International Conference on.
Tsumura, T. (1994). AGV in Japan-recent trends of advanced research, development, and industrial applications. Paper presented at the Intelligent Robots and Systems'' 94.''Advanced Robotic Systems and the Real World'', IROS''94. Proceedings of the IEEE/RSJ/GI International Conference on.
Wang, T., Xin, J., &; Zheng, N. (2011). A Method Integrating Human Visual Attention and Consciousness of Radar and Vision Fusion for Autonomous Vehicle Navigation. Paper presented at the Space Mission Challenges for Information Technology (SMC-IT), 2011 IEEE Fourth International Conference on.
Ye, A., Zhu, H., Xu, Z., Sun, C., &; Yuan, K. (2012). A vision-based guidance method for autonomous guided vehicles. Paper presented at the Mechatronics and Automation (ICMA), 2012 International Conference on.
Yu, J., Lou, P., Qian, X., &; Wu, X. (2008). An Intelligent Real-Time Monocular Vision-Based AGV System for Accurate Lane Detecting. Paper presented at the Computing, Communication, Control, and Management, 2008. CCCM''08. ISECS International Colloquium on.
Zabih, R., &; Woodfill, J. (1994). Non-parametric local transforms for computing visual correspondence Computer Vision—ECCV''94 (pp. 151-158): Springer.
Zhang, Z. (2000). A flexible new technique for camera calibration. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 22(11), 1330-1334.
李冠樺. (2013). 廠內自動搬運車之標記定向系統. 臺灣大學電機工程學研究所學位論文.


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔