(3.236.214.19) 您好!臺灣時間:2021/05/09 23:03
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:張兆翔
研究生(外文):Chao-Hsiang Chang
論文名稱:一個有效率的擴充實境架構使用二元樹和光流法
論文名稱(外文):An Efficient Augmented Reality Architecture by Binary Tree and Optical Flow
指導教授:黃文楨黃文楨引用關係
指導教授(外文):Wen-Chen Huang
學位類別:碩士
校院名稱:國立高雄第一科技大學
系所名稱:資訊管理所
學門:電算機學門
學類:電算機一般學類
論文種類:學術論文
論文出版年:2009
畢業學年度:97
語文別:中文
論文頁數:72
中文關鍵詞:擴充實境二元搜尋樹光流法
外文關鍵詞:Augmented RealityOptical FlowBinary Search Tree
相關次數:
  • 被引用被引用:0
  • 點閱點閱:381
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
擴充實境在近年來已慢慢受到重視,它被廣泛的應用於產品行銷、廣告、醫療影像、工業設計、教育、遊戲等領域。目前大部分的擴充實境系統仍局限在自然環境和時間兩項限制,本篇論文就是使用強健的局部特徵描述子和穩定的追蹤機制來克服這兩項限制,讓使用者可以在任何環境使用任何喜愛的照片或圖片,就可以與視訊中的3D資訊或物體進行即時的互動,以達到擴充實境的效果。
本篇論文將廣義的介紹擴充實境系統的類型和追蹤技術,主要的焦點會放在以電腦視覺為主的無標記符號追蹤技術上,我們將深入探討此種追蹤技術在2003年到2008年間六種相關的擴充實境系統,並歸納出一個共通的擴充實境架構。首先,我們提出一種有彈性的二元搜尋樹來改善Bastos所提出的二元碼分群機制,得以大幅減少特徵樣版比對的次數。再來,我們將追蹤機制加入到共通的擴充實境架構中,然後選取強健的光流演算法來追蹤特徵點,以加速整個擴充實境系統。我們所提出的二元搜尋樹方法應用在靜態影像實驗上,在比對700個特徵點時,只需要花費大約500毫秒的時間,而在即時視訊影像實驗,我們提出的光流追蹤架構分別應用在兩種不同的局部特徵描述子SIFT和SURF上,實驗的結果都被證明非常有效率,其中最好的結果還達到每秒三十一個影格的速度。
Augmented Reality has been valued in recent years. It’s applied extensively in produce sale, advertisement, medical image, industrial design, and education and games so on. Most of the Augmented Reality Systems are still limited in natural environment and time those two limitations. This thesis uses robust Local Invariant Feature Detectors and stable tracking technology to overcome these two limitations and makes users can use any loved pictures or photos in any environments therefore they can have real-time interactions with 3D information and objects in webcam to achieve the effect of Augmented Reality.
This thesis will introduce widely the types of Augmented Reality and tracking technology. The main focus is Markerless Tracking technology of computer vision. We will explore further the six relative Augmented Reality Systems of this tracking technology from 2003 to 2008.and to sum up a common structure of Augmented Reality. First, we propose a flexible Binary Search Tree to Improve the Binary code cluster systems that was proposed by Bastos so it can decrease drastically template matching. Then, we put Tracking systems in the common Augmented Reality Architecture and select robust Optical Flow algorithm to track feature points in order to speed up the whole Augmented Reality System. To apply the Binary Search Tree that we proposed in Static Image experiment only cost approximately 500 ms in matching 700 feature points. And in Real-Time Video Image experiments, the Optical Flow tracking Architecture that we proposed apply separately in two different Local Invariant Feature Detectors: SIFT and SURF. The result of experiments is proved very effective. The best result achieves the speed of 31 frames/sec.
摘要 i
ABSTRACT ii
致謝 iv
目錄 v
圖目錄 vii
表目錄 ix
第一章 緒論 1
1.1研究背景與動機 1
1.2 研究目的 2
1.3論文架構 2
第二章 擴充實境的文獻探討 4
2.1擴充實境概述 4
2.2 以電腦視覺為主的追蹤技術 5
2.2.1 標記符號追蹤技術 6
2.2.2 無標記符號追蹤技術 7
2.3 擴充實境系統架構探討 8
第三章 研究理論與方法 16
3.1攝影機校正 16
3.1.1針孔成像模型 16
3.1.2 攝影機的參數矩陣與鏡頭扭曲參數 18
3.1.3兩平面間的投影轉換矩陣(Homography) 21
3.2局部特徵描述子 22
3.2.1 Distinctive Image Features from Scale-Invariant Keypoints (SIFT) 22
3.2.2 Automatic Camera Pose Initialization,using Scale, Rotation and Luminance Invariant Natural Feature Tracking (ACPI) 30
3.3金字塔型Lucas–Kanade光流演算法 36
3.4 RANdom SAmple Consensus演算法 (RANSAC) 39
3.5二元樹搜尋樹 40
3.6擴充實境之光流追蹤架構 44
第四章 實驗結果 51
4.1系統描述 51
4.2攝影機校正 53
4.3針對靜態影像的實驗結果 54
4.4針對動態視訊的實驗結果 60
第五章 結論 63
5.1結論 63
5.2未來的研究方向 64
參考文獻 66
[1]Zhang, Z. “Flexible camera calibration by viewing a plane from unknown orientations”, Proceedings of the 7th International Conference on Computer Vision, 666–673, 1999.
[2]Bradski, G. and Kaehler, A. “Learning OpenCV”, chap.11. September 2008.
[3]Azuma, R.T. “A survey of augmented reality”, Presence: Teleoperators and Virtual Environments, 355-385, 1997.
[4]Azuma, R.T., Baillot, Y., Behringer R., Feiner S., Julier S., and MacIntyre B. “Recent advances in augmented reality”, IEEE Computer Graphics & Applications, 21:6, 34-47, 2001.
[5]Zhou, F., Duh, H. B.-L., Billinghurst, M. “Trends in Augmented Reality Tracking, Interaction and Display:A Review of Ten Years of ISMAR”, IEEE International Symposium on Mixed and Augmented Reality, 193-202, 2008.
[6]Rolland, J. P., Baillot, Y., and Goon, A. A. “A survey of tracking technology for virtual environments”, Fundamentals of Wearable Computers and Augmented Reality, 67-112, 2001.
[7]State, A., Hirota, G., Chen, D. T., et al., “Superior augmented reality registration by integrating landmark tracking and magnetic tracking”, Proceedings of SIGGRAPH, 29-438, 1996.
[8]Jiang, B., Neumann, U., You, S. “A robust hybrid tracking system for out door augmented reality”, Proceedings of IEEE Virtual Reality, 3-10, 2004.
[9]Klein, G. and Drummond, T.W. “A Single-frame Visual Gyroscope”, In the proceedings of the British Machine Vision Conference (BMVC), 5-8, 2005.
[10]Lepetit, V. and Fua, P. “Monocular Model-Based 3D Tracking of Rigid Objects: A Survey”, Foundations and Trends in Computer Graphics and Vision, 1-89, 2005.
[11]Klein, G. “Visual Tracking for Augmented Reality”. Ph.D. thesis, University of Cambridge, UK, 1-193, 2006.
[12]http://www.hitl.washington.edu/artoolkit/
[13]http://studierstube.icg.tu-graz.ac.at/handheld_ar/markerbased.php
[14]http://www.artoolworks.com/Home.html
[15]http://www.openscenegraph.org/projects/osg
[16]http://www.artoolworks.com/community/osgart/index.html
[17]Looser, J., Grasset, R., Seichter, H., Billinghurst, M. “OSGART - A Pragmatic Approach to MR”, 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 06): Industrial Workshop, 22-25, 2006.
[18]Henrysson, A., Billinghurst, M., Ollila, M. “AR Tennis”, ACM SIGGRAPH 2006 Sketches: International Conference on Computer Graphics and Interactive Techniques, 2006.
[19]Chen, L.-H., Yu, C.-J., Hsu, S.-C. “A Remote Chinese Chess Game using Mobile Phone Augmented Reality”, AR/MR game: ACM International Conference Proceeding Series: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, 284-287, 2008.
[20]Andersen, T. L., Kristensen, S., Nielsen, B. W., and Gronbak, K. “Designing an Augmented Reality Board Game with Children: The BattleBoard 3D Experience”, Interaction Design and Children, Proceedings of the 2004 conference on Interaction design and children: building a community, 137 – 138, 2004.
[21]Oda, O., Lister, L. J., White, S., Feiner, S. “ Developing an Augmented Reality Racing Game”, Games: Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive entertainment, 2008.
[22]Mathews, M., Challa, M., Chu, C.-T., Jian, G., Seichter H., Grasset, R. “ Evaluation of Spatial Abilities through Tabletop AR”, Proceedings of the 7th ACM SIGCHI New Zealand chapter''s international conference on Computer-human interaction: design centered HCI, 17 – 24, 2007.
[23]Henrysson, A., Billinghurst, M., Ollila, M. “Face to Face Collaborative AR on Mobile Phones”, Symposium on Mixed and Augmented Reality: Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality, 80 – 89, 2005.
[24]Nilsen, T., Looser, J. “ Tankwar Tabletop war gaming in augmented reality”, In Proceedings of 2nd International Workshop on Pervasive Gaming Applications, 2005.
[25]Wang, Y., Langlotz, T., Billinghurst, M., Bell, T. “ An Authoring Tool for Mobile Phone AR Environments”, Proceedings of New Zealand Computer Science Research Student Conference 09, 2009.
[26]Grasset, R., D‥unser, A., Billinghurst, M. “ The Design of a Mixed-Reality Book: Is It Still a Real Book?”, Mixed and Augmented Reality, 2008. ISMAR 2008. 7th IEEE/ACM International Symposium on, 99-102, 2008.
[27]Wagner, D., Schmalstieg, D. “Making Augmented Reality Practical on Mobile Phones Part 1”, IEEE Computer Graphics and Applications, 12-15, 2009.
[28]Klein, G., Murray, D. “Parallel Tracking and Mapping for Small AR Workspaces”, Mixed and Augmented Reality(ISMAR 07): 6th IEEE and ACM International Symposium, 225-234, 2007.
[29]Fischler, M. A., Bolles, R. C. “Random Sample Consensus: A paradigm for model fitting with applications to image analysis and automated cartography”, Communications of the Assoc. Comp. Mach., 1981.
[30]Rousseeuw, P. J. “Least median of squares regression”, Journal of the American Statistical Association, 871–880, 1984.
[31]Inui, K., Kaneko, S., Igarashi, S. “Robust line fitting using LmedS clustering”, Systems and Computers in Japan, 92–100, 2003.
[32]Chum, O. and Matas, J. “Matching with prosac – progressive sample consensus”, in IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’2005), 220–226, 2005.
[33]Lowe, D.G. “Distinctive image features from scale-invariant keypoints”, International Journal of Computer Vision, 91-110, 2004.
[34]Lowe, D.G. “Object recognition from local scale-invariant features”, In International Conference on Computer Vision, Corfu, Greece, 1150-1157, 1999.
[35]Koenderink, J.J. “The structure of images”, Biological Cybernetics, 363-396, 1984.
[36]Lindeberg, T. “Scale-space theory: A basic tool for analysing structures at different scales”, Journal of Applied Statistics, 224-270, 1994.
[37]Harris, C., Stephens, M., “A combined corner and edge detector”, Proc. of the 4th Alvey Vision Conference,147–152, 1988.
[38]Bastos, R., Dias, M. S. “Automatic Camera Pose Initialization, using Scale, Rotation and Luminance Invariant Natural Feature Tracking”, Proceedings of WSCG'' 2008 - The 16-th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision ''2008, 97-104 2008.
[39]Shi, J., Tomasi, C. “Good Features to Track”, in IEEE Conference on CVPR, 593-600 , 1994.
[40]Bastos, R., Dias, J.M.S. “Fully Automated Texture Tracking Based on Natural Features Extraction and Template Matching”, in ACE Technology, 180-183, 2005.
[41]Comport, A., Marchand, E., Chaumette, F. “A real-time tracker for markerless augmented reality”, In ISMAR ‘03, 36-45, 2003.
[42]Wuest, H., Vial, F., Stricker, D. “Adaptive line tracking with multiple hypotheses for augmented reality”. In ISMAR ‘05, 62-69, 2005.
[43]Reitmayr, G. and Drummond, T. “Going out: robust model-based tracking for outdoor augmented reality”, In ISMAR ‘06, 109-118, 2006.
[44]Chen, L.-H., Yu, C.-J., Hsu, S.-C. “A remote Chinese chess game using mobile phone augmented reality”, Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, 284-287, 2008.
[45]Kato, H., Tachibana, K., Billinghurst, M., Grafe, M. “A Registration Method based on Texture Tracking using ARToolKit”, in The Second IEEE International Augmented Reality Toolkit Workshop, 77-85 , 2003.
[46]Bastos, R. “Tracking of Planar Objects in Natural Scenes using Textures”, Master of Science Thesis in Computers and Telecommunications Engineering, 2005.
[47]Yuan, C. “Markerless Pose Tracking for Augmented Reality”, in Proceedings of ISVC, 721-730, 2006.
[48]Klein, G. and Murray, D. “Parallel Tracking and Mapping for Small AR Workspaces”, in International Symposium on Mixed and Augmented Reality - ISMAR''07, 1-10, 2007.
[49]Xu, K., Chia, K. W., Cheok, A. D. “Real-time camera tracking for marker-less and unprepared augmented reality environments”, in Image and Vision Computing, 673-689, 2008.
[50]Hartley, R. I. and Zisserman, A. “Multiple View Geometry in computer vision”, 2nd Edition of Cambridge University Press, 2004.
[51]Nister, D. “An efficient solution to the five-point relative pose problem”, in PAMI, 756-770, 2004.
[52]Rosten, E., Drummond, T. “Machine learning for high-speed corner detection”, In Proc. 9th European Conference on Computer Vision (ECCV’06), 430-443 , 2006.
[53]Kalman, R. E. “A new approach to linear filtering and prediction problems”, in Transactions of ASME – Journal of Basic Engineering, 35-45, 1960.
[54]Abdel-Aziz, Y.I. and Karara, H.M., “Direct Linear Transformation into Object Space Coordinates in Close-Range Photogrammetry”, in Procedures of Symposium of Close-Range Photogrammetry, 1-18 , 1971.
[55]Bouguet, J.-Y. “Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the Algorithm”, Intel Corporation, Microprocessor Research Labs., 1074-1082, 1999.
[56]http://www.robots.ox.ac.uk/~vgg/research/affine/ .
[57]Bay, H., Tuytelaars, T., Gool, L. V. “SURF: Speeded up robust features”, In ECCV, 346-359, 2006.
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔