跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.87) 您好!臺灣時間:2025/02/09 10:07
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:謝孟原
研究生(外文):Hsieh, Meng-Yuan
論文名稱:以俯視式環場電腦視覺及行動裝置作擴增實境式室內導覽
論文名稱(外文):A Study on Indoor Navigation by Augmented Reality and Down-looking Omni-vision Techniques Using Mobile Devices
指導教授:蔡文祥蔡文祥引用關係
指導教授(外文):Tsai, Wen-Hsiang
學位類別:碩士
校院名稱:國立交通大學
系所名稱:多媒體工程研究所
學門:電算機學門
學類:軟體發展學類
論文種類:學術論文
論文出版年:2012
畢業學年度:100
語文別:英文
論文頁數:130
中文關鍵詞:室內導航擴增實境人物定位
外文關鍵詞:Indoor navigationAugmented RealityHuman localization
相關次數:
  • 被引用被引用:0
  • 點閱點閱:568
  • 評分評分:
  • 下載下載:195
  • 收藏至我的研究室書目清單書目收藏:2
本論文提出了一個結合電腦視覺及擴增實境技術在行動裝置上使用的室內導覽系統。此系統以在室內環境天花板上安裝的魚眼攝影機作為基礎硬體架構。在人物定位方面,提出了一個以電腦視覺為基礎的方法,藉由分析魚眼影像來偵測使用者的活動資訊。為了得到影像中人物的真實空間位置,我們也提出了一個空間映射的方法,來進行影像座標與真實空間座標的轉換。此外我們也整合了三項技術來進行人物方向的偵測,分別為(一)分析使用者的移動路徑、(二)利用行動裝置上的方向感測器、以及(三)藉由行動裝置上所貼一長條色彩標記,在魚眼影像中分析該標記來進行方向偵測。另亦提出一適用於室內路徑的規劃方法,藉由分析建築平面圖來得到障礙物區域,並以此為基礎得到障礙物迴避方向來進行路徑規劃。伺服器會將導覽資訊傳送至行動裝置上的使用者端,此資訊包括了定位資訊、周遭環境地點及導覽路徑。使用者端接收到的導覽資訊會被覆蓋在行動裝置影像中對應的真實物件上,來提供擴增實境導覽介面。此外本研究也提出了一個方法來估測行動裝置上攝影機的可視角,並以此建立一個轉換矩陣來將真實空間中的點轉換到影像平面上。最後,實驗結果也顯示出了本研究所提出方法的可行性。同時,定位資訊的精確測量結果也顯示了此系統在提供精確導覽資訊的能力。
When people visit new indoor places or complicated indoor environments, there usually needs a navigation system to guide them to desired destinations. In this study, an indoor navigation system based on augmented reality (AR) and computer vision techniques by the use of a mobile device like an HTC Flyer or an iPad is proposed.
At first, an indoor vision infra-structure is set up by attaching fisheye cameras on the ceiling of the navigation environment. The user’s location and orientation are detected at a server-side system, and the analysis results are sent to the client-side system. Furthermore, the server-side system also sends the surrounding environment information and the navigation path to the client-side system, which runs on the user’s mobile device. The client-side system then displays the information in an AR way, which provides clear information for a user to conduct the navigation.
For human localization, a vision-based localization technique is proposed, which analyzes images captured from the fisheye cameras, and detects human activities in the environment. In order to transform coordinates of image points into the real-world space, a space-mapping technique is proposed. Furthermore, three techniques are integrated together to conduct human orientation detection effectively. The first is analysis of human motions in consecutive images. The second is utilization of the orientation sensor on the user’s mobile device. The last is localization of the color edge mark attached on the top of the mobile device using omni-images. These techniques are integrated together to provide a reliable human orientation detection system.
A path planning technique for use to generate a path from a spot to a selected destination via the use of an environment map is also proposed. The environment map is constructed from a floor plan drawing of the indoor environment. An obstacle avoidance map is created from the floor plan drawing, which is used to determine the avoidance direction when a path collides with an obstacle in the environment.
Finally, the navigation information is overlaid onto the image shown on the mobile device to provide an AR navigation interface. A method for estimation of the field-of-view of the camera on the mobile device is proposed. The field-of-view is used to construct a transformation matrix, by which real-world points can be transformed into the screen plane, so that the navigation information can be overlaid onto the corresponding real-world objects in the images to accomplish the AR function of the system.
Good experimental results are also presented to show the feasibility of the proposed methods for real applications. Precision measures and statistics showing the system’s effectiveness in producing precise data for accurate visiting target displays and environment navigations are also included.
ABSTRACT (in Chinese) i
ABSTRACT (in English) ii
ACKNOWLEDGEMENTS iv
CONTENTS v
LIST OF FIGURES viii
LIST OF TABLES xiii

Chapter 1 Introduction 1
1.1 Background and Motivation 1
1.2 Review of Related Works 2
1.2.1 Review of Related Indoor Navigation Works 3
1.2.2 Review of Related Augmented Reality Works 4
1.2.3 Review of Related Human Localization Works 4
1.2.4 Review of Related Path Planning Works 5
1.3 Overview of Proposed Methods 6
1.4 Contributions 7
1.5 Thesis Organization 8
Chapter 2 Ideas of Proposed Methods and System Design 9
2.1 Ideas of Proposed Method 9
2.2 Ideas of System Design 11
2.2.1 Server-side System 11
2.2.2 Client-side System 12
2.2.3 Cooperation between Client and Server Sides 13
2.3 System Configuration 14
2.3.1 Hardware Configuration 14
2.3.2 Network Configuration 16
2.3.3 Software Configuration 16
2.4 System Processes 17
2.4.1 Learning Process 17
2.4.2 Navigation Process 19
Chapter 3 Learning of Environments 23
3.1 Ideas of Proposed Environment Learning Techniques 23
3.2 Coordinate Systems Used in This Study 24
3.3 Construction of Environment Map 25
3.3.1 Information of Environment Map 26
3.3.2 Finding Walkable Regions in Environment Floor Plan 27
3.3.3 Obstacle Orientation Analysis 29
3.3.4 Learning of Magnetic Field Information 32
3.3.5 Algorithm of Environment Construction 33
3.4 Camera Calibration 34
3.4.1 Fisheye Camera Calibration and Ground Point Location Mapping 34
3.4.2 Calibration of Camera on Mobile Device 40
3.5 Experimental Results 44
Chapter 4 Human Localization in Indoor Environments by Computer Vision Techniques 46
4.1 Idea of Proposed Human Localization Techniques 46
4.2 Human Location Detection 47
4.2.1 Background/Foreground Separation 47
4.2.2 Human Foot Point Detection and Computation 49
4.3 Human Orientation Detection 50
4.3.1 Orientation Detection by Human Motions 50
4.3.2 Orientation Detection by Orientation Sensor on Client Device 54
4.3.3 Orientation Detection by Color Edge Mark on Top of Client Device 56
4.3.4 Algorithm of Orientation Detection 60
4.4 Human Tracking 61
4.4.1 Idea of Human Tracking 61
4.4.2 Camera Hand-off 65
4.5 Algorithm of Human Localization and Tracking 67
4.6 Experimental Results 68
Chapter 5 Path Planning for Navigation 71
5.1 Ideas of Proposed Techniques 71
5.2 Obstacle Avoidance 72
5.3 Path Finding 77
5.4 Path Simplification 79
5.5 Path Update 86
5.6 Algorithm for Path Planning 88
5.7 Experimental Results 89
Chapter 6 Augmented Reality for Navigation 92
6.1 Ideas of Proposed Techniques 92
6.2 View Mapping between Real World and Client Device 93
6.2.1 Information for Use in Mapping between Real World and Client Device 93
6.2.2 Transformation from Real World Spot to Client Device Screen 94
6.3 Rendering for Visiting Targets and Navigation Paths 99
6.3.1 Visiting Target Rendering 99
6.3.2 Rendering and Geometry Creation of Navigation Paths 104
6.4 Algorithm of Indoor Navigation by Augmented Reality 107
6.5 Experimental Results 108
Chapter 7 Experimental Results and Discussions 111
7.1 Experimental Results 111
7.1.1 Result of Real Navigations 112
7.1.2 Result of Precision Measurement 118
7.2 Discussions 123
Chapter 8 Conclusions and Suggestions for Future Works 125
8.1 Conclusions 125
8.2 Suggestions for Future Works 126
References 128
[1] C. Lukianto, C. Honniger, and H. Sternberg, "Pedestrian Smartphone-Based Indoor Navigation Using Ultra Portable Sensory Equipment," in Proceedings of International Conference on Indoor Positioning and Indoor Navigation (IPIN), Zurich, Switzerland, 2010, pp. 1-5.
[2] B. Ozdenizci, K. Ok, V. Coskun, and M. N. Aydin, "Development of an Indoor Navigation System Using NFC Technology," in Proceedings of Fourth International Conference on Information and Computing (ICIC), Phuket Island, Thailand, 2011, pp. 11-14.
[3] L. C. Huey, P. Sebastian, and M. Drieberg, "Augmented Reality Based Indoor Positioning Navigation Tool," in Proceedings of IEEE Conference on Open Systems (ICOS), Langkawi, Malaysia, 2011, pp. 256 - 260.
[4] A. Mulloni, D. Wagner, D. Schmalsteig, and I. Barakonyi, "Indoor Positioning and Navigation with Camera Phones," Pervasive Computing, IEEE, vol. 8, pp. 22-31, 2009.
[5] M. Werner, M. Kessel, and C. Marouane, "Indoor positioning using smartphone camera," in Proceedings of International Conference on Indoor Positioning and Indoor Navigation (IPIN), Guimaraes, Portugal, 2011, pp. 1-6.
[6] H. Hile and G. Borriello, "Positioning and Orientation in Indoor Environments Using Camera Phones," IEEE Computer Graphics and Applications, vol. 28, pp. 32-39, 2008.
[7] S. Henderson and S. Feiner, "Exploring the Benefits of Augmented Reality Documentation for Maintenance and Repair," IEEE Transactions on Visualization and Computer Graphics, vol. 17, pp. 1355 - 1368, 2011.
[8] M. C. Juan, C. Botella, M. Alcaniz, R. Banos, C. Carrion, M. Melero, and J. A. Lozano, "An Augmented Reality System for treating psychological disorders: Application to phobia to cockroaches," in Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality, Arlington, USA, 2004, pp. 256 - 257.
[9] D. Kalkofen, E. Mendez, and D. Schmalstieg, "Interactive Focus and Context Visualization for Augmented Reality," in Proceedings of IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 2007, pp. 191-201.
[10] K. Jongbae and J. Heesung, "Vision-Based Location Positioning using Augmented Reality for Indoor Navigation," IEEE Transactions on Consumer Electronics, vol. 54, pp. 954-962, 2008.
[11] T. Miyashita, P. Meier, T. Tachikawa, S. Orlic, T. Eble, V. Scholz, A. Gapel, O. Gerl, S. Arnaudov, and S. Lieberknecht, "An Augmented Reality Museum Guide," in Proceedings of IEEE International Symposium on Mixed and Augmented Reality, Cambridge, United Kingdom, 2008, pp. 103-106.
[12] H. C. Chen and W. H. Tsai, "Optimal security patrolling by multiple vision-based autonomous vehicles with omni-monitoring from the ceiling," in Proceedings of 2008 International Computer Symposium, Taipei, Taiwan, Republic of China, 2008, pp. 196-201.
[13] J. Borenstein and Y. Koren, "The Vector Field Histogram-Fast Obstacle Avoidance for Mobile Robots," IEEE Transactions on Robotics and Automation, vol. 7, pp. 278-288, 1991.
[14] J. Y. Hwang, J. S. Kim, S. S. Lim, and K. H. Park, "A Fast Path Planning by Path Graph Optimization," IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, vol. 33, pp. 121-129, 2003.
[15] J. Bruce and M. Veloso, "Real-Time Randomized Path Planning for Robot Navigation," in IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 2002, pp. 2383 - 2388.
[16] T. Akenine-Moller, E. Haines, and N. Hoffman, "Pespective Projection," in Real-Time Rendering, Third Edition, T. Akenine-Moller, ed., 2008, pp. 92-97.
[17] A. Senior, A. Hampapur, Y.-l. Tian, L. Brown, S. Pankanti, and R. Bolle, "Appearance Models for Occlusion Handling," in Proceedings of 2nd IEEE Workshop on Performance Evaluation of Tracking and Surveillance, Hawaii, USA, 2001.
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊