(3.236.175.108) 您好!臺灣時間:2021/02/27 05:46
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:王柏凱
研究生(外文):WANG,PO-KAI
論文名稱:基於光流場景分析室內定位方法與小角近似簡化投影轉換方法之室內擴增實境導航技術及其穿戴式裝置實作
論文名稱(外文):Indoor Augmented Reality Navigation Based on Optical-Flow-Scene Indoor Positioning and Small-Angle-Approximation Projective Transformation and Its Wearable Device Implementation
指導教授:何前程
指導教授(外文):HO,CHIAN-CHENG
口試委員:郭鐘榮黃崇豪方楨文陳國益
口試委員(外文):KUO,CHUNG-JUNGHUANG,CHUANG-HAOFANG,CHEN-WENChen,KUO-YI
口試日期:2016-07-01
學位類別:碩士
校院名稱:國立雲林科技大學
系所名稱:電機工程系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2016
畢業學年度:104
語文別:中文
論文頁數:53
中文關鍵詞:擴增實境導航室內定位路徑規劃影像對位視角估測投影轉換穿戴式裝置
外文關鍵詞:augmented reality navigationindoor positioningpath planningimage registrationpose estimationprojective transformationwearable device
相關次數:
  • 被引用被引用:1
  • 點閱點閱:229
  • 評分評分:系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔系統版面圖檔
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
「位置感知」與「目的導航」是各種無所不在的穿戴式裝置應用與服務的基礎技術,因此本論文研發一個無需基礎設施的室內擴增實境導航穿戴式裝置。本論文可以將三維虛擬導航指引物件貼圖於使用者眼前的真實世界中,無需建置任何基礎設施或特殊標記。為了讓此技術更精準、更可靠且更即時,本論文基於航位估測法提出創新的「光流場景分析室內定位方法」來改進「特徵光流分析室內定位方法」與「光流場域室內定位方法」,基於迪科斯徹法提出創新的「相鄰串列座標路徑規劃方法」來改進「相鄰矩陣節點路徑規劃方法」與「相鄰串列節點路徑規劃方法」,基於地板分割法提出創新的「牆腳界線偵測影像對位方法」來改進「邊緣點邊緣線影像對位方法」與「全景影像分割影像對位方法」,基於無標追蹤法提出創新的「慣性光流分析視角估測方法」來改進「特殊標記追蹤之視角估測方法」與「自然特徵追蹤視角估測方法」,基於單應矩陣提出創新的「小角近似簡化投影轉換方法」來改進「齊次單應矩陣投影轉換方法」與「分解單應矩陣投影轉換方法」。實作結果顯示,相較於傳統知名的擴增實境導航技術,具備這五種新創方法的本論文擁有更高的精準度與更低的延遲性。此外,具備這五種新創方法的本論文已經無縫且順利地實作於Android穿戴式裝置中。本作品適合應用於各類室內外穿戴式導航應用服務中,例如:地點指引、事件導引、商品探尋、社群找人等。
“Location Awareness” and “Destination Navigation” are fundamental to versatile ubiquitous wearable device’s applications and service, this paper develops an infrastructureless indoor Augmented Reality Navigation (ARN) wearable device. It can lay 3D virtual navigation directions over what users are actually seeing in front of themselves in real world without deploying any infrastructure or markers. In order to make it more precise, reliable, and instantaneous, this work proposes “optical-flow-scene indoor positioning” based on dead reckoning to improve “feature-optical-flow indoor positioning” and “optical-flow-field indoor positioning”, proposes “adjacent-list-coordinate path planning” based on Dijkstra algorithm to improve “adjacent-matrix-node path planning” and “adjacent-list-node path planning”, proposes “wall-floor-boundary image registration” based on floor segmentation to improve “End-Dot-EdgeLine image registration” and “panoramic-image-segmentation image registration”, proposes “optical-flow-inertial pose estimation” based on markerless tracker to improve “specific-marker-tracking pose estimation” and “nature-feature-tracking pose estimation”, and proposes “small-angle-approximation projective transformation” based on Homography matrix to improve “Homogeneous-Homography-matrix projective transformation” and “Decomposed-Homograph-matrix projective transformation”. Implementation results show this thesis featuring these proposed 5 methods has higher accuracy and less latency than conventional well-known ARN methods. Besides, this thesis featuring these proposed 5 methods has been implemented into Android wearable device seamlessly and smoothly. This work is suitably applied to versatile outdoor and indoor wearable navigation applications, like site directions, event guidance, merchandise seeking, social searching, and so on.
摘要 i
ABSTRACT ii
目錄 iii
表目錄 v
圖目錄 vi
第一章 緒論 1
1.1研究背景 1
1.2研究動機與目的 4
1.3擴增實境導航流程 6
1.4論文架構 6
第二章 傳統的室內擴增實境導航技術 7
2.1 定向定位階段 8
2.2 路徑規劃階段 11
2.3 影像對位階段 13
2.4 視角估測階段 15
2.5 影像重繪階段 16
第三章 無需基礎設施的室內擴增實境導航技術 18
3.1 光流場景分析室內定位方法 19
3.2 相鄰串列座標路徑規劃方法 22
3.3 牆腳界線偵測影像對位方法 23
3.4 慣性光流分析視角估測方法 25
3.5 小角近似簡化投影轉換方法 27
第四章 實驗分析 28
4.1 實驗平台 28
4.2 實驗結果 29
4.2.1 光流場景分析室內定位方法 29
4.2.2 相鄰串列座標路徑規劃方法 33
4.2.3 牆腳界線偵測影像對位方法 34
4.2.4 慣性光流分析視角估測方法 35
4.2.5 小角近似簡化投影轉換方法 36
第五章 無需基礎設施的室內擴增實境導航穿戴式裝置實作 37
5.1 實作動機 37
5.2 實作平台 37
5.3 Android嵌入式系統平台的軟體架構 39
5.4 開放原始碼函式庫 42
5.4.1 電腦視覺資料庫OpenCV 42
5.4.2 開放圖形庫OpenGL 42
5.4.2.1 OpenGL核心庫 43
5.4.2.2 OpenGL實用庫 (OpenGL Utility Library, GLU) 43
5.4.2.3 OpenGL輔助庫 (Auxiliary Library, AUX) 43
5.4.2.4 OpenGL工具庫 (OpenGL Utility Toolkit) 44
5.5無需基礎設施的室內擴增實境導航穿戴式裝置實作系統架構 45
5.6無需基礎設施的室內擴增實境導航穿戴式裝置實作成果展示 46
第六章 結論與未來展望 49
參考文獻 50

[1]Wikipedia, “South-pointing chariot,” [Online].Available:
https://en.wikipedia.org/wiki/South-pointing_chariot
[2]YouTube, “Sun stone,” [Online].Available:
https://www.youtube.com/watch?v=OtLcvb5WyhU&feature=youtu.be
[3]Hong Kong Astronomical society, “Astrolabe,” [Online].Available:
http://forum.hkas.org.hk/thread-5186-1-1.html
[4]Mobey Proximity, “Indoor Postioning System,” [Online].Available:
http://obeyproximity.com/2015/12/14/indoor-positioning-system/
[5]Megaleecher.Net, “Indoor Atlas-Smart indoor navigation system uses earth’s magnetic field,” [Online].Available:
http://www.megaleecher.net/IndoorAtlas#axzz4BYdQmQrI
[6]CTIMES, “室內定位可行技術剖析,” [Online].Available:
https://www.ctimes.com.tw/DispCols/tw/GPS/DecaWave/BLE/Apple/802.15.4a/1312171639X6.shtml
[7]Business NEXT, “法國家樂福用「燈泡」來追蹤商品位置,” [Online].Available:
http://www.bnext.com.tw/article/view/id/36373
[8]INSIDE, “勝義科技將深耕定位技術,” [Online].Available:
http://www.inside.com.tw/2015/04/02/foxconn-invests-hyxen-over-million
[9]IThome, “Aruba跨入Beacon定位市場,” [Online].Available:
http://www.ithome.com.tw/article/93105
[10]CTIMES, “解析室內定位五大技術,” [Online].Available:
http://www.hope.com.tw/DispNews-tw.asp?O=HJWC6A09T98SAA0MEP
[11]IEEE Spectrum, “Navigating the Great Indoors,” [Online].Available:
http://spectrum.ieee.org/consumer-electronics/portable-devices/navigating-the-great-indoors
[12]Dako-pt, “Mobile Navigation,” [Online].Available:
http://www.dako-pr.de/service/pressemitteilungen/news-details/article/mobile-gebaeudenavigation-ganz-ohne-app.html
[13]YouTube, “路不熟看導航!遊覽車撞機車,祖孫雙亡,” [Online].Available: https://www.youtube.com/watch?v=_GFzce8J7lk
[14]Wikipedia, “Augmented Reality,” [Online].Available: http://en.wikipedia.org/wiki/Augmented_reality
[15]Wikipedia, “Indoor positioning system,” [Online].Available:
https://en.wikipedia.org/wiki/Indoor_positioning_system
[16]A. R. J. Ruis, F. S. Granja, J. C. P. Honorato, and J. I. G. Rosas, “Accurate pedestrian indoor navigation by tightly coupling foot-mounted IMU and RFID measurements,” IEEE Transactions on Instrumentation and Measurement, vol. 61, no. 1, pp. 178–189, Jan. 2012.
[17]F. Forno, G. Malnati and G. Portelli, “Design and implementation of a Bluetooth ad hoc network for indoor positioning,” IEE Journal of Proceeding Software., vol. 152, no. 5, October 2005.
[18]Chung-Hao Huang, Lun-Hui Lee, Chian C. Ho, Lang-Long Wu, and Zu-Hao Lai, “Real-time RFID indoor positioning system based on Kalman-filter drift removal and Heron-bilateration location estimation,” IEEE Transactions on Instrumentation and Measurement, vol. 64, no. 3, pp. 728–739, Mar. 2015.
[19]Cristiano di Flora and Marion Hermersdorf, “A practical implementation of indoor location-based services using simple WiFi positioning,” Journal of Location Based Services, vol. 2, no. 2, Jnue 2008, 87-111
[20]Shu Liu, Yingxin Jiang, and Aaron Striegel, “Face-to-Face Proximity Estimation Using Bluetooth On Smartphones,” IEEE Transactions on Mobile Computing, vol. 13, no. 4, April 2014
[21]Teresa Garcia-Valverde, Alberto Garcia-Sola, Hani Hagras, Senior Member, IEEE, James A. Dooley, Victor Callaghan, and Juan A. Botia, “A Fuzzy Logic-Based System for Indoor Localization Using WiFi in Ambient Intelligent Environments,” IEEE Transactions on Fuzzy systems, vol. 21, no. 4, August 2013
[22]S. DiVerdi and T. Hollerer, “Heads up and camera down: a vision-based tracking modality for mobile mixed reality,” IEEE Transactions on Visualization and Computer Graphics, vol. 14, no. 3, pp. 500–512, May–June. 2008.
[23]Maria E. Angelopoulou and Christos-Savvas Bouganis, “Vision-Based Egomotion Estimation on FPGA for Unmanned Aerial Vehicle Navigation,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 24, no. 6, June 2014
[24]César Silva and José Santos-Victor, ”Robust Egomotion Estimation From the Normal Flow Using Search Subspaces,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 9, September 1997
[25]Kahlouche Souhila and Achour Karim, “Optical Flow based robot obstacle avoidance,” International Journal of Advanced Robotic Systems, vol. 4, no. 1 (2007)ISSN 1729-8806, pp. 13-16
[26]Chris McCarthy, Nick Barnes, and Robert Mahony, “A Robust Docking Strategy for a Mobile Robot Using Flow Field Divergence,” IEEE Transactions on Robotics, vol. 24, no. 4, August 2008
[27]Andrea Giachetti, Marco Campani, and Vincent Torre, “The Use of Optical Flow for Road Navigation,” IEEE Transactions on Robotics and Automation, vol. 14, no. 1, February 1998
[28]Antonio Robles-Kelly and Edwin R. Hancock, “Graph edit distance from spectral seriation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, pp.365-378, Mar. 2005.
[29]Chen Wang and Kai-Kuang Ma, “Common visual pattern discovery via directed graph model,” Proceedings of IEEE International Conference on Image Processing (ICIP), Brussels, Belgium, pp.2957-2960, Sep. 2011.
[30]Joon-Sang Park, Michael Penner and Viktor K. Prasanna, “Optimizing graph algorithms for improved cache performance,” IEEE Transactions on Parallel and Distributed Systems, vol. 15, pp.769-782, Sep. 2004.
[31]Marcel Hlawatsch, Michael Burch and Daniel Weiskopf, “Visual Adjacency Lists for Dynamic Graphs,” IEEE Transactions on Visualization and Computer Graphics, vol. 20, pp.1590-1603, Nov. 2014.
[32]DongKai Fan and Ping Shi, “Improvement of Dijkstra's algorithm and its application in route planning,” Proceedings of IEEE Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), vol. 4, Yantai, Shandong, pp.1901-1904, Aug. 2010.
[33]Yinxiao Li and Stanley T. Birchfield, “Image-Based Segmentation of Indoor Corridor Floors for a Mobile Robot,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 837-843,18-22 Oct. 2010
[34]Guillem Casas Barcel´o, Ghazaleh Panahandeh, and Magnus Jansson, “Image-Based Floor Segmentation in Visual Inertial Navigation,” IEEE International Instrumentation and Measurement Technology Conference I2MTC, Minneapolis, May 6-9 2013.
[35]Luis Felipe Posada, Krishna Kumar Narayanan, Frank Hoffmann and Torsten Bertram, “Floor Segmentation of Omnidirectional Images for Mobile Robot Visual Navigation,” Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS), Taipei, Taiwan, 18-22 Oct. 2010.
[36]Jongbae Kim and Heesung Jun, “Vision-based location positioning using augmented reality for indoor navigation,” IEEE Transactions on Consumer Electronics, vol. 54, no. 3, pp. 954–962, Aug. 2008.
[37]O. Mohareri and A. B. Rad, “Autonomous humanoid robot navigation using augmented reality technique,” Proceedings of 2011 IEEE International Conference on Mechatronics (ICM), pp. 463–468, Apr. 2011.
[38]H. Hile and G. Borriello, “Positioning and orientation in indoor environments using camera phones,” IEEE Computer Graphics and Applications, vol. 28, no. 4, pp. 32–39, Jul.–Aug. 2008.
[39]A. D. Cheok and Li Yue, “A novel light-sensor-based information transmission system for indoor positioning and navigation,” IEEE Transactions on Instrumentation and Measurement, vol. 60, no. 1, pp. 290–299, Jan. 2011.
[40]R. Hervas, J. Bravo, and J. Fontecha, “An assistive navigation system based on augmented reality and context awareness for people with mild cognitive impairments,” IEEE Journal of Biomedical and Health Informatics, vol. 18, no. 1, pp. 368–374, Jan. 2014.
[41]T. Oskiper, M. Sizintsev, V. Branzoi, S. Samarasekera, and R. Kumar, “Augmented reality binoculars,” IEEE Transactions on Visualization and Computer Graphics, vol. 21, no. 5, pp. 611–623, May 2015.
[42]Wikipedia, “Camera resectioning,” [Online].Available:
https://en.wikipedia.org/wiki/Camera_resectioning#Intrinsic_parameters
[43]Wikipedia, “Matrix decomposition,” [Online].Available:
https://en.wikipedia.org/wiki/Matrix_decomposition
[44]D.W.F. van Krevelen and R. Poelman, “A Survey of Augmented Reality Technologies, Applications and Limitations, ” The International Journal of Virtual Reality, 2010, 9(2):1-20
[45]Ezio Malis and Manuel Vargas, “Deeper understanding of the homography decomposition for vision-based control,” HAL archives-ouvertes,iniria-00174036, submitted on 25 Sep 2007
[46]HTC, “HTC One (M8),” [Online].Available:
http://www.htc.com/tw/smartphones/htc-one-m8/
[47]Business Next, “[2015回顧]穿戴式裝置市場是還在浪頭還是要步入寒冬?,” [Online].Available: http://www.bnext.com.tw/article/view/id/38304
[48]Wikipedia, “Google Glass,” [Online].Available:
https://en.wikipedia.org/wiki/Google_Glass
[49]EPSON, “EPSON BT-200” [Online].Available:
http://www.epson.com.tw/Projectors/V11H560054/Overview
[50]Wikipedia, “Microsoft HoloLens” [Online].Available:
https://en.wikipedia.org/wiki/Microsoft_HoloLens
[51]Wikipedia, “Android,” [Online].Available: http://zh.wikipedia.org/wiki/Android
[52]Wikipedia, “OpenCV,” [Online].Available: http://zh.wikipedia.org/wiki/OpenCV
[53]Wikipedia, “OpenGL,” [Online].Available: http://zh.wikipedia.org/wiki/OpenGL

電子全文 電子全文(網際網路公開日期:20211231)
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
系統版面圖檔 系統版面圖檔