跳到主要內容

臺灣博碩士論文加值系統

(18.97.9.170) 您好!臺灣時間:2024/12/02 16:10
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:李至偉
研究生(外文):Chih-Wei Lee
論文名稱:基於語音標籤辨識之穿戴式視障者導引系統
論文名稱(外文):Wearable Guidance System Based on Voice-Marker Recognition for People with Visual Impairment
指導教授:阮聖彰
指導教授(外文):Shanq-Jang Ruan
口試委員:林昌鴻陳維美
口試委員(外文):Chang-Hong LinWei-Mei Chen
口試日期:2017-07-18
學位類別:碩士
校院名稱:國立臺灣科技大學
系所名稱:電子工程系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:英文
論文頁數:55
中文關鍵詞:室內定位穿戴式裝置影像標籤視障者
外文關鍵詞:indoor locatingwearable devicevisual markerpeople with visual impairment
相關次數:
  • 被引用被引用:0
  • 點閱點閱:245
  • 評分評分:
  • 下載下載:21
  • 收藏至我的研究室書目清單書目收藏:0
世界上有大約2.85億的人口是視障者,幫助他們能自由的行走在熟悉/不熟悉的地方已成爲當今的一個重要議題。本論文提出一套為視障者設計的穿戴式導引系統,其包含穿戴式眼鏡並搭配一支手機。此系統是基於辨識一特殊設計的影像標籤來實現,然而,由於視障者的視覺障礙,對他們而言尋找環境中一般的二維條碼是很困難的,因此本論文提出了一種特殊的彩色標籤,其中內含語音資訊,當標籤的尺寸為400平方公分時,最遠可於15公尺外被偵測到。通過此系統,使用者可以在不需要網路或資料庫的情況下,得到目前所在位置的語音提示。此標籤辨識可在大部分四核心的手機上即時運行。此外,為了改善視障者平時行走上的困難,多個超音波感測器被用來實現低功率的避障功能,定位與避障功能可以同時運作也可以只單獨啟用其中一個。為了將系統以低功率架構實現,以及避免運算過載,其中,眼鏡只負責擷取與傳送資料,這些資料包含影像與超音波感測器的數值,而手機則負責處理這些資料。本論文提出的這套系統可以幫助視障者在許多生活中的場所上都能獲得更好的體驗,例如公車站、百貨公司等,並且讓他們在無需他人的幫助下安全地抵達目的地。
There are 285 million people with visually impairment in the world, helping them walking free on a familiar/unfamiliar place is an important issue nowadays. This thesis presents a wearable guidance system on a pair of wearable glasses with a mobile phone for people with visual impairment. The proposed system is based on a special designed voice-marker recognition. Because people with visual impairment have difficulty searching normal 2-D barcode based markers in environments, this thesis proposed a custom color marker which is detectable in 15m maximum when its size is 400 〖cm〗^2 and containing voice information inside. Users can obtain the audio message of current location information though the system without any network connection or a database. The marker recognition can be processed in real-time on most quad-core mobile phones. In addition, to ensure safe walk experiences for people with visual impairment, multiple micro ultrasonic sensors are used to achieve obstacles detection for low power. The function of positioning and obstacles detection can work standalone or in parallel. In order to achieve low power system architecture and avoid overloading, glasses are only responsible for obtaining and transmitting data including image sequence and ultrasonic sensors values. The mobile phone processes these data to get the location and detect obstacles. The proposed system could help people with visual impairment have better experience in many environments such as bus stations or department stores, and arrive at their destination in a safe way without other’s assistance.
Recommendation Form
Committee Form
Chinese Abstract
English Abstract
Acknowledgements
Table of Contents
List of Figures
List of Tables
Chapter 1 Introduction
1.1 Background
1.2 Feature of This Work
1.3 Organization of This Thesis
Chapter 2 Related Works
2.1 A Review of Visual Markers
2.1.1 AprilTag
2.1.2 Quick Response Code (QR code)
2.1.3 VoiceCode
2.1.4 Target Marker
2.2 A Review of Aids for People with Visual Impairment
Chapter 3 Proposed Voice-Marker
3.1 Marker Design
3.2 Marker Detection
3.3 Marker Recognition
Chapter 4 Proposed System
4.1 Proposed System Architecture
4.2 Wearable Device
4.2.1 The Intel Edison Module
4.3 Mobile Phone Application
4.3.1 RenderScript
4.3.2 Android NDK
Chapter 5 Experimental Results
5.1 Run Time of the Proposed Marker Processing
5.2 Detectable Distance and Detection Rate of the Proposed Marker
5.3 Detection Rate of the Obstacles Detection Function
5.4 Power Consumption
5.4.1 Mobile Device Power Monitor
Chapter 6 Conclusions
References
[1] WHO Visual impairment and blindness, 2017. [Online]. Available: http://www.who.int/topics/blindness/en/ 

[2] S. Alghamdi, R. V. Schyndel and I. Khalil, “Safe trajectory estimation at a pedestrian crossing to assist visually impaired people,” 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5114-5117, Aug. 2012.
[3] M. A. Williams, A. Hurst, and S. K. Kane, “”pray before you step out”: describing personal and situational blind navigation behaviors.” in ASSETS. ACM, 2013, p. 28.
[4] T. P. Yunck, W. G. Melbourne and C. L. Thoenton, “GPS-Based Satellite Tracking System for Precise Positioning,” IEEE Transactions on Geoscience and Remote Sensing, vol. GE-23, no. 4, pp. 450-457, Jul. 1985.
[5] M. Pent, M. A. Spirito and E. Turco, ”Method for positioning GSM mobile stations using absolute time delay measurements,” Electronics Letters, vol. 33, no. 24, pp. 2019-2020, Nov. 1997.
[6] C. Drane, M. Macnaughtan and C. Scott, “Positioning GSM telephones,” IEEE Communications Magazine, vol. 36, no. 4, pp. 46-54, Apr. 1998.
[7] F. Forno, G. Malnati and G. Portelli, “Design and implementation of a Bluetooth ad hoc network for indoor positioning,” IEE Proceedings – Software, vol. 152, no. 5, pp. 223-228, Oct. 2005.
[8] S. Zhou and J. K. Pollard, “Position measurement using Bluetooth,” IEEE Transactions on Consumer Electronics, vol. 52, no. 2, pp. 555-558, May. 2006.
[9] T. M. Fern´andez, J. Rodas, C. J. Escudero and D. I. Iglesia, “Bluetooth Sensor Network Positioning System with Dynamic Calibration,” 4th International Symposium on Wireless Communication Systems, pp. 45-49, Oct. 2007.
[10] S. S. Chawathe, “Low-latency indoor localization using bluetooth beacons,” 12th International IEEE Conference on Intelligent Transportation Systems, pp. 1-7, Oct. 2009.
[11] H. Hile and G. Borriello, “Positioning and Orientation in Indoor Environments Using Camera Phones,” IEEE Computer Graphics and Applications, vol. 32, no. 4, pp. 32-39, Aug. 2008.
[12] D. V. Opdenbosch, G. Schroth, R. Huitl, S. Hilsenbeck, A. Garcea and E. Steinbach, “Camera-based indoor positioning using scalable streaming of compressed binary image signatures,” IEEE International Conference on Image Processing (ICIP), pp. 2804-2808, Oct. 2014.
[13] W. Sui and K. Wang, “An accurate indoor localization approach using cellphone camera,” Natural Computation (ICNC) 11th International Conference on, pp. 949-953, Aug. 2015.
[14] A. Mulloni, D. Wagner, I. Barakonyi and D. Schmalstieg, “Indoor Positioning and Navigation with Camera Phones,” IEEE Pervasive Computing, vol. 8, no. 2, pp. 22-31, Apr. 2009.
[15] A. Hayes, “My Journey into Glass: Talking about Google Glass with stakeholders in the Glass Explorer Program.,” IEEE Consumer Electronics Magazine, vol. 5, no. 1, pp. 102-106, Jan. 2016.
[16] S. M. Bascon, S. L. Arroyo, P. G. Jimenez, Hilario Gomez-Moreno and Francisco Lopez-Ferreras, “Road-Sign Detection and Recognition Based on Support Vector Machines,” IEEE Transactions on Intelligent Transportation Systems, vol. 8, no. 2, pp. 264-278, Jun. 2007.
[17] J. Greenhalgh and M. Mirmehdi, “Real-Time Detection and Recognition of Road Traffic Signs,” IEEE Transactions on Intelligent Transportation Systems, vol. 13, no. 4, pp. 1198-1506, Aug. 2012.
[18] E. Olson, “AprilTag: A robust and flexible visual fiducial system,” Robotics and Automation (ICRA), 2011 IEEE International Conference on, pp. 3400-3407, May. 2011.
[19] QR code.com. DENSO WAVE Inc., [Online]. Available: http://www.qrcode.com/en/index.html
[20] J. R. Lee, S. J. Ruan and C. H. Lin, "VoiceCode: A 2D barcode system for digital audio encoding," 2016 IEEE 5th Global Conference on Consumer Electronics, Oct. 2016.
[21] O. Christen, E. Naroska, A. Micheel, and S. J. Ruan, “Target Marker: A Visual Marker for Long Distances and Detection in Realtime on Mobile Devices,” 2nd International Conference of Machine Vision and Machine Learning, Aug. 2013.
[22] S. Bharathi, A. Ramesh and S. Vivek, “Effective navigation for visually impaired by wearable obstacle avoidance system,” International Conference on Computing, Electronics and Electrical Technologies(ICCEET), pp. 956-958, Mar. 2012
[23] J. H. Lee, D. Kim and B. S. Shin, “A wearable guidance system with interactive user interface for persons with visual impairment,” Springer Multimedia Tools and Applications, pp. 1-22, Nov. 2014
[24] J. L. Bentley, “Multidimensional binary search trees used for associative searching,” Communications of the ACM, vol. 18, no. 9, pp. 509-517, Sept. 1975.
[25] The Intel Edison Module. Intel Inc., [Online]. Available: https://software.intel.com/en-us/iot/hardware/edison
[26] RenderScript. Google Inc., [Online]. Available: https://developer.android.com/guide/topics/renderscript/compute.html
[27] Android NDK. Google Inc., [Online]. Available: https://developer.android.com/ndk/index.html
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊