跳到主要內容

臺灣博碩士論文加值系統

(3.235.56.11) 您好!臺灣時間:2021/07/29 09:52
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:王冠傑
研究生(外文):Kuan-Chieh Wang
論文名稱:極低複雜度之車道偏移警示系統演算法
論文名稱(外文):Ultra Low Complexity Algorithm for Lane Departure Warning System
指導教授:吳崇賓
口試委員:范志鵬陳春僥
口試日期:2015-07-22
學位類別:碩士
校院名稱:國立中興大學
系所名稱:電機工程學系所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2015
畢業學年度:103
語文別:中文
論文頁數:69
中文關鍵詞:車道線車道線偵測車道偏移車道偏移警示
外文關鍵詞:lanelane detectionlane departurelane departure warning
相關次數:
  • 被引用被引用:0
  • 點閱點閱:149
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
本論文提出一應用於行車紀錄器影像的極低複雜度車道偏移警示演算法及其快速演算法。基於車道線在行車紀錄器影像中的特性,車道線在較近距離時皆呈現為直線,因此本論文以二元一次直線方程式表示車道線;並運用此特性決定ROI範圍。接著在ROI內利用車道線之色彩、亮度及角度等特徵完成車道線偵測,並利用車道線之角度判定車道偏移。
所提出之精簡的車道偏移演算法(SLDA)利用顏色截取與邊緣偵測,可在ROI中正確區分出車道線與路面,並依據邊緣的梯度資訊將ROI分割為不重疊的角度區塊,並將角度區塊分成六個群組。接著提出適應性的截距標準差,將不在車道線上的角度區塊濾除。經濾除後的角度區塊,可再被歸類為左中右三種車道線,並依據統計的區塊數量決定域偵測的兩車道線。使用車道線上的平均點及消失點能夠清楚的表示車道線的方程式,並利用其角度資訊判定車道偏移。
然而由於SLDA中利用顏色的特徵擷取車道線卻忽略了劇烈光影變化的狀況,且在偵側車道線時重複的利用了區塊資訊。為了改善這些問題,基於SLDA進一步提出了快速車道偏移演算法(FLDA),FLDA先在ROI內使用調整權重的RGB提高車道線與路面的亮度差異,再利用適應性權重的方式將車道線資訊截取出來。使用提出之梯度運算子可直接獲得區塊的角度,減少SLDA中計算區塊角度的複雜度。接著再以這些區塊的角度及截距可直接偵測出車道線,並以與SLDA相同的方式判定車道偏移。
經實驗結果顯示SLDA中,車道線偵測平均準確率達到91.6%,平均處理速度為40.19ms;而改善後之FLDA演算法之準確率則提升至96.41%,平均處理速度也達到3.71ms,且較SLDA更能適應各種不同的行車環境。


The thesis proposed a fast linear block-based lane detection and departure warning system. Based on the analysis of the distribution of lane markings, the lane markings in near distance are extracted that they are straight lines. A region of interest (ROI) is determined, and the lane markings in the region are strengthened to be suitable in different weather conditions.
For SLDA, we extract the colors of lanes in a determined region of interest (ROI), and the result is partitioned into non-overlapped blocks. The Sobel edge filter is applied to compute the angles of blocks. According to the angles, the blocks are classified into 6 groups that a fixed gradient is defined to each group. An adaptive standard deviation of intercept is proposed to filter out redundant blocks. After filtering, the block number of each group is used to determine left and right candidate lanes. Two mean points of the two chosen groups are calculated and the vanishing points of two lanes are also determined. With the mean points and the vanishing points, the descriptive functions of left and right lanes are determined. The gradient of left and right lanes’ descriptive linear functions are used for the decision of lane departure warning. If the gradient is larger than or equal to the pre-set threshold, the lane departure warning signal is sent.
As for FLDA, to reduce the computation, we proposed two arrays to compute the horizontal and horizontal gradient of a block that the block angle and block gradient are also obtained. Based on the driving conditions, the blocks are classified into 6 groups by their angles. The two groups with maximum numbers are chosen as the left and right lane markings. To describe the lane markings with linear function formula, the average gradient and gradient of each group are computed. According to the gradients of the lane markings and the pre-set thresholds, the decision of warning signal is determined.
The experimental results show that the average detection rate in SLDA achieves 91.6%, the average processing speed in 40.19ms. And the average detection rate in FLDA raised to 96.41%, the average processing speed can reach 3.71ms. Furthermore, FLDA is more adaptive than SLDA in various environments.


目錄 I
圖目錄 III
表目錄 V
第一章 緒論 1
1.1 研究動機 1
1.2 研究目的 1
1.3 論文架構 1
第二章 文獻探討 3
2.1 車道線特徵擷取 3
2.2 車道線偵測 4
2.3 車道偏移警示 6
第三章 SLDA精簡的車道偏移警示演算法 8
3.1 演算法架構 8
3.2 車道線特徵擷取(LANE FEATURE EXTRACTION) 9
3.2.1 偵測區域(Region of Interest) 9
3.2.2 顏色擷取(Color Extraction) 9
3.3 區塊分類(BLOCK CLASSIFICATION) 12
3.3.1 區塊分割(Block Cutting) 12
3.3.2 邊緣偵測(Edge Detection) 14
3.3.3 區塊分組(Block Group Determination) 15
3.4 車道線偵測(LANE DETECTION) 18
3.4.1 非車道區塊消除(Non-lane Block Elimination) 19
3.4.2 車道線種類決定(Lane Mode Decision) 20
3.4.3 平均點計算(Mean Point Calculation) 24
3.4.4 消失點搜尋(Vanish Point Searching) 25
3.4.5 車道線偵測(Lane Detection) 27
3.5 車道偏移警示(LANE DEPARTURE WARNING) 28
3.5.1 車道偏移範圍設定(Lane Departure Range Setting) 28
3.5.2 偏移判定(Departure Determination) 29
第四章 FLDA快速車道偏移警示演算法 31
4.1 快速演算法架構 31
4.2 車道線亮度擷取(LANE BRIGHTNESS EXTRACTION) 32
4.3 區塊角度計算與分類(BLOCK ANGLE CALCULATION & CLASSIFICATION) 36
4.3.1 區塊梯度(Block Gradient) 36
4.3.2 有界區塊分類(Bounded Block Classification) 40
4.3.3 車道線偵測(Lane Detection) 42
第五章 實驗結果與討論 48
5.1 演算法實驗結果與討論 48
5.1.1 車道線特徵擷取(Lane Feature Extraction) 49
5.1.2 區塊分類結果(Block Classification Results) 51
5.1.3 車道線偵測結果(Lane Detection Results) 52
5.2 演算法之環境與效能分析 56
5.2.1 運算量分析 56
5.2.2 準確率及效能分析 57
第六章 結論與未來工作 65
6.1 結論 65
6.2 未來工作 65
參考文獻 66


[1][Online]. Available:
http://td.police.taipei/ct.asp?xItem=75049&ctNode=17787&mp=108191
[2]鐘國亮, “影像處理與電腦視覺” 第四版 2008年
[3]李安樂, “應用於智慧型車輛之是硬性ROI硬體架構設計,” 國立中興大學電機工程學研究所碩士學位論文, 中華民國一百零二年七月
[4]National Police Agency, Ministry of the Interior (2013, Mar. 30), Policing Statistics Report [Online]. Available: http://www.npa.gov.tw/NPAGip/wSite/ct?xItem=41406&ctNode=12595&mp=1
[5]Taiwan Area National Freeway Bureau, Ministry of the Transportation and Communications (2012, July. 26), Intell. Trans. Syst. [Online]. Available: http://www.freeway.gov.tw/Publish.aspx?cnid=1556
[6]Automotive Research & Testing Center, “以影像為基礎之先進駕駛輔助系統開發,” [Online]. Available: http://www.car-safety.org.tw/uploads/Rule/%E8%BB%8A%E8%BC%9B%E7%A0%94%E6%B8%AC%E8%B3%87%E8%A8%8A/058/058_01_%E4%BB%A5%E5%BD%B1%E5%83%8F%E7%82%BA%E5%9F%BA%E7%A4%8E%E4%B9%8B%E5%85%88%E9%80%B2%E9%A7%95%E9%A7%9B%E8%BC%94%E5%8A%A9%E7%B3%BB%E7%B5%B1%E9%96%8B%E7%99%BC.pdf
[7]H. Yoo, U. Yang and K. Sohn, “Gradient-Enhancing Conversion for Illumination-Robust Lane Detection,” IEEE Trans. Intell. Transp. Syst., vol. 14, no. 3, pp. 1083–1094, Sept. 2013.
[8]Gaikwad, V. and Lokhande, S., “Lane Departure Identification for Advanced Driver Assistance” IEEE Trans. Intell. Transp. Syst., vol. 16, no. 2, pp. 910–918, Apr. 2015.
[9]Cualain, D.O., Glavin, M. and Jones, E., “Multiple-camera lane departure warning system for the automotive environment,” IET Intell. Transp. Syst., vol. 6, no. 3, pp. 223–234, Sep. 2012.
[10]Cualain, D.O., Hughes, C., Glavin, M. and Jones, E., “Automotive standards-grade lane departure warning system,” IET Intell. Transp. Syst., vol. 6, no. 1, pp. 44–57, Mar. 2012.
[11][Online]. Available:
http://www.cpubenchmark.net/cpu_list.php
[12]M.Sc. Rana Abdul Rahman Lateef,“Expansion and Implementation of a 3x3 Sobel and Prewitt Edge Detection Filter to a 5x5 Dimension Filter”[online].Available:
http://www.iasj.net/iasj?func=fulltext&aId=52927
[13]Yinghua He, Hong Wang and Bo Zhang, “Color-based road detection in urban traffic scenes,” IEEE Trans. Intell. Transp. Syst., vol. 5,no. 4, pp. 309–318, Dec. 2004
[14]T.-Y. Sun, S.-J. Tsai and V. Chan, “HIS color model based lane-marking detection,” in Proc. IEEE ITSC, Toronto, ON, Canada, Sep. 2006, pp. 1168–1172
[15]H.-Y.g Cheng, B.-S. Jeng, P.-T. Tseng and K.-C. Fan, “Lane Detection With Moving Vehicles in the Traffic Scenes,” IEEE Trans. Intell. Transp. Syst., vol. 7, no. 4, pp. 571–582, Dec. 2006
[16]J. Wang, Y. Wu, Z. Liang and Y. Xi, “Lane detection based on random Hough transform on region of interesting,” in Proc. IEEE ICIA, Jun. 2010, pp. 1735–1740
[17]N. Apostoloff and A. Zelinsky, “Robust vision based lane tracking using multiple cues and particle filtering,” in Proc. IEEE IV Symp., San Diego, CA, USA, Jun. 2003, pp. 558–563
[18]J. B. McDonald, “Application of the Hough Transform to Lane Detection and Following on High Speed Roads,” in Proc. Irish Signals Syst. Conf. Motorway Driving Scenarios, Maynooth, Ireland, 2001.
[19]Y.U.Yim and S.-Y. Oh, “Three-Feature based Automatic Lane Detection Algorithm (TFALDA) for autonomous driving,” IEEE Trans. Intell. Transp. Syst., vol. 4, no. 4, pp. 219–225, Dec. 2003
[20]G.P. Stein, E. Rushinek, G. Hayun and A. Shashua, “A Computer Vision System on a Chip: a case study from the automotive domain, ” IEEE CVPR, San Diego, CA, USA, Jun 2005, pp. 130
[21]P. Jeong and S. Nedevschi, “Efficient and robust classification method using combined feature vector for lane detection, ” IEEE Trans. CSVT, vol. 15,no. 4, pp. 528–537, Apr. 2005
[22]J.C. McCall and M.M. Trivedi, “Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation, ” IEEE Trans. Intell. Transp. Syst., vol.7, no. 1, pp. 20–37, Mar. 2006
[23]B. Yu, W. Zhang and Y. Cai, “A Lane Departure Warning System Based on Machine Vision, ” IEEE CIIA, vol. 1, Dec. 2008, pp. 197–201
[24]V. Hardzeyeu and F. Klefenz, “On using the hough transform for driving assistance applications, ” IEEE ICCP, Aug. 2008, pp. 91–98
[25]Z. Kim, “Robust Lane Detection and Tracking in Challenging Scenarios, ” IEEE Trans. Intell. Transp. Syst., vol. 5, no. 4, pp. 16–26, Mar. 2008
[26]P.Y. Hsiao, C.W. Yeh, S.S. Huang and L.C. Fu, “A Portable Vision-Based Real-Time Lane Departure Warning System: Day and Night, ” IEEE Trans. Vehicles Symp. vol. 58, no. 4, May. 2009, pp. 2089–2094
[27]R. Danescu and S. Nedevschi, “Pribabilistic Lane Tracking in Difficult Road Scenarios Using Stereovision,” IEEE Trans. Intell. Transp. Syst., vol. 10, no. 2, pp. 272–282, Jun. 2009
[28]H.-Y. Cheng, C.-C. Yu, C.-C. Tseng, K.-C. Fan, J.-N. Hwang and B.-S. Jeng, “Environment classification and hierarchical lane detection for structured and unstructured roads, ” IET Computer Vision, vol. 4, no. 1, Mar 2009, pp. 37–49
[29]R.K. Satzoda, S. Sathyanarayana and T.Srikanthan,“Hierarchical Additive Hough Transform for Lane Detection, ” IEEE Embedded Syst. vol. 2, no. 2, pp. 23–26, Jun. 2010
[30]Q. Lin, Y. Han and H. Hahn,“Real-Time Lane Departure Detection Based on Extended Edge-Linking Algorithm, ” in Computer Research and Development, 2010 Second International Conference on, May. 2010, pp. 725–730.
[31]Y.C Leng and C.L. Chen, “Vision-based lane departure detection system in urban traffic scenes, ” IEEE CARCV, Singapore, Dec. 2010, pp. 1875–1880
[32]R. Gopalan, T. Hong, M. Shneier and R. Chellappa, “A Learning Approach Towards Detection and Tracking of Lane Markings, ” IEEE Trans. Intell. Transp. Syst., vol. 13, no. 3, pp. 1088–1098, Sep. 2012
[33]A. Borkar, M. Hayes and M.T. Smith, “A Novel Lane Detection System With Efficient Ground Trith Generation,” IEEE Trans. Intell. Transp. Syst., vol. 13, no. 1, pp. 365–374, Mar. 2012
[34]C.-F. Wu, C.-J. Lin and C.-Y. Lee, “Applying a Functional Neurofuzzy Network to Real-Time Lane Detection and Front-Vehicle Distance Measurement, ” IEEE Trans. on Systems, Man, and Cybernetics—Part C: Applications and Reviews, vol. 42, no. 4, July 2012, pp. 577-589
[35]J. Choi, J. Lee, D. Kim, G. Soprani, P. Cerri, A. Broggi and K. Yi, “Environment-Detection-and-Mapping Algorithm for Autonomous Driving in Rural or Off-Road Environment, ” IEEE Trans. Intell. Transp. Syst., vol. 13, no. 2, pp. 974–982, Jun. 2012
[36]Shabana HABIB, Mahammad A Hannan,“Lane Departure Detection and Transmisson using Hough Transform Method,” ISSN 0033-2097, R. 89
[37]S. Eum and H.G. Jung, “Enhancing Light Blob Detection for Intelligent Headlight Control Using Lane Detection, ” IEEE Trans. Intell. Transp. Syst., vol. 14, no. 2, pp. 1003–1011, Jun. 2013
[38]Q. Li, L. Chen, M. Lim S.-L. Shaw and A. Nuchter, “A Sensor-Fusion Drivable-Region and Lane-Detection System for Autonomous Vehicle Navigation in Challenging Road Scenarios, ” IEEE Trans. Vehicular Tech., vol. 63, no. 2, pp. 540–555, Feb. 2014
[39]G. Cui, J. Wang and J. Li, “Robust multilane detection and tracking in urban scenarios based on LIDAR and mono-vision, ” IET Image Processing, vol. 8, no. 5, pp. 269–279, May. 2014
[40]ISO 17361:2007: “Intelligent transport systems Lane departure warning systems: Performance requirements and test procedures ”. International Organization for Standardization, http://www.iso.org/iso/catalogue_detail.htm?csnumber=41105, accessed April 2010
[41]Taiwan Area National Freeway Bureau, Ministry of the Transportation and Communications (2012, July. 26), Traffic Accident Analysis [Online]. Available: http://www.freeway.gov.tw/Publish.aspx?cnid=516&p=128
[42]Introduction to Hough transform, [Online]. Available:
http://www.uio.no/studier/emner/matnat/ifi/INF4300/h09/undervisningsmateriale/hough09.pdf


QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top