跳到主要內容

臺灣博碩士論文加值系統

(18.97.14.91) 您好!臺灣時間:2024/12/10 06:56
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

: 
twitterline
研究生:陳振昌
研究生(外文):Zhen-Chang Chen
論文名稱:抗部分遮蔽之採用交互多模型粒子濾波器的行人追蹤
論文名稱(外文):Pedestrian Tracking Using Interacting Multiple Model Particle Filter for Partial Occlusion
指導教授:唐之瑋
指導教授(外文):Chih-Wei Tang
學位類別:碩士
校院名稱:國立中央大學
系所名稱:通訊工程學系
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2017
畢業學年度:105
語文別:中文
論文頁數:87
中文關鍵詞:行人追蹤交互多模型粒子濾波器特徵匹配遮蔽
外文關鍵詞:pedestrian trackinginteracting multiple modelparticle filterfeature matchingocclusion
相關次數:
  • 被引用被引用:0
  • 點閱點閱:242
  • 評分評分:
  • 下載下載:20
  • 收藏至我的研究室書目清單書目收藏:0
物件追蹤在電腦視覺領域被廣泛應用,其中行人追蹤在視覺監控(visual surveillance)系統裡是相當重要的,然而環境中之其他靜態或動態物件常影響行人原本的運動行為,也可能造成遮蔽,而靠近或遠離相機會造成行人在畫面裡放大或縮小。因此,本論文提出結合交互多模型粒子濾波器與SURF特徵匹配作目標偵測之物件追蹤演算法。藉由粒子濾波器的更正階段計算出各運動模型之最大色彩權重對應的狀態,更新運動模式機率(mode probability),提升運動模式機率(mode probability)的準確性。而交互多模型粒子濾波器以前一時刻的運動模式機率(mode probability),於交互多模型粒子濾波器交互階段更新混合機率,使混合後之各運動模型對應之狀態分布,更趨近於行人目前時刻的狀態之事前機率分布,進而提升預測準確率。此外,本論文參考各運動模型經粒子濾波器估測的狀態,交互多模型粒子濾波器之整體估測狀態,與以SURF特徵匹配所得之目前畫面多個匹配特徵點為中心之不同大小的候選方框區域,計算與目標樣板的色彩相似度,增加追蹤的準確率。最後利用外觀相似度判斷,進行物件的外觀模型更新,可防止目標因外觀大小改變、光線變化影響色彩相似度的判斷。實驗結果顯示,相較於交互多模型粒子濾波器,我們提出的方案, 在行人的運動行為改變, 遮蔽, 光線變化, 放大縮小的情況下,相較於以色彩為基礎之IMMPF演算法, 皆有較好的追蹤效果。
Object tracking is widely used in applications of computer vision where pedestrian tracking is important in the visual surveillance system. However, either static or dynamic objects in the environment may frequently affect the motion model of the pedestrian or lead to occlusions. Moreover, scaling is inevitable. Therefore, this paper proposes to combine interacting multiple model particle filter (IMMPF) and SURF feature matching based target detection for pedestrian tracking. The mode probability of each motion model is updated by the state with the maximum color weight of the corresponding motion model, computed by the correction stage of the particle filter. And thus, accuracy of the mode probability is improved. The mixing probability is updated by the previous mode probability in the interaction stage of IIMMPF. It increases accuracy of approximation of the mixed a priori probability distribution of the pedestrian and thus improves prediction accuracy. To improve tracking accuracy, the proposed scheme refers to the estimated state of each motion model, the overall estimated state of IMMPF, and the neighborhood with varying size of SURF matched keypoints to compute the color similarity with the target template. Finally, target appearance model is optionally updated according to the similarity of appearance model. Experimental results show that the proposed scheme outperforms the color based IMMPF algorithm.
摘要 I
Abstract II
誌謝 IV
目錄 V
圖目錄 VII
表目錄 XI
第一章 緒論 1
1.1 前言 1
1.2 研究動機 1
1.3 研究方法 2
1.4 論文架構 3
第二章 交互多模型之物件追蹤技術現 4
2.1 貝氏濾波器(Bayesian Filter) 4
2.2 交互多模型之卡爾曼濾波器技術現況(Cultivating Techniques in Interacting Multiple Model Kalman Filter) 6
2.2.1 卡爾曼濾波器(Kalman Filter, KF) 6
2.2.2 基於交互多模型之卡爾曼濾波器 (Interacting Multiple Model Kalman Filter) 7
2.3 基於交互多模型之粒子濾波器技術現況(Cultivating Techniques in Interacting Multiple Model Particle Filter) 9
2.3.1 粒子濾波器(Particle Filter, PF) 9
2.3.2 交互多模型之粒子濾波器 (Interacting Multiple Model Particle Filter) 13
2.4 總結 15
第三章 抗遮蔽之物件追蹤技術現況(Object Tracking CultivatingTechniques for Partial Occlusion) 16
3.1 以交互多模型為基礎的視覺追蹤(Visual Tracker Using Interacting Multiple Model) 16
3.2 以特徵匹配為基礎的視覺追蹤(Visual Tracker Using Feature Matching) 18
3.3 遮蔽處理(Occlusion Handing) 20
3.4 總結 22
第四章 本論文所提之行人追蹤方案 24
4.1 系統架構 25
4.2 以交互多模型粒子濾波器為基礎之視覺追蹤 (Visual tracking Using Interacting Multiple Model Particle Filter) 26
4.3 相似度判斷 30
4.4 狀態估測 32
4.5 模板更新 43
4.6 總結 44
第五章 實驗結果與討論 45
5.1 實驗參數與測試影片規格 45
5.2 追蹤系統實驗結果 47
5.2.1 均方根誤差之追蹤準確率 48
5.2.2 重疊率之追蹤準確率 56
5.2.3 系統計算複雜度 63
5.3總結 65
第六章 結論與未來展望 66
參考文獻 67
[1] Y. Hua, K. Alahari, C. Schmid, “Occlusion and motion reasoning for long-term tracking,” in Proc. European Conference on Computer Vision, Vol. 8694, p.172-187, Sep. 2014.
[2] S. Kwak, W. Nam, B. Han, J. Hee Han, “Learning occlusion with likelihoods for visual tracking,” in Proc. IEEE International Conference on Computer Vision, pp. 1551-1558, Nov. 2011.
[3] C. Ma, X. Yang, C. Zhang, M. Yang, “Long-term correlation tracking,” in Proc. IEEE International Conference on Computer Vision and Pattern Recognition, pp. 5388-5396, June. 2015.
[4] Y. Wu, T. Yu, G. Hua, “Tracking appearances with occlusions,” Computer Vision and Pattern Recognition, 2003. in Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, p.789-795, June. 2003.
[5] W. Bouachir, G. Bilodeau, “Structure-aware keypoint tracking for partial occlusion handling,” in Proc. IEEE International Winter Conference on Applications of Computer Vision, p.877-884, March. 2014.
[6] W. Kloihofer, M. Kampel, “Interest Point Based Tracking,” in Proc. IEEE International Conference on Pattern Recognition , pp. 3549-3552, Aug. 2010.
[7] S. Gao, Z. Han, D. Doermann, J. Jiao, “Depth Structure Association for RGB-D Multi-Target Tracking,” in Proc. IEEE International Conference on Pattern Recognition, p. 4152-4157, Aug. 2014.
[8] H. Bay, A. Ess, T. Tuytelaar, and L. Van Gool, “SURF: speeded up robust features,“ Computer Vision and Image Understanding, Vol. 110, No. 3, pp. 346-359, June. 2008.
[9] Y. Boers, J. N. Driessen, “Interacting multiple model particle filter,“ IEE Proceedings - Radar, Sonar and Navigation, Vol. 150, No. 8, pp. 344-349, Oct. 2003.
[10] A. Yilmaz, O. Javed, M. Shah, “Object tracking: A survey,” ACM computing surveys, Vol. 38, No. 13, Article 13, Dec. 2006.
[11] N. J. Gordon; D. J. Salmond; A. F. M. Smith, "Novel approach to nonlinear/non-Gaussian Bayesian state estimation." IEE Proceedings F - Radar and Signal Processing, Vol. 140, p.107-113, April. 1993.
[12] Bishop, Gary, G. Welch, “An introduction to the Kalman filter,” Technical Report TR 95-041, University of North Carolina, Department of Computer Science, 1995.
[13] Z. Jiang, D. Q. Huynh, W. Moran, S. Challa, “Tracking pedestrians using smoothed colour histograms in an interacting multiple model framework,“ in Proc. IEEE International Conference on Image Processing , p. 2313-2316, Sept. 2011
[14] M. S. Arulampalam; S. Maskell; N. Gordon; T. Clapp, “A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking,” IEEE Trans. signal processing, Vol. 50, No. 2, p.174-188, Feb. 2002.
[15] K. Nummiaro, E. Koller-Meier, L. Van Gool, ”A color-based particle filter,” Image and Vision Computing, Vol. 21, No. 1, p. 99-110, Jan. 2003.
[16] H. A. P. Blom, Y. Bar-Shalom, “The Interacting Multiple Model Algorithm for Systems with Markovian Switching Coefficients,“ IEEE Trans. Automatic Control, Vol. 33, No. 8, pp. 780-783, Aug. 1988.
[17] J. Wang, D. Zhao, W. Gao, S. Shan, "Interacting multiple model particle filter to adaptive visual tracking," in Proc. IEEE International Conference on Image and Graphics, pp. 568-571, Dec. 2004.
[18] D. Ta, W. Chen, N. Gelfand, K. Pulli, “SURFTrac: Efficient tracking and continuous object recognition using local feature descriptors,“ in Proc. IEEE International Conference on Computer Vision and Pattern Recognition , pp. 2937-2944, June. 2009.
[19] Z. Qi, R. Ting, F. Husheng, Z. Jinlin, “Particle Filter Object Tracking Based on Harris-SIFT Feature Matching,“ in Proc. International Workshop on Information and Electronics Engineering, Vol. 29, pp. 924-929, Jan. 2012.
[20] X. Lu, J. Zhang, L. Song, R. Lei, H. Lu, N. Ling, “Particle filter vehicle tracking based on surf feature matching,“ IEEJ Journal of Industry Applications, Vol. 3, No. 2, pp. 182-191, March. 2014.
[21] K. Ratnayake, M. Amer, “Object tracking with adaptive motion modeling of particle filter and support vector machines,” in Proc. IEEE International Conference on Image Processing, p. 1140-1144, Sept. 2015.
[22] M. Yazdian-Dehkordi, Z. Azimifar, “Adaptive visual target detection and tracking using incremental appearance learning,” in Proc. IEEE International Conference on Image Processing, p. 1041-1045, Sept. 2015.
[23] S. Shantaiya, K. Verma, K. Mehta, “Multiple object tracking using kalman filter and optical flow,” European Journal of Advances in Engineering and Technology, 2015.
[24] X. Lu, J. Zhang, L. Song, R. Lei, H. Lu, N. Ling, “Person Tracking with Partial Occlusion Handling,“ in Proc. IEEE International Workshop on Signal Processing Systems , pp. 14-16, Oct. 2015.
[25] J.-Y. Lu, Y.-C. Wei, C.-W. Tang, “Visual tracking using compensated motion model for mobile cameras,“ in Proc. IEEE International Conference on Image Processing , pp. 489-492, Sept. 2011.
[26] J. Ferryman; A. Shahrokni “A. PETS2009: Dataset and challenge,” in Proc. IEEE International Workshop on Performance Evaluation of Tracking and Surveillance, p. 1-6, Dec. 2009.
[27] A. Milan, L. Leal-Taixé, I. Reid, S. Roth, K. Schindler, “MOT16: A Benchmark for Multi-Object Tracking,” in Proc. IEEE International Conference on Computer Vision and Pattern Recognition, May. 2016.
[28] CAVIARDATA Dataset: http://homepages.inf.ed.ac.uk/rbf/CAVIARDATA1/
[29] Y. Wu, J. Lim, M. Yang, “Online object tracking: A benchmark." in Proc. IEEE International Conference on Computer Vision and Pattern Recognition, p.2411-2418, June. 2013.
[30] C. Bao, Y. Wu, H. Ling, and H. Ji, “Real Time Robust L1 Tracker Using Accelerated Proximal Gradient Approach,” in Proc. IEEE International Conference on Computer Vision and Pattern Recognition, p.1830-1837, June. 2012.
[31] H. Ling, L1_APG (Matlab, ~40M with data), the code implement the L1-APG
http://www.dabi.temple.edu/~hbling/code_data.htm
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top
無相關期刊