(3.238.173.209) 您好!臺灣時間:2021/05/16 21:41
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果

詳目顯示:::

我願授權國圖
: 
twitterline
研究生:謝宗良
研究生(外文):Tzon-Liang Shieh
論文名稱:整合多重線索的視訊追蹤法
論文名稱(外文):Visual Tracking Using Multiple Cues
指導教授:藍呂興
指導教授(外文):Leu-Shing Lan
學位類別:碩士
校院名稱:國立雲林科技大學
系所名稱:電子與資訊工程研究所
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2009
畢業學年度:97
語文別:中文
論文頁數:120
中文關鍵詞:均值移動視訊追蹤光流
外文關鍵詞:Video TrackingMean ShiftOptical Flow
相關次數:
  • 被引用被引用:1
  • 點閱點閱:178
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:1
在電腦視覺領域中,視訊物件追蹤之相關研究一直備受重視,並且在安全監
控系統、視訊壓縮、工廠自動化生產等方面被廣泛的應用。視訊物件追蹤方法上,
Mean Shift 可稱為相當快速且有效的方法之一,通常利用物件的色彩特徵進行追
蹤;但是當物件與背景色彩分佈相似時,以色彩為特徵的方式存在強健性不足的
缺點。本論文針對 Oshima[27]結合色彩特徵與光流、She[28]結合色彩與邊緣特
徵之 Mean Shift 追蹤演算法提出若干改良:(1)以可分離性參數為比例合併多特
徵的權重分佈;(2)提出使用特徵密度分佈權重結合相似度函數。最後,我們以
標準視訊測試改良方法之可行性。
Recently mean shift (MS) has become a prevailing technique for video object
tracking. In its original form, color feature is usually employed as a primary visual cue for tracking. This thesis presents a new mean shift scheme that utilizes multiple visual cues, i.e., color, optical flow, and edge orientation histogram. Two different methods for fusing the visual cues are proposed. A variety of experiments were conducted to evaluate the performance of the new MS scheme.
目錄
中文摘要.................................................... i
英文摘要.................................................... ii
目錄...................................................... iii
表目錄.................................................... v
圖目錄.................................................... vi

第一章 序論.................................................... 1
1.1 前言.................................................... 1
1.2 研究動機與目的.......................................... 1
1.3 本研究之特點與貢獻...................................... 2
1.4 全文架構................................................ 2

第二章 文獻回顧與相關研究...................................... 3
2.1 均值移動法(Mean Shift) ................................. 3
2.2 卡曼爾濾波器(Kalman Filter) ............................ 6
2.3 光流(Optical Flow) ..................................... 7
2.4 動態輪廓曲線法(Active Contour) ......................... 9
2.5 模擬退火演算法.......................................... 10
2.6 粒子濾波器演算法(Particle Filter , PF) ................. 10

第三章 Meanshift 用於視訊追蹤.................................. 12
3.1 架構.................................................... 12
3.1.1 Mean Shift 的假設條件........................... 13
3.2 影像樣板特徵密度分佈.................................... 13
3.2.1 影像特徵量化.................................... 13
3.2.2 反向投影(Back Projection) ...................... 14
3.2.3 影像物件標準樣板(Target Model) ................. 16
3.2.4 候選影像物件樣板(Target Candidate)............ 19
3.2.5 核心函數(Kernel Function)..................... 22
3.3 相似度係數(Bhattacharyya Coefficient)................. 24
3.4 特徵密度分佈權重 ....................................... 25
3.5 Mean Shift 系統流程..................................... 29
3.5.1 Mean Shift 優缺點............................... 30
3.5.2 Mean Shift 系統流程圖........................... 31

第四章 多特徵擷取.............................................. 32
4.1 色彩特徵................................................ 32
4.1.1 色彩空間........................................ 32
4.1.2 選取一合適色彩空間.............................. 35
4.1.3 色彩特徵選取與量化.............................. 36
4.2 光流特徵擷取與量化...................................... 37
4.2.1 L.K.光流法(Lucas-Kanade Method)........... 37
4.2.2 孔徑問題(Aperture Problem) ................. 41
4.2.3 光流特徵量化................................ 43
4.2.4 光流樣板範圍擴張............................ 47
4.2.5 光流標準樣版特徵機率密度修正................ 49
4.2.6 結合色彩特徵與光流特徵...................... 49
4.2.7 提出改良的方法.............................. 50
4.3 邊緣特徵偵測(Edge Detection).......................... 58
4.3.1 Sobel法..................................... 60
4.3.2 Edge特徵的擷取與量化........................ 61
4.3.3 Bhattacharyya Coefficient(B.C.)相似度結合法.. 65
4.3.4 提出改良的方法............................... 66

第五章 實驗結果與討論........................................... 71
5.1 改良的機率分佈權重 合併方法之實驗結果.................... 71
5.2 改良的特徵相似度函數合併方法之實驗結果................... 87

第六章 結論與未來研究方向....................................... 103
結論..................................................... 103
未來研究方向............................................. 103
參考文獻 ................................................. 104
[1]Fukunaga K, and Hostetler LD, “The estimation of the gradient of a density function, with applications in pattern recognition,” IEEE Trans. Information Theory, vol. 21, pp.32-40, 1975.
[2] B. Lucas and T. Kanade. An iterative image registration technique with an application to stereovision. In IJCAI81, pages 674–679, 1981
[3] J. Y. Bouguet, “Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm” Technical Repor in Intel Corporation Microprocessor Research Labs.
[4] J.L. Barron, Fleet, D.J., and Beauchemin, S. (1994) Performance of optical flow techniques. International Journal of Computer Vision (IJCV1994), 12(1):43-77. A copy of Technical Report, TR299 is also available.
[5] S.S. Beauchemin and J.L. Barron (1996) The Computation of Optical Flow. ACM Computing Surveys (ACMCS1995) 27(3):433-467
[6] Horn B.K.P. and Schunck B.G. (1981) Determining optical flow. AI 17, pp.185-204
[7] Kelson R. T. Aires , Andre M. Santana , Adelardo A. D. Medeiros, Optical flow using color information: preliminary results, Proceedings of the 2008 ACM symposium on Applied computing, March 16-20, 2008, Fortaleza, Ceara, Brazil
[8]Sobel, I., Feldman,G., "A 3x3 Isotropic Gradient Operator for Image Processing", presented at a talk at the Stanford Artificial Project in 1968, unpublished but often cited, orig. in Pattern Classification and Scene Analysis, Duda,R. and Hart,P., John Wiley and Sons,''73, pp271-2
[10] G.Bradski and A.Kaehler “Learn OpenCV” Published by O’Reilly Media, Inc.Septermber 2008.
[11] Dorin Comaniciu, Peter Meer. Robust Analysis of Feature Spaces: Color Image Segmentation .USA:IEEE, Los Atamios , 1997 : 750-755 .
[12] Bradski G R. “Real time face and object tracking as a component of a perceptual user interface.” Applicantions of Computer Vision, Proceedings , Fourth IEEE Workshop on, Oct. 1998 : 214 – 219.
[13] Canny, J., A Computational Approach To Edge Detection, IEEE Trans. Pattern Analysis and Machine Intelligence, 8:679-714, 1986.
[14] L. Roberts Machine Perception of 3-D Solids, Optical and Electro-optical Information Processing, MIT Press 1965.
[15] John G. Allen, Richard Y. D. Xu, Jesse S. Jin: Object Tracking Using CamShift Algorithm and Multiple Quantized Feature Spaces.
[16] Erik Cuevas1,2, Daniel Zaldivar1,2 and Raul Rojas1 ,”Kalman filter for vision tracking” 10th August 2005
[17] Kalman, R.E. (1960). "A new approach to linear filtering and prediction problems". Journal of Basic Engineering 82 (1): 35–45. Retrieved on 2008-05-03.
[18] M. YAMAMOTO, “A General Aperture Problem for Direct Estimation of 3-D Motion Parameters” , IEEE TRANSACTIONS ON PA1 IhKN ANALYhIh ANU MALHINC LNIELLILJCNLE. VUL. 11. IYU J, IVIAI ,707
[19] C. Harris and M. Stephens (1988). "A combined corner and edge detector".Proceedings of the 4th Alvey Vision Conference: pp 147--151.
[20] J. Shi and C. Tomasi (June 1994). "Good Features to Track,". 9th IEEE Conference on Computer Vision and Pattern Recognition, Springer.
[21] H. Moravec (1980). "Obstacle Avoidance and Navigation in the Real World by a Seeing Robot Rover". Tech Report CMU-RI-TR-3 Carnegie-Mellon University, Robotics Institute.
[22] C. Harris and M. Stephens (1988). "A combined corner and edge detector" Proceedings of the 4th Alvey Vision Conference: pp 147--151.
[23] D. Comaniciu, V. Ramesh, P. Meer: Real-Time Tracking of Non-Rigid Objects using Mean Shift, BEST PAPER AWARD, IEEE Conf. Computer Vision and Pattern Recognition (CVPR''00), Hilton Head Island, South Carolina, Vol. 2, 142-149, 2000
[24] Swain, M. J. and Ballard, D. H. (1991), “Color indexing”, Int. J. Comput. Vis., Vol. 7,no. 1, pp 11-32.
[25] Bernt Schiele and James L. Crowley. Recognition without Correspondence Using Multidimensional Receptive Field Histograms. In International Journal of Computer Vision 36 (1), pp. 31-50, January 2000.
[26] Comaniciu D, Ramesh V, and Meer P. “Kernel-based object tracking. IEEE Trans. on Pattern Analysis and Machine Intelligence”, 2003,25(5): 564-577.
[27] N. Oshima, T. Saitoh, R. Konishi “Real Time Mean Shift Tracking using Optical Flow Distribution.” SICE-ICASE, 2006. International Joint Conference In SICE-ICASE, 2006. International Joint Conference (2006), pp. 4316-4320.
[28] Kai She, George Bebis, Haisong Gu, and Ronald Miller “Vehcle Tracking Using On-Line Fusion of Color and Shape Features.” 2004 IEEE Intelligent Transprtation Systems Conference Washington, D.C., USA, October3b, 2004
[29] Metropolis, N,. Rosenbluth, A., Rosenbluth, M. N., Teller, A., and Teller, E., Journal lf Chemical Physics, Vol. 21, pp.1087-1092, 1953
[30] Kirkparick, S., Gelett, Jr. C. D., and Vecchi, M. P. “Optimization by Simulated Annealing,” Science, Vol. 220, No. 4598, pp.671-680, 1983.
[31] Kirkpatick, S., “Optimization by Simulated Annealing: Quantitative Studies”, Journal of Statisticle Physics, Vol.34(5/6), 1984.
[31] Cerny, V. “Thermodynamic Approach to the Traveling Salesman Problem: An Efficient Simulated Algorithm,” Journal of Optimization Theory and, Applications, Vol.45,pp.41-51, 1985.
[33] D.Jepsen and C. Gelatt, Jr., “Macro Placement by Monte Carlo Annealing,” Proc. Int Conf. on Computer Design, pp.495-498, Nov 1983.
[34] Weller, S.W., “Simulated Annealing: What good is it,” SPIE Current Development in Opt. Eng., Vol.818, pp.265-274, 1987.
[35] C.E.Erdem, A.. M. Tekalp, and B. Sankur, “Wideo object tracking whith feedback of performance meansures,” IEEE Transactions Circuits and Systems for Video Technology 13(4), 2003.
[36] Y.Fu, A.T. Erdem, and A. M. Tekalp, “Tracking visible Boundary of Objects using occlusion adaptive motion snake ,” IEEE Trans. Image Processing, vol. 9,pp. 2051 2060 ,Dec. 2000.
[37] S.Sun,D.R. Haynor, and Y.Kim, “Semiautomatic video object segmentation using VSnake,”IEEE Trans. Circuits Syst. Video Technol., vol. 13,no. 1,pp. 75 82, Jan. 2003.
[38] Hee-Gu Kang, Daijin Kim “Real-time multiple people tracking using competitive condensation.” Department of Computer Science and Engineering, Pohang University of Science and Technology, San 31, Hyoja-Dong, Nam-Gu, Pohang, 790-784, Korea, accepted 17 December 2004.
[39] Michael Isard, Andrew Blake “CONDENSATION—Conditional Density Propagation for Visual Tracking” Department of Engineering Science, University of Oxford, Oxford OX1 3PJ, UK, accepted March 3, 1997.
[40] Freeman, W.T., Anderson, D.B., Beardsley, P.A., Dodge, C.N., Roth, M., Weissman, C.D., Yerazunis, W.S., Kage, H., Kyuma, K., Miyake, Y. and Tanaka, K., “Computer Vision for Interactive Computer Graphics”, IEEE Computer Graphics and Applications, Vol. 18, No. 3, pp. 42-53, May-June 1998
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top