|
[1]C. S. Regazzoni, G. Fabri, and G. Vernazza, ed., Advanced Video-based Surveillance Systems, Kluwer Academic Publishers, 1999. [2]G. L. Foresti,, P. Mahonen, and C. S. Regazzoni ed., Multimedia Video-based Surveillance Systems: Requirements, Issues and Solutions, Kluwer Academic Publishers, 2000. [3]C. S. Regazzoni, V. Ramesh, and G. L. Foresti, “Scanning the issue/technology- Special issue on video communication, processing and understanding for third generation video surveillance systems,” Proc. IEEE, vol. 89, no. 10, pp. 1355-1367, Oct. 2001. [4]W. Hu, T. Tan, L. Wang, and S. Maybank, “A survey on visual surveillance of object motion and behaviors,” IEEE Trans. Systems, Man, and Cybernetics- Part C: Applications and Reviews, vol. 34, no. 3, pp. 334-352, Aug. 2004. [5]D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-based object tracking,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 5, pp. 564-577 May 2003. [6]D. Comaniciu, V. Ramesh, and P. Meer, “Real-time tracking of non-rigid objects using mean shift,” in Proc. IEEE Int. Conf. Computer Vision & Pattern Recognition, vol.2, pp.142-149, June 2000 [7]D. Comaniciu and V. Ramesh, “Mean shift and optimal prediction for efficient object tracking,” in Proc. IEEE Int. Conf. Image Processing, vol. 3, pp. 70-73, Sept. 2000, Vancouver, Canada. [8]K. Nummiaro, E. Koller-Meier, and L. Van Gool, “An adaptive color-based particle filter,” Image and Vision Computing, vol. 21, pp. 99-110, 2003. [9]M. Isard and A. Blake, “Conditional density propagation for visual tracking,” Int. J. Computer Vision vol. 29, no. 1, pp. 5–28, 1998. [10]C. Chang and R. Ansari, “Kernel particle filter for visual tracking,” IEEE Signal Processing Letters, vol. 12, no. 3, pp.242-245, Mar. 2005. [11]E. Maggio and A. Cavallaro “Hybrid particle filter and mean shift tracker with adaptive transition model” in Proc. IEEE Int. Conf. Acoustics, Speech, and Signal Processing, pp. 3698-3701, Mar. 2005, Philadelphia, USA. [12]K. Deguchi, O. Kawanaka, and T. Okatani “Object tracking by the mean shift of regional color distribution combined with the particle-filter algorithm,” in Proc. IEEE Int. Conf. Pattern Recognition, vol. 3, pp. 506-509, Aug. 2004, Cambridge, UK. [13]C. Shan, Y. Wei, T. Tan and F. Ojardias, F “Real time hand tracking by combining particle filtering and mean shift,” in Proc. IEEE Int. Conf. Automatic Face and Gesture Recognition, pp. 669-674, May 2004, Seoul, Korea. [14]T.-L. Liu and H.-T. Chen, “Real-time tracking using trust-region methods,” IEEE Trans. Pattern Anal. Machine Intell., vol. 26, no. 3, pp. 397-402, Mar. 2004. [15]S.-C. Park, S.-H. Lim, B.-K. Sin, and S.-W. Lee “Tracking non-rigid objects using probabilistic Hausdorff distance matching,” Pattern Recognition, In press, available online April 2005. [16]J. Gao, A. Kosaka, and A.-C. Kak “A multi-Kalman filtering approach for video tracking of human-delineated objects in cluttered environments,” Computer Vision and Image Understanding, vol. 99, no. 1, pp. 1-57, July 2005. [17]D. Freedman and T. Zhang “Active contours for tracking distributions” IEEE Trans. Image Processing, vol. 13, no. 4, pp. 518-526, Apr. 2004. [18]P. KaewTrakulPong and R. Bowden “A real time adaptive visual surveillance system for tracking low-resolution colour targets in dynamically changing scenes” Image and Vision Computing, vol. 21, pp.913-929, March 2003 [19]A. Cavallaro, O. Steiger, and T. Ebrahimi “Tracking video objects in cluttered background,” IEEE Trans. Circuits Syst. Video Technol., vol. 15, no. 4, Apr. 2005. [20]T. Zhao and R. Nevatia “Tracking multiple humans in complex situations” IEEE Trans. Pattern Anal. Machine Intell., vol. 26, no. 9, pp.1208-1221, Sept. 2004. [21]F. Porikli and O. Tuzel, Human body tracking by adaptive background models and mean-shift analysis, Technical Report, Mitsubishi Electric Research Laboratory, July 2003. [22]T. Tamura et al., “An ambulatory fall monitor for the elderly,” in Proc. IEEE Int. Conf. EMBS, pp. 2608-2610, July 2000., Chicago, IL. [23]G. Williams et al., “A smart fall & activity monitor for telecare applications,” in Proc. IEEE Int. Conf. EMBS, vol. 20, no. 3, pp. 1151-1154, 1998. [24]N. Noury, T. Herve, V. Rialle, G. Virone, E. Mercier,G. Morey, A. Moro, and T. Porcheron, “Monitoring behavior in home using a smart fall sensor and position sensors,” in Proc. IEEE Int. Conf. Microtechnologies in Medicine and Biology, pp. 607-610, Oct. 2000, Lyon, France. [25]N. Noury, “A smart sensor for the remote follow up of activity and fall detection of the elderly,” in Proc. IEEE Int. Conf. Microtechnologies in Medicine and Biology, pp. 314-317, May 2002, Lyon, France. [26]C. Kim and J.-N. Hwang, “Object-based video abstraction for video surveillance systems,” IEEE Trans. Circuits Syst. Video Technol., vol. 12, no. 12, pp. 1128-1138, Dec. 2002. [27]I. Haritaoglu, D. Harwood, and L. S. Davis, “W4: Real-time surveillance of people and their activities,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp. 809-830, Aug. 2000. [28]K. Yoon, D.F. Dementhon, and D. Doermann, “Event detection from MPEG video in the compressed domain,” in Proc. IEEE Int. Conf. Pattern Recognition, pp. 1819-1822, Barcelona, Spain, 2000. [29]H. Nait-Charif and S. J. McKenna, “Activity summarisation and fall detection in a supportive home environment,” in Proc. IEEE Int. Conf. Pattern Rcognition, vol. 4, pp. 23-26, Aug. 2004, Cambridge UK. [30]C.-W. Lin, Z.-H. Ling, Y.-C. Chang, and C.-J. Kuo, “Compressed-domain fall incident detection for intelligent home surveillance” in Proc. IEEE Int. Symp. Circuits Syst., pp. 3781-3784, May 2005, Kobe, Japan. [31]D.J. Bullock and J.S. Zelek, “Real-time tracking for visual interface applications in cluttered and occluding situations” Image and Vision Computing, vol. 22, pp. 1083-1091, Mar. 2004 [32]C. Schell, S.P. Linder, and A.R. Zeidler, “Tracking highly maneuverable targets with unknown behavior” in Proc. IEEE, vol. 92, no. 3, pp. 558-574, March 2004 [33]Y. Su, M.-T. Sun, and V. Hsu, “Global motion estimation from coarsely sampled motion vector field and the applications,” in Proc. IEEE Int. Symp. Circuits Syst., vol. 2, pp. 628-631, Mar. 2003, Bangkok, Thailand. [34]H. Zen, T. Hasegawa, and S. Ozawa, “Moving object detection from MPEG coded picture,” in Proc. IEEE Int. Conf. on Image Processing, Vol. IV, pp.25-29, Oct. 1999, Kobe, Japan. [35]H.-L. Eng and K. –K. Ma, “Spatiotemporal segmentation of moving video objects over MPEG compressed domain,” in Proc. IEEE Int. Conf. Multimedia and Expo., vol.3, pp.1531 –1534, 2000, New York. [36]M.-L. Jamrozik and M.-H. Hayes, “A compressed domain video object segmentation system,” in Proc. IEEE Int. Conf. Image Processing, vol.1, pp.113-116, Sep. 2002, New York, USA [37]T. Kailath, “The divergence and Bhattacharyya distance measures in signal selection,” IEEE Trans. Comm. Technology, vol. 15, pp. 52-60, 1967. [38]M. Wollborn and R. Mech, “Refined procedure for objective evaluation of video object generation algorithms,” Doc. ISO/IEC JTC1/SC29/WG11 M3448, Mar. 1998.
|