Ramezani, R. and Angelov, Plamen and Zhou, Xiao (2008) A fast approach to novelty detection in video streams using recursive density estimation. In: Intelligent Systems, 2008. IS '08. 4th International IEEE Conference :. IEEE, BGR, 14-2 - 14-7. ISBN 978-1-4244-1739-1
Full text not available from this repository.Abstract
Video-based surveillance and security become extremely important in the new, 21st century for human safety, counter-terrorism, traffic control etc. Visual novelty detection and tracking are key elements of such activities. The current state-of-the-art approaches often suffer from high computational, memory storage costs and from not being fully automated (they usually require a human operator in the loop). This paper introduces a new approach to the problem of novelty detection in video streams that is based on recursive, and therefore, computationally efficient density estimation by a Cauchy type of kernel (as opposed to the usually used Gaussian one). The idea of the proposed approach stems from the recently introduced evolving clustering approach, eClustering and is suitable for on-line and real-time applications in fully autonomous and unsupervised systems as a stand-alone novelty detector or for priming a tracking algorithm. The approach proposed in this paper has evolving property - it can gradually update the background model and the criteria to detect novelty by unsupervised on-line learning. The proposed approach is faster by an order of magnitude than the well known kernel density estimation (KDE) method for background subtraction, while having has adaptive characteristics, and does not need any threshold to be pre-specified. Recursive expressions similar to the proposed approach in this paper can also be applied to image segmentation and landmark recognition used for self-localization in robotics. If combined with a real-time prediction using Kalman filter or evolving Takagi-Sugeno fuzzy models a fast and fully autonomous tracking system can be realized with potential applications in surveillance and robotic systems. (c) IEEE Press.