Efficient mean-shift tracking via a new similarity measure

TitleEfficient mean-shift tracking via a new similarity measure
Publication TypeConference Papers
Year of Publication2005
AuthorsYang C, Duraiswami R, Davis LS
Conference NameComputer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on
Date Published2005/06//
Keywordsalgorithm;, analysis;, Bhattacharyya, coefficient;, Color, colour, density, divergence;, estimates;, extraction;, fast, feature, frame-rate, Gauss, Gaussian, histograms;, image, Kernel, Kullback-Leibler, matching;, Mean-shift, measures;, nonparametric, processes;, sample-based, sequences;, similarity, spaces;, spatial-feature, tracking, tracking;, transform;
Abstract

The mean shift algorithm has achieved considerable success in object tracking due to its simplicity and robustness. It finds local minima of a similarity measure between the color histograms or kernel density estimates of the model and target image. The most typically used similarity measures are the Bhattacharyya coefficient or the Kullback-Leibler divergence. In practice, these approaches face three difficulties. First, the spatial information of the target is lost when the color histogram is employed, which precludes the application of more elaborate motion models. Second, the classical similarity measures are not very discriminative. Third, the sample-based classical similarity measures require a calculation that is quadratic in the number of samples, making real-time performance difficult. To deal with these difficulties we propose a new, simple-to-compute and more discriminative similarity measure in spatial-feature spaces. The new similarity measure allows the mean shift algorithm to track more general motion models in an integrated way. To reduce the complexity of the computation to linear order we employ the recently proposed improved fast Gauss transform. This leads to a very efficient and robust nonparametric spatial-feature tracking algorithm. The algorithm is tested on several image sequences and shown to achieve robust and reliable frame-rate tracking.

DOI10.1109/CVPR.2005.139