Probabilistic fusion-based parameter estimation for visual tracking

TitleProbabilistic fusion-based parameter estimation for visual tracking
Publication TypeJournal Articles
Year of Publication2009
AuthorsHan B, Davis LS
JournalComputer Vision and Image Understanding
Pagination435 - 445
Date Published2009/04//
ISBN Number1077-3142
KeywordsComponent-based tracking, Density-based fusion, Mean-shift, visual tracking

In object tracking, visual features may not be discriminative enough to estimate high dimensional motion parameters accurately, and complex motion estimation is computationally expensive due to a large search space. To tackle these problems, a reasonable strategy is to track small components within the target independently in lower dimensional motion parameter spaces (e.g., translation only) and then estimate the overall high dimensional motion (e.g., translation, scale and rotation) by statistically integrating the individual tracking results. Although tracking each component in a lower dimensional space is more reliable and faster, it is not trivial to combine the local motion information and estimate global parameters in a robust way because the individual component motions are frequently inconsistent. We propose a robust fusion algorithm to estimate the complex motion parameters using variable-bandwidth mean-shift. By employing correlation-based uncertainty modeling and fusion of individual components, the motion parameter that is robust to outliers can be detected with variable-bandwidth density-based fusion (VBDF) algorithm. In addition, we describe a method to update target appearance model for each component adaptively based on the component motion consistency. We present various tracking results and compare the performance of our algorithm with others using real video sequences.