%0 Conference Paper
%B Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on
%D 2009
%T Locally time-invariant models of human activities using trajectories on the grassmannian
%A Turaga,P.
%A Chellapa, Rama
%K action
%K activity
%K analysis;local
%K analysis;video
%K content
%K dynamic
%K Estimation
%K estimation;statistical
%K Grassmann
%K indexing;distance
%K linear
%K manifold;activity-based
%K method;statistical
%K method;surveillance;time-varying
%K metrics;human
%K model;parameter
%K recognition;human
%K recognition;indexing;parameter
%K summarization;computer
%K surveillance;
%K system;computer
%K time-invariant
%K vision;consumer
%K vision;image
%X Human activity analysis is an important problem in computer vision with applications in surveillance and summarization and indexing of consumer content. Complex human activities are characterized by non-linear dynamics that make learning, inference and recognition hard. In this paper, we consider the problem of modeling and recognizing complex activities which exhibit time-varying dynamics. To this end, we describe activities as outputs of linear dynamic systems (LDS) whose parameters vary with time, or a time-varying linear dynamic system (TV-LDS). We discuss parameter estimation methods for this class of models by assuming that the parameters are locally time-invariant. Then, we represent the space of LDS models as a Grassmann manifold. Then, the TV-LDS model is defined as a trajectory on the Grassmann manifold. We show how trajectories on the Grassmannian can be characterized using appropriate distance metrics and statistical methods that reflect the underlying geometry of the manifold. This results in more expressive and powerful models for complex human activities. We demonstrate the strength of the framework for activity-based summarization of long videos and recognition of complex human actions on two datasets.
%B Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on
%P 2435 - 2441
%8 2009/06//
%G eng
%R 10.1109/CVPR.2009.5206710