%0 Journal Article %J Image Processing, IEEE Transactions on %D 2008 %T Activity Modeling Using Event Probability Sequences %A Cuntoor, N.P. %A Yegnanarayana,B. %A Chellapa, Rama %K Automated;Reproducibility of Results;Sensitivity and Specificity;Video Recording; %K Biological;Models %K Carnegie Mellon University;Credo Intelligence %K Computer-Assisted;Models %K Inc.;Motion Capture dataset;Transportation Security Administration;University Central Florida;airport tarmac surveillance dataset;anomaly detection;event probability sequence;event representation;hidden Markov model;human action dataset;human activity rec %K Statistical;Motor Activity;Movement;Pattern Recognition %X Changes in motion properties of trajectories provide useful cues for modeling and recognizing human activities. We associate an event with significant changes that are localized in time and space, and represent activities as a sequence of such events. The localized nature of events allows for detection of subtle changes or anomalies in activities. In this paper, we present a probabilistic approach for representing events using the hidden Markov model (HMM) framework. Using trained HMMs for activities, an event probability sequence is computed for every motion trajectory in the training set. It reflects the probability of an event occurring at every time instant. Though the parameters of the trained HMMs depend on viewing direction, the event probability sequences are robust to changes in viewing direction. We describe sufficient conditions for the existence of view invariance. The usefulness of the proposed event representation is illustrated using activity recognition and anomaly detection. Experiments using the indoor University of Central Florida human action dataset, the Carnegie Mellon University Credo Intelligence, Inc., Motion Capture dataset, and the outdoor Transportation Security Administration airport tarmac surveillance dataset show encouraging results. %B Image Processing, IEEE Transactions on %V 17 %P 594 - 607 %8 2008/04// %@ 1057-7149 %G eng %N 4 %R 10.1109/TIP.2008.916991