ROS package for the Active Monitoring Process Version 1.0




A snapshot of a robot system running this package


Abstract

Here a technique is developed to monitor object under manipulation. At the heart of the technique lies a novel active tracking and segmentation method that integrates fixation based segmentation and stochastic tracking into a dynamic loop. Here an example implementation of the algorithm under ROS (Robot Operating System) is provided with realtime performance. Please note that for general purposes and realtime concern, depth information and optical flow are not included in this ROS package version.

The ROS (Robot Operating System) package

Relevant publication: Yezhou Yang, Yiannis. Aloimonos, Cornelia. Fermuller. Detection of Manipulation Action Consequences (MAC), IEEE International Conference on Computer Vision and Pattern Recognition, 2013, Portland Seattle.

ZIP file: ROS PACKAGE V1.0


Special thanks to give for sharing the active segmentation and particle filter based tracking code from,

Mishra, Ajay, Yiannis Aloimonos, and Cornelia Fermuller. "Active segmentation for robotics." Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on. IEEE, 2009.

Nummiaro, Katja, Esther Koller-Meier, and Luc Van Gool. "A color-based particle filter." First International Workshop on Generative-Model-Based Vision. Vol. 2002. Denmark, Kopenhagen: Datalogistik Institut, Kobenhavns Universitet, 2002.


How to use it

1) First install ROS (only tested on diamondback) For more information about ROS, please refer HERE

2) install ROS package rosTrackSeg from .zip file, check dependency and then run "rosmake"

3) Publish a Live camera or kinect input through massage /camera/rgb/image_color , or publish a recorded .oni file with the live stream to the same massage channel.

4) run "rosrun rosTrackSeg trackSeg". There will be a window with name "select a fixation point", jump out. Select a fixation point within the object under manipulation and then press ENTER.

5) now active monitoring process starts and you can see result from the label window, you are welcome to revise our program for saving the segmentation result out.

Questions? Please contact yzyang "at" cs dot umd dot edu