Robust ego-motion estimation and 3D model refinement using depth based parallax model

TitleRobust ego-motion estimation and 3D model refinement using depth based parallax model
Publication TypeConference Papers
Year of Publication2004
AuthorsAgrawala AK, Chellappa R
Conference NameImage Processing, 2004. ICIP '04. 2004 International Conference on
Date Published2004/10//
Keywords3D, algorithm;, analysis;, and, based, camera;, coarse, compensation;, DEM;, depth, digital, ego-motion, eigen-value, eigenfunctions;, eigenvalues, ELEVATION, epipolar, estimation;, extraction;, feature, field;, iteration, iterative, map;, method;, methods;, model, model;, MOTION, parallax, partial, range-finding;, refinement;, refining;, surface
Abstract

We present an iterative algorithm for robustly estimating the ego-motion and refining and updating a coarse, noisy and partial depth map using a depth based parallax model and brightness derivatives extracted from an image pair. Given a coarse, noisy and partial depth map acquired by a range-finder or obtained from a Digital Elevation Map (DFM), we first estimate the ego-motion by combining a global ego-motion constraint and a local brightness constancy constraint. Using the estimated camera motion and the available depth map estimate, motion of the 3D points is compensated. We utilize the fact that the resulting surface parallax field is an epipolar field and knowing its direction from the previous motion estimates, estimate its magnitude and use it to refine the depth map estimate. Instead of assuming a smooth parallax field or locally smooth depth models, we locally model the parallax magnitude using the depth map, formulate the problem as a generalized eigen-value analysis and obtain better results. In addition, confidence measures for depth estimates are provided which can be used to remove regions with potentially incorrect (and outliers in) depth estimates for robustly estimating ego-motion in the next iteration. Results on both synthetic and real examples are presented.

DOI10.1109/ICIP.2004.1421606