next up previous
Next: Mean Shift Analysis and Up: Research Summary Previous: Research Summary

Improved Fast Gauss Transform

The fast multipole method has been called one of the ten most significant numerical algorithms discovered in the 20th century (along with algorithms such as the fast Fourier transform), and won its inventors, V. Rokhlin and L. Greengard, the 2001 Steele prize, in addition to getting Greengard the ACM best dissertation award. The algorithm allows the product of particular dense matrices with a vector to be evaluated approximately (to a specified precision) in $ O(N \log N)$ operations, when direct multiplication requires $ O(N^2)$ operations. For extremely large problems, the gain in efficiency and memory can be very significant, and enables the use of more powerful modeling approaches that may have been discarded as computationally infeasible in the past.

The fast Gauss transform (FGT) introduced by Greengard and Strain is an important variant of the more general fast multipole method. While the fast multipole method has been successfully in many mathematics and physics domains, the fast Gauss transform is widely applied in many applications of computer vision and pattern recognition. While the original FGT has been successfully applied to problem in low dimensional spaces, it suffers from two serious defects in higher dimensional spaces: To overcome the difficulties of FGT in higher dimensional spaces, we proposed a new multivariate Taylor expansion whose complexity is asymptomatically polynomial order. To adaptively fit the density of points, we use $ k$-center algorithm (a.k.a. farthest-point algorithm) to subdivide the space. The complexity of $k$-center algorithm is $ O( n \log k)$. The result of $k$-center algorithm is shown by the following Java demonstration. We also give a relatively simple error bound of our improved fast Gauss transform.

The improved fast Gauss transform has been successfully applied to many applications, such as mean shift algorithm, realtime object tracking, efficient kernel density estimation, kernel machines, image segmentation, etc.

The C++ code of improved fast Gauss transform is available for any non-commercial,  academic purpose without fee. Please read the license first if you want to download the code (Click here and there is a link at the end of the license). Please send email to yangcj@cs.umd.edu, ramani@umiacs.umd.edu and lsd@umiacs.umd.edu if you have any question.



The Java implementation of k-center algorithm(1.1)



References

  1. C. Yang, R. Duraiswami and L. Davis. Efficient Kernel Machines Using the Improved Fast Gauss Transform. In Advances in Neural Information Processing Systems 16, 2004.
  2. C. Yang, R. Duraiswami, N. Gumerov and L. Davis. Improved Fast Gauss Transform and Efficient Kernel Density Estimation, In IEEE International Conference on Computer Vision, pages 464-471, 2003.
  3. C. Yang, R. Duraiswami, A. Elgammal and L. Davis. Real-Time Kernel-Based Tracking in Joint Feature-Spatial Spaces. Technical Report CS-TR-4567, Dept. of Computer Science, University of Maryland, College Park, 2004.
  4. C. Yang, R. Duraiswami and N. Gumerov. Improved Fast Gauss Transform. Technical Report CS-TR-4495, Dept. of Computer Science, University of Maryland, College Park, also submitted SIAM Sci. Comput. for publication, 2003.
next up previous
Next: Mean Shift Analysis and Up: Research Summary Previous: Research Summary
Changjiang Yang 2005-03-01