%0 Conference Paper %B IEEE International Conference on Acoustics, Speech and Signal Processing, 2008. ICASSP 2008 %D 2008 %T Imaging concert hall acoustics using visual and audio cameras %A O'Donovan,A. %A Duraiswami, Ramani %A Zotkin,Dmitry N %K Acoustic imaging %K acoustic intensity images %K acoustic measurement %K Acoustic measurements %K Acoustic scattering %K acoustic signal processing %K acoustical camera %K acoustical scene analysis %K acquired audio registration %K audio cameras %K audio signal processing %K CAMERAS %K central projection %K Computer vision %K Educational institutions %K HUMANS %K image registration %K Image segmentation %K imaging concert hall acoustics %K Layout %K microphone arrays %K panoramic mosaiced visual image %K Raman scattering %K reverberation %K room acoustics %K spherical microphone array beamformer %K spherical microphone arrays %K video image registration %K visual cameras %X Using a developed real time audio camera, that uses the output of a spherical microphone array beamformer steered in all directions to create central projection to create acoustic intensity images, we present a technique to measure the acoustics of rooms and halls. A panoramic mosaiced visual image of the space is also create. Since both the visual and the audio camera images are central projection, registration of the acquired audio and video images can be performed using standard computer vision techniques. We describe the technique, and apply it to the examine the relation between acoustical features and architectural details of the Dekelbaum concert hall at the Clarice Smith Performing Arts Center in College Park, MD. %B IEEE International Conference on Acoustics, Speech and Signal Processing, 2008. ICASSP 2008 %I IEEE %P 5284 - 5287 %8 2008/// %@ 978-1-4244-1483-3 %G eng %R 10.1109/ICASSP.2008.4518852 %0 Conference Paper %B IEEE International Conference on Acoustics, Speech and Signal Processing, 2007. ICASSP 2007 %D 2007 %T Fast Multipole Accelerated Boundary Elements for Numerical Computation of the Head Related Transfer Function %A Gumerov, Nail A. %A Duraiswami, Ramani %A Zotkin,Dmitry N %K Acceleration %K Acoustic measurements %K Acoustic scattering %K audio signal processing %K boundary element formulation %K Boundary element method %K Boundary element methods %K boundary-elements methods %K Costs %K Ear %K Fast Multipole Method %K Frequency %K Head related transfer function %K HUMANS %K Irrigation %K iterative methods %K multipole accelerated boundary elements %K multipole based iterative preconditioned Krylov solution %K numerical computation %K Reciprocity %K Transfer functions %X The numerical computation of head related transfer functions has been attempted by a number of researchers. However, the cost of the computations has meant that usually only low frequencies can be computed and further the computations take inordinately long times. Because of this, comparisons of the computations with measurements are also difficult. We present a fast multipole based iterative preconditioned Krylov solution of a boundary element formulation of the problem and use a new formulation that enables the reciprocity technique to be accurately employed. This allows the calculation to proceed for higher frequencies and larger discretizations. Preliminary results of the computations and of comparisons with measured HRTFs are presented. %B IEEE International Conference on Acoustics, Speech and Signal Processing, 2007. ICASSP 2007 %I IEEE %V 1 %P I-165-I-168 - I-165-I-168 %8 2007/04// %@ 1-4244-0727-3 %G eng %R 10.1109/ICASSP.2007.366642 %0 Journal Article %J IEEE Multimedia %D 2005 %T Interactive sonification of choropleth maps %A Zhao,Haixia %A Smith,B. K %A Norman,K. %A Plaisant, Catherine %A Shneiderman, Ben %K audio signal processing %K audio user interfaces %K Auditory (non-speech) feedback %K auditory information %K cartography %K choropleth maps %K data collections %K decision making %K Evaluation %K Feedback %K georeferenced data %K Guidelines %K handicapped aids %K Hardware %K HUMANS %K information resources %K interaction style %K Interactive sonification %K interactive systems %K Navigation %K nonspeech audio %K problem solving %K Problem-solving %K sound %K universal usability %K US Government %K User interfaces %K vision impairments %K World Wide Web %X Auditory information is an important channel for the visually impaired. Effective sonification (the use of non-speech audio to convey information) promotes equal working opportunities for people with vision impairments by helping them explore data collections for problem solving and decision making. Interactive sonification systems can make georeferenced data accessible to people with vision impairments. The authors compare methods for using sound to encode georeferenced data patterns and for navigating maps. %B IEEE Multimedia %V 12 %P 26 - 35 %8 2005/06//April %@ 1070-986X %G eng %N 2 %R 10.1109/MMUL.2005.28 %0 Journal Article %J IEEE Transactions on Multimedia %D 2004 %T Rendering localized spatial audio in a virtual auditory space %A Zotkin,Dmitry N %A Duraiswami, Ramani %A Davis, Larry S. %K -D audio processing %K 3-D audio processing %K Audio databases %K audio signal processing %K audio user interfaces %K augmented reality %K data sonification %K Digital signal processing %K head related transfer functions %K head-related transfer function %K Interpolation %K Layout %K perceptual user interfaces %K Real time systems %K Rendering (computer graphics) %K Scattering %K spatial audio %K Transfer functions %K User interfaces %K virtual audio scene rendering %K virtual auditory spaces %K virtual environments %K Virtual reality %K virtual reality environments %X High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware. %B IEEE Transactions on Multimedia %V 6 %P 553 - 564 %8 2004/08// %@ 1520-9210 %G eng %N 4 %R 10.1109/TMM.2004.827516 %0 Conference Paper %B Applications of Signal Processing to Audio and Acoustics, 2001 IEEE Workshop on the %D 2001 %T Efficient evaluation of reverberant sound fields %A Duraiswami, Ramani %A Gumerov, Nail A. %A Zotkin,Dmitry N %A Davis, Larry S. %K AB algorithm %K acoustic signal processing %K architectural acoustics %K audio signal processing %K Computational modeling %K Computer interfaces %K Computer simulation %K Educational institutions %K image method %K image sources %K Impedance %K Laboratories %K microphone arrays %K multipole expansions %K Nails %K performance evaluation %K reverberant sound fields %K reverberation %K room reverberation %K simulation %K Simulations %K speedup %K virtual audio %X An image method due to Allen and Berkley (1979) is often used to simulate the effect of reverberation in rooms. This method is relatively expensive computationally. We present a fast method for conducting such simulations using multipole expansions. For M real and image sources and N evaluation points, while the image method requires O(MN) operations, our method achieves the calculations in O(M + N) operations, resulting in a substantial speedup. Applications of our technique are also expected in simulation of virtual audio %B Applications of Signal Processing to Audio and Acoustics, 2001 IEEE Workshop on the %I IEEE %P 203 - 206 %8 2001/// %@ 0-7803-7126-7 %G eng %R 10.1109/ASPAA.2001.969578 %0 Conference Paper %B 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001. Proceedings. (ICASSP '01) %D 2001 %T Modeling the effect of a nearby boundary on the HRTF %A Gumerov, Nail A. %A Duraiswami, Ramani %K Acoustic scattering %K acoustic signal processing %K acoustic wave reflection %K acoustic wave scattering %K architectural acoustics %K audio signal processing %K Biological system modeling %K boundary effect modeling %K Computer interfaces %K Ear %K Educational institutions %K Frequency %K Head related transfer function %K HRTF %K HUMANS %K infinite plane %K Laboratories %K Nails %K Raman scattering %K rigid surface %K room environment %K sound pressure level %K sound scattering %K spatial audio %K sphere %K spherical model %K Transfer functions %K wall influence %X Understanding and simplified modeling of the head related transfer function (HRTF) holds the key to many applications in spatial audio. We develop an analytical solution to the problem of scattering of sound from a sphere in the vicinity of an infinite plane. Using this solution we study the influence of a nearby scattering rigid surface, on a spherical model for the HRTF %B 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001. Proceedings. (ICASSP '01) %I IEEE %V 5 %P 3337-3340 vol.5 - 3337-3340 vol.5 %8 2001/// %@ 0-7803-7041-4 %G eng %R 10.1109/ICASSP.2001.940373 %0 Conference Paper %B IEEE Workshop on Detection and Recognition of Events in Video, 2001. Proceedings %D 2001 %T Multimodal 3-D tracking and event detection via the particle filter %A Zotkin,Dmitry N %A Duraiswami, Ramani %A Davis, Larry S. %K algorithms %K APPROACH %K audio data collection %K audio signal processing %K Bayesian inference %K Bayesian methods %K belief networks %K CAMERAS %K capture %K conversation %K echo %K Educational institutions %K Event detection %K event occurrence %K filtering theory %K flying echo locating bat behaviour %K Image motion analysis %K inference mechanisms %K Laboratories %K microphone arrays %K moving object tracking %K moving participants %K moving prey %K multimodal 3D tracking %K multiple cameras %K Object detection %K particle filter %K Particle filters %K Particle tracking %K Robustness %K search %K smart video conferencing setup %K target tracking %K Teleconferencing %K tracking filters %K turn-taking detection %K video data collection %K video signal processing %X Determining the occurrence of an event is fundamental to developing systems that can observe and react to them. Often, this determination is based on collecting video and/or audio data and determining the state or location of a tracked object. We use Bayesian inference and the particle filter for tracking moving objects, using both video data obtained from multiple cameras and audio data obtained using arrays of microphones. The algorithms developed are applied to determining events arising in two fields of application. In the first, the behavior of a flying echo locating bat as it approaches a moving prey is studied, and the events of search, approach and capture are detected. In a second application we describe detection of turn-taking in a conversation between possibly moving participants recorded using a smart video conferencing setup %B IEEE Workshop on Detection and Recognition of Events in Video, 2001. Proceedings %I IEEE %P 20 - 27 %8 2001/// %@ 0-7695-1293-3 %G eng %R 10.1109/EVENT.2001.938862