TY - CONF T1 - Imaging concert hall acoustics using visual and audio cameras T2 - IEEE International Conference on Acoustics, Speech and Signal Processing, 2008. ICASSP 2008 Y1 - 2008 A1 - O'Donovan,A. A1 - Duraiswami, Ramani A1 - Zotkin,Dmitry N KW - Acoustic imaging KW - acoustic intensity images KW - acoustic measurement KW - Acoustic measurements KW - Acoustic scattering KW - acoustic signal processing KW - acoustical camera KW - acoustical scene analysis KW - acquired audio registration KW - audio cameras KW - audio signal processing KW - CAMERAS KW - central projection KW - Computer vision KW - Educational institutions KW - HUMANS KW - image registration KW - Image segmentation KW - imaging concert hall acoustics KW - Layout KW - microphone arrays KW - panoramic mosaiced visual image KW - Raman scattering KW - reverberation KW - room acoustics KW - spherical microphone array beamformer KW - spherical microphone arrays KW - video image registration KW - visual cameras AB - Using a developed real time audio camera, that uses the output of a spherical microphone array beamformer steered in all directions to create central projection to create acoustic intensity images, we present a technique to measure the acoustics of rooms and halls. A panoramic mosaiced visual image of the space is also create. Since both the visual and the audio camera images are central projection, registration of the acquired audio and video images can be performed using standard computer vision techniques. We describe the technique, and apply it to the examine the relation between acoustical features and architectural details of the Dekelbaum concert hall at the Clarice Smith Performing Arts Center in College Park, MD. JA - IEEE International Conference on Acoustics, Speech and Signal Processing, 2008. ICASSP 2008 PB - IEEE SN - 978-1-4244-1483-3 M3 - 10.1109/ICASSP.2008.4518852 ER - TY - CONF T1 - Fast Multipole Accelerated Boundary Elements for Numerical Computation of the Head Related Transfer Function T2 - IEEE International Conference on Acoustics, Speech and Signal Processing, 2007. ICASSP 2007 Y1 - 2007 A1 - Gumerov, Nail A. A1 - Duraiswami, Ramani A1 - Zotkin,Dmitry N KW - Acceleration KW - Acoustic measurements KW - Acoustic scattering KW - audio signal processing KW - boundary element formulation KW - Boundary element method KW - Boundary element methods KW - boundary-elements methods KW - Costs KW - Ear KW - Fast Multipole Method KW - Frequency KW - Head related transfer function KW - HUMANS KW - Irrigation KW - iterative methods KW - multipole accelerated boundary elements KW - multipole based iterative preconditioned Krylov solution KW - numerical computation KW - Reciprocity KW - Transfer functions AB - The numerical computation of head related transfer functions has been attempted by a number of researchers. However, the cost of the computations has meant that usually only low frequencies can be computed and further the computations take inordinately long times. Because of this, comparisons of the computations with measurements are also difficult. We present a fast multipole based iterative preconditioned Krylov solution of a boundary element formulation of the problem and use a new formulation that enables the reciprocity technique to be accurately employed. This allows the calculation to proceed for higher frequencies and larger discretizations. Preliminary results of the computations and of comparisons with measured HRTFs are presented. JA - IEEE International Conference on Acoustics, Speech and Signal Processing, 2007. ICASSP 2007 PB - IEEE VL - 1 SN - 1-4244-0727-3 M3 - 10.1109/ICASSP.2007.366642 ER - TY - JOUR T1 - Interactive sonification of choropleth maps JF - IEEE Multimedia Y1 - 2005 A1 - Zhao,Haixia A1 - Smith,B. K A1 - Norman,K. A1 - Plaisant, Catherine A1 - Shneiderman, Ben KW - audio signal processing KW - audio user interfaces KW - Auditory (non-speech) feedback KW - auditory information KW - cartography KW - choropleth maps KW - data collections KW - decision making KW - Evaluation KW - Feedback KW - georeferenced data KW - Guidelines KW - handicapped aids KW - Hardware KW - HUMANS KW - information resources KW - interaction style KW - Interactive sonification KW - interactive systems KW - Navigation KW - nonspeech audio KW - problem solving KW - Problem-solving KW - sound KW - universal usability KW - US Government KW - User interfaces KW - vision impairments KW - World Wide Web AB - Auditory information is an important channel for the visually impaired. Effective sonification (the use of non-speech audio to convey information) promotes equal working opportunities for people with vision impairments by helping them explore data collections for problem solving and decision making. Interactive sonification systems can make georeferenced data accessible to people with vision impairments. The authors compare methods for using sound to encode georeferenced data patterns and for navigating maps. VL - 12 SN - 1070-986X CP - 2 M3 - 10.1109/MMUL.2005.28 ER - TY - JOUR T1 - Rendering localized spatial audio in a virtual auditory space JF - IEEE Transactions on Multimedia Y1 - 2004 A1 - Zotkin,Dmitry N A1 - Duraiswami, Ramani A1 - Davis, Larry S. KW - -D audio processing KW - 3-D audio processing KW - Audio databases KW - audio signal processing KW - audio user interfaces KW - augmented reality KW - data sonification KW - Digital signal processing KW - head related transfer functions KW - head-related transfer function KW - Interpolation KW - Layout KW - perceptual user interfaces KW - Real time systems KW - Rendering (computer graphics) KW - Scattering KW - spatial audio KW - Transfer functions KW - User interfaces KW - virtual audio scene rendering KW - virtual auditory spaces KW - virtual environments KW - Virtual reality KW - virtual reality environments AB - High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware. VL - 6 SN - 1520-9210 CP - 4 M3 - 10.1109/TMM.2004.827516 ER - TY - CONF T1 - Efficient evaluation of reverberant sound fields T2 - Applications of Signal Processing to Audio and Acoustics, 2001 IEEE Workshop on the Y1 - 2001 A1 - Duraiswami, Ramani A1 - Gumerov, Nail A. A1 - Zotkin,Dmitry N A1 - Davis, Larry S. KW - AB algorithm KW - acoustic signal processing KW - architectural acoustics KW - audio signal processing KW - Computational modeling KW - Computer interfaces KW - Computer simulation KW - Educational institutions KW - image method KW - image sources KW - Impedance KW - Laboratories KW - microphone arrays KW - multipole expansions KW - Nails KW - performance evaluation KW - reverberant sound fields KW - reverberation KW - room reverberation KW - simulation KW - Simulations KW - speedup KW - virtual audio AB - An image method due to Allen and Berkley (1979) is often used to simulate the effect of reverberation in rooms. This method is relatively expensive computationally. We present a fast method for conducting such simulations using multipole expansions. For M real and image sources and N evaluation points, while the image method requires O(MN) operations, our method achieves the calculations in O(M + N) operations, resulting in a substantial speedup. Applications of our technique are also expected in simulation of virtual audio JA - Applications of Signal Processing to Audio and Acoustics, 2001 IEEE Workshop on the PB - IEEE SN - 0-7803-7126-7 M3 - 10.1109/ASPAA.2001.969578 ER - TY - CONF T1 - Modeling the effect of a nearby boundary on the HRTF T2 - 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001. Proceedings. (ICASSP '01) Y1 - 2001 A1 - Gumerov, Nail A. A1 - Duraiswami, Ramani KW - Acoustic scattering KW - acoustic signal processing KW - acoustic wave reflection KW - acoustic wave scattering KW - architectural acoustics KW - audio signal processing KW - Biological system modeling KW - boundary effect modeling KW - Computer interfaces KW - Ear KW - Educational institutions KW - Frequency KW - Head related transfer function KW - HRTF KW - HUMANS KW - infinite plane KW - Laboratories KW - Nails KW - Raman scattering KW - rigid surface KW - room environment KW - sound pressure level KW - sound scattering KW - spatial audio KW - sphere KW - spherical model KW - Transfer functions KW - wall influence AB - Understanding and simplified modeling of the head related transfer function (HRTF) holds the key to many applications in spatial audio. We develop an analytical solution to the problem of scattering of sound from a sphere in the vicinity of an infinite plane. Using this solution we study the influence of a nearby scattering rigid surface, on a spherical model for the HRTF JA - 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001. Proceedings. (ICASSP '01) PB - IEEE VL - 5 SN - 0-7803-7041-4 M3 - 10.1109/ICASSP.2001.940373 ER - TY - CONF T1 - Multimodal 3-D tracking and event detection via the particle filter T2 - IEEE Workshop on Detection and Recognition of Events in Video, 2001. Proceedings Y1 - 2001 A1 - Zotkin,Dmitry N A1 - Duraiswami, Ramani A1 - Davis, Larry S. KW - algorithms KW - APPROACH KW - audio data collection KW - audio signal processing KW - Bayesian inference KW - Bayesian methods KW - belief networks KW - CAMERAS KW - capture KW - conversation KW - echo KW - Educational institutions KW - Event detection KW - event occurrence KW - filtering theory KW - flying echo locating bat behaviour KW - Image motion analysis KW - inference mechanisms KW - Laboratories KW - microphone arrays KW - moving object tracking KW - moving participants KW - moving prey KW - multimodal 3D tracking KW - multiple cameras KW - Object detection KW - particle filter KW - Particle filters KW - Particle tracking KW - Robustness KW - search KW - smart video conferencing setup KW - target tracking KW - Teleconferencing KW - tracking filters KW - turn-taking detection KW - video data collection KW - video signal processing AB - Determining the occurrence of an event is fundamental to developing systems that can observe and react to them. Often, this determination is based on collecting video and/or audio data and determining the state or location of a tracked object. We use Bayesian inference and the particle filter for tracking moving objects, using both video data obtained from multiple cameras and audio data obtained using arrays of microphones. The algorithms developed are applied to determining events arising in two fields of application. In the first, the behavior of a flying echo locating bat as it approaches a moving prey is studied, and the events of search, approach and capture are detected. In a second application we describe detection of turn-taking in a conversation between possibly moving participants recorded using a smart video conferencing setup JA - IEEE Workshop on Detection and Recognition of Events in Video, 2001. Proceedings PB - IEEE SN - 0-7695-1293-3 M3 - 10.1109/EVENT.2001.938862 ER -