Rendering localized spatial audio in a virtual auditory space

TitleRendering localized spatial audio in a virtual auditory space
Publication TypeJournal Articles
Year of Publication2004
AuthorsZotkin DN, Duraiswami R, Davis LS
JournalIEEE Transactions on Multimedia
Volume6
Issue4
Pagination553 - 564
Date Published2004/08//
ISBN Number1520-9210
Keywords-D audio processing, 3-D audio processing, Audio databases, audio signal processing, audio user interfaces, augmented reality, data sonification, Digital signal processing, head related transfer functions, head-related transfer function, Interpolation, Layout, perceptual user interfaces, Real time systems, Rendering (computer graphics), Scattering, spatial audio, Transfer functions, User interfaces, virtual audio scene rendering, virtual auditory spaces, virtual environments, Virtual reality, virtual reality environments
Abstract

High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware.

DOI10.1109/TMM.2004.827516