Rendering localized spatial audio in a virtual auditory space

TitleRendering localized spatial audio in a virtual auditory space
Publication TypeJournal Articles
Year of Publication2004
AuthorsZotkin DN, Duraiswami R, Davis LS
JournalMultimedia, IEEE Transactions on
Volume6
Issue4
Pagination553 - 564
Date Published2004/08//
ISBN Number1520-9210
Keywords(computer, 3-D, audio, audio;, auditory, augmented, data, environments;, functions;, graphics);, Head, interfaces;, perceptual, processing;, reality, reality;, related, rendering, rendering;, scene, signal, sonification;, spaces;, spatial, transfer, user, virtual
Abstract

High-quality virtual audio scene rendering is required for emerging virtual and augmented reality applications, perceptual user interfaces, and sonification of data. We describe algorithms for creation of virtual auditory spaces by rendering cues that arise from anatomical scattering, environmental scattering, and dynamical effects. We use a novel way of personalizing the head related transfer functions (HRTFs) from a database, based on anatomical measurements. Details of algorithms for HRTF interpolation, room impulse response creation, HRTF selection from a database, and audio scene presentation are presented. Our system runs in real time on an office PC without specialized DSP hardware.

DOI10.1109/TMM.2004.827516