Capture and rendering of spatial sound over headphones

TitleCapture and rendering of spatial sound over headphones
Publication TypeJournal Articles
Year of Publication2006
AuthorsDuraiswami R, Zotkin DN, O'donovan A
JournalThe Journal of the Acoustical Society of America
Volume120
Issue5
Pagination3094 - 3094
Date Published2006///
Abstract

A theory for capturing an audio scene and then rendering it remotely over headphones is developed. The method relies on capture of a sound field up to a certain order in terms of spherical wave functions. Then, the captured sound field is convolved with the head‐related transfer function and rendered to provide presence in the auditory scene. The sound‐field representation is then transmitted to a remote location for immediate rendering or stored for later use. A system that implements the capture using a spherical array is developed and tested. Head‐related transfer functions are measured using the system described in [D.N. Zotkin et al., J. Acoustic. Soc. Am. (to appear)]. The sound renderer, coupled with the head tracker, reconstructs the acoustic field using individualized head‐related transfer functions to preserve the perceptual spatial structure of the audio scene. [Work partially supported by VA.]

URLhttp://link.aip.org/link/?JAS/120/3094/4