Journal of the Audio Engineering Society

The Journal of the Audio Engineering Society — the official publication of the AES — is the only peer-reviewed journal devoted exclusively to audio technology. Published 10 times each year, it is available to all AES members and subscribers.

 

The Joumal contains state-of-the-art technical papers and engineering reports; feature articles covering timely topics; pre and post reports of AES conventions and other society activities; news from AES sections around the world; Standards and Education Committee work membership news, new products, and newsworthy developments in the field of audio.

2023 June - Volume 71 Number 6

Papers


Evaluation of Metaverse Music Performance With BBC Maida Vale Recording Studios

Authors: Cairns, Patrick; Hunt, Anthony; Johnston, Daniel; Cooper, Jacob; Lee, Ben; Daffern, Helena; Kearney, Gavin

This paper details a case study evaluation of a recording experience in a networked XR simulation of the renowned BBC Maida Vale Recording Studios. The system allows multiple remote musicians to connect over a network, providing a shared virtual acoustic space, with interactive immersive audio, XR display, and low-latency throughput. A four-piece rock band used this system in a live recording session, performing under different latency and audio conditions. Technical setup and case study protocol is detailed. Evaluation is provided in the form of Quality of Experience rating, tempo analysis, and a semi-structured exit interview.

Auralization of Measured Room Transitions in Virtual Reality

Authors: McKenzie, Thomas; Meyer-Kahlen, Nils; Hold, Christoph; Schlecht, Sebastian J.; Pulkki, Ville

To auralize a room’s acoustics in six degrees-of-freedom virtual reality (VR), a dense set of spatial room impulse response (SRIR) measurements is required, so interpolating between a sparse set is desirable. This paper studies the auralization of room transitions by proposing a baseline interpolation method for higher-order Ambisonic SRIRs and evaluating it in VR. The presented method is simple yet applicable to coupled rooms and room transitions. It is based on linear interpolation with RMS compensation, although direct sound, early reflections, and late reverberation are processed separately, whereby the input direct sounds are first steered to the relative direction-of-arrival before summation and interpolated early reflections are directionally equalized. The proposed method is first evaluated numerically, which demonstrates its improvements over a basic linear interpolation. A listening test is then conducted in six degrees-of-freedom VR, to assess the density of SRIR measurements needed in order to plausibly auralize a room transition using the presented interpolation method. The results suggest that, given the tested scenario, a 50-cm to 1-m inter-measurement distance can be perceptually sufficient.

AR/VR applications commonly face difficulties binaurally spatializing many sound sources at once because of computational constraints. Existing techniques for efficient binaural rendering, such as Ambisonics, Vector-Based Amplitude Panning, or Principal Component Analysis, alleviate this issue by approximating Head-Related Transfer Function (HRTF) datasets with a linear combination of basis filters. This paper proposes a novel binaural renderer that convolves each basis filter with a layer of low-order finite impulse response filters applied in time-domain and derives both the spatial functions and filter coefficients through the minimization of a perceptually motivated error function. In a MUSHRA test, expert listeners had more difficulty differentiating the proposed method from the HRTF dataset it approximates than it did with existing methods configured with an equivalent number of Fast Fourier Transforms and identical HRTF preprocessing. This was consistent across both an internal Microsoft HRTF dataset and an individual head from the SADIE database.

Spatial Integration of Dynamic Auditory Feedback in Electric Vehicle Interior

Authors: Dupré, Théophile; Denjean, Sébastien; Aramaki, Mitsuko; Kronland-Martinet, Richard

With the development of electric motor vehicles, the domain of automotive sound design addresses new issues and is now concerned with creating suitable and pleasant soundscapes inside the vehicle. For instance, the absence of predominant engine sound changes the driver perception of the dynamic of the car. Previous studies proposed relevant sonification strategies to augment the interior sound environment by bringing back vehicle dynamics with synthetic auditory cues. Yet, users report a lack of blending with the existing soundscape. In this study, the authors analyze acoustical and perceptual spatial characteristics of the car soundscape and show that the spatial attributes of sound sources are fundamental to improve the perceptual coherency of the global environment.

The Sonic Interactions in Virtual Environments (SIVE) Toolkit

Authors: Willemsen, Silvin; Nuijens, Helmer; Lasickas, Titas; Serafin, Stefania

In this paper, the Sonic Interactions in Virtual Environments (SIVE) toolkit, a virtual reality (VR) environment for building musical instruments using physical models, is presented. The audio engine of the toolkit is based on finite-difference time-domain (FDTD) methods and works in a modular fashion. The authors show how the toolkit is built and how it can be imported in Unity to create VR musical instruments, and future developments and possible applications are discussed.

Virtual-Reality-Based Research in Hearing Science: A Platforming Approach

Authors: Pedersen, Rasmus Lundby; Picinali, Lorenzo; Kajs, Nynne; Patou, François

The lack of ecological validity in clinical assessment, as well as the challenge of investigating multimodal sensory processing, remain key challenges in hearing science. Virtual Reality (VR) can support hearing research in these domains by combining experimental control with situational realism. However, the development of VR-based experiments is traditionally highly resource demanding, which places a significant entry barrier for basic and clinical researchers looking to embrace VR as the research tool of choice. The Oticon Medical Virtual Reality (OMVR) experiment platform fast-tracks the creation or adaptation of hearing research experiment templates to be used to explore areas such as binaural spatial hearing, multimodal sensory integration, cognitive hearing behavioral strategies, auditory-visual training, etc. In this paper, the OMVR’s functionalities, architecture, and key elements of implementation are presented, important performance indicators are characterized, and a use-case perceptual evaluation is presented.

Engineering reports


Measuring Motion-to-Sound Latency in Virtual Acoustic Rendering Systems

Authors: Meyer-Kahlen, Nils; Kastemaa, Miranda; Schlecht, Sebastian J.; Lokki, Tapio

Few studies that employ virtual acoustic rendering systems accurately specify motion-to-sound latency. To make such assessments more common, we present two methods for latency measurements using either impulsive or periodic movements. The methods only require hardware available in every acoustics lab: a small microphone and a loudspeaker. We provide open-source tools that implement analysis according to the methods. The methods are evaluated on a high-quality optical tracking system. In addition, three small trackers based on inertial measurement units were tested. The results show the reliability of the method for the optical system and the difficulties in defining the latency of inertial measurement unit-based trackers.

Standards and Information Documents


AES Standards Committee News

Features


Call For Papers: Special Issue on Sonification

Call For Papers: Special Issue on Spatial and Immersive Audio

Guest Editors Note


Guest Editors' Note -- Special Issue on Audio for Virtual and Augmented Reality, Part II: Applications

Departments


Extras


Table of Contents

Cover & Sustaining Members List

AES Officers, Committees, Offices & Journal Staff

Institutional Subscribers: If your company or library has an insitutional subscription to the E-Library then click here to access it.

Choose your country of residence from this list:










Skip to content