AES E-Library

Adaptive Synthesis of Immersive Audio Rendering Filters

One of the key limitations in spatial audio rendering over loudspeakers is the degradation that occurs as the listener`s head moves away from the intended sweet spot. In this paper, we propose a method for designing immersive audio rendering filters using adaptive synthesis methods that can update the filter coefficients in real time. These methods can be combined with a head tracking system to compensate for changes in the listener`s head position. The rendering filter`s weight vectors are synthesized in the frequency domain using magnitude and phase interpolation in frequency sub-bands.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
AES Convention: Paper Number:
Publication Date:
Session subject:
Permalink: https://aes2.org/publications/elibrary-page/?id=9896


(116KB)


Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content