AES E-Library

Spatial auditory masking between real sound signals and virtual sound images

In an augmented reality environment, real and virtual world audio signals are simultaneously presented to a listener. Virtual sound content and a real sound source should not interfere with each other. Thus, to make this possible, we have examined spatial auditory masking between maskers and maskees, where maskers are real sound signals emitted from loudspeakers, and maskees are virtual sound images generated using head-related transfer functions (HRTFs), emitted from headphones. The experiment was conducted using open-ear headphones, which allows us to hear the environment while listening to the audio content. The results are similar to those of a previous experiment, in which the masker and maskee were both real signals emitted from loudspeakers. Masking threshold levels as a function of maskee locations have a symmetric property to the frontal plane of a subject with a given masker location. However, the masking threshold level is lower than in the previous experiment, perhaps due to HRTFs’ limited ability to localize sound images. The results indicate that, like real sound signals, spatial auditory masking of human hearing occurs with virtually localized sound images.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
AES Convention: Paper Number:
Publication Date:
Session subject:
Permalink: https://aes2.org/publications/elibrary-page/?id=21488


(1041KB)


Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content