AES E-Library

Real-World Environment Simulation for Validation of AI Sound Detection and Localization

This paper presents an experimental framework designed to evaluate the performance of Deep Neural Networks (DNNs) in detecting and localizing audio signals in a controlled laboratory setting. Departing from conventional validation methods, our methodology emphasizes the importance of a precisely configured laboratory setup to ensure accurate and reliable assessment of DNN capabilities. Central to our approach is the use of Wave-Field Synthesis (WFS) technology, which enables the recreation of realistic acoustic environments in the laboratory. By leveraging this technology, we can simulate a wide range of acoustic scenarios, allowing for comprehensive testing of DNN performance under varying conditions. Additionally, our methodology incorporates diverse datasets carefully selected to represent real-world audio stimuli. Furthermore, we propose as an example, the development of an AI-based sound detection and localization system tailored for emergency sounds in vehicular environments. This initiative aims to assess the performance of AI systems trained on data from the meticulously constructed validation environment outlined in this study.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
AES Convention: Paper Number:
Publication Date:
Permalink: https://aes2.org/publications/elibrary-page/?id=22544


(274KB)


Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content