AES E-Library

Neural Proxies for Sound Synthesizers: Learning Perceptually Informed Preset Representations

Deep learning appears as an appealing solution for automatic synthesizer programming (ASP), which aims to assist musicians and sound designers in programming sound synthesizers. However, integrating software synthesizers into training pipelines is challenging due to their potential nondifferentiability. This work tackles this challenge by introducing a method to approximate arbitrary synthesizers. Specifically, a neural network is trained to map synthesizer presets onto an audio embedding space derived from a pretrained model. This facilitates the definition of a neural proxy that produces compact yet effective representations, thereby enabling the integration of audio embedding loss into neural-based ASP systems for black-box synthesizers. The authors evaluate the representations derived by various pretrained audio models in the context of neural-based methods for ASP and assess the effectiveness of several neural network architectures, including feedforward, recurrent, and transformer-based models, in defining neural proxies. The proposed method is evaluated using both synthetic and handcrafted presets from three popular software synthesizers and assessed its performance in a synthesizer sound-matching downstream task. Although the benefits of the learned representation are nuanced by resource requirements, encouraging results were obtained for all synthesizers, paving the way for future research into the application of synthesizer proxies for neural-based ASP systems.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
Publication Date:
Permalink: https://aes2.org/publications/elibrary-page/?id=22956


(975KB)


Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content