AES E-Library

Advances in Group Delay Modeling and Optimization

This paper introduces a stochastic optimization algorithm to model and equalize group delay functions with all-pass filters. The method employs two techniques that are used to optimize the hyperparameters of deep neural networks and machine learning models. Various examples are used to demonstrate that the stochastic search algorithm (employing either of the two techniques) presented consistently outperforms baseline or state-of-the-art (SOTA) techniques. Moreover, the resulting all-pass filters are stable because the pole amplitudes are constrained within the unit circle during optimization. Initialization with poles obtained from closed-form expressions of second-order all-pass group delay functions further leads to an excellent approximation of the desired group delay function.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
AES Convention: Paper Number:
Publication Date:
Permalink: https://aes2.org/publications/elibrary-page/?id=22507


(12566KB)


Download Now

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content