You are currently logged in as an
Institutional Subscriber.
If you would like to logout,
please click on the button below.
Home / Publications / E-library page
Only AES members and Institutional Journal Subscribers can download
This paper introduces a stochastic optimization algorithm to model and equalize group delay functions with all-pass filters. The method employs two techniques that are used to optimize the hyperparameters of deep neural networks and machine learning models. Various examples are used to demonstrate that the stochastic search algorithm (employing either of the two techniques) presented consistently outperforms baseline or state-of-the-art (SOTA) techniques. Moreover, the resulting all-pass filters are stable because the pole amplitudes are constrained within the unit circle during optimization. Initialization with poles obtained from closed-form expressions of second-order all-pass group delay functions further leads to an excellent approximation of the desired group delay function.
Author (s): Bharitkar, Sunil
Affiliation:
Samsung Research America
(See document for exact affiliation information.)
AES Convention: 156
Paper Number:10694
Publication Date:
2024-06-06
Import into BibTeX
Permalink: https://aes2.org/publications/elibrary-page/?id=22507
(12566KB)
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.
Bharitkar, Sunil; 2024; Advances in Group Delay Modeling and Optimization [PDF]; Samsung Research America; Paper 10694; Available from: https://aes2.org/publications/elibrary-page/?id=22507
Bharitkar, Sunil; Advances in Group Delay Modeling and Optimization [PDF]; Samsung Research America; Paper 10694; 2024 Available: https://aes2.org/publications/elibrary-page/?id=22507