AES E-Library

A Model of Loudness Applicable to Time-Varying Sounds

Previously we described a model for calculating the loudness of steady sounds from their spectrum. Here a new version of the model is presented, which uses a waveform as its input. The stages of the model are as follows. (a) A finite impulse response filter representing transfer through the outer and middle ear. (b) Calculation of the short-term spectrum using the fast Fourier transform (FFT). To give adequate spectral resolution at low frequencies, combined with adequate temporal resolution at high frequencies, six FFTs are calculated in parallel, using longer signal segments for low frequencies and shorter segments for higher frequencies. (c) Calculation of an excitation pattern from the physical spectrum. (d) Transformation of the excitation pattern to a specific loudness pattern. (e) Determination of the area under the specific loudness pattern. This gives a value for the "instantaneous" loudness. The short-term perceived loudness is calculated from the instantaneous loudness using an averaging mechanism similar to an automatic gain control system, with attack and release times. Finally the overall loudness impression is calculated from the short-term loudness using a similar averaging mechanism, but with longer attack and release times. The new model gives very similar predictions to our earlier model for steady sounds. In addition, it can predict the loudness of brief sounds as a function of duration and the overall loudness of sounds that are amplitude modulated at various rates.

 

Author (s):
Affiliation: (See document for exact affiliation information.)
Publication Date:
Permalink: https://aes2.org/publications/elibrary-page/?id=11081


(4170KB)


Download Now

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.

Type:
E-Libary location:
16938
Choose your country of residence from this list:










Skip to content