Home / Publications / E-library page
Only AES members and Institutional Journal Subscribers can download
The increase in and demand for immersive audio content production and consumption, particularly in VR, is driving the need for tools to facilitate creation. Immersive productions place additional demands on sound design teams, specifically around the increased complexity of scenes, increased number of sound producing objects, and the need to spatialise sound in 360?. This paper presents an initial feasibility study for a methodology utilising visual object detection in order to detect, track, and match content for sound generating objects, in this case based on a simple 2D visual scene. Results show that while successful for a single moving object there are limitations within the current computer vision system used which causes complications for scenes with multiple objects. Results also show that the recommendation of candidate sound effect files is heavily dependent on the accuracy of the visual object detection system and the labelling of the audio repository used.
Author (s): Turner, Daniel; Pike, Chris; Murphy, Damian
Affiliation:
University of York; BBC R&D; University of York
(See document for exact affiliation information.)
AES Convention: 148
Paper Number:10375
Publication Date:
2020-05-06
Import into BibTeX
Session subject:
Applications
Permalink: https://aes2.org/publications/elibrary-page/?id=20792
(3111KB)
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member Join the AES. If you need to check your member status, login to the Member Portal.
Turner, Daniel; Pike, Chris; Murphy, Damian; 2020; Content matching for sound generating objects within a visual scene using a computer vision approach [PDF]; University of York; BBC R&D; University of York; Paper 10375; Available from: https://aes2.org/publications/elibrary-page/?id=20792
Turner, Daniel; Pike, Chris; Murphy, Damian; Content matching for sound generating objects within a visual scene using a computer vision approach [PDF]; University of York; BBC R&D; University of York; Paper 10375; 2020 Available: https://aes2.org/publications/elibrary-page/?id=20792
@article{turner2020content,
author={turner daniel and pike chris and murphy damian},
journal={journal of the audio engineering society},
title={content matching for sound generating objects within a visual scene using a computer vision approach},
year={2020},
number={10375},
month={may},}