e-space
Manchester Metropolitan University's Research Repository

    A plugin for neural audio synthesis of impact sound effects

    Yang, Zih-Syuan and Hockman, Jason ORCID logoORCID: https://orcid.org/0000-0002-2911-6993 (2023) A plugin for neural audio synthesis of impact sound effects. In: AM '23: Audio Mostly 2023, 30 August 2023 - 01 September 2023, Edinburgh, United Kingdom.

    [img]
    Preview
    Accepted Version
    Available under License In Copyright.

    Download (1MB) | Preview

    Abstract

    The term impact sound as referred to in this paper, can be broadly defined as the sudden burst of short-lasting impulsive noise generated by the collision of two objects. This type of sound effect is prevalent in multimedia productions. However, conventional methods of sourcing these materials are often costly in time and resources. This paper explores the potential of neural audio synthesis for generating realistic impact sound effects, targeted for use in multimedia such as films, games, and AR/VR. The designed system uses a Realtime Audio Variational autoEncoder (RAVE) [2] model trained on a dataset of over 3,000 impact sound samples for inference in a Digital Audio Workstation (DAW), with latent representations exposed as user controls. The performance of the trained model is assessed using various objective evaluation metrics, revealing both the prospects and limitations of this approach. The results and contributions of this paper are discussed, with audio examples and source code made available.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    50Downloads
    6 month trend
    13Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record