e-space
Manchester Metropolitan University's Research Repository

    Using a NEAT approach with curriculums for dynamic content generation in video games

    Hind, D and Harvey, C ORCID logoORCID: https://orcid.org/0000-0002-4809-1592 (2024) Using a NEAT approach with curriculums for dynamic content generation in video games. Personal and Ubiquitous Computing, 28 (3-4). pp. 629-641. ISSN 1617-4909

    [img]
    Preview
    Published Version
    Available under License Creative Commons Attribution.

    Download (2MB) | Preview

    Abstract

    This paper presents a novel exploration of the use of an evolving neural network approach to generate dynamic content for video games, specifically for a tower defence game. The objective is to employ the NeuroEvolution of Augmenting Topologies (NEAT) technique to train a NEAT neural network as a wave manager to generate enemy waves that challenge the player’s defences. The approach is extended to incorporate NEAT-generated curriculums for tower deployments to gradually increase the difficulty for the generated enemy waves, allowing the neural network to learn incrementally. The approach dynamically adapts to changes in the player’s skill level, providing a more personalised and engaging gaming experience. The quality of the machine-generated waves is evaluated through a blind A/B test with the Games Experience Questionnaire (GEQ), and results are compared with manually designed human waves. The study finds no discernible difference in the reported player experience between AI and human-designed waves. The approach can significantly reduce the time and resources required to design game content while maintaining the quality of the player experience. The approach has the potential to be applied to a range of video game genres and within the design and development process, providing a more personalised and engaging gaming experience for players.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    3Downloads
    6 month trend
    9Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record