e-space
Manchester Metropolitan University's Research Repository

    Assessing Type Agreeability in the Unified Model of Personality and Play Styles

    Brooke, Alexander ORCID logoORCID: https://orcid.org/0009-0009-5907-9044, Crossley, Matthew ORCID logoORCID: https://orcid.org/0000-0001-5965-8147, Lloyd, Huw ORCID logoORCID: https://orcid.org/0000-0001-6537-4036 and Cunningham, Stuart (2024) Assessing Type Agreeability in the Unified Model of Personality and Play Styles. In: IEEE Symposium on Computational Intelligence and Games, CIG. Presented at 2024 IEEE Conference on Games (CoG), 5 August 2024 - 8 August 2024.

    [img]
    Preview
    Accepted Version
    Available under License Creative Commons Attribution.

    Download (299kB) | Preview

    Abstract

    Classifying players into well defined groups can be useful when designing games and gamified systems, with many models relating to player or personality 'type'. The Unified Model of Personality and Play Styles groups together many player and personality taxonomies, but whilst similarities have been noted in previous work, the overlap between models has not been analysed ahead of its use. This study provides evidence both for and against aspects of the Unified Model, with model agreeability assessed through comparison of participant classifications. Results show that representations of types related by the Unified Model do correlate significantly greater than types unrelated by the model, but do so with only weak-to-moderate correlation coefficients. Ranking classifications leads to results better mapping to the Unified Model, but also reduces the overall strength of correlations between types. The Unified Model is therefore considered fit for purpose as an explanatory tool, but without additional study should be used with caution in further use cases.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    2Downloads
    6 month trend
    6Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record