Manchester Metropolitan University's Research Repository

    Methods for Affective Content Analysis and Recognition in Film

    Roberts, Shaun Andrew (2023) Methods for Affective Content Analysis and Recognition in Film. Doctoral thesis (PhD), Manchester Metropolitan University.


    Available under License Creative Commons Attribution Non-commercial No Derivatives.

    Download (3MB) | Preview


    The research presented in this thesis resulted from the growing attention on the effects of emotion on users, raising questions about their potential application to computational systems. This research investigates the best methods for determining affective scoring for video content, specifically films. This resulted in the affective video system (AVS) framework, AVS dataset and AVS systems being developed, leading to several contributions to knowledge about the best affective methods and systems. This work presents the necessary theory to understand the subject area. It builds as the thesis matures, laying a pathway in the form of a methodology framework for viewing affective problems and systems, moving into a subsequent study reviewing the well-recognised affective methods such as the International Affective Picture System (IAPS) and how its well-defined processes and procedures could be adapted for a more modern approach using video content. The research then studies the most critical perceivable features from video clips for users, which were analysed using the repertory grid approach. This led to the above contributions being combined to create the AVS system and database, which is a unique database comprising the affective scores for various film clips. This research concluded with the presentation of the best regression methods resulting from this research and its datasets and a summary of this performance, and discussions of the results in terms of other research in this area.

    Impact and Reach


    Activity Overview
    6 month trend
    6 month trend

    Additional statistics for this dataset are available via IRStats2.

    Actions (login required)

    View Item View Item