e-space
Manchester Metropolitan University's Research Repository

    Real-time emotional health detection using fine-tuned transfer networks with multimodal fusion

    Sharma, A, Sharma, K and Kumar, A ORCID logoORCID: https://orcid.org/0000-0003-4263-7168 (2022) Real-time emotional health detection using fine-tuned transfer networks with multimodal fusion. Neural Computing and Applications. ISSN 0941-0643

    [img]
    Preview
    Accepted Version
    Available under License In Copyright.

    Download (992kB) | Preview

    Abstract

    Recognizing and regulating human emotion or a wave of riding emotions are a vital life skill as it can play an important role in how a person thinks, behaves and acts. Accurate real-time emotion detection can revolutionize the human–computer interaction industry and has the potential to provide a proactive approach to mental health care. Several untapped sources of data, including social media data (psycholinguistic markers), multimodal data (audio and video signals) combined with the sensor-based psychophysiological and brain signals, help to comprehend the affective states and emotional experiences. In this work, we propose a model that utilizes three modalities, i.e., visual (facial expression and body gestures), audio (speech) and text (spoken content), to classify emotion into discrete categories based on Ekman’s model with an additional category for ‘neutral’ state. Transfer learning has been used with multistage fine-tuning for each modality instead of training on a single dataset to make the model generalizable. The use of multiple modalities allows integration of heterogeneous data from different sources effectively. The results of the three modalities are combined at the decision-level using weighted fusion technique. The proposed EmoHD model compares favorably to the state-of-the-art technique on two benchmark datasets MELD and IEMOCAP.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    758Downloads
    6 month trend
    131Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record