e-space
Manchester Metropolitan University's Research Repository

    DVC-Net: a new dual-view context-aware network for emotion recognition in the wild

    Qing, Linbo ORCID logoORCID: https://orcid.org/0000-0003-3555-0005, Wen, Hongqian, Chen, Honggang, Jin, Rulong, Cheng, Yongqiang ORCID logoORCID: https://orcid.org/0000-0001-7282-7638 and Peng, Yonghong ORCID logoORCID: https://orcid.org/0000-0002-5508-1819 (2024) DVC-Net: a new dual-view context-aware network for emotion recognition in the wild. Neural Computing and Applications, 36 (2). pp. 653-665. ISSN 0941-0643

    [img] Accepted Version
    File will be available on: 4 October 2024.
    Available under License In Copyright.

    Download (664kB)

    Abstract

    Emotion recognition in the wild (ERW) is a challenging task due to unknown and the unconstrained scenes in the wild environment. Different from previous approaches that use facial expression or posture for ERW, a growing number of researches are beginning to utilize contextual information to improve the performance of emotion recognition. In this paper, we propose a new dual-view context-aware network (DVC-Net) to fully explore the usage of contextual information from global and local views, and balance the individual features and context features by introducing the attention mechanism. The proposed DVC-Net consists of three parallel modules: (1) the body-aware stream to suppress the uncertainties of body gesture feature representation, (2) the global context-aware stream based on salient context to capture the global-level effective context, and (3) the local context-aware stream based on graph convolutional network to find the local discriminative features with emotional cues. Quantitative evaluations have been carried out on two in-the-wild emotion recognition datasets. The experimental results demonstrated that the proposed DVC-Net outperforms the state-of-the-art methods.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    0Downloads
    6 month trend
    10Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record