e-space
Manchester Metropolitan University's Research Repository

    Multi-source attention for Unsupervised Domain Adaptation

    Cui, Xia ORCID logoORCID: https://orcid.org/0000-0002-1726-3814 and Bollegala, Danushka (2020) Multi-source attention for Unsupervised Domain Adaptation. In: The 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing (AACL-IJCNLP) 2020, 04 December 2020 - 07 December 2020, Suzhou, China/Online.

    [img]
    Preview
    Published Version
    Available under License Creative Commons Attribution.

    Download (558kB) | Preview

    Abstract

    We model source-selection in multi-source Unsupervised Domain Adaptation (UDA) as an attention-learning problem, where we learn attention over the sources per given target instance. We first independently learn source-specific classification models, and a relatedness map between sources and target domains using pseudo-labelled target domain instances. Next, we learn domain-attention scores over the sources for aggregating the predictions of the source-specific models. Experimental results on two cross-domain sentiment classification datasets show that the proposed method reports consistently good performance across domains, and at times outperforming more complex prior proposals. Moreover, the computed domain-attention scores enable us to find explanations for the predictions made by the proposed method.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    6Downloads
    6 month trend
    25Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record