e-space
Manchester Metropolitan University's Research Repository

    Self-adaptation for unsupervised domain adaptation

    Cui, Xia ORCID logoORCID: https://orcid.org/0000-0002-1726-3814 and Bollegala, Danushka (2019) Self-adaptation for unsupervised domain adaptation. In: Recent Advances in Natural Language Processing (RANLP) 2019, 02 September 2019 - 04 September 2019, Varna, Bulgaria.

    [img]
    Preview
    Published Version
    Available under License Creative Commons Attribution.

    Download (371kB) | Preview

    Abstract

    Lack of labelled data in the target domain for training is a common problem in domain adaptation. To overcome this problem, we propose a novel unsupervised domain adaptation method that combines projection and self-training based approaches. Using the labelled data from the source domain, we first learn a projection that maximises the distance among the nearest neighbours with opposite labels in the source domain. Next, we project the source domain labelled data using the learnt projection and train a classifier for the target class prediction. We then use the trained classifier to predict pseudo labels for the target domain unlabelled data. Finally, we learn a projection for the target domain as we did for the source domain using the pseudo-labelled target domain data, where we maximise the distance between nearest neighbours having opposite pseudo labels. Experiments on a standard benchmark dataset for domain adaptation show that the proposed method consistently outperforms numerous baselines and returns competitive results comparable to that of SOTA including self-training, tri-training, and neural adaptations.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    101Downloads
    6 month trend
    53Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record