e-space
Manchester Metropolitan University's Research Repository

    Modelling Multiple Language Learning in a Developmental Cognitive Architecture

    Giorgi, Ioanna, Golosio, Bruno, Esposito, Massimo, Cangelosi, Angelo and Masala, Giovanni L (2021) Modelling Multiple Language Learning in a Developmental Cognitive Architecture. IEEE Transactions on Cognitive and Developmental Systems, 13 (4). pp. 922-933. ISSN 2379-8920

    [img]
    Preview
    Accepted Version
    Available under License In Copyright.

    Download (835kB) | Preview

    Abstract

    In this work, we model multiple natural language learning in a developmental neuroscience-inspired architecture. The ANNABELL model (Artificial Neural Network with Adaptive Behaviour Exploited for Language Learning), is a large-scale neural network, however, unlike most deep learning methods that solve natural language processing (NLP) tasks, it does not represent an empirical engineering solution for specific NLP problems; rather, its organisation complies with findings from cognitive neuroscience, particularly the multi-compartment working memory models. The system is appropriately trained to understand the level of cognitive development required for language acquisition and the robustness achieved in learning simultaneously four languages, using a corpus of text-based exchanges of developmental complexity. The selected languages, Greek, Italian and Albanian, besides English, differ significantly in structure and complexity. Initially, the system was validated in each language alone and was then compared with the open-ended cumulative training, in which languages are learned jointly, prior to querying with random language at random order. We aimed to assess if the model could learn the languages together to the same degree of skill as learning each apart. Moreover, we explored the generalisation skill in multilingual context questions and the ability to elaborate a short text of preschool literature. We verified if the system could follow a dialogue coherently and cohesively, keeping track of its previous answers and recalling them in subsequent queries. The results show that the architecture developed broad language processing functionalities, with satisfactory performances in each language trained singularly, maintaining high accuracies when they are acquired cumulatively.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    706Downloads
    6 month trend
    119Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record