e-space
Manchester Metropolitan University's Research Repository

    Eye gaze and production accuracy predict English L2 speakers' morphosyntactic learning

    McDonough, Kim, Trofimovich, Pavel, Dao, Phung ORCID logoORCID: https://orcid.org/0000-0002-8612-5589 and Dion, Alexandre (2016) Eye gaze and production accuracy predict English L2 speakers' morphosyntactic learning. Studies in Second Language Acquisition, 39 (4). pp. 851-868. ISSN 0272-2631

    [img]
    Preview
    Accepted Version
    Available under License In Copyright.

    Download (291kB) | Preview

    Abstract

    <jats:p>This study investigated the relationship between second language (L2) speakers’ success in learning a new morphosyntactic pattern and characteristics of one-on-one learning activities, including opportunities to comprehend and produce the target pattern, receive feedback from an interlocutor, and attend to the meaning of the pattern through self- and interlocutor-initiated eye-gaze behaviors. L2 English students (<jats:italic>N =</jats:italic> 48) were exposed to the transitive construction in Esperanto (e.g., <jats:italic>filino mordas pomon</jats:italic> [SVO] or <jats:italic>pomon mordas filino</jats:italic> [OVS] “girl bites apple”) through comprehension and production activities with an interlocutor, receiving feedback in the form of recasts for their Esperanto errors. The L2 speakers’ interpretation and production of Esperanto transitives were then tested using known and novel lexical items. The results indicated that OVS test performance was predicted by the duration of self-initiated eye gaze to images illustrating the OVS pattern during the comprehension learning activity and by accurate production of OVS sentences during the production learning activity. The findings suggest important roles for eye-gaze behavior and production opportunities in L2 pattern learning.</jats:p>

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    475Downloads
    6 month trend
    258Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record