e-space
Manchester Metropolitan University's Research Repository

    Towards Socially Intelligent Automated Tutors: Predicting Learning Style Dimensions from Conversational Dialogue

    Adel, N ORCID logoORCID: https://orcid.org/0000-0003-4449-7410, Latham, AM ORCID logoORCID: https://orcid.org/0000-0002-8410-7950 and Crockett, K ORCID logoORCID: https://orcid.org/0000-0003-1941-6201 (2017) Towards Socially Intelligent Automated Tutors: Predicting Learning Style Dimensions from Conversational Dialogue. In: 13th IEEE International Conference on Ubiquitous Intelligence and Computing (UIC 2016), 18 July 2016 - 21 July 2016, Toulouse, France.

    [img]
    Preview
    Accepted Version
    Download (256kB) | Preview

    Abstract

    Conversational Intelligent Tutoring Systems (CITS) that automatically adapt to learning styles (LS) can improve learning, however current modelling of LS has ignored Neutral learners. This paper presents research examining the ability of data mining algorithms to predict LS dimensions from behaviour captured during natural language tutorials with Oscar CITS. Two datasets, 2ClassBDS and 3ClassBDS, were cleaned and prepared for the data mining task of predicting student LS. Each dataset comprised four sub-datasets representing the four Felder-Silverman LS dimensions. 3ClassBDS included a third Neutral class describing individuals with a balance of LS preferences. Naïve Bayes, Decision Trees, Lazy Learning and Neural Networks algorithms were applied to each dataset and parameters adjusted to improve prediction accuracies. The 2ClassBDS dataset results show good prediction, with decision trees (Simple CART) achieving accuracies of 81.33-86.66%. For 3ClassBDS results were mixed, with the J48 algorithm achieving 56-73% accuracy, indicating that further work and data is needed.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    533Downloads
    6 month trend
    416Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record