e-space
Manchester Metropolitan University's Research Repository

    Quantification of explainability in black box models using complexity measures

    Nneke, Ngozi, Lloyd, Huw and Crockett, Keeley ORCID logoORCID: https://orcid.org/0000-0003-1941-6201 (2023) Quantification of explainability in black box models using complexity measures. In: 2023 15th International Conference on Innovations in Information Technology (IIT), 14 November 2023 - 15 November 2023, College of Information Technology (CIT), United Arab Emirates University.

    [img]
    Preview
    Accepted Version
    Available under License In Copyright.

    Download (4MB) | Preview

    Abstract

    As a result in the rapid growth of explainability methods, there is a significant interest, driven by industry to develop methods for quantitative evaluation of such explanations. The availability of standard explainability evaluation methods would result in the ability to develop models that suit different stakeholders in different use cases. To address this issue, we propose three measures of the complexity of explanations based on Linear correlation, Monotonicity, and ϕK. We evaluate these measures on three tabular datasets (Ames House Price, Auto Price, and Wind). We investigate how these complexity measures vary with model accuracy. Our results show that model accuracy varies with complexity measures across the datasets. These variations indicate that models can be developed with the same accuracy but with less complex explanations as a result of varying the hyperparameters. We observe a trade-off between complexity measures and model accuracy which is evidenced in Pareto-fronts. We suggest that our metrics could be used for the development of multi-objective optimization methods for machine learning models with tunable accuracy and simplicity of explanation.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    48Downloads
    6 month trend
    75Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record