Manchester Metropolitan University's Research Repository

    Non-invasive health prediction from visually observable features

    Khong, Fan Yi, Connie, Tee, Goh, Michael Kah Ong, Wong, Li Pei, Teh, Pin Shen ORCID logoORCID: https://orcid.org/0000-0002-0607-2617 and Choo, Ai Ling (2022) Non-invasive health prediction from visually observable features. F1000Research, 10. p. 918. ISSN 2046-1402

    Published Version
    Available under License Creative Commons Attribution.

    Download (522kB) | Preview


    Background: The unprecedented development of Artificial Intelligence has revolutionised the healthcare industry. In the next generation of healthcare systems, self-diagnosis will be pivotal to personalised healthcare services. During the COVID-19 pandemic, new screening and diagnostic approaches like mobile health are well-positioned to reduce disease spread and overcome geographical barriers. This paper presents a non-invasive screening approach to predict the health of a person from visually observable features using machine learning techniques. Images like face and skin surface of the patients are acquired using camera or mobile devices and analysed to derive clinical reasoning and prediction of the person’s health. Methods: In specific, a two-level classification approach is presented. The proposed hierarchical model chooses a class by training a binary classifier at the node of the hierarchy. Prediction is then made using a set of class-specific reduced feature set. Results: Testing accuracies of 86.87% and 76.84% are reported for the first and second-level classification. Empirical results demonstrate that the proposed approach yields favourable prediction results while greatly reduces the computational time. Conclusions: The study suggests that it is possible to predict the health condition of a person based on his/her face appearance using cost-effective machine learning approaches.

    Impact and Reach


    Activity Overview
    6 month trend
    6 month trend

    Additional statistics for this dataset are available via IRStats2.


    Actions (login required)

    View Item View Item