e-space
Manchester Metropolitan University's Research Repository

    Objective falls risk assessment using markerless motion capture and representational machine learning

    Maudsley-Barton, Sean ORCID logoORCID: https://orcid.org/0000-0003-0289-0783 and Yap, Moi Hoon ORCID logoORCID: https://orcid.org/0000-0001-7681-4287 (2024) Objective falls risk assessment using markerless motion capture and representational machine learning. Sensors, 24 (14). 4593.

    [img]
    Preview
    Published Version
    Available under License Creative Commons Attribution.

    Download (2MB) | Preview

    Abstract

    Falls are a major issue for those over the age of 65 years worldwide. Objective assessment of fall risk is rare in clinical practice. The most common methods of assessment are time-consuming observational tests (clinical tests). Computer-aided diagnosis could be a great help. A popular clinical test for fall risk is the five times sit-to-stand. The time taken to complete the test is the most commonly used metric to identify the most at-risk patients. However, tracking the movement of skeletal joints can provide much richer insights. We use markerless motion capture, allied with a representational model, to identify those at risk of falls. Our method uses an LSTM autoencoder to derive a distance measure. Using this measure, we introduce a new scoring system, allowing individuals with differing falls risks to be placed on a continuous scale. Evaluating our method on the KINECAL dataset, we achieved an accuracy of 0.84 in identifying those at elevated falls risk. In addition to identifying potential fallers, our method could find applications in rehabilitation. This aligns with the goals of the KINECAL Dataset. KINECAL contains the recordings of 90 individuals undertaking 11 movements used in clinical assessments. KINECAL is labelled to disambiguate age-related decline and falls risk.

    Impact and Reach

    Statistics

    Activity Overview
    6 month trend
    67Downloads
    6 month trend
    20Hits

    Additional statistics for this dataset are available via IRStats2.

    Altmetric

    Repository staff only

    Edit record Edit record