Cunningham, R, Harding, P and Loram, I (2017) Deep residual networks for quantification of muscle fiber orientation and curvature from ultrasound images. In: 21st Annual Conference, Medical Image Understanding and Analysis (MIUA 2017), 11 July 2017 - 13 July 2017, Edinburgh, UK.
|
Accepted Version
Available under License In Copyright. Download (748kB) | Preview |
Abstract
This paper concerns fully automatic and objective measurement of human skeletal muscle fiber orientation directly from standard b-mode ultrasound images using deep residual (ResNet) and convolutional neural networks (CNN). Fiber orientation and length is related with active and passive states of force production within muscle. There is currently no non-invasive way to measure force directly from muscle. Measurement of forces and other contractile parameters like muscle length change, thickness, and tendon length is not only important for understanding healthy muscle, but such information has contributed to understanding, diagnosis, monitoring, targeting and treatment of diseases ranging from myositis to stroke and motor neurone disease (MND). We applied well established deep learning methods to ultrasound data recorded from 19 healthy participants (5 female, ages: 30 ± 7.7) and achieved state of the art accuracy in predicting fiber orientation directly from ultrasound images of the calf muscles. First we used a previously developed segmentation technique to extract a region of interest within the gastrocnemius muscle. Then we asked an expert to annotate the main line of fiber orientation in 4 × 4 partitions of 400 normalized images. A linear model was then applied to the annotations to regulate and recover the orientation field for each image. Then we applied a CNN and a ResNet to predict the fiber orientation in each image. With leave one participant out cross-validation and dropout as a regulariser, we were able to demonstrate state of the art performance, recovering the fiber orientation with an average error of just 2°.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.