Manchester Metropolitan University's Research Repository

    Analyzing fibrous tissue pattern in fibrous dysplasia bone images using deep R-CNN networks for segmentation

    Saranya, A, Kottursamy, Kottilingam, AlZubi, Ahmad Ali and Bashir, Ali Kashif ORCID logoORCID: https://orcid.org/0000-0001-7595-2522 (2022) Analyzing fibrous tissue pattern in fibrous dysplasia bone images using deep R-CNN networks for segmentation. Soft Computing, 26 (16). pp. 7519-7533. ISSN 1432-7643

    Accepted Version
    Download (916kB) | Preview


    Predictive health monitoring systems help to detect human health threats in the early stage. Evolving deep learning techniques in medical image analysis results in efficient feedback in quick time. Fibrous dysplasia (FD) is a genetic disorder, triggered by the mutation in Guanine Nucleotide binding protein with alpha stimulatory activities in the human bone genesis. It slowly occupies the bone marrow and converts the bone cell into fibrous tissues. It weakens the bone structure and leads to permanent disability. This paper proposes the study of FD bone image analyzing techniques with deep networks. Also, the linear regression model is annotated for predicting the bone abnormality levels with observed coefficients. Modern image processing begins with various image filters. It describes the edges, shades, texture values of the receptive field. Different types of segmentation and edge detection mechanisms are applied to locate the tumor, lesion, and fibrous tissues in the bone image. Extract the fibrous region in the bone image using the region-based convolutional neural network algorithm. The segmented results are compared with their accuracy metrics. The segmentation loss is reduced by each iteration. The overall loss is 0.24% and the accuracy is 99%, segmenting the masked region produces 98% of accuracy, and building the bounding boxes is 99% of accuracy.

    Impact and Reach


    Activity Overview
    6 month trend
    6 month trend

    Additional statistics for this dataset are available via IRStats2.


    Actions (login required)

    View Item View Item