Islam, Md Robiul, Nahiduzzaman, Md, Goni, Md Omaer Faruq, Sayeed, Abu, Anower, Md Shamim, Ahsan, Mominul and Haider, Julfikar ORCID: https://orcid.org/0000-0001-7010-8285 (2022) Explainable Transformer-Based Deep Learning Model for the Detection of Malaria Parasites from Blood Cell Images. Sensors, 22 (12). 4358.
|
Published Version
Available under License Creative Commons Attribution. Download (24MB) | Preview |
Abstract
Malaria is a life-threatening disease caused by female anopheles mosquito bites. Various plasmodium parasites spread in the victim’s blood cells and keep their life in a critical situation. If not treated at the early stage, malaria can cause even death. Microscopy is a familiar process for diagnosing malaria, collecting the victim’s blood samples, and counting the parasite and red blood cells. However, the microscopy process is time-consuming and can produce an erroneous result in some cases. With the recent success of machine learning and deep learning in medical diagnosis, it is quite possible to minimize diagnosis costs and improve overall detection accuracy compared with the traditional microscopy method. This paper proposes a multiheaded attention-based transformer model to diagnose the malaria parasite from blood cell images. To demonstrate the effectiveness of the proposed model, the gradient-weighted class activation map (Grad-CAM) technique was implemented to identify which parts of an image the proposed model paid much more attention to compared with the remaining parts by generating a heatmap image. The proposed model achieved a testing accuracy, precision, recall, f1-score, and AUC score of 96.41%, 96.99%, 95.88%, 96.44%, and 99.11%, respectively, for the original malaria parasite dataset and 99.25%, 99.08%, 99.42%, 99.25%, and 99.99%, respectively, for the modified dataset. Various hyperparameters were also finetuned to obtain optimum results, which were also compared with state-of-the-art (SOTA) methods for malaria parasite detection, and the proposed method outperformed the existing methods.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.