Hussein, Aziza I, Hassan, Mennatall Essam, Ekpo, Sunday ORCID: https://orcid.org/0000-0001-9219-3759, Alyami, Ghadah S, Elias, Fanuel, Salah, Ibrahim and Mabrook, M Mourad (2024) Cloud Detection and Removal from RGB Images using U-Net Semantic Segmentation and CloudGAN Models. In: International Adaptive and Sustainable Science, Engineering and Technology (ASSET) Conference 2024, 16 July 2024 - 18 July 2024, Manchester. (In Press)
Accepted Version
File not available for download. Available under License In Copyright. Download (533kB) |
Abstract
Satellite-based imagery provides an indispensable tool for many applications, ranging from environmental surveillance to urban development and managing natural disasters. Nevertheless, the presence of clouds can often impede the useful-ness of these images by veiling significant details. In the current study, we proposed an innovative strategy for identify-ing and eliminating clouds within RGB satellite images employing deep learning techniques. This involves using a Cloud Generative Adversarial Network (CloudGAN) to carry out image inpainting tasks and U-Net for semantic segmentation. The proposed methodology yields encouraging outcomes, showcasing its ability to discern and eradicate clouds effective-ly, thereby enhancing the clarity and practicality of satellite imagery. The proposed approach demonstrates superior cloud removal compared to traditional methods, achieving a remarkable overall accuracy of 95\% in both cloud detec-tion and removal. This underscores its effectiveness in enhancing image quality and utility. The qualitative assessment confirms the models' ability to produce high-quality, cloud-free images, preserving essential features and details faithfully. Additionally, the inpainted images closely resemble the ground truth, affirming the accuracy of the models in cloud removal.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.