Sultan, Saba, Javed, Ali, Irtaza, Aun, Dawood, Hassan, Dawood, Hussain and Bashir, Ali Kashif ORCID: https://orcid.org/0000-0001-7595-2522 (2019) A hybrid egocentric video summarization method to improve the healthcare for Alzheimer patients. Journal of Ambient Intelligence and Humanized Computing, 10 (10). pp. 4197-4206. ISSN 1868-5137
|
Accepted Version
Available under License In Copyright. Download (2MB) | Preview |
Abstract
Alzheimer patients face difficulty to remember the identity of persons and performing daily life activities. This paper presents a hybrid method to generate the egocentric video summary of important people, objects and medicines to facilitate the Alzheimer patients to recall their deserted memories. Lifelogging video data analysis is used to recall the human memory; however, the massive amount of lifelogging data makes it a challenging task to select the most relevant content to educate the Alzheimer’s patient. To address the challenges associated with massive lifelogging content, static video summarization approach is applied to select the key-frames that are more relevant in the context of recalling the deserted memories of the Alzheimer patients. This paper consists of three main modules that are face, object, and medicine recognition. Histogram of oriented gradient features are used to train the multi-class SVM for face recognition. SURF descriptors are employed to extract the features from the input video frames that are then used to find the corresponding points between the objects in the input video and the reference objects stored in the database. Morphological operators are applied followed by the optical character recognition to recognize and tag the medicines for Alzheimer patients. The performance of the proposed system is evaluated on 18 real-world homemade videos. Experimental results signify the effectiveness of the proposed system in terms of providing the most relevant content to enhance the memory of Alzheimer patients.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.