Duan, Lian, Cheng, Jianquan ORCID: https://orcid.org/0000-0001-9778-9009, Huang, Shiqi, Long, Xuancheng, Li, Ling and Liu, Wenchao
(2025)
Drone-based Gait Spatial Analysis Across Age Groups in an Urban Park using deep learning approach.
In: 2025 IEEE Conference on Serious Games and Applications for Health (SeGAH), pp. 1-8. Presented at IEEE International Conference on Serious Games and Applications for Health (SeGAH), 6 August 2025 - 8 August 2025, Manchester, UK.
|
Accepted Version
Available under License Creative Commons Attribution. Download (1MB) | Preview |
Abstract
This study introduces a pioneering drone-based spatial gait analysis approach for scalable and ecologically valid public health monitoring. Leveraging a processing pipeline integrating Drone-YOLO for object detection, YOLOv11-pose for pose estimation, ByteTrack for multi-object tracking, and geographic coordinate mapping, we achieved high-fidelity extraction of pedestrian keypoints and trajectories from drone imagery. Rigorous testing on 1679 pedestrians across age groups in Qingxiu Mountain Park, Nanning, China, revealed significant inter-group differences in gait speed, trunk inclination, and stride length variability. Spatial analysis highlighted elevated gait speeds in open, flat areas of the Main Gate Plaza, contrasting with significantly reduced speeds in congested, topographically complex Flower Plaza. "Cold spots" of gait speed, coinciding with high pedestrian density and uneven terrain, were identified in Flower Plaza, while "hot spots" were observed in open pathways of the Main Gate Plaza. These findings demonstrate the transformative potential of drone-based spatial gait analysis for public health surveillance, evidence-based urban planning, and understanding the complex interplay between human movement, health proxies, and urban environments. Future research should enhance pipeline robustness in occluded urban settings and explore multi-modal sensor fusion to refine dynamic human behavior analysis from aerial perspectives.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.

