Hewitt, BM, Yap, Moi, Hodson-Tole, EF, Kennerley, AJ, Sharp, PS and Grant, Robyn (2017) A novel automated rodent tracker (ART), demonstrated in a mouse model of amyotrophic lateral sclerosis. Journal of Neuroscience Methods, 300. pp. 147-156. ISSN 0165-0270
|
Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives. Download (1MB) | Preview |
Abstract
Background Generating quantitative metrics of rodent locomotion and general behaviours from video footage is important in behavioural neuroscience studies. However, there is not yet a free software system that can process large amounts of video data with minimal user interventions. New method Here we propose a new, automated rodent tracker (ART) that uses a simple rule-based system to quickly and robustly track rodent nose and body points, with minimal user input. Tracked points can then be used to identify behaviours, approximate body size and provide locomotion metrics, such as speed and distance. Results ART was demonstrated here on video recordings of a SOD1 mouse model, of amyotrophic lateral sclerosis, aged 30, 60, 90 and 120 days. Results showed a robust decline in locomotion speeds, as well as a reduction in object exploration and forward movement, with an increase in the time spent still. Body size approximations (centroid width), showed a significant decrease from P30. Comparison with existing method(s) ART performed to a very similar accuracy as manual tracking and Ethovision (a commercially available alternative), with average differences in coordinate points of 0.6 and 0.8 mm, respectively. However, it required much less user intervention than Ethovision (6 as opposed to 30 mouse clicks) and worked robustly over more videos. Conclusions ART provides an open-source option for behavioural analysis of rodents, performing to the same standards as commercially available software. It can be considered a validated, and accessible, alternative for researchers for whom non-invasive quantification of natural rodent behaviour is desirable.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.