Explaining the unique nature of individual gait patterns with deep learning
- Authors
- Horst, Fabian; Lapuschkin, Sebastian; Samek, Wojciech; Mueller, Klaus-Robert; Schoellhorn, Wolfgang, I
- Issue Date
- 20-2월-2019
- Publisher
- NATURE PUBLISHING GROUP
- Citation
- SCIENTIFIC REPORTS, v.9
- Indexed
- SCIE
SCOPUS
- Journal Title
- SCIENTIFIC REPORTS
- Volume
- 9
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/67607
- DOI
- 10.1038/s41598-019-38748-8
- ISSN
- 2045-2322
- Abstract
- Machine learning (ML) techniques such as (deep) artificial neural networks (DNN) are solving very successfully a plethora of tasks and provide new predictive models for complex physical, chemical, biological and social systems. However, in most cases this comes with the disadvantage of acting as a black box, rarely providing information about what made them arrive at a particular prediction. This black box aspect of ML techniques can be problematic especially in medical diagnoses, so far hampering a clinical acceptance. The present paper studies the uniqueness of individual gait patterns in clinical biomechanics using DNNs. By attributing portions of the model predictions back to the input variables (ground reaction forces and full-body joint angles), the Layer-Wise Relevance Propagation (LRP) technique reliably demonstrates which variables at what time windows of the gait cycle are most relevant for the characterisation of gait patterns from a certain individual. By measuring the time-resolved contribution of each input variable to the prediction of ML techniques such as DNNs, our method describes the first general framework that enables to understand and interpret non-linear ML methods in (biomechanical) gait analysis and thereby supplies a powerful tool for analysis, diagnosis and treatment of human gait.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.