Wearable-based estimation of continuous 3D knee moments during running using a convolutional neural network
Lucas Höschler, Christina Halmich, Christoph Schranz, Julian Fritz, Saša Čigoja, Martin Ullrich, Anne D. Koelewijn & Hermann Schwameder (): Wearable-based estimation of continuous 3D knee moments during running using a convolutional neural network In: Sports Biomechanics
This study aimed to develop and validate a machine learning method to estimate continuous 3D knee moments during running from wearable sensor data. Reference knee moments were calculated from 19 recreational runners during treadmill running at varying slopes (0 ± 5 % incline), speeds (self-selected ± 1 km/h) and in 3 types of footwear. A convolutional neural network was trained on data from 7 inertial measuring units (feet, shanks, thighs, sacrum) and a pair of pressure insoles. We assessed performance over continuous time windows (CONT) and during stance phases (PHSS) by intraclass-correlation (ICC), normalised root mean squared error (nRMSE), and statistical parametric mapping. The agreement levels in the sagittal plane were good to excellent (ICC: 0.84–0.98), with low errors (nRMSE: 0.05–0.11). However, accuracy was lower for non-sagittal estimations (frontal ICC: 0.19–0.90, nRMSE: 0.08–0.23; transverse ICC: 0.72–0.94, nRMSE: 0.07–0.17). Accuracy decreased across all planes during PHSS. The proposed approach yields similar or better accuracy compared to previous work while requiring less preprocessing. It provides a viable method for wearable-based assessment of running kinetics in near real-time. Additional data and methods to address inter-individual variability could improve its precision in assessing frontal plane injury risk factors.