A combination of wearable sensors’ data and Machine Learning (ML) techniques has been used in many studies to predict specific joint angles and moments. The aim of this study was to compare the performance of four different non-linear regression ML models to estimate lowerlimb joints’ kinematics, kinetics, and muscle forces using Inertial Measurement Units (IMUs) and electromyographys’ (EMGs) data. Seventeen healthy volunteers (9F, 28 ± 5 years) were asked to walk over-ground for a minimum of 16 trials. For each trial, marker trajectories and three force-plates data were recorded to calculate pelvis, hip, knee, and ankle kinematics and kinetics, and muscle forces (the targets), as well as 7 IMUs and 16 EMGs. The features from sensors’ data were extracted using the Tsfresh python package and fed into 4 ML models; Convolutional Neural Networks (CNN), Random Forest (RF), Support Vector Machine, and Multivariate Adaptive Regression Spline for targets’ prediction. The RF and CNN models outperformed the other ML models by providing lower prediction errors in all intended targets with a lower computational cost. This study suggested that a combination of wearable sensors’ data with an RF or a CNN model is a promising tool to overcome the limitations of traditional optical motion capture for 3D gait analysis.
Lower limb joint angles, joint moments, and muscle forces during gait for 17 healthy volunteers (9F, 28±5 yrs) along with raw IMU and EMG data. This dataset can be used to build regression-based machine learning models for the prediction of intended targ
This data set includes lower limb joint angles, joint moments, and muscle forces during gait for 17 healthy volunteers (9F, 28±5 yrs), along with raw IMU and EMG data. Joint angles and moments are related to pelvis tilt, pelvis obliquity, pelvis rotation, hip flexion/extension, hip adduction/abduction, hip rotation, knee flexion/extension, ankle dorsi/plantar flexion, and ankle inversion/eversion. EMG surface electrodes were recorded from lower limb muscles' activity on both legs (Gluteus maximus, Rectus femoris, Vastus lateralis, Biceps femoris, Semimembranosus, Medial gastrocnemius, Soleus, and Tibialis anterior). Three-dimensional acceleration (m/s^2) and angular velocity (rad/s) were recorded from 7 IMUs attached to each segment of the lower limbs (one on the pelvis, one on each foot, shank, and thigh). The dataset can be used to build regression-based machine learning models for the prediction of intended targets (Joint angles, joint moments, and muscle forces) by using wearable sensors' data. We built a multi-output random forest (RF) model to predict all targets simultaneously from wearable sensors' data with the aid of the Tsfresh python package for automatic feature extraction. The RF model can predict all targets with a reasonable accuracy that is comparable to the literature.