AboutDownloadsDocumentsForumsIssuesNews

Inertial sensing and computer vision are promising alternatives to traditional optical motion tracking, but until now these data sources have been explored either in isolation or fused without incorporating equations of motion. By adding physiological plausibility and dynamical robustness to a proposed solution, biomechanical modeling may enable better fusion. To test this hypothesis, we fused RGB video and inertial sensing data with analytical kinematics equations of motion and dynamics equations of motion with a nine degree-of-freedom model and investigated whether adding these equations of motion enables fusion methods to outperform video-only and inertial-sensing-only methods on data of varying qualities.


This project links to a repository containing code for running 4 classes of simulations in MATLAB with a nine DOF biomechanical model for estimating full body kinematics (and dynamics and contact forces if using direct collocation):

(1) IMU and vision data fusion (tracking) via direct collocation (kinematics+dynamics equations of motion)

(2) IMU only data tracking (and denoising) via direct collocation (kinematics+dynamics equations of motion)

(3) Unconstrained IMU and vision data fusion via inverse kinematics (using kinematics equations of motion)

(4) Unconstrained kinematics calculations using computer vision keypoints only (using kinematics equations of motion)

Feedback