Setting up the OpenCap pipeline locally

New project for OpenCap, which is a new software package to estimate 3D human movement dynamics from smartphone videos. OpenCap strongly relies on OpenSim.
POST REPLY
User avatar
Matt Petrucci
Posts: 216
Joined: Fri Feb 24, 2012 11:49 am

Setting up the OpenCap pipeline locally

Post by Matt Petrucci » Mon Nov 11, 2024 1:29 pm

We have received many questions about setting up a local OpenCap pipeline on your own computer or server.

Reproducing the entire pipeline

It is not currently possible to reproduce the whole pipeline locally, as it requires many infrastructure elements to be working together. Also, settings need to be changed in the iOS app in order to get accurate camera intrinsics.

Using your own videos with opencap-core

Having the correct intrinsics (internal hardware properties) and extrinsics (relationship of camera to other cameras in the environment) is essential for the OpenCap pipeline to work. Please note that even if you use the native camera app on an iPhone that we support, you will not be able to use the intrinsics that are available within our repository. There are several parameters that we fix (like auto focus) that cannot be done within the camera app.

If you have calibrated RGB cameras that are manually focused, you could map the camera parameters (intrinsic and extrinsic) to match the format we use (link). Then, you can re-engineer the code that we have provided to run the opencap-core code.

Note, to run the pose detection, you will need to have a computer that meets the hardware requirements that we recommend.

POST REPLY