AboutDownloadsDocumentsForumsIssuesNews

This site serves for dissemination of upper body motion data from stroke patients during rehabilitation. The data is captured using wearable inertial measurement units and cameras. The data are fully labeled by trained annotators. A link to the code for the deep learning model is also provided.


We present PrimSeq, a pipeline to classify and count functional motions trained in stroke rehabilitation. PrimSeq encompasses three main steps: (1) the capture of upper body motion during rehabilitation with wearable inertial measurement units (IMUs) and video, (2) the generation of primitive sequences from IMU data with the trained deep learning model, and (3) the tallying of primitives with a counting algorithm.

To build this approach, we collected a large realistic dataset of stroke patients and healthy controls undergoing rehabilitation training activities. We labeled the dataset and developed a new deep learning method to process it.

Using a previously established functional motion taxonomy (Schambra et al., 2019), we identified five classes of functional primitives, which are elemental units of functional motion. These classes are reach, reposition, transport, stabilize, and idle. A reach is a UE motion to move into contact with a target object; a reposition is a UE motion to move proximate to a target object; a transport is a UE motion to convey a target object; a stabilize is a minimal-motion to keep a target object still; and an idle is a minimal-motion to stand at the ready near target object.

Data include:
- Subject demographic and clinical characteristics
- Kinematic data from 9 IMUs affixed to the upper body: a 77-dimensional dataset every 10 ms consisting of 27 dimensions of accelerations (9 IMUs × 3D accelerations per IMU), 27 dimensions of quaternions (9 IMUs × 3D quaternions per IMU), 22 joint angles, and side of the patient’s affected upper extremity (left or right).
- Video features from 2 orthogonal cameras:

We invite you to download the data and code.

References:
- Schambra HM, Parnandi A, Pandit NG, Uddin J, Wirtanen A, Nilsen DM. A Taxonomy of Functional Upper Extremity Motion. Front Neurol. 2019 Aug 20;10:857. doi: 10.3389/fneur.2019.00857. PMID: 31481922; PMCID: PMC6710387.

- Parnandi, A., Kaku, A., Venkatesan, A., Pandit, N., Wirtanen, A., Rajamohan, H., Venkataramanan, K., Nilsen, D., Fernandez-Granda, C. and Schambra, H., 2021. PrimSeq: a deep learning-based pipeline to quantitate rehabilitation training. arXiv preprint arXiv:2112.11330. (https://arxiv.org/abs/2112.11330)

- Kaku, A., Liu, K., Parnandi, A., Rajamohan, H.R., Venkataramanan, K., Venkatesan, A., Wirtanen, A., Pandit, N., Schambra, H. and Fernandez-Granda, C., 2021. Sequence-to-Sequence Modeling for Action Identification at High Temporal Resolution. arXiv preprint arXiv:2111.02521. (https://arxiv.org/abs/2111.02521)

Acknowledgement:
This work was funded by the American Heart Association/Amazon Web Service postdoctoral fellowship 19AMTG35210398 (A.P.), NIH R01 LM013316 (C.F.G., H.S.), NIH K02 NS104207 (H.S.), NIH NCATS UL1TR001445 (H.S.), and NSF NRT-HDR 1922658 (A.K., C.F.G.)

Feedback