Show simple item record

dc.contributor.authorGeissinger, Jack H.en
dc.contributor.authorAsbeck, Alan T.en
dc.identifier.citationGeissinger, J.H.; Asbeck, A.T. Motion Inference Using Sparse Inertial Sensors, Self-Supervised Learning, and a New Dataset of Unscripted Human Motion. Sensors 2020, 20, 6330.en
dc.description.abstractIn recent years, wearable sensors have become common, with possible applications in biomechanical monitoring, sports and fitness training, rehabilitation, assistive devices, or human-computer interaction. Our goal was to achieve accurate kinematics estimates using a small number of sensors. To accomplish this, we introduced a new dataset (the Virginia Tech Natural Motion Dataset) of full-body human motion capture using XSens MVN Link that contains more than 40 h of unscripted daily life motion in the open world. Using this dataset, we conducted self-supervised machine learning to do kinematics inference: we predicted the complete kinematics of the upper body or full body using a reduced set of sensors (3 or 4 for the upper body, 5 or 6 for the full body). We used several sequence-to-sequence (Seq2Seq) and Transformer models for motion inference. We compared the results using four different machine learning models and four different configurations of sensor placements. Our models produced mean angular errors of 10–15 degrees for both the upper body and full body, as well as worst-case errors of less than 30 degrees. The dataset and our machine learning code are freely available.en
dc.rightsCreative Commons Attribution 4.0 Internationalen
dc.titleMotion Inference Using Sparse Inertial Sensors, Self-Supervised Learning, and a New Dataset of Unscripted Human Motionen
dc.typeArticle - Refereeden
dc.contributor.departmentElectrical & Computer Engineeringen
dc.contributor.departmentMechanical Engineeringen

Files in this item


This item appears in the following Collection(s)

Show simple item record

Creative Commons Attribution 4.0 International
License: Creative Commons Attribution 4.0 International