On Natural Motion Processing using Inertial Motion Capture and Deep Learning

TR Number
Date
2020-05-21
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Tech
Abstract

Human motion collected in real-world environments without instruction from researchers - or natural motion - is an understudied area of the field of motion capture that could increase the efficacy of assistive devices such as exoskeletons, robotics, and prosthetics. With this goal in mind, a natural motion dataset is presented in this thesis alongside algorithms for analyzing human motion. The dataset contains more than 36 hours of inertial motion capture data collected while the 16 participants went about their lives. The participants were not instructed on what actions to perform and interacted freely with real-world environments such as a home improvement store and a college campus. We apply our dataset in two experiments. The first is a study into how manual material handlers lift and bend at work, and what postures they tend to use and why. Workers rarely used symmetric squats and infrequently used symmetric stoops typically studied in lab settings. Instead, they used a variety of different postures that have not been well-characterized such as one-legged lifting and split-legged lifting. The second experiment is a study of how to infer human motion using limited information. We present methods for inferring human motion from sparse sensors using Transformers and Seq2Seq models. We found that Transformers perform better than Seq2Seq models in producing upper-body and full-body motion, but that each model can accurately infer human motion for a variety of postures like sitting, standing, kneeling, and bending given sparse sensor data.

Description
Keywords
inertial motion capture, deep learning, ergonomics, manual material handlers
Citation
Collections