Human Motion Sensing
We are trying to understand how people move, which is useful for both exoskeleton control as well as biofeedback and rehabilitation applications.
We conducted motion capture of several manual material handlers in a retail environment using an XSens system. This allowed the workers to complete their jobs as they would normally, without interruption. Some examples of the results are shown below.
We found that workers did not use the standard "Squat" posture very much at all, and also used a variety of other lift/bend types besides Stoops. The full writeup of our findings is in our paper "Quantification of Postures for Low-Height Object Manipulation Conducted by Manual Material Handlers in a Retail Environment" (Link). The data captured in these experiments is part of our Natural Motion Dataset (see below).
In addition to capturing data about workers in a retail environment, we also conducted full-body motion capture of individuals conducting normal activities of daily living, such as going to lunch, walking around the campus, going to a store, and doing exercises. The resulting dataset is available for both biomechanics research and machine learning research, and is detailed in our paper "Motion Inference Using Sparse Inertial Sensors, Self-Supervised Learning, and a New Dataset of Unscripted Human Motion" (Link), and is hosted at the Virginia Tech Libraries (Link).
Also in that paper, we present new methods for Motion Inference-- predicting full-body kinematics using only a small number of sensors, for example at the waist, wrists, and ankles. We accomplish this using machine learning. Some results from that study are shown below. We are currently using this technique for stroke rehabilitation, to understand more about how people are able to move during daily life with just a few sensors.
We assume sensors are at the locations of the white and blue circles on the mannequin. The "Predicted" shows the output of our algorithm, as compared to the actual person's pose ("Ground Truth").