The trend toward soft wearable robotic systems creates a compelling need for new and reliable sensor systems that do not require a rigid mounting frame. Despite the growing use of inertial measurement units (IMUs) in motion tracking applications, sensor drift and IMU-to-segment misalignment still represent major problems in applications requiring high accuracy. This paper proposes a novel 2-step calibration method which takes advantage of the periodic nature of human locomotion to improve the accuracy of wearable inertial sensors in measuring lower-limb joint angles. Specifically, the method was applied to the determination of the hip joint angles during walking tasks. The accuracy and precision of the calibration method were accessed in a group of N = 8 subjects who walked with a custom-designed inertial motion capture system at 85% and 115% of their comfortable pace, using an optical motion capture system as reference. In light of its low computational complexity and good accuracy, the proposed approach shows promise for embedded applications, including closed-loop control of soft wearable robotic systems.
more »
« less
Dynamic Joint Motions in Occupational Environments as Indicators of Potential Musculoskeletal Injury Risk
The objective of this study was to test the feasibility of using a pair of wearable inertial measurement unit (IMU) sensors to accurately capture dynamic joint motion data during simulated occupational conditions. Eleven subjects (5 males and 6 females) performed repetitive neck, low-back, and shoulder motions simulating low- and high-difficulty occupational tasks in a laboratory setting. Kinematics for each of the 3 joints were measured via IMU sensors in addition to a “gold standard” passivemarker optical motion capture system. The IMU accuracy was benchmarked relative to the optical motion capture system, and IMU sensitivity to low- and high-difficulty tasks was evaluated. The accuracy of the IMU sensors was found to be very good on average, but significant positional drift was observed in some trials. In addition, IMU measurements were shown to be sensitive to differences in task difficulty in all 3 joints (P < .05). These results demonstrate the feasibility for using wearable IMU sensors to capture kinematic exposures as potential indicators of occupational injury risk. Velocities and accelerations demonstrate the most potential for developing risk metrics since they are sensitive to task difficulty and less sensitive to drift than rotational position measurements.
more »
« less
- Award ID(s):
- 1822124
- PAR ID:
- 10300918
- Date Published:
- Journal Name:
- Journal of applied biomechanics
- Volume:
- 37
- ISSN:
- 1065-8483
- Page Range / eLocation ID:
- 196-203
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The purpose of this study was to use 3D motion capture and stretchable soft robotic sensors (SRS) to collect foot-ankle movement on participants performing walking gait cycles on flat and sloped surfaces. The primary aim was to assess differences between 3D motion capture and a new SRS-based wearable solution. Given the complex nature of using a linear solution to accurately quantify the movement of triaxial joints during a dynamic gait movement, 20 participants performing multiple walking trials were measured. The participant gait data was then upscaled (for the SRS), time-aligned (based on right heel strikes), and smoothed using filtering methods. A multivariate linear model was developed to assess goodness-of-fit based on mean absolute error (MAE; 1.54), root mean square error (RMSE; 1.96), and absolute R2 (R2; 0.854). Two and three SRS combinations were evaluated to determine if similar fit scores could be achieved using fewer sensors. Inversion (based on MAE and RMSE) and plantar flexion (based on R2) sensor removal provided second-best fit scores. Given that the scores indicate a high level of fit, with further development, an SRS-based wearable solution has the potential to measure motion during gait- based tasks with the accuracy of a 3D motion capture system.more » « less
-
Abstract Sensing for wearable robots is an ongoing challenge, especially given the recent trend of soft and compliant robots. Recently, a wearable origami exoshell has been designed to sense the user’s torso motion and provide mobility assistance. The materials of the exoshell contribute to a lightweight design with compliant joints, which are ideal characteristics for a wearable device. Common sensors are not ideal for the exoshell as they compromise these design characteristics. Rotary encoders are often rigid metal devices that add considerable weight and compromise the flexibility of the joints. Inertial measurement unit sensors are affected by environments with variable electromagnetic fields and therefore not ideal for wearable applications. Hall effect sensors and gyroscopes are utilized as alternative compatible sensors, which introduce their own set of challenges: noisy measurements and drift due to sensor bias. To mitigate this, we designed the Kinematically Constrained Kalman filter for sensor fusion of gyroscopes and Hall effect sensors, with the goal of estimating the human’s torso and robot joint angles. We augmented the states to consider bias related to the torso angle in order to compensate for drift. The forward kinematics of the robot is incorporated into the Kalman filter as state constraints to address the unobservability of the torso angle and its related bias. The proposed algorithm improved the estimation performance of the torso angle and its bias, compared to the individual sensors and the standard Kalman filter, as demonstrated through bench tests and experiments with a human user.more » « less
-
In safety-critical environments, robots need to reliably recognize human activity to be effective and trust-worthy partners. Since most human activity recognition (HAR) approaches rely on unimodal sensor data (e.g. motion capture or wearable sensors), it is unclear how the relationship between the sensor modality and motion granularity (e.g. gross or fine) of the activities impacts classification accuracy. To our knowledge, we are the first to investigate the efficacy of using motion capture as compared to wearable sensor data for recognizing human motion in manufacturing settings. We introduce the UCSD-MIT Human Motion dataset, composed of two assembly tasks that entail either gross or fine-grained motion. For both tasks, we compared the accuracy of a Vicon motion capture system to a Myo armband using three widely used HAR algorithms. We found that motion capture yielded higher accuracy than the wearable sensor for gross motion recognition (up to 36.95%), while the wearable sensor yielded higher accuracy for fine-grained motion (up to 28.06%). These results suggest that these sensor modalities are complementary, and that robots may benefit from systems that utilize multiple modalities to simultaneously, but independently, detect gross and fine-grained motion. Our findings will help guide researchers in numerous fields of robotics including learning from demonstration and grasping to effectively choose sensor modalities that are most suitable for their applications.more » « less
-
Multi-modal bio-sensing has recently been used as effective research tools in affective computing, autism, clinical disorders, and virtual reality among other areas. However, none of the existing bio-sensing systems support multi-modality in a wearable manner outside well-controlled laboratory environments with research-grade measurements. This work attempts to bridge this gap by developing a wearable multi-modal biosensing system capable of collecting, synchronizing, recording and transmitting data from multiple bio-sensors: PPG, EEG, eye-gaze headset, body motion capture, GSR, etc. while also providing task modulation features including visual-stimulus tagging. This study describes the development and integration of the various components of our system. We evaluate the developed sensors by comparing their measurements to those obtained by a standard research-grade bio-sensors. We first evaluate different sensor modalities of our headset, namely earlobe-based PPG module with motion-noise canceling for ECG during heart-beat calculation. We also compare the steady-state visually evoked potentials (SSVEP) measured by our shielded dry EEG sensors with the potentials obtained by commercially available dry EEG sensors. We also investigate the effect of head movements on the accuracy and precision of our wearable eyegaze system. Furthermore, we carry out two practical tasks to demonstrate the applications of using multiple sensor modalities for exploring previously unanswerable questions in bio-sensing. Specifically, utilizing bio-sensing we show which strategy works best for playing Where is Waldo? visual-search game, changes in EEG corresponding to true versus false target fixations in this game, and predicting the loss/draw/win states through biosensing modalities while learning their limitations in a Rock-Paper-Scissors game.more » « less