skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Can Wearable Sensors Provide Accurate and Reliable 3D Tibiofemoral Angle Estimates during Dynamic Actions?
The ability to accurately measure tibiofemoral angles during various dynamic activities is of clinical interest. The purpose of this study was to determine if inertial measurement units (IMUs) can provide accurate and reliable angle estimates during dynamic actions. A tuned quaternion conversion (TQC) method tuned to dynamics actions was used to calculate Euler angles based on IMU data, and these calculated angles were compared to a motion capture system (our “gold” standard) and a commercially available sensor fusion algorithm. Nine healthy athletes were instrumented with APDM Opal IMUs and asked to perform nine dynamic actions; five participants were used in training the parameters of the TQC method, with the remaining four being used to test validity. Accuracy was based on the root mean square error (RMSE) and reliability was based on the Bland–Altman limits of agreement (LoA). Improvement across all three orthogonal angles was observed as the TQC method was able to more accurately (lower RMSE) and more reliably (smaller LoA) estimate an angle than the commercially available algorithm. No significant difference was observed between the TQC method and the motion capture system in any of the three angles (p < 0.05). It may be feasible to use this method to track tibiofemoral angles with higher accuracy and reliability than the commercially available sensor fusion algorithm.  more » « less
Award ID(s):
2003434
PAR ID:
10515499
Author(s) / Creator(s):
;
Publisher / Repository:
MDPI
Date Published:
Journal Name:
Sensors
Volume:
23
Issue:
14
ISSN:
1424-8220
Page Range / eLocation ID:
6627
Subject(s) / Keyword(s):
wearable sensors knee kinematics
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Lower limb joint kinematics have been measured in laboratory settings using fixed camera-based motion capture systems; however, recently inertial measurement units (IMUs) have been developed as an alternative. The purpose of this study was to test a quaternion conversion (QC) method for calculating the three orthogonal knee angles during the high velocities associated with a jump landing using commercially available IMUs. Nine cadaveric knee specimens were instrumented with APDM Opal IMUs to measure knee kinematics in one-legged 3–4× bodyweight simulated jump landings, four of which were used in establishing the parameters (training) for the new method and five for validation (testing). We compared the angles obtained from the QC method to those obtained from a commercially available sensor and algorithm (APDM Opal) with those calculated from an active marker motion capture system. Results showed a significant difference between both IMU methods and the motion capture data in the majority of orthogonal angles (p < 0.01), though the differences between the QC method and Certus system in the testing set for flexion and rotation angles were smaller than the APDM Opal algorithm, indicating an improvement. Additionally, in all three directions, both the limits of agreement and root-mean-square error between the QC method and the motion capture system were smaller than between the commercial algorithm and the motion capture. 
    more » « less
  2. Injuries are often associated with rapid body segment movements. We compared Certus motion capture and APDM inertial measurement unit (IMU) measurements of tibiofemoral angle and angular velocity changes during simulated pivot landings (i.e., ~70 ms peak) of nine cadaver knees dissected free of skin, subcutaneous fat, and muscle. Data from a total of 852 trials were compared using the Bland–Altman limits of agreement (LoAs): the Certus system was considered the gold standard measure for the angle change measurements, whereas the IMU was considered the gold standard for angular velocity changes. The results show that, although the mean peak IMU knee joint angle changes were slightly underestimated (2.1° for flexion, 0.2° for internal rotation, and 3.0° for valgus), the LoAs were large, ranging from 35.9% to 49.8%. In the case of the angular velocity changes, Certus had acceptable accuracy in the sagittal plane, with LoAs of ±54.9°/s and ±32.5°/s for the tibia and femur. For these rapid motions, we conclude that, even in the absence of soft tissues, the IMUs could not reliably measure these peak 3D knee angle changes; Certus measurements of peak tibiofemoral angular velocity changes depended on both the magnitude of the velocity and the plane of measurement. 
    more » « less
  3. The trend toward soft wearable robotic systems creates a compelling need for new and reliable sensor systems that do not require a rigid mounting frame. Despite the growing use of inertial measurement units (IMUs) in motion tracking applications, sensor drift and IMU-to-segment misalignment still represent major problems in applications requiring high accuracy. This paper proposes a novel 2-step calibration method which takes advantage of the periodic nature of human locomotion to improve the accuracy of wearable inertial sensors in measuring lower-limb joint angles. Specifically, the method was applied to the determination of the hip joint angles during walking tasks. The accuracy and precision of the calibration method were accessed in a group of N = 8 subjects who walked with a custom-designed inertial motion capture system at 85% and 115% of their comfortable pace, using an optical motion capture system as reference. In light of its low computational complexity and good accuracy, the proposed approach shows promise for embedded applications, including closed-loop control of soft wearable robotic systems. 
    more » « less
  4. The purpose of this study was to use 3D motion capture and stretchable soft robotic sensors (SRS) to collect foot-ankle movement on participants performing walking gait cycles on flat and sloped surfaces. The primary aim was to assess differences between 3D motion capture and a new SRS-based wearable solution. Given the complex nature of using a linear solution to accurately quantify the movement of triaxial joints during a dynamic gait movement, 20 participants performing multiple walking trials were measured. The participant gait data was then upscaled (for the SRS), time-aligned (based on right heel strikes), and smoothed using filtering methods. A multivariate linear model was developed to assess goodness-of-fit based on mean absolute error (MAE; 1.54), root mean square error (RMSE; 1.96), and absolute R2 (R2; 0.854). Two and three SRS combinations were evaluated to determine if similar fit scores could be achieved using fewer sensors. Inversion (based on MAE and RMSE) and plantar flexion (based on R2) sensor removal provided second-best fit scores. Given that the scores indicate a high level of fit, with further development, an SRS-based wearable solution has the potential to measure motion during gait- based tasks with the accuracy of a 3D motion capture system. 
    more » « less
  5. Production innovations are occurring faster than ever. Manufacturing workers thus need to frequently learn new methods and skills. In fast changing, largely uncertain production systems, manufacturers with the ability to comprehend workers' behavior and assess their operation performance in near real-time will achieve better performance than peers. Action recognition can serve this purpose. Despite that human action recognition has been an active field of study in machine learning, limited work has been done for recognizing worker actions in performing manufacturing tasks that involve complex, intricate operations. Using data captured by one sensor or a single type of sensor to recognize those actions lacks reliability. The limitation can be surpassed by sensor fusion at data, feature, and decision levels. This paper presents a study that developed a multimodal sensor system and used sensor fusion methods to enhance the reliability of action recognition. One step in assembling a Bukito 3D printer, which composed of a sequence of 7 actions, was used to illustrate and assess the proposed method. Two wearable sensors namely Myo-armband captured both Inertial Measurement Unit (IMU) and electromyography (EMG) signals of assembly workers. Microsoft Kinect, a vision based sensor, simultaneously tracked predefined skeleton joints of them. The collected IMU, EMG, and skeleton data were respectively used to train five individual Convolutional Neural Network (CNN) models. Then, various fusion methods were implemented to integrate the prediction results of independent models to yield the final prediction. Reasons for achieving better performance using sensor fusion were identified from this study. 
    more » « less