skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Acoustic information about upper limb movement in voicing
We show that the human voice has complex acoustic qualities that are directly coupled to peripheral musculoskeletal tensioning of the body, such as subtle wrist movements. In this study, human vocalizers produced a steady-state vocalization while rhythmically moving the wrist or the arm at different tempos. Although listeners could only hear and not see the vocalizer, they were able to completely synchronize their own rhythmic wrist or arm movement with the movement of the vocalizer which they perceived in the voice acoustics. This study corroborates recent evidence suggesting that the human voice is constrained by bodily tensioning affecting the respiratory–vocal system. The current results show that the human voice contains a bodily imprint that is directly informative for the interpersonal perception of another’s dynamic physical states.  more » « less
Award ID(s):
1735225
PAR ID:
10281436
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of the National Academy of Sciences
Volume:
117
Issue:
21
ISSN:
0027-8424
Page Range / eLocation ID:
11364 to 11367
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Myoelectric control of prostheses is a long-established technique, using surface electromyography (sEMG) to detect user intention and perform subsequent mechanical actions. Most machine learning models utilized in control systems are trained using isolated movements that do not reflect the natural movements occurring during daily activities. Moreover, movements are often affected by arm postures, the duration of activities, and personal habits. It is crucial to have a control system for multi-degree-of-freedom (DoF) prosthetic arms that is trained using sEMG data collected from activities of daily living (ADL) tasks. This work focuses on two major functional wrist movements: pronation-supination and dart-throwing movement (DTM), and introduces a new wrist control system that directly maps sEMG signals to the joint velocities of the multi-DoF wrist. Additionally, a specific training strategy (Quick training) is proposed that enables the controller to be applied to new subjects and handle situations where sensors may displace during daily living, muscles can become fatigued, or sensors can become contaminated (e.g., due to sweat). The prosthetic wrist controller is designed based on data from 24 participants and its performance is evaluated using the Root Mean Square Error (RMSE) and Pearson Correlation. The results are found to depend on the characteristics of the tasks. For example, tasks with dart-throwing motion show smaller RSME values (Hammer: 6.68 deg/s and Cup: 7.92 deg/s) compared to tasks with pronation-supination (Bulb: 43.98 deg/s and Screw: 53.64 deg/s). The proposed control technique utilizing Quick training demonstrates a decrease in the average root mean square error (RMSE) value by 35% and an increase in the average Pearson correlation value by 40% across all four ADL tasks. 
    more » « less
  2. Effective physical human-robot interaction (pHRI) depends on how humans can communicate their intentions for movement with others. While it is speculated that small interaction forces contain significant information to convey the specific movement intention of physical humanhuman interaction (pHHI), the underlying mechanism for humans to infer intention from such small forces is largely unknown. The hypothesis in this work is that the sensitivity to a small interaction force applied at the hand is affected by the movement of the arm that is affected by the arm stiffness. For this, a haptic robot was used to provide the endpoint interaction forces to the arm of seated human participants. They were asked to determine one of the four directions of the applied robot interaction force without visual feedback. Variations of levels of interaction force as well as arm muscle contraction were applied. The results imply that human’s ability to identify and respond to the correct direction of small interaction forces was lower when the alignment of human arm movement with respect to the force direction was higher. In addition, the sensitivity to the direction of the small interaction force was high when the arm stiffness was low. It is also speculated that humans lower their arm stiffness to be more sensitive to smaller interaction forces. These results will help develop human-like pHRI systems for various applications. 
    more » « less
  3. Monitoring human gait is essential to quantify gait issues associated with fall-prone individuals as well as other gait-related movement disorders. Being portable and cost-effective, ambulatory gait analysis using inertial sensors is considered a promising alternative to traditional laboratory-based approach. The current study aimed to provide a method for predicting the spatio-temporal gait parameters using the wrist-worn inertial sensors. Eight young adults were involved in a laboratory study. Optical motion analysis system and force-plates were used for the assessment of baseline gait parameters. Spatio-temporal features of an Inertial Measurement Unit (IMU) on the wrist were analyzed. Multi-variate correlation analyses were performed to develop gait parameter prediction models. The results indicated that gait stride time was strongly correlated with peak-to-peak duration of wrist gyroscope signal in the anterio-posterior direction. Meanwhile, gait stride length was successfully predicted using a combination model of peak resultant wrist acceleration and peak sagittal wrist angle. In conclusion, current study provided the evidence that the wrist-worn inertial sensors are capable of estimating spatio-temporal gait parameters. This finding paves the foundation for developing a wrist-worn gait monitor with high user compliance. 
    more » « less
  4. Abstract When visual and proprioceptive estimates of hand position disagree (e.g., viewing the hand underwater), the brain realigns them to reduce mismatch. This perceptual change is reflected in primary motor cortex (M1) excitability, suggesting potential relevance for hand movement. Here, we asked whether fingertip visuo-proprioceptive misalignment affects only the brain’s representation of that finger (somatotopically focal), or extends to other parts of the limb that would be needed to move the misaligned finger (somatotopically broad). In Experiments 1 and 2, before and after misaligned or veridical visuo-proprioceptive training at the index finger, we used transcranial magnetic stimulation to assess M1 representation of five hand and arm muscles. The index finger representation showed an association between M1 excitability and visuo-proprioceptive realignment, as did the pinkie finger representation to a lesser extent. Forearm flexors, forearm extensors, and biceps did not show any such relationship. In Experiment 3, participants indicated their proprioceptive estimate of the fingertip, knuckle, wrist, and elbow, before and after misalignment at the fingertip. Proprioceptive realignment at the knuckle, but not the wrist or elbow, was correlated with realignment at the fingertip. These results suggest the effects of visuo-proprioceptive mismatch are somatotopically focal in both sensory and motor domains. 
    more » « less
  5. Full-body motion capture is essential for the study of body movement. Video-based, markerless, mocap systems are, in some cases, replacing marker-based systems, but hybrid systems are less explored. We develop methods for coregistration between 2D video and 3D marker positions when precise spatial relationships are not known a priori. We illustrate these methods on three-ball cascade juggling in which it was not possible to use marker-based tracking of the balls, and no tracking of the hands was possible due to occlusion. Using recorded video and motion capture, we aimed to transform 2D ball coordinates into 3D body space as well as recover details of hand motion. We proposed four linear coregistration methods that differ in how they optimize ball-motion constraints during hold and flight phases, using an initial estimate of hand position based on arm and wrist markers. We found that minimizing the error between ball and hand estimate was globally suboptimal, distorting ball flight trajectories. The best-performing method used gravitational constraints to transform vertical coordinates and ball-hold constraints to transform lateral coordinates. This method enabled an accurate description of ball flight as well as a reconstruction of wrist movements. We discuss these findings in the broader context of video/motion capture coregistration. 
    more » « less