Physical human–robot collaboration is becoming more common, both in industrial and service robotics. Cooperative execution of a task requires intuitive and efficient interaction between both actors. For humans, this means being able to predict and adapt to robot movements. Given that natural human movement exhibits several robust features, we examined whether human–robot physical interaction is facilitated when these features are considered in robot control. The present study investigated how humans adapt to biological and nonbiological velocity patterns in robot movements. Participants held the end-effector of a robot that traced an elliptic path with either biological (twothirds power law) or nonbiological velocity profiles. Participants were instructed to minimize the force applied on the robot endeffector. Results showed that the applied force was significantly lower when the robot moved with a biological velocity pattern. With extensive practice and enhanced feedback, participants were able to decrease their force when following a nonbiological velocity pattern, but never reached forces below those obtained with the 2/3 power law profile. These results suggest that some robust features observed in natural human movements are also a strong preference in guided movements. Therefore, such features should be considered in human–robot physical collaboration. 
                        more » 
                        « less   
                    
                            
                            Biomimetic learning of hand gestures in a humanoid robot
                        
                    
    
            Hand gestures are a natural and intuitive form of communication, and integrating this communication method into robotic systems presents significant potential to improve human-robot collaboration. Recent advances in motor neuroscience have focused on replicating human hand movements from synergies also known as movement primitives. Synergies, fundamental building blocks of movement, serve as a potential strategy adapted by the central nervous system to generate and control movements. Identifying how synergies contribute to movement can help in dexterous control of robotics, exoskeletons, prosthetics and extend its applications to rehabilitation. In this paper, 33 static hand gestures were recorded through a single RGB camera and identified in real-time through the MediaPipe framework as participants made various postures with their dominant hand. Assuming an open palm as initial posture, uniform joint angular velocities were obtained from all these gestures. By applying a dimensionality reduction method, kinematic synergies were obtained from these joint angular velocities. Kinematic synergies that explain 98% of variance of movements were utilized to reconstruct new hand gestures using convex optimization. Reconstructed hand gestures and selected kinematic synergies were translated onto a humanoid robot, Mitra, in real-time, as the participants demonstrated various hand gestures. The results showed that by using only few kinematic synergies it is possible to generate various hand gestures, with 95.7% accuracy. Furthermore, utilizing low-dimensional synergies in control of high dimensional end effectors holds promise to enable near-natural human-robot collaboration. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2053498
- PAR ID:
- 10538619
- Publisher / Repository:
- Frontiers
- Date Published:
- Journal Name:
- Frontiers in Human Neuroscience
- Volume:
- 18
- ISSN:
- 1662-5161
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Brain-machine interfaces (BMIs) have become increasingly popular in restoring the lost motor function in individuals with disabilities. Several research studies suggest that the CNS may employ synergies or movement primitives to reduce the complexity of control rather than controlling each DoF independently, and the synergies can be used as an optimal control mechanism by the CNS in simplifying and achieving complex movements. Our group has previously demonstrated neural decoding of synergy-based hand movements and used synergies effectively in driving hand exoskeletons. In this study, ten healthy right-handed participants were asked to perform six types of hand grasps representative of the activities of daily living while their neural activities were recorded using electroencephalography (EEG). From half of the participants, hand kinematic synergies were derived, and a neural decoder was developed, based on the correlation between hand synergies and corresponding cortical activity, using multivariate linear regression. Using the synergies and the neural decoder derived from the first half of the participants and only cortical activities from the remaining half of the participants, their hand kinematics were reconstructed with an average accuracy above 70%. Potential applications of synergy-based BMIs for controlling assistive devices in individuals with upper limb motor deficits, implications of the results in individuals with stroke and the limitations of the study were discussed.more » « less
- 
            The hypothesis that the central nervous system (CNS) makes use of synergies or movement primitives in achieving simple to complex movements has inspired the investigation of different types of synergies. Kinematic and muscle synergies have been extensively studied in the literature, but only a few studies have compared and combined both types of synergies during the control and coordination of the human hand. In this paper, synergies were extracted first independently (called kinematic and muscle synergies) and then combined through data fusion (called musculoskeletal synergies) from 26 activities of daily living in 22 individuals using principal component analysis (PCA) and independent component analysis (ICA). By a weighted linear combination of musculoskeletal synergies, the recorded kinematics and the recorded muscle activities were reconstructed. The performances of musculoskeletal synergies in reconstructing the movements were compared to the synergies reported previously in the literature by us and others. The results indicate that the musculoskeletal synergies performed better than the synergies extracted without fusion. We attribute this improvement in performance to the musculoskeletal synergies that were generated on the basis of the cross-information between muscle and kinematic activities. Moreover, the synergies extracted using ICA performed better than the synergies extracted using PCA. These musculoskeletal synergies can possibly improve the capabilities of the current methodologies used to control high dimensional prosthetics and exoskeletons.more » « less
- 
            Converging evidence in human and animal models suggests that exogenous stimulation of the motor cortex (M1) elicits responses in the hand with similar modular structure to that found during voluntary grasping movements. The aim of this study was to establish the extent to which modularity in muscle responses to transcranial magnetic stimulation (TMS) to M1 resembles modularity in muscle activation during voluntary hand movements involving finger fractionation. EMG was recorded from eight hand-forearm muscles in nine healthy individuals. Modularity was defined using non-negative matrix factorization to identify low rank approximations (spatial muscle synergies) of the complex activation patterns of EMG data recorded during high density TMS mapping of M1 and voluntary formation of gestures in the American Sign Language alphabet. Analysis of synergies as a set, and individually, revealed greater than chance similarity between those derived from TMS and those derived from voluntary movement. Both datasets included synergies dominated by single intrinsic hand muscles presumably to meet the demand for highly fractionated finger movement. These results suggest a cortical role in combining corticospinal connectivity to individual intrinsic hand muscles with modular mulit-muscle activation via synergies.more » « less
- 
            Abstract As artificial intelligence and industrial automation are developing, human–robot collaboration (HRC) with advanced interaction capabilities has become an increasingly significant area of research. In this paper, we design and develop a real-time, multi-model HRC system using speech and gestures. A set of 16 dynamic gestures is designed for communication from a human to an industrial robot. A data set of dynamic gestures is designed and constructed, and it will be shared with the community. A convolutional neural network is developed to recognize the dynamic gestures in real time using the motion history image and deep learning methods. An improved open-source speech recognizer is used for real-time speech recognition of the human worker. An integration strategy is proposed to integrate the gesture and speech recognition results, and a software interface is designed for system visualization. A multi-threading architecture is constructed for simultaneously operating multiple tasks, including gesture and speech data collection and recognition, data integration, robot control, and software interface operation. The various methods and algorithms are integrated to develop the HRC system, with a platform constructed to demonstrate the system performance. The experimental results validate the feasibility and effectiveness of the proposed algorithms and the HRC system.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    