An important component for the effective collaboration of humans with robots is the compatibility of their movements, especially when humans physically collaborate with a robot partner. Following previous findings that humans interact more seamlessly with a robot that moves with humanlike or biological velocity profiles, this study examined whether humans can adapt to a robot that violates human signatures. The specific focus was on the role of extensive practice and realtime augmented feedback. Six groups of participants physically tracked a robot tracing an ellipse with profiles where velocity scaled with the curvature of the path in biological and nonbiological ways, while instructed to minimize the interaction force with the robot. Three of the 6 groups received real-time visual feedback about their force error. Results showed that with 3 daily practice sessions, when given feedback about their force errors, humans could decrease their interaction forces when the robot’s trajectory violated human-like velocity patterns. Conversely, when augmented feedback was not provided, there were no improvements despite this extensive practice. The biological profile showed no improvements, even with feedback, indicating that the (non-zero) force had already reached a floor level. These findings highlight the importance of biological robot trajectories and augmented feedback to guide humans to adapt to non-biological movements in physical human-robot interaction. These results have implications on various fields of robotics, such as surgical applications and collaborative robots for industry. 
                        more » 
                        « less   
                    
                            
                            Physically Assistive Robots: A Systematic Review of Mobile and Manipulator Robots That Physically Assist People with Disabilities
                        
                    
    
            More than 1 billion people in the world are estimated to experience significant disability. These disabilities can impact people's ability to independently conduct activities of daily living, including ambulating, eating, dressing, taking care of personal hygiene, and more. Mobile and manipulator robots, which can move about human environments and physically interact with objects and people, have the potential to assist people with disabilities in activities of daily living. Although the vision of physically assistive robots has motivated research across subfields of robotics for decades, such robots have only recently become feasible in terms of capabilities, safety, and price. More and more research involves end-to-end robotic systems that interact with people with disabilities in real-world settings. In this article, we survey papers about physically assistive robots intended for people with disabilities from top conferences and journals in robotics, human–computer interactions, and accessible technology, to identify the general trends and research methodologies. We then dive into three specific research themes—interaction interfaces, levels of autonomy, and adaptation—and present frameworks for how these themes manifest across physically assistive robot research. We conclude with directions for future research. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1924435
- PAR ID:
- 10492877
- Publisher / Repository:
- Annual Reviews
- Date Published:
- Journal Name:
- Annual Review of Control, Robotics, and Autonomous Systems
- Volume:
- 7
- Issue:
- 1
- ISSN:
- 2573-5144
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Indoor robots hold the promise of automatically handling mundane daily tasks, helping to improve access for people with disabilities, and providing on-demand access to remote physical environments. Unfortunately, the ability to understand never-before-seen objects in scenes where new items may be added (e.g., purchased) or altered (e.g., damaged) on a regular basis remains an open challenge for robotics. In this paper, we introduce EURECA, a mixed-initiative system that leverages online crowds of human contributors to help robots robustly identify 3D point cloud segments corresponding to user-referenced objects in near real-time. EURECA allows robots to understand multi-object 3D scenes on-the-fly (in ∼40 seconds) by providing groups of non-expert crowd workers with intelligent tools that can segment objects more quickly (∼70% faster) and more accurately than individuals. More broadly, EURECA introduces the first real-time crowdsourcing tool that addresses the challenge of learning about new objects in real-world settings, creating a new source of data for training robots online, as well as a platform for studying mixed-initiative crowdsourcing workflows for understanding 3D scenes.more » « less
- 
            Assistive mobile robots can play an important role in supporting individuals with disabilities. While the field of robot control interfaces for individuals with disabilities is growing, there is little work done on such systems for children end users specifically. Accordingly, we pursued the design of an adapted robot control interface for use in child pediatric occupational therapy (OT). Our target end user, a nine-year-old child with cerebral palsy, leveraged the interface to perform instrumental activities of daily living (e.g., play) with a modern mobile manipulator. We used an iterative design process to adjust and improve the interface via input from the participant’s caregivers and occupational therapist, as well as objective participant performance data. Furthermore, we tested the participant’s ability to utilize our interface by creating two testing cases: a control case (in which our participant performed standard ALD/IADL tasks) and an experimental case (in which our participant performed ADL/IADL practice activities more tailored toward the child). Key insights during the process included the need for sensitivity to taking up space on the child user’s existing power wheelchair, the advantages of integrating technologies familiar to the child (e.g., gaming controls, iPads) in our system design, and the potential value of integrating playful mischief (including playful interactions between the child, their caregivers, and their clinicians) as a part of the playbook for pediatric OT. This work can serve to inform and augment new OT strategies for the marginalized population of young children with disabilities.more » « less
- 
            null (Ed.)BACKGROUND: Although a number of research studies on sensor technology for smart home environments have been conducted, there is still lack of consideration of human factors in implementing sensor technology in the home of older adults with visual disabilities. OBJECTIVE: This paper aims to advance knowledge of how sensor technology (e.g., Microsoft Kinect) should be implemented in the home of those with visual disabilities. METHODS: A convenience sample of 20 older adults with visual disabilities allowed us to observe their home environments and interview about the activities of daily living, which were analyzed via the inductive content analysis. RESULTS: Sensor technology should be integrated in the living environments of those with visual disabilities by considering various contexts, including people, tasks, tools, and environments (i.e., level-1 categories), which were further broken down into 22 level-2 categories and 28 level-3 categories. Each sub-category included adequate guidelines, which were also sorted by sensor location, sensor type, and data analysis. CONCLUSIONS: The guidelines will be helpful for researchers and professionals in implementing sensor technology in the home of older adults with visual disabilities.more » « less
- 
            In this work, we analyze video data and interviews from a public deployment of two trash barrel robots in a large public space to better understand the sensemaking activities people perform when they encounter robots in public spaces. Based on an analysis of 274 human–robot interactions and interviews withN =65 individuals or groups, we discovered that people were responding not only to the robots or their behavior, but also to the general idea of deploying robots as trashcans, and the larger social implications of that idea. They wanted to understand details about the deployment because having that knowledge would change how they interact with the robot. Based on our data and analysis, we have provided implications for design that may be topics for future human–robot design researchers who are exploring robots for public space deployment. Furthermore, our work offers a practical example of analyzing field data to make sense of robots in public spaces.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    