How can we enable users to create effective, perception-driven task plans for collaborative robots? We conducted a 35-person user study with the Behavior Tree-based CoSTAR system to determine which strategies for end user creation of generalizable robot task plans are most usable and effective. CoSTAR allows domain experts to author complex, perceptually grounded task plans for collaborative robots. As a part of CoSTAR's wide range of capabilities, it allows users to specify SmartMoves: abstract goals such as "pick up component A from the right side of the table." Users were asked to perform pick-and-place assembly tasks with either SmartMoves or one of three simpler baseline versions of CoSTAR. Overall, participants found CoSTAR to be highly usable, with an average System Usability Scale score of 73.4 out of 100. SmartMove also helped users perform tasks faster and more effectively; all SmartMove users completed the first two tasks, while not all users completed the tasks using the other strategies. SmartMove users showed better performance for incorporating perception across all three tasks.
more »
« less
Considerations for End-User Development in the Caregiving Domain
As service robots become more capable of autonomous behaviors, it becomes increasingly important to consider how people will be able to communicate with a robot about what task it should perform and how to do the task. There has been a rise in attention to end-user development (EUD), where researchers create interfaces that enable non-roboticist end users to script tasks for autonomous robots to perform. Currently, state-of-the-art interfaces are largely constrained, often through simplified domains or restrictive end-user interaction. Motivated by our past qualitative design work exploring how to integrate a care robot in an assisted living community, we discuss challenges of EUD in this complex domain. One set of challenges stems from different user-facing representations, e.g., certain tasks may lend themselves better to a rule-based trigger-action representations, whereas other tasks may be easier to specify via a sequence of actions. The other stems from considering the needs of multiple stakeholders, e.g., caregivers and residents of the facility may all create tasks for the robot, but the robot may not be able to share information about all tasks with all residents due to privacy concerns. We present scenarios that illustrate these challenges and also discuss possible solutions.
more »
« less
- Award ID(s):
- 1925043
- PAR ID:
- 10536755
- Publisher / Repository:
- Vol. 2 No. 1: Proceedings of the 2023 AAAI Fall Symposia
- Date Published:
- Journal Name:
- Proceedings of the AAAI Symposium Series
- Volume:
- 2
- Issue:
- 1
- ISSN:
- 2994-4317
- Page Range / eLocation ID:
- 532 to 536
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
How can we enable users to create effective, perception-driven task plans for collaborative robots? We conducted a 35-person user study with the Behavior Tree-based CoSTAR system to determine which strategies for end user creation of generalizable robot task plans are most usable and effective. CoSTAR allows domain experts to author complex, perceptually grounded task plans for collaborative robots. As a part of CoSTAR's wide range of capabilities, it allows users to specify SmartMoves: abstract goals such as "pick up component A from the right side of the table." Users were asked to perform pick-and-place assembly tasks with either SmartMoves or one of three simpler baseline versions of CoSTAR. Overall, participants found CoSTAR to be highly usable, with an average System Usability Scale score of 73.4 out of 100. SmartMove also helped users perform tasks faster and m ore effectively; all SmartMove users completed the first two tasks, while not all users completed the tasks using other strategies. SmartMove users showed better performance vs. baseline methods for incorporating perception across all three tasks.more » « less
-
Collaborative robots promise to transform work across many industries and promote “human-robot teaming” as a novel paradigm. However, realizing this promise requires the understanding of how existing tasks, developed for and performed by humans, can be effectively translated into tasks that robots can singularly or human-robot teams can collaboratively perform. In the interest of developing tools that facilitate this process we present Authr, an end-to-end task authoring environment that assists engineers at manufacturing facilities in translating existing manual tasks into plans applicable for human-robot teams and simulates these plans as they would be performed by the human and robot. We evaluated Authr with two user studies, which demonstrate the usability and effectiveness of Authr as an interface and the benefits of assistive task allocation methods for designing complex tasks for human-robot teams. We discuss the implications of these findings for the design of software tools for authoring human-robot collaborative plans.more » « less
-
For collaborative robots to become useful, end users who are not robotics experts must be able to instruct them to perform a variety of tasks. With this goal in mind, we developed a system for end‐user creation of robust task plans with a broad range of capabilities. CoSTAR: the Collaborative System for Task Automation and Recognition} is our winning entry in the 2016 KUKA Innovation Award competition at the Hannover Messe trade show, which this year focused on Flexible Manufacturing. CoSTAR is unique in how it creates natural abstractions that use perception to represent the world in a way users can both understand and utilize to author capable and robust task plans. Our Behavior Tree‐based task editor integrates high‐level information from known object segmentation and pose estimation with spatial reasoning and robot actions to create robust task plans. We describe the crossplatform design and implementation of this system on multiple industrial robots and evaluate its suitability for a wide variety of use cases.more » « less
-
Adebisi, John (Ed.)Non-expert users can now program robots using various end-user robot programming methods, which have widened the use of robots and lowered barriers preventing robot use by laypeople. Kinesthetic teaching is a common form of end-user robot programming, allowing users to forgo writing code by physically guiding the robot to demonstrate behaviors. Although it can be more accessible than writing code, kinesthetic teaching is difficult in practice because of users’ unfamiliarity with kinematics or limitations of robots and programming interfaces. Developing good kinesthetic demonstrations requires physical and cognitive skills, such as the ability to plan effective grasps for different task objects and constraints, to overcome programming difficulties. How to help users learn these skills remains a largely unexplored question, with users conventionally learning through self-guided practice. Our study compares how self-guided practice compares with curriculum-based training in building users’ programming proficiency. While we found no significant differences between study participants who learned through practice compared to participants who learned through our curriculum, our study reveals insights into factors contributing to end-user robot programmers’ confidence and success during programming and how learning interventions may contribute to such factors. Our work paves the way for further research on how to best structure training interventions for end-user robot programmers.more » « less