Reading fluency is a vital building block for developing literacy, yet the best way to practice fluency—reading aloud—can cause anxiety severe enough to inhibit literacy development in ways that can have an adverse effect on students through adulthood. One promising intervention to mitigate oral reading anxiety is to have children read aloud to a robot. Although observations in prior work have suggested that people likely feel more comfortable in the presence of a robot instead of a human, few studies have empirically demonstrated that people feel less anxious performing in front of a robot compared with a human or used objective physiological indicators to identify decreased anxiety. To investigate whether a robotic reading companion could reduce reading anxiety felt by children, we conducted a within-subjects study where children aged 8 to 11 years (n = 52) read aloud to a human and a robot individually while being monitored for physiological responses associated with anxiety. We found that children exhibited fewer physiological indicators of anxiety, specifically vocal jitter and heart rate variability, when reading to the robot compared with reading to a person. This paper provides strong evidence that a robot’s presence has an effect on the anxiety a person experiences while doing a task, offering justification for the use of robots in a wide-reaching array of social interactions that may be anxiety inducing.
more »
« less
A Social Robot for Anxiety Reduction via Deep Breathing
In this paper, we introduce Ommie, a novel robot that supports deep breathing practices for the purposes of anxiety reduction. The robot’s primary function is to guide users through a series of extended inhales, exhales, and holds by way of haptic interactions and audio cues. We present core design decisions during development, such as robot morphology and tactility, as well as the results of a usability study in collaboration with a local wellness center. Interacting with Ommie resulted in a significant reduction in STAI-6 anxiety measures, and participants found the robot intuitive, approachable, and engaging. Participants also reported feelings of focus and companionship when using the robot, often elicited by the haptic interaction. These results show promise in the robot’s capacity for supporting mental health.
more »
« less
- PAR ID:
- 10354175
- Date Published:
- Journal Name:
- 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
For robots to seamlessly interact with humans, we first need to make sure that humans and robots understand one another. Diverse algorithms have been developed to enable robots to learn from humans (i.e., transferring information from humans to robots). In parallel, visual, haptic, and auditory communication interfaces have been designed to convey the robot’s internal state to the human (i.e., transferring information from robots to humans). Prior research often separates these two directions of information transfer, and focuses primarily on either learning algorithms or communication interfaces. By contrast, in this survey we take an interdisciplinary approach to identify common themes and emerging trends that close the loop between learning and communication. Specifically, we survey state-of-the-art methods and outcomes for communicating a robot’s learning back to the human teacher during human-robot interaction. This discussion connects human-in-the-loop learning methods and explainable robot learning with multimodal feedback systems and measures of human-robot interaction. We find that—when learning and communication are developed together—the resulting closed-loop system can lead to improved human teaching, increased human trust, and human-robot co-adaptation. The paper includes a perspective on several of the interdisciplinary research themes and open questions that could advance how future robots communicate their learning to everyday operators. Finally, we implement a selection of the reviewed methods in a case study where participants kinesthetically teach a robot arm. This case study documents and tests an integrated approach for learning in ways that can be communicated, conveying this learning across multimodal interfaces, and measuring the resulting changes in human and robot behavior.more » « less
-
Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot’s uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user to provide better demonstrations. See videos of our device and user studies here: https://youtu.be/tX-2Tqeb9Nwmore » « less
-
This paper presents our preliminary study on enabling individuals who are legally blind to safely operate mobile robots and vehicles. To achieve this, we developed a teleoperation with accessibility at its core. The system incorporates features that enhance usability and situational awareness, including assistive control based on artificial potential fields to prevent collisions and ensure smooth navigation. It also provides multimodal feedback through (a) haptic vibrations on the gamepad controller, which convey the proximity of nearby objects detected by the robot's laser sensor, and (b) color-coded overlays that differentiate paths, obstacles, and people through semantic segmentation performed by a deep neural network on the robot’s camera feed. To evaluate its effectiveness, we partnered with the Austin Lighthouse to conduct experiments in which legally blind participants used the system to successfully guide the robot through a testing area with obstacles.more » « less
-
null (Ed.)In this experiment, we investigated how a robot’s violation of several social norms influences human engagement with and perception of that robot. Each participant in our study (n = 80) played 30 rounds of rock-paper-scissors with a robot. In the three experimental conditions, the robot violated a social norm by cheating, cursing, or insulting the participant during gameplay. In the control condition, the robot conducted a non-norm violating behavior by stretching its hand. During the game, we found that participants had strong emotional reactions to all three social norm violations. However, participants spoke more words to the robot only after it cheated. After the game, participants were more likely to describe the robot as an agent only if they were in the cheating condition. These results imply that while social norm violations do elicit strong immediate reactions, only cheating elicits a significantly stronger prolonged perception of agency.more » « less
An official website of the United States government

