Soft robotics enriches the robotic functionalities by engineering soft materials and electronics toward enhanced compliance, adaptivity, and friendly human machine. This decade has witnessed extraordinary progresses and benefits in scaling down soft robotics to small scale for a wide range of potential and promising applications, including medical and surgical soft robots, wearable and rehabilitation robots, and unconstructed environments exploration. This perspective highlights recent research efforts in miniature soft robotics in a brief and comprehensive way in terms of actuation, powering, designs, fabrication, control, and applications in four sections. Section 2 discusses the key aspects of materials selection and structural designs for small‐scale tethered and untethered actuation and powering, including fluidic actuation, stimuli‐responsive actuation, and soft living biohybrid materials, as well as structural forms from 1D to 3D. Section 3 discusses the advanced manufacturing techniques at small scales for fabricating miniature soft robots, including lithography, mechanical self‐assembly, additive manufacturing, tissue engineering, and other fabrication methods. Section 4 discusses the control systems used in miniature robots, including off‐board/onboard controls and artificial intelligence‐based controls. Section 5 discusses their potential broad applications in healthcare, small‐scale objects manipulating and processing, and environmental monitoring. Finally, outlooks on the challenges and opportunities are discussed.
more »
« less
This content will become publicly available on November 26, 2025
Robots and Dance: A Promising Young Alchemy
Research at the intersection of robots and dance promises to create vehicles for expression that enable new creative pursuits and allow robots to function better, especially in human-facing scenarios. Moving this research beyond fringe spectacle and establishing it as a serious, systematic field—a proper subdiscipline of both robotics and dance—will require answering a key question: How does dance advance the fundamentals of robotics, and vice versa? Focusing on the former, this article offers glimpses of this new field with examples of meaningful contributions to control, robotics, and autonomous systems, such as novel actuator designs, improved sensing systems, salient motion profiles for robots, reproducible experiment designs, and new theories of motion derived from the study of dance. It also poses two grand challenges for the emerging field of choreobotics: developing a robust symbolic system for representing bodily action and establishing rich, repeatable testing environments for human–robot interaction.
more »
« less
- Award ID(s):
- 2234195
- PAR ID:
- 10574256
- Publisher / Repository:
- Annual Reviews
- Date Published:
- Journal Name:
- Annual Review of Control, Robotics, and Autonomous Systems
- Volume:
- 8
- ISSN:
- 2573-5144
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Effective human-robot interaction is increasingly vital across various domains, including assistive robotics, emotional communication, entertainment, and industrial automation. Visual feedback, a common feature of current interfaces, may not be suitable for all environments. Audio feedback serves as a critical supplementary communication layer in settings where visibility is low or where robotic operations generate extensive data. Sonification, which transforms a robot's trajectory, motion, and environmental signals into sound, enhances users' comprehension of robot behavior. This improvement in understanding fosters more effective, safe, and reliable Human-Robot Interaction (HRI). Demonstrations of auditory data sonification's benefits are evident in real-world applications such as industrial assembly, robot-assisted rehabilitation, and interactive robotic exhibitions, where it promotes cooperation, boosts performance, and heightens engagement. Beyond conventional HRI environments, auditory data sonification shows substantial potential in managing complex robotic systems and intricate structures, such as hyper-redundant robots and robotic teams. These systems often challenge operators with complex joint monitoring, mathematical kinematic modeling, and visual behavior verification. This dissertation explores the sonification of motion in hyper-redundant robots and teams of industrial robots. It delves into the Wave Space Sonification (WSS) framework developed by Hermann, applying it to the motion datasets of protein molecules modeled as hyper-redundant mechanisms with numerous rigid nano-linkages. This research leverages the WSS framework to develop a sonification methodology for protein molecules' dihedral angle folding trajectories. Furthermore, it introduces a novel approach for the systematic sonification of robotic motion across varying configurations. By employing localized wave fields oriented within the robots' configuration space, this methodology generates auditory outputs with specific timbral qualities as robots move through predefined configurations or along certain trajectories. Additionally, the dissertation examines a team of wheeled industrial/service robots whose motion patterns are sonified using sinusoidal vibratory sounds, demonstrating the practical applications and benefits of this innovative approach.more » « less
-
Abstract As technology advances, Human-Robot Interaction (HRI) is boosting overall system efficiency and productivity. However, allowing robots to be present closely with humans will inevitably put higher demands on precise human motion tracking and prediction. Datasets that contain both humans and robots operating in the shared space are receiving growing attention as they may facilitate a variety of robotics and human-systems research. Datasets that track HRI with rich information other than video images during daily activities are rarely seen. In this paper, we introduce a novel dataset that focuses on social navigation between humans and robots in a future-oriented Wholesale and Retail Trade (WRT) environment (https://uf-retail-cobot-dataset.github.io/). Eight participants performed the tasks that are commonly undertaken by consumers and retail workers. More than 260 minutes of data were collected, including robot and human trajectories, human full-body motion capture, eye gaze directions, and other contextual information. Comprehensive descriptions of each category of data stream, as well as potential use cases are included. Furthermore, analysis with multiple data sources and future directions are discussed.more » « less
-
In the field of soft robotics, flexibility, adaptability, and functionality define a new era of robotic systems that can safely deform, reach, and grasp. To optimize the design of soft robotic systems, it is critical to understand their configuration space and full range of motion across a wide variety of design parameters. Here we integrate extreme mechanics and soft robotics to provide quantitative insights into the design of bio-inspired soft slender manipulators using the concept of reachability clouds. For a minimal three-actuator design inspired by the elephant trunk, we establish an efficient and robust reduced-order method to generate reachability clouds of almost half a million points each to visualize the accessible workspace of a wide variety of manipulator designs. We generate an atlas of 256 reachability clouds by systematically varying the key design parameters including the fiber count, revolution, tapering angle, and activation magnitude. Our results demonstrate that reachability clouds not only offer an immediately clear perspective into the inverse problem of control, but also introduce powerful metrics to characterize reachable volumes, unreachable regions, and actuator redundancy to quantify the performance of soft slender robots.more » « less
-
null (Ed.)The robotics community continually strives to create robots that are deployable in real-world environments. Often, robots are expected to interact with human groups. To achieve this goal, we introduce a new method, the Robot-Centric Group Estimation Model (RoboGEM), which enables robots to detect groups of people. Much of the work reported in the literature focuses on dyadic interactions, leaving a gap in our understanding of how to build robots that can effectively team with larger groups of people. Moreover, many current methods rely on exocentric vision, where cameras and sensors are placed externally in the environment, rather than onboard the robot. Consequently, these methods are impractical for robots in unstructured, human-centric environments, which are novel and unpredictable. Furthermore, the majority of work on group perception is supervised, which can inhibit performance in real-world settings. RoboGEM addresses these gaps by being able to predict social groups solely from an egocentric perspective using color and depth (RGB-D) data. To achieve group predictions, RoboGEM leverages joint motion and proximity estimations. We evaluated RoboGEM against a challenging, egocentric, real-world dataset where both pedestrians and the robot are in motion simultaneously, and show RoboGEM outperformed two state-of-the-art supervised methods in detection accuracy by up to 30%, with a lower miss rate. Our work will be helpful to the robotics community, and serve as a milestone to building unsupervised systems that will enable robots to work with human groups in real-world environments.more » « less
An official website of the United States government
