In this work we address the flexible physical docking-and-release as well as recharging needs for a marsupial system comprising an autonomous tiltrotor hybrid Micro Aerial Vehicle and a high-end legged locomotion robot. Within persistent monitoring and emergency response situations, such aerial / ground robot teams can offer rapid situational awareness by taking off from the mobile ground robot and scouting a wide area from the sky. For this type of operational profile to retain its long-term effectiveness, regrouping via landing and docking of the aerial robot onboard the ground one is a key requirement. Moreover, onboard recharging is a necessity in order to perform systematic missions. We present a framework comprising: a novel landing mechanism with recharging capabilities embedded into its design, an external battery-based recharging extension for our previously developed power-harvesting Micro Aerial Vehicle module, as well as a strategy for the reliable landing and the docking-and-release between the two robots. We specifically address the need for this system to be ferried by a quadruped ground system while remaining reliable during aggressive legged locomotion when traversing harsh terrain. We present conclusive experimental validation studies by deploying our solution on a marsupial system comprising the MiniHawk micro tiltrotor and the Boston Dynamics Spot legged robot.
more »
« less
Dynamic Placement of Rapidly Deployable Mobile Sensor Robots Using Machine Learning and Expected Value of Information
The Industrial Internet of Things has increased the number of sensors permanently installed in industrial plants. Yet there will be gaps in coverage due to broken sensors or sparce density in very large plants, such as in the petrochemical industry. Modern emergency response operations are beginning to use Small Unmanned Aerial Systems (sUAS) as remote sensors to provide rapid improved situational awareness. Ground-based sensors are an integral component of overall situational awareness platforms, as they can provide longer-term persistent monitoring that aerial drones are unable to provide. Squishy Robotics and the Berkeley Emergent Space Tensegrities Laboratory have developed hardware and a framework for rapidly deploying sensor robots for integrated ground-aerial disaster response. The semi-autonomous delivery of sensors using tensegrity (tension-integrity) robotics uses structures that are flexible, lightweight, and have high stiffness-to-weight ratios, making them ideal candidates for robust high-altitude deployments. Squishy Robotics has developed a tensegrity robot for commercial use in Hazardous Materials (HazMat) scenarios that is capable of being deployed from commercial drones or other aircraft. Squishy Robots have been successfully deployed with a delicate sensing and communication payload of up to 1,000 ft. This paper describes the framework for optimizing the deployment of emergency sensors spatially over time. AI techniques (e.g., Long Short-Term Memory neural networks) identify regions where sensors would be most valued without requiring humans to enter the potentially dangerous area. The cost function for optimization considers costs of false-positive and false-negative errors. Decisions on mitigation include shutting down the plant or evacuating the local community. The Expected Value of Information (EVI) is used to identify the most valuable type and location of physical sensors to be deployed to increase the decision-analytic value of a sensor network. A case study using data from the Tennessee Eastman process dataset of a chemical plant displayed in OSI Soft is provided.
more »
« less
- Award ID(s):
- 1927010
- PAR ID:
- 10317188
- Date Published:
- Journal Name:
- ASME 2021 International Mechanical Congress Exposition (IMECE 2021
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Unmanned aerial vehicles (UAVs), equipped with a variety of sensors, are being used to provide actionable information to augment first responders’ situational awareness in disaster areas for urban search and rescue (SaR) operations. However, existing aerial robots are unable to sense the occluded spaces in collapsed structures, and voids buried in disaster rubble that may contain victims. In this study, we developed a framework, AiRobSim, to simulate an aerial robot to acquire both aboveground and underground information for post-disaster SaR. The integration of UAV, ground-penetrating radar (GPR), and other sensors, such as global navigation satellite system (GNSS), inertial measurement unit (IMU), and cameras, enables the aerial robot to provide a holistic view of the complex urban disaster areas. The robot-collected data can help locate critical spaces under the rubble to save trapped victims. The simulation framework can serve as a virtual training platform for novice users to control and operate the robot before actual deployment. Data streams provided by the platform, which include maneuver commands, robot states and environmental information, have potential to facilitate the understanding of the decision-making process in urban SaR and the training of future intelligent SaR robots.more » « less
-
Presented at the Workshop on Heterogeneous Multi-Robot Task Allocation and Coordination. The authors recently developed a distributed algorithm to enable a team of homogeneous robots to search for and track an unknown and time-varying number of dynamic targets. This algorithm combined a distributed version of the PHD filter (for multi-target tracking) with Lloyd’s algorithm to drive the motion of the robots. In this paper we extend this previous work to allow a heterogeneous team of groundand aerial robots to perform the search and tracking tasks in a coordinated manner. Both types of robots are equipped with sensors that have a finite field of view and which may receive both false positive and false negative detections. Theaerial robots may vary the size of their sensor field of view (FoV) by changing elevation. This increase in the FoV coincides with a decrease in the accuracy and reliability of the sensor. The ground robots maintain the target tracking information while the aerial robots provide additional sensor coverage. We develop two new distributed algorithms to provide filter updates and to make control decisions in this heterogeneous team. Both algorithms only require robots to communicate with nearby robots and use minimal bandwidth.We demonstrate the efficacy of our approach through a series of simulated experiments which show that the heterogeneous teams are able to achieve more accurate tracking in less time than our previous work.more » « less
-
Emergency response, navigation, and evacuation are key essentials for effective rescue and safety management. Situational awareness is a key ingredient when fire responders or emergency response personnel responds to an emergency. They have to quickly assess the layout of a building or a campus upon entry. Moreover, the occupants of a building or campus also need situational awareness for navigation and emergency response. We have developed an integrated situational awareness mobile augmented reality (AR) application for smart campus planning, management, and emergency response. Through the visualization of integrated geographic information systems and real-time data analysis, our mobile application provides insights into operational implications and offers information to support effective decision-making. Using existing building features, the authors demonstrate how the mobile AR application provides contextualized 3D visualizations that promote and support spatial knowledge acquisition and cognitive mapping thereby enhancing situational awareness. A limited user study was conducted to test the effectiveness of the proposed mobile AR application using the mobile phone usability questionnaire (MPUQ) framework. The results show that the mobile AR application was relatively easy to use and that it can be considered a useful application for navigation and evacuation.more » « less
-
Ishigami G., Yoshida K. (Ed.)This paper develops an autonomous tethered aerial visual assistant for robot operations in unstructured or confined environments. Robotic tele-operation in remote environments is difficult due to the lack of sufficient situational awareness, mostly caused by stationary and limited field-of-view and lack of depth perception from the robot’s onboard camera. The emerging state of the practice is to use two robots, a primary and a secondary that acts as a visual assistant to overcome the perceptual limitations of the onboard sensors by providing an external viewpoint. However, problems exist when using a tele-operated visual assistant: extra manpower, manually chosen suboptimal viewpoint, and extra teamwork demand between primary and secondary operators. In this work, we use an autonomous tethered aerial visual assistant to replace the secondary robot and operator, reducing the human-robot ratio from 2:2 to 1:2. This visual assistant is able to autonomously navigate through unstructured or confined spaces in a risk-aware manner, while continuously maintaining good viewpoint quality to increase the primary operator’s situational awareness. With the proposed co-robots team, tele-operation missions in nuclear operations, bomb squad, disaster robots, and other domains with novel tasks or highly occluded environments could benefit from reduced manpower and teamwork demand, along with improved visual assistance quality based on trustworthy risk-aware motion in cluttered environments.more » « less
An official website of the United States government

