The 2021 Champlain Towers South Condominiums collapse in Surfside, Florida, resulted 98 deaths. Nine people are thought to have survived the initial collapse, and might have been rescued if rescue workers could have located them. Perhaps, if rescue workers had been able to use robots to search the interior of the rubble pile, outcomes might have been better. An improved understanding of the environment in which a robot would have to operate to be able to search the interior of a rubble pile would help roboticists develop better suited robotic platforms and control strategies. To this end, this work offers an approach to characterize and visualize the interior of a rubble pile and conduct a preliminary analysis of the occurrence of voids. Specifically, the analysis makes opportunistic use of four days of aerial imagery gathered from responders at Surfside to create a 3D volumetric aggregated model of the collapse in order to identify and characterize void spaces in the interior of the rubble. The preliminary results confirm expectations of small number and scale of these interior voids. The results can inform better selection and control of existing robots for disaster response, aid in determining the design specifications (specifically scale and form factor), and improve control of future robotic platforms developed for search operations in rubble.
more »
« less
AiRobSim: Simulating a Multisensor Aerial Robot for Urban Search and Rescue Operation and Training
Unmanned aerial vehicles (UAVs), equipped with a variety of sensors, are being used to provide actionable information to augment first responders’ situational awareness in disaster areas for urban search and rescue (SaR) operations. However, existing aerial robots are unable to sense the occluded spaces in collapsed structures, and voids buried in disaster rubble that may contain victims. In this study, we developed a framework, AiRobSim, to simulate an aerial robot to acquire both aboveground and underground information for post-disaster SaR. The integration of UAV, ground-penetrating radar (GPR), and other sensors, such as global navigation satellite system (GNSS), inertial measurement unit (IMU), and cameras, enables the aerial robot to provide a holistic view of the complex urban disaster areas. The robot-collected data can help locate critical spaces under the rubble to save trapped victims. The simulation framework can serve as a virtual training platform for novice users to control and operate the robot before actual deployment. Data streams provided by the platform, which include maneuver commands, robot states and environmental information, have potential to facilitate the understanding of the decision-making process in urban SaR and the training of future intelligent SaR robots.
more »
« less
- Award ID(s):
- 1850008
- PAR ID:
- 10221533
- Date Published:
- Journal Name:
- Sensors
- Volume:
- 20
- Issue:
- 18
- ISSN:
- 1424-8220
- Page Range / eLocation ID:
- 5223
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Flood events have become intense and more frequent due to heavy rainfall and hurricanes caused by global warming. Accurate floodwater extent maps are essential information sources for emergency management agencies and flood relief programs to direct their resources to the most affected areas. Synthetic Aperture Radar (SAR) data are superior to optical data for floodwater mapping, especially in vegetated areas and in forests that are adjacent to urban areas and critical infrastructures. Investigating floodwater mapping with various available SAR sensors and comparing their performance allows the identification of suitable SAR sensors that can be used to map inundated areas in different land covers, such as forests and vegetated areas. In this study, we investigated the performance of polarization configurations for flood boundary delineation in vegetated and open areas derived from Sentinel1b, C-band, and Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) L-band data collected during flood events resulting from Hurricane Florence in the eastern area of North Carolina. The datasets from the sensors for the flooding event collected on the same day and same study area were processed and classified for five landcover classes using a machine learning method—the Random Forest classification algorithm. We compared the classification results of linear, dual, and full polarizations of the SAR datasets. The L-band fully polarized data classification achieved the highest accuracy for flood mapping as the decomposition of fully polarized SAR data allows land cover features to be identified based on their scattering mechanisms.more » « less
-
The Industrial Internet of Things has increased the number of sensors permanently installed in industrial plants. Yet there will be gaps in coverage due to broken sensors or sparce density in very large plants, such as in the petrochemical industry. Modern emergency response operations are beginning to use Small Unmanned Aerial Systems (sUAS) as remote sensors to provide rapid improved situational awareness. Ground-based sensors are an integral component of overall situational awareness platforms, as they can provide longer-term persistent monitoring that aerial drones are unable to provide. Squishy Robotics and the Berkeley Emergent Space Tensegrities Laboratory have developed hardware and a framework for rapidly deploying sensor robots for integrated ground-aerial disaster response. The semi-autonomous delivery of sensors using tensegrity (tension-integrity) robotics uses structures that are flexible, lightweight, and have high stiffness-to-weight ratios, making them ideal candidates for robust high-altitude deployments. Squishy Robotics has developed a tensegrity robot for commercial use in Hazardous Materials (HazMat) scenarios that is capable of being deployed from commercial drones or other aircraft. Squishy Robots have been successfully deployed with a delicate sensing and communication payload of up to 1,000 ft. This paper describes the framework for optimizing the deployment of emergency sensors spatially over time. AI techniques (e.g., Long Short-Term Memory neural networks) identify regions where sensors would be most valued without requiring humans to enter the potentially dangerous area. The cost function for optimization considers costs of false-positive and false-negative errors. Decisions on mitigation include shutting down the plant or evacuating the local community. The Expected Value of Information (EVI) is used to identify the most valuable type and location of physical sensors to be deployed to increase the decision-analytic value of a sensor network. A case study using data from the Tennessee Eastman process dataset of a chemical plant displayed in OSI Soft is provided.more » « less
-
Search and rescue (SAR), performed to locate and save victims in disaster and other scenarios, primarily involves collaborative sensemaking and planning. To become a SAR responder, students learn to search within and navigate the environment, make sense of situations, and collaboratively plan operations. In this study, we synthesize data from four sources: (1) semi-structured interviews with experienced SAR professionals; (2) online surveys of SAR professionals; (3) analysis of documentation and artifacts from SAR operations on the 2017 hurricanes Harvey and Maria; and (4) first-person experience undertaking SAR training. Drawing on activity theory, we develop an understanding of current SAR sensemaking and planning activities, which help explore unforeseen factors that are relevant to the design of training systems. We derive initial design implications for systems that teach SAR responders to deal with mapping in the outdoors, collecting data, sharing information, and collaboratively planning activities.more » « less
-
Mobile robots can access regions and collect data in structural locations not easily reached by humans. This includes confined spaces, such as inside walls, and underground pipes; and remote spaces, such as the underside of bridge decks. Robot access provides the opportunity to sense in these difficult to access spaces with robot mounted sensors, i.e. cameras and lidars, and with the robot placing and servicing standalone sensors. Teams of robots, sensors and AR-equipped humans have the potential to provide rapid and more comprehensive structural assessments. This paper presents results of studies using small robots to explore and collect structural condition data from remote and confined spaces including in walls, culverts, and bridge deck undersides. The presentation also covers system and network architecture, methods for automating data processing with localized and edge-based processors, the use of augmented reality (AR) human interfaces and discusses key technical challenges and possible solutions.more » « less
An official website of the United States government

