Learning multi-agent dynamics is a core AI problem with broad applications in robotics and autonomous driving. While most existing works focus on deterministic prediction, producing probabilistic forecasts to quantify uncertainty and assess risks is critical for downstream decision-making tasks such as motion planning and collision avoidance. Multi-agent dynamics often contains internal symmetry. By leveraging symmetry, specifically rotation equivariance, we can improve not only the prediction accuracy but also uncertainty calibration. We introduce Energy Score, a proper scoring rule, to evaluate probabilistic predictions. We propose a novel deep dynamics model, Probabilistic Equivariant Continuous COnvolution (PECCO) for probabilistic prediction of multi-agent trajectories. PECCO extends equivariant continuous convolution to model the joint velocity distribution of multiple agents. It uses dynamics integration to propagate the uncertainty from velocity to position. On both synthetic and real-world datasets, PECCO shows significant improvements in accuracy and calibration compared to non-equivariant baselines. 
                        more » 
                        « less   
                    
                            
                            Geomorphic risk maps for river migration using probabilistic modeling – a framework
                        
                    
    
            Abstract. Lateral migration of meandering rivers poses erosional risks to human settlements, roads, and infrastructure in alluvial floodplains. While there is a large body of scientific literature on the dominant mechanisms driving river migration, it is still not possible to accurately predict river meander evolution over multiple years. This is in part because we do not fully understand the relative contribution of each mechanism and because deterministic mathematical models are not equipped to account for stochasticity in the system. Besides, uncertainty due to model structure deficits and unknown parameter values remains. For a more reliable assessment of risks, we therefore need probabilistic forecasts. Here, we present a workflow to generate geomorphic risk maps for river migration using probabilistic modeling. We start with a simple geometric model for river migration, where nominal migration rates increase with local and upstream curvature. We then account for model structure deficits using smooth random functions. Probabilistic forecasts for river channel position over time are generated by Monte Carlo runs using a distribution of model parameter values inferred from satellite data. We provide a recipe for parameter inference within the Bayesian framework. We demonstrate that such risk maps are relatively more informative in avoiding false negatives, which can be both detrimental and costly, in the context of assessing erosional hazards due to river migration. Our results show that with longer prediction time horizons, the spatial uncertainty of erosional hazard within the entire channel belt increases – with more geographical area falling within 25 % < probability < 75 %. However, forecasts also become more confident about erosion for regions immediately in the vicinity of the river, especially on its cut-bank side. Probabilistic modeling thus allows us to quantify our degree of confidence – which is spatially and temporally variable – in river migration forecasts. We also note that to increase the reliability of these risk maps, we need to describe the first-order dynamics in our model to a reasonable degree of accuracy, and simple geometric models do not always possess such accuracy. 
        more » 
        « less   
        
    
    
                            - PAR ID:
- 10563677
- Publisher / Repository:
- Copernicus Publications
- Date Published:
- Journal Name:
- Earth Surface Dynamics
- Volume:
- 12
- Issue:
- 3
- ISSN:
- 2196-632X
- Page Range / eLocation ID:
- 691 to 708
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Probabilistic hazard assessments for studying overland pyroclastic flows or atmospheric ash clouds under short timelines of an evolving crisis, require using the best science available unhampered by complicated and slow manual workflows. Although deterministic mathematical models are available, in most cases, parameters and initial conditions for the equations are usually only known within a prescribed range of uncertainty. For the construction of probabilistic hazard assessments, accurate outputs and propagation of the inherent input uncertainty to quantities of interest are needed to estimate necessary probabilities based on numerous runs of the underlying deterministic model. Characterizing the uncertainty in system states due to parametric and input uncertainty, simultaneously, requires using ensemble based methods to explore the full parameter and input spaces. Complex tasks, such as running thousands of instances of a deterministic model with parameter and input uncertainty require a High Performance Computing infrastructure and skilled personnel that may not be readily available to the policy makers responsible for making informed risk mitigation decisions. For efficiency, programming tasks required for executing ensemble simulations need to run in parallel, leading to twin computational challenges of managing large amounts of data and performing CPU intensive processing. The resulting flow of work requires complex sequences of tasks, interactions, and exchanges of data, hence the automatic management of these workflows are essential. Here we discuss a computer infrastructure, methodology and tools which enable scientists and other members of the volcanology research community to develop workflows for construction of probabilistic hazard maps using remotely accessed computing through a web portal.more » « less
- 
            Abstract Bedrock river width is an essential geometric parameter relevant to understanding flood hazards and gauging station rating curves, and is critical to stream power incision models and many other landscape evolution models. Obtaining bedrock river width measurements, however, typically requires extensive field campaigns that take place in rugged and steep topography where river access is often physically challenging. Although prior work has turned to measuring channel width from satellite imagery, these data present a snapshot in time, are typically limited to rivers ≥ 10–30 m wide due to the image resolution, and are physically restricted to areas devoid of vegetation. For these reasons, we are generally data limited, and the factors impacting bedrock channel width remain poorly understood. Due to these limitations, researchers often turn to assumptions of width‐scaling relationships with drainage area or discharge to estimate bedrock channel width. Here we present a new method of obtaining bedrock channel width at a desired river discharge through the incorporation of a high‐resolution bare‐earth digital elevation model (DEM) using MATLAB Topotoolbox and the HEC‐RAS river analysis system. We validate this method by comparing modeled results to US Geological Survey (USGS) field measurements at existing gauging stations, as well as field channel measurements. We show that this method can capture general characteristics of discharge rating curves and predict field‐measured channel widths within uncertainty. As high‐resolution DEMs become more available across the United States through the USGS three‐dimensional elevation program (3DEP), the future utility of this method is notable. Through developing and validating a streamlined, open‐source, and freely available workflow of channel width extraction, we hope this method can be applied to future research to improve the quantity of channel width measurements and improve our understanding of bedrock channels.more » « less
- 
            null (Ed.)Abstract Engineers design for an inherently uncertain world. In the early stages of design processes, they commonly account for such uncertainty either by manually choosing a specific worst-case and multiplying uncertain parameters with safety factors or by using Monte Carlo simulations to estimate the probabilistic boundaries in which their design is feasible. The safety factors of this first practice are determined by industry and organizational standards, providing a limited account of uncertainty; the second practice is time intensive, requiring the development of separate testing infrastructure. In theory, robust optimization provides an alternative, allowing set-based conceptualizations of uncertainty to be represented during model development as optimizable design parameters. How these theoretical benefits translate to design practice has not previously been studied. In this work, we analyzed the present use of geometric programs as design models in the aerospace industry to determine the current state-of-the-art, then conducted a human-subjects experiment to investigate how various mathematical representations of uncertainty affect design space exploration. We found that robust optimization led to far more efficient explorations of possible designs with only small differences in an experimental participant’s understanding of their model. Specifically, the Pareto frontier of a typical participant using robust optimization left less performance “on the table” across various levels of risk than the very best frontiers of participants using industry-standard practices.more » « less
- 
            null (Ed.)Engineers design for an inherently uncertain world. In the early stages of design processes, they commonly account for such uncertainty either by manually choosing a specific worstcase and multiplying uncertain parameters with safety factors, or by using Monte Carlo simulations to estimate the probabilistic boundaries in which their design is feasible. The safety factors of this first practice are determined by industry and organizational standards, providing an inexpressive account of uncertainty; the second practice is time intensive, requiring the development of separate testing infrastructure. In theory, robust optimization provides an alternative, allowing set based conceptualizations of uncertainty to be represented during model development as optimizable design parameters. How these theoretical benefits translate to design practice has not previously been studied. In this work, we analyzed present use of geometric programs as design models in the aerospace industry to determine the current state-of-the-art, then conducted a human-subjects experiment to investigate how various mathematical representations of uncertainty affect design space exploration. We found that robust optimization led to far more efficient explorations of possible designs with only small differences in experimental participant’s understandings of their model. Specifically, the Pareto frontier of a typical participant using robust optimization left less performance “on the table” across various levels of risk than the very best frontiers of participants using industry-standard practices.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    