skip to main content

Title: Probabilistic forecasting of plausible debris flows from Nevado de Colima (Mexico) using data from the Atenquique debris flow, 1955

Abstract. We detail a new prediction-oriented procedure aimed at volcanic hazardassessment based on geophysical mass flow models constrained withheterogeneous and poorly defined data. Our method relies on an itemizedapplication of the empirical falsification principle over an arbitrarily wideenvelope of possible input conditions. We thus provide a first step towards aobjective and partially automated experimental design construction. Inparticular, instead of fully calibrating model inputs on past observations,we create and explore more general requirements of consistency, and then weseparately use each piece of empirical data to remove those input values thatare not compatible with it. Hence, partial solutions are defined to the inverseproblem. This has several advantages compared to a traditionally posedinverse problem: (i) the potentially nonempty inverse images of partialsolutions of multiple possible forward models characterize the solutions tothe inverse problem; (ii) the partial solutions can provide hazard estimatesunder weaker constraints, potentially including extreme cases that areimportant for hazard analysis; (iii) if multiple models are applicable,specific performance scores against each piece of empirical information canbe calculated. We apply our procedure to the case study of the Atenquiquevolcaniclastic debris flow, which occurred on the flanks of Nevado de Colimavolcano (Mexico), 1955. We adopt and compare three depth-averaged modelscurrently implemented in the TITAN2D solver, available from more » 4.0.0 – last access: 23 June 2016). The associated inverse problemis not well-posed if approached in a traditional way. We show that our procedurecan extract valuable information for hazard assessment, allowing the explorationof the impact of synthetic flows that are similar to those that occurred in thepast but different in plausible ways. The implementation of multiple models isthus a crucial aspect of our approach, as they can allow the covering of otherplausible flows. We also observe that model selection is inherently linked tothe inversion problem.

« less
; ; ; ; ; ;
Award ID(s):
1821311 1621853 1521855
Publication Date:
Journal Name:
Natural Hazards and Earth System Sciences
Page Range or eLocation-ID:
791 to 820
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract This project is funded by the US National Science Foundation (NSF) through their NSF RAPID program under the title “Modeling Corona Spread Using Big Data Analytics.” The project is a joint effort between the Department of Computer & Electrical Engineering and Computer Science at FAU and a research group from LexisNexis Risk Solutions. The novel coronavirus Covid-19 originated in China in early December 2019 and has rapidly spread to many countries around the globe, with the number of confirmed cases increasing every day. Covid-19 is officially a pandemic. It is a novel infection with serious clinical manifestations, including death, and it has reached at least 124 countries and territories. Although the ultimate course and impact of Covid-19 are uncertain, it is not merely possible but likely that the disease will produce enough severe illness to overwhelm the worldwide health care infrastructure. Emerging viral pandemics can place extraordinary and sustained demands on public health and health systems and on providers of essential community services. Modeling the Covid-19 pandemic spread is challenging. But there are data that can be used to project resource demands. Estimates of the reproductive number (R) of SARS-CoV-2 show that at the beginning of the epidemic, each infectedmore »person spreads the virus to at least two others, on average (Emanuel et al. in N Engl J Med. 2020, Livingston and Bucher in JAMA 323(14):1335, 2020). A conservatively low estimate is that 5 % of the population could become infected within 3 months. Preliminary data from China and Italy regarding the distribution of case severity and fatality vary widely (Wu and McGoogan in JAMA 323(13):1239–42, 2020). A recent large-scale analysis from China suggests that 80 % of those infected either are asymptomatic or have mild symptoms; a finding that implies that demand for advanced medical services might apply to only 20 % of the total infected. Of patients infected with Covid-19, about 15 % have severe illness and 5 % have critical illness (Emanuel et al. in N Engl J Med. 2020). Overall, mortality ranges from 0.25 % to as high as 3.0 % (Emanuel et al. in N Engl J Med. 2020, Wilson et al. in Emerg Infect Dis 26(6):1339, 2020). Case fatality rates are much higher for vulnerable populations, such as persons over the age of 80 years (> 14 %) and those with coexisting conditions (10 % for those with cardiovascular disease and 7 % for those with diabetes) (Emanuel et al. in N Engl J Med. 2020). Overall, Covid-19 is substantially deadlier than seasonal influenza, which has a mortality of roughly 0.1 %. Public health efforts depend heavily on predicting how diseases such as those caused by Covid-19 spread across the globe. During the early days of a new outbreak, when reliable data are still scarce, researchers turn to mathematical models that can predict where people who could be infected are going and how likely they are to bring the disease with them. These computational methods use known statistical equations that calculate the probability of individuals transmitting the illness. Modern computational power allows these models to quickly incorporate multiple inputs, such as a given disease’s ability to pass from person to person and the movement patterns of potentially infected people traveling by air and land. This process sometimes involves making assumptions about unknown factors, such as an individual’s exact travel pattern. By plugging in different possible versions of each input, however, researchers can update the models as new information becomes available and compare their results to observed patterns for the illness. In this paper we describe the development a model of Corona spread by using innovative big data analytics techniques and tools. We leveraged our experience from research in modeling Ebola spread (Shaw et al. Modeling Ebola Spread and Using HPCC/KEL System. In: Big Data Technologies and Applications 2016 (pp. 347-385). Springer, Cham) to successfully model Corona spread, we will obtain new results, and help in reducing the number of Corona patients. We closely collaborated with LexisNexis, which is a leading US data analytics company and a member of our NSF I/UCRC for Advanced Knowledge Enablement. The lack of a comprehensive view and informative analysis of the status of the pandemic can also cause panic and instability within society. Our work proposes the HPCC Systems Covid-19 tracker, which provides a multi-level view of the pandemic with the informative virus spreading indicators in a timely manner. The system embeds a classical epidemiological model known as SIR and spreading indicators based on causal model. The data solution of the tracker is built on top of the Big Data processing platform HPCC Systems, from ingesting and tracking of various data sources to fast delivery of the data to the public. The HPCC Systems Covid-19 tracker presents the Covid-19 data on a daily, weekly, and cumulative basis up to global-level and down to the county-level. It also provides statistical analysis for each level such as new cases per 100,000 population. The primary analysis such as Contagion Risk and Infection State is based on causal model with a seven-day sliding window. Our work has been released as a publicly available website to the world and attracted a great volume of traffic. The project is open-sourced and available on GitHub. The system was developed on the LexisNexis HPCC Systems, which is briefly described in the paper.« less

    Iceland represents one of the most well-known examples of hotspot volcanism, but the details of how surface volcanism connects to geodynamic processes in the deep mantle remain poorly understood. Recent work has identified evidence for an ultra-low velocity zone in the lowermost mantle beneath Iceland and argued for a cylindrically symmetric upwelling at the base of a deep mantle plume. This scenario makes a specific prediction about flow and deformation in the lowermost mantle, which can potentially be tested with observations of seismic anisotropy. Here we present an investigation of seismic anisotropy in the lowermost mantle beneath Iceland, using differential shear wave splitting measurements of S–ScS and SKS–SKKS phases. We apply our techniques to waves propagating at multiple azimuths, with the goal of gaining good geographical and azimuthal coverage of the region. Practical limitations imposed by the suboptimal distribution of global seismicity at the relevant distance ranges resulted in a relatively small data set, particularly for S–ScS. Despite this, however, our measurements of ScS splitting due to lowermost mantle anisotropy clearly show a rotation of the fast splitting direction from nearly horizontal for two sets of paths that sample away from the low velocity region (implying VSH > VSV)more »to nearly vertical for a set of paths that sample directly beneath Iceland (implying VSV > VSH). We also find evidence for sporadic SKS–SKKS discrepancies beneath our study region; while the geographic distribution of discrepant pairs is scattered, those pairs that sample closest to the base of the Iceland plume tend to be discrepant. Our measurements do not uniquely constrain the pattern of mantle flow. However, we carried out simple ray-theoretical forward modelling for a suite of plausible anisotropy mechanisms, including those based on single-crystal elastic tensors, those obtained via effective medium modelling for partial melt scenarios, and those derived from global or regional models of flow and texture development in the deep mantle. These simplified models do not take into account details such as possible transitions in anisotropy mechanism or deformation regime, and test a simplified flow field (vertical flow beneath the plume and horizontal flow outside it) rather than more detailed flow scenarios. Nevertheless, our modelling results demonstrate that our ScS splitting observations are generally consistent with a flow scenario that invokes nearly vertical flow directly beneath the Iceland hotspot, with horizontal flow just outside this region.

    « less
  3. Abstract

    Zircon (U-Th)/He (ZHe) dates are presented from eight samples (n=55) collected from three ranges including the Carrizo and Franklin Mountains in western Texas and the Cookes Range in southern New Mexico. ZHe dates from Proterozoic crystalline rocks range from 6 to 731 Ma in the Carrizo Mountains, 19 to 401 Ma in the Franklin Mountains, and 63 to 446 Ma in the Cookes Range, and there is a negative correlation with eU values. These locations have experienced a complex tectonic history involving multiple periods of uplift and reburial, and we use a combination of forward and inverse modeling approaches to constrain plausible thermal histories. Our final inverse models span hundreds of millions of years and multiple tectonic events and lead to the following conclusions: (1) Proterozoic exhumation occurred from 800 to 500 Ma, coinciding with the break-up of Rodinia; (2) elevated temperatures at approximately 100 Ma occurred during final development of the Bisbee basin and are a likely result of elevated heat flow in the upper crust during continental rifting; (3) a pulse of cooling associated with Laramide shortening is observed from 70 to 45 Ma in the Cooks Range and 80 to 50 Ma in the Franklin Mountains, whereas the Carrizo Mountains were largely unaffectedmore »by this event; and (4) final cooling to near-surface temperatures began 30–25 Ma at all three locations and was likely a result of Rio Grande rift extension. These data help to bridge the gap between higher and lower temperature isotopic systems to constrain complex thermal histories in tectonically mature regions.

    « less
  4. The West Antarctic Ice Sheet (WAIS) is largely marine based and thus highly sensitive to both climatic and oceanographic changes. Therefore, the WAIS has likely had a very dynamic history over the last several million years. A complete collapse of the WAIS would result in a global sea level rise of 3.3–4.3 m, yet the world’s scientific community is not able to predict its future behavior. Moreover, knowledge about past behavior of the WAIS is poor, in particular during geological times with climatic conditions similar to those expected for the near and distant future. Reconstructions and quantifications of partial or complete WAIS collapses in the past are urgently needed for constraining and testing ice sheet models that aim to predict future WAIS behavior and the potential contribution of the WAIS to global sea level rise. Large uncertainties exist regarding the chronology, extent, rates, and spatial and temporal variability of past advances and retreats of the WAIS across the continental shelves. These uncertainties largely result from the fundamental lack of data from drill cores recovered proximal to the WAIS. The continental shelf and rise of the Amundsen Sea are prime targets for drilling because the records are expected to yield archivesmore »of pure WAIS dynamics unaffected by other ice sheets and the WAIS sector draining into the Amundsen Sea Embayment (ASE) currently experiences the largest ice loss in Antarctica (Paolo et al., 2015). We propose a series of drill sites for the ASE shelf where seismic data reveal seaward-dipping sedimentary sequences that span from the preglacial depositional phase to the most recent glacial periods. Our strategy is to drill a transect from the oldest sequences close to the bedrock/basin boundary at the middle–inner shelf transition to the youngest sequences on the outer shelf in the eastern ASE. If the eastern ASE is inaccessible due to sea ice cover, a similar transect of sites can be drilled on the western ASE. The core transect will provide a detailed history of the glacial cycles in the Amundsen Sea region and allow comparison to the glacial history from the Ross Sea sector. In addition, deep-water sites on the continental rise of the Amundsen Sea are selected for recovering continuous records of glacially transported sediments and detailed archives of climatic and oceanographic changes throughout glacial–interglacial cycles. We will apply a broad suite of analytical techniques, including multiproxy analyses, to address our objectives of reconstructing the onset of glaciation in the greenhouse to icehouse transition, processes of dynamic ice sheet behavior during the Neogene and Quaternary, and ocean conditions associated with the glacial cycles. The five principal objectives of Expedition 379 are as follows: 1. To reconstruct the glacial history of West Antarctica from the Paleogene to recent times and the dynamic behavior of the WAIS during the Neogene and Quaternary, especially possible partial or full WAIS collapses, and the WAIS contribution to past sea level changes. Emphasis is placed in particular on studying the response of the WAIS at times when the pCO2 in Earth’s atmosphere exceeded 400 ppm and atmospheric and oceanic temperatures were higher than at present. 2. To correlate the WAIS-proximal records of ice sheet dynamics in the Amundsen Sea with global records of ice volume changes and proxy records for air and seawater temperatures. 3. To study the relationship between incursions of warm Circumpolar Deep Water (CDW) onto the continental shelf of the Amundsen Sea Embayment and the stability of marine-based ice sheet margins under warm water conditions. 4. To reconstruct the processes of major WAIS advances onto the middle and outer shelf that are likely to have occurred since the middle Miocene and compare their timing and processes to those of other Antarctic continental shelves. 5. To identify the timing of the first ice sheet expansion onto the continental shelf of the ASE and its possible relationship to the uplift of Marie Byrd Land.« less
  5. Abstract. Plume-SPH provides the first particle-based simulation ofvolcanic plumes. Smoothed particle hydrodynamics (SPH) has several advantagesover currently used mesh-based methods in modeling of multiphase freeboundary flows like volcanic plumes. This tool will provide more accurateeruption source terms to users of volcanic ash transport anddispersion models (VATDs), greatly improving volcanic ash forecasts. The accuracy ofthese terms is crucial for forecasts from VATDs, and the 3-D SPH modelpresented here will provide better numerical accuracy. As an initial effortto exploit the feasibility and advantages of SPH in volcanic plume modeling,we adopt a relatively simple physics model (3-D dusty-gas dynamic modelassuming well-mixed eruption material, dynamic equilibrium and thermodynamicequilibrium between erupted material and air that entrained into the plume,and minimal effect of winds) targeted at capturing the salient features of avolcanic plume. The documented open-source code is easily obtained andextended to incorporate other models of physics of interest to the largecommunity of researchers investigating multiphase free boundary flows ofvolcanic or other origins.

    The Plume-SPH code ( also incorporates several newly developed techniques inSPH needed to address numerical challenges in simulating multiphasecompressible turbulent flow. The code should thus be also of general interestto the much larger community of researchers using and developing SPH-basedtools. In particular,more »the SPHε turbulence model is used to capturemixing at unresolved scales. Heat exchange due to turbulence is calculated bya Reynolds analogy, and a corrected SPH is used to handle tensile instabilityand deficiency of particle distribution near the boundaries. We alsodeveloped methodology to impose velocity inlet and pressure outlet boundaryconditions, both of which are scarce in traditional implementations of SPH.

    The core solver of our model is parallelized with the message passinginterface (MPI) obtaining good weak and strong scalability using novel techniquesfor data management using space-filling curves (SFCs), object creationtime-based indexing and hash-table-based storage schemes. These techniques areof interest to researchers engaged in developing particles in cell-typemethods. The code is first verified by 1-D shock tube tests, then bycomparing velocity and concentration distribution along the central axis andon the transverse cross with experimental results of JPUE (jet or plume thatis ejected from a nozzle into a uniform environment). Profiles of severalintegrated variables are compared with those calculated by existing 3-D plumemodels for an eruption with the same mass eruption rate (MER) estimated forthe Mt. Pinatubo eruption of 15 June 1991. Our results are consistent withexisting 3-D plume models. Analysis of the plume evolution processdemonstrates that this model is able to reproduce the physics of plumedevelopment.

    « less