Modern-day reservoir management and monitoring of geologic carbon storage increasingly call for costly time-lapse seismic data collection. We demonstrate how techniques from graph theory can be used to optimize acquisition geometries for low-cost sparse 4D seismic data. Based on midpoint-offset-domain connectivity arguments, our algorithm automatically produces sparse nonreplicated time-lapse acquisition geometries that favor wavefield recovery.
more »
« less
Derisking geologic carbon storage from high-resolution time-lapse seismic to explainable leakage detection
Geologic carbon storage represents one of the few truly scalable technologies capable of reducing the CO 2 concentration in the atmosphere. While this technology has the potential to scale, its success hinges on our ability to mitigate its risks. An important aspect of risk mitigation concerns assurances that the injected CO 2 remains within the storage complex. Among the different monitoring modalities, seismic imaging stands out due to its ability to attain high-resolution and high-fidelity images. However, these superior features come at prohibitive costs and time-intensive efforts that potentially render extensive seismic monitoring undesirable. To overcome this shortcoming, we present a methodology in which time-lapse images are created by inverting nonreplicated time-lapse monitoring data jointly. By no longer insisting on replication of the surveys to obtain high-fidelity time-lapse images and differences, extreme costs and time-consuming labor are averted. To demonstrate our approach, hundreds of realistic synthetic noisy time-lapse seismic data sets are simulated that contain imprints of regular CO 2 plumes and irregular plumes that leak. These time-lapse data sets are subsequently inverted to produce time-lapse difference images that are used to train a deep neural classifier. The testing results show that the classifier is capable of detecting CO 2 leakage automatically on unseen data with reasonable accuracy. We consider the use of this classifier as a first step in the development of an automatic workflow designed to handle the large number of continuously monitored CO 2 injection sites needed to help combat climate change.
more »
« less
- Award ID(s):
- 2203821
- PAR ID:
- 10436334
- Date Published:
- Journal Name:
- The Leading Edge
- Volume:
- 42
- Issue:
- 1
- ISSN:
- 1070-485X
- Page Range / eLocation ID:
- 69 to 76
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The energy transition to meet net-zero emissions by 2050 has created demand for underground caverns needed to safely store CO2, hydrocarbon, hydrogen, and wastewater. Salt domes are ideal for underground storage needs because of their low permeability and affordable costs, which makes them the preferred choice for large-scale storage projects like the US Strategic Petroleum Reserves. However, the uneven upward movement of salt spines can create drilling problems and breach cavern integrity, releasing harmful gases into overlying aquifers and endangering nearby communities. Here, we present a novel application of data-driven geophysical methods combined with machine learning that improves salt dome characterization during feasibility studies for site selection and potentially advances the effectiveness of current early-warning systems. We utilize long-term, non-invasive seismic monitoring to investigate deformation processes at the Sorrento salt dome in Louisiana. We developed a hybrid autoencoder model and applied it to an 8-month dataset from a nodal array deployed in 2020, to produce a high-fidelity microearthquake catalog. Our hybrid model outperformed traditional event detection techniques and other neural network detectors. Seismic signals from storms, rock bursts, trains, aircraft, and other anthropogenic sources were identified. Clusters of microearthquakes were observed along two N-S trends referred to as Boundary Shear Zones (BSZ), along which we infer that salt spines are moving differentially. Time-lapse sonar surveys were used to confirm variations in propagation rates within salt spines and assess deformation within individual caverns. Seismicity along one BSZ is linked with a well failure incident that created a 30-ft wide crater at the surface in 2021. This study introduces a novel method for mapping spatial and temporal variations in salt shear zones and provides insights into the subsurface processes that can compromise the safety and lifetime of underground storage sites.more » « less
-
SUMMARY Repeatedly recording seismic data over a period of months or years is one way to identify trapped oil and gas and to monitor CO2 injection in underground storage reservoirs and saline aquifers. This process of recording data over time and then differencing the images assumes the recording of the data over a particular subsurface region is repeatable. In other words, the hope is that one can recover changes in the Earth when the survey parameters are held fixed between data collection times. Unfortunately, perfect experimental repeatability almost never occurs. Acquisition inconsistencies such as changes in weather (currents, wind) for marine seismic data are inevitable, resulting in source and receiver location differences between surveys at the very least. Thus, data processing aimed at improving repeatability between baseline and monitor surveys is extremely useful. One such processing tool is regularization (or binning) that aligns multiple surveys with different source or receiver configurations onto a common grid. Data binned onto a regular grid can be stored in a high-dimensional data structure called a tensor with, for example, x and y receiver coordinates and time as indices of the tensor. Such a higher-order data structure describing a subsection of the Earth often exhibits redundancies which one can exploit to fill in gaps caused by sampling the surveys onto the common grid. In fact, since data gaps and noise increase the rank of the tensor, seeking to recover the original data by reducing the rank (low-rank tensor-based completion) successfully fills in gaps caused by binning. The tensor nuclear norm (TNN) is defined by the tensor singular value decomposition (tSVD) which generalizes the matrix SVD. In this work we complete missing time-lapse data caused by binning using the alternating direction method of multipliers (or ADMM) to minimize the TNN. For a synthetic experiment with three parabolic events in which the time-lapse difference involves an amplitude increase in one of these events between baseline and monitor data sets, the binning and reconstruction algorithm (TNN-ADMM) correctly recovers this time-lapse change. We also apply this workflow of binning and TNN-ADMM reconstruction to a real marine survey from offshore Western Australia in which the binning onto a regular grid results in significant data gaps. The data after reconstruction varies continuously without the large gaps caused by the binning process.more » « less
-
For farmers, policymakers, and government agencies, it is critical to accurately define agricultural crop phenology and its spatial-temporal variability. At the moment, two approaches are utilized to report crop phenology. On one hand, land surface phenology provides information about the overall trend, whereas weekly reports from USDA-NASS provide information about the development of particular crops at the regional level. High-cadence earth observations might help to improve the accuracy of these estimations and bring more precise crop phenology classifications closer to what farmers demand. The second component of the proposed solution requires the use of robust classifiers (e.g., random forest, RF) capable of successfully managing large data sets. To evaluate this solution, this study compared the output of a RF classifier model using weather, two different satellite sources (Planet Fusion; PF and Sentinel-2; S-2), and ground truth data to improve maize (Zea mays L.) crop phenology classification using two regions of Kansas (Southwest and Central) as a testbed during the 2017 growing season. Our findings suggests that high temporal resolution (PF) data can significantly improve crop classification metrics (f1-score = 0.94) relative to S-2 (f1-score = 0.86). Additionally, a decline in the f1-score between 0.74 and 0.60 was obtained when we assessed the ability of S-2 to extend the temporal forecast for crop phenology. This research highlights the critical nature of very high temporal resolution (daily) earth observation data for crop monitoring and decision making in agriculture.more » « less
-
null (Ed.)Seismology provides important constraints on the structure and dynamics of the deep mantle. Computational and methodological advances in the past two decades improved tomographic imaging of the mantle and revealed the fine-scale structure of plumes ascending from the core-mantle boundary region and slabs of oceanic lithosphere sinking into the lower mantle. We discuss the modeling aspects of global tomography including theoretical approximations, data selection, and model fidelity and resolution. Using spectral, principal component, and cluster analyses, we highlight the robust patterns of seismic heterogeneity, which inform us of flow in the mantle, the history of plate motions, and potential compositionally distinct reservoirs. In closing, we emphasize that data mining of vast collections of seismic waveforms and new data from distributed acoustic sensing, autonomous hydrophones, ocean-bottom seismometers, and correlation-based techniques will boost the development of the next generation of global models of density, seismic velocity, and attenuation. ▪ Seismic tomography reveals the 100-km to 1,000-km scale variation of seismic velocity heterogeneity in the mantle. ▪ Tomographic images are the most important geophysical constraints on mantle circulation and evolution.more » « less
An official website of the United States government

