Modern-day reservoir management and monitoring of geologic carbon storage increasingly call for costly time-lapse seismic data collection. We demonstrate how techniques from graph theory can be used to optimize acquisition geometries for low-cost sparse 4D seismic data. Based on midpoint-offset-domain connectivity arguments, our algorithm automatically produces sparse nonreplicated time-lapse acquisition geometries that favor wavefield recovery.
more »
« less
Derisking geologic carbon storage from high-resolution time-lapse seismic to explainable leakage detection
Geologic carbon storage represents one of the few truly scalable technologies capable of reducing the CO 2 concentration in the atmosphere. While this technology has the potential to scale, its success hinges on our ability to mitigate its risks. An important aspect of risk mitigation concerns assurances that the injected CO 2 remains within the storage complex. Among the different monitoring modalities, seismic imaging stands out due to its ability to attain high-resolution and high-fidelity images. However, these superior features come at prohibitive costs and time-intensive efforts that potentially render extensive seismic monitoring undesirable. To overcome this shortcoming, we present a methodology in which time-lapse images are created by inverting nonreplicated time-lapse monitoring data jointly. By no longer insisting on replication of the surveys to obtain high-fidelity time-lapse images and differences, extreme costs and time-consuming labor are averted. To demonstrate our approach, hundreds of realistic synthetic noisy time-lapse seismic data sets are simulated that contain imprints of regular CO 2 plumes and irregular plumes that leak. These time-lapse data sets are subsequently inverted to produce time-lapse difference images that are used to train a deep neural classifier. The testing results show that the classifier is capable of detecting CO 2 leakage automatically on unseen data with reasonable accuracy. We consider the use of this classifier as a first step in the development of an automatic workflow designed to handle the large number of continuously monitored CO 2 injection sites needed to help combat climate change.
more »
« less
- Award ID(s):
- 2203821
- PAR ID:
- 10436334
- Date Published:
- Journal Name:
- The Leading Edge
- Volume:
- 42
- Issue:
- 1
- ISSN:
- 1070-485X
- Page Range / eLocation ID:
- 69 to 76
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The energy transition to meet net-zero emissions by 2050 has created demand for underground caverns needed to safely store CO2, hydrocarbon, hydrogen, and wastewater. Salt domes are ideal for underground storage needs because of their low permeability and affordable costs, which makes them the preferred choice for large-scale storage projects like the US Strategic Petroleum Reserves. However, the uneven upward movement of salt spines can create drilling problems and breach cavern integrity, releasing harmful gases into overlying aquifers and endangering nearby communities. Here, we present a novel application of data-driven geophysical methods combined with machine learning that improves salt dome characterization during feasibility studies for site selection and potentially advances the effectiveness of current early-warning systems. We utilize long-term, non-invasive seismic monitoring to investigate deformation processes at the Sorrento salt dome in Louisiana. We developed a hybrid autoencoder model and applied it to an 8-month dataset from a nodal array deployed in 2020, to produce a high-fidelity microearthquake catalog. Our hybrid model outperformed traditional event detection techniques and other neural network detectors. Seismic signals from storms, rock bursts, trains, aircraft, and other anthropogenic sources were identified. Clusters of microearthquakes were observed along two N-S trends referred to as Boundary Shear Zones (BSZ), along which we infer that salt spines are moving differentially. Time-lapse sonar surveys were used to confirm variations in propagation rates within salt spines and assess deformation within individual caverns. Seismicity along one BSZ is linked with a well failure incident that created a 30-ft wide crater at the surface in 2021. This study introduces a novel method for mapping spatial and temporal variations in salt shear zones and provides insights into the subsurface processes that can compromise the safety and lifetime of underground storage sites.more » « less
-
SUMMARY Geological Carbon Storage (GCS) is one of the most viable climate-change mitigating net-negative CO2-emission technologies for large-scale CO2 sequestration. However, subsurface complexities and reservoir heterogeneity demand a systematic approach to uncertainty quantification to ensure both containment and conformance, as well as to optimize operations. As a step toward a digital twin for monitoring and control of underground storage, we introduce a new machine-learning-based data-assimilation framework validated on realistic numerical simulations. The proposed digital shadow combines simulation-based inference (SBI) with a novel neural adaptation of a recently developed nonlinear ensemble filtering technique. To characterize the posterior distribution of CO2 plume states (saturation and pressure) conditioned on multimodal time-lapse data, consisting of imaged surface seismic and well-log data, a generic recursive scheme is employed, where neural networks are trained on simulated ensembles for the time-advanced state and observations. Once trained, the digital shadow infers the state as time-lapse field data become available. Unlike ensemble Kalman filtering, corrections to predicted states are computed via a learned nonlinear prior-to-posterior mapping that supports non-Gaussian statistics and nonlinear models for the dynamics and observations. Training and inference are facilitated by the combined use of conditional invertible neural networks and bespoke physics-based summary statistics. Starting with a probabilistic permeability model derived from a baseline seismic survey, the digital shadow is validated against unseen simulated ground-truth time-lapse data. Results show that injection-site-specific uncertainty in permeability can be incorporated into state uncertainty, and the highest reconstruction quality is achieved when conditioning on both seismic and wellbore data. Despite incomplete permeability knowledge, the digital shadow accurately tracks the subsurface state throughout a realistic CO2 injection project. This work establishes the first proof-of-concept for an uncertainty-aware, scalable digital shadow, laying the foundation for a digital twin to optimize underground storage operations.more » « less
-
SUMMARY Repeatedly recording seismic data over a period of months or years is one way to identify trapped oil and gas and to monitor CO2 injection in underground storage reservoirs and saline aquifers. This process of recording data over time and then differencing the images assumes the recording of the data over a particular subsurface region is repeatable. In other words, the hope is that one can recover changes in the Earth when the survey parameters are held fixed between data collection times. Unfortunately, perfect experimental repeatability almost never occurs. Acquisition inconsistencies such as changes in weather (currents, wind) for marine seismic data are inevitable, resulting in source and receiver location differences between surveys at the very least. Thus, data processing aimed at improving repeatability between baseline and monitor surveys is extremely useful. One such processing tool is regularization (or binning) that aligns multiple surveys with different source or receiver configurations onto a common grid. Data binned onto a regular grid can be stored in a high-dimensional data structure called a tensor with, for example, x and y receiver coordinates and time as indices of the tensor. Such a higher-order data structure describing a subsection of the Earth often exhibits redundancies which one can exploit to fill in gaps caused by sampling the surveys onto the common grid. In fact, since data gaps and noise increase the rank of the tensor, seeking to recover the original data by reducing the rank (low-rank tensor-based completion) successfully fills in gaps caused by binning. The tensor nuclear norm (TNN) is defined by the tensor singular value decomposition (tSVD) which generalizes the matrix SVD. In this work we complete missing time-lapse data caused by binning using the alternating direction method of multipliers (or ADMM) to minimize the TNN. For a synthetic experiment with three parabolic events in which the time-lapse difference involves an amplitude increase in one of these events between baseline and monitor data sets, the binning and reconstruction algorithm (TNN-ADMM) correctly recovers this time-lapse change. We also apply this workflow of binning and TNN-ADMM reconstruction to a real marine survey from offshore Western Australia in which the binning onto a regular grid results in significant data gaps. The data after reconstruction varies continuously without the large gaps caused by the binning process.more » « less
-
For farmers, policymakers, and government agencies, it is critical to accurately define agricultural crop phenology and its spatial-temporal variability. At the moment, two approaches are utilized to report crop phenology. On one hand, land surface phenology provides information about the overall trend, whereas weekly reports from USDA-NASS provide information about the development of particular crops at the regional level. High-cadence earth observations might help to improve the accuracy of these estimations and bring more precise crop phenology classifications closer to what farmers demand. The second component of the proposed solution requires the use of robust classifiers (e.g., random forest, RF) capable of successfully managing large data sets. To evaluate this solution, this study compared the output of a RF classifier model using weather, two different satellite sources (Planet Fusion; PF and Sentinel-2; S-2), and ground truth data to improve maize (Zea mays L.) crop phenology classification using two regions of Kansas (Southwest and Central) as a testbed during the 2017 growing season. Our findings suggests that high temporal resolution (PF) data can significantly improve crop classification metrics (f1-score = 0.94) relative to S-2 (f1-score = 0.86). Additionally, a decline in the f1-score between 0.74 and 0.60 was obtained when we assessed the ability of S-2 to extend the temporal forecast for crop phenology. This research highlights the critical nature of very high temporal resolution (daily) earth observation data for crop monitoring and decision making in agriculture.more » « less
An official website of the United States government

