Geologic carbon storage represents one of the few truly scalable technologies capable of reducing the CO 2 concentration in the atmosphere. While this technology has the potential to scale, its success hinges on our ability to mitigate its risks. An important aspect of risk mitigation concerns assurances that the injected CO 2 remains within the storage complex. Among the different monitoring modalities, seismic imaging stands out due to its ability to attain high-resolution and high-fidelity images. However, these superior features come at prohibitive costs and time-intensive efforts that potentially render extensive seismic monitoring undesirable. To overcome this shortcoming, we present a methodology in which time-lapse images are created by inverting nonreplicated time-lapse monitoring data jointly. By no longer insisting on replication of the surveys to obtain high-fidelity time-lapse images and differences, extreme costs and time-consuming labor are averted. To demonstrate our approach, hundreds of realistic synthetic noisy time-lapse seismic data sets are simulated that contain imprints of regular CO 2 plumes and irregular plumes that leak. These time-lapse data sets are subsequently inverted to produce time-lapse difference images that are used to train a deep neural classifier. The testing results show that the classifier is capable of detecting CO 2 leakage automatically on unseen data with reasonable accuracy. We consider the use of this classifier as a first step in the development of an automatic workflow designed to handle the large number of continuously monitored CO 2 injection sites needed to help combat climate change. 
                        more » 
                        « less   
                    
                            
                            Optimized time-lapse acquisition design via spectral gap ratio minimization
                        
                    
    
            Modern-day reservoir management and monitoring of geologic carbon storage increasingly call for costly time-lapse seismic data collection. We demonstrate how techniques from graph theory can be used to optimize acquisition geometries for low-cost sparse 4D seismic data. Based on midpoint-offset-domain connectivity arguments, our algorithm automatically produces sparse nonreplicated time-lapse acquisition geometries that favor wavefield recovery. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2203821
- PAR ID:
- 10436338
- Date Published:
- Journal Name:
- GEOPHYSICS
- Volume:
- 88
- Issue:
- 4
- ISSN:
- 0016-8033
- Page Range / eLocation ID:
- A19 to A23
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            null (Ed.)Reconstruction of sparsely sampled seismic data is critical for maintaining the quality of seismic images when significant numbers of shots and receivers are missing.We present a reconstruction method in the shot-receiver-time (SRT) domain based on a residual U-Net machine learning architecture, for seismic data acquired in a sparse 2-D acquisition and name it SRT2D-ResU-Net. The SRT domain retains a high level of seismic signal connectivity, which is likely the main data feature that the reconstructing algorithms rely on. We develop an “in situ training and prediction” workflow by dividing the acquisition area into two nonoverlapping subareas: a training subarea for establishing the network model using regularly sampled data and a testing subarea for reconstructing the sparsely sampled data using the trained model. To establish a reference base for analyzing the changes in data features over the study area, and quantifying the reconstructed seismic data, we devise a baseline reference using a tiny portion of the field data. The baselines are properly spaced and excluded from the training and reconstruction processes. The results on a field marine data set show that the SRT2D-ResU-Net can effectively learn the features of seismic data in the training process, and the average correlation between the reconstructed missing traces and the true answers is over 85%.more » « less
- 
            SUMMARY Repeatedly recording seismic data over a period of months or years is one way to identify trapped oil and gas and to monitor CO2 injection in underground storage reservoirs and saline aquifers. This process of recording data over time and then differencing the images assumes the recording of the data over a particular subsurface region is repeatable. In other words, the hope is that one can recover changes in the Earth when the survey parameters are held fixed between data collection times. Unfortunately, perfect experimental repeatability almost never occurs. Acquisition inconsistencies such as changes in weather (currents, wind) for marine seismic data are inevitable, resulting in source and receiver location differences between surveys at the very least. Thus, data processing aimed at improving repeatability between baseline and monitor surveys is extremely useful. One such processing tool is regularization (or binning) that aligns multiple surveys with different source or receiver configurations onto a common grid. Data binned onto a regular grid can be stored in a high-dimensional data structure called a tensor with, for example, x and y receiver coordinates and time as indices of the tensor. Such a higher-order data structure describing a subsection of the Earth often exhibits redundancies which one can exploit to fill in gaps caused by sampling the surveys onto the common grid. In fact, since data gaps and noise increase the rank of the tensor, seeking to recover the original data by reducing the rank (low-rank tensor-based completion) successfully fills in gaps caused by binning. The tensor nuclear norm (TNN) is defined by the tensor singular value decomposition (tSVD) which generalizes the matrix SVD. In this work we complete missing time-lapse data caused by binning using the alternating direction method of multipliers (or ADMM) to minimize the TNN. For a synthetic experiment with three parabolic events in which the time-lapse difference involves an amplitude increase in one of these events between baseline and monitor data sets, the binning and reconstruction algorithm (TNN-ADMM) correctly recovers this time-lapse change. We also apply this workflow of binning and TNN-ADMM reconstruction to a real marine survey from offshore Western Australia in which the binning onto a regular grid results in significant data gaps. The data after reconstruction varies continuously without the large gaps caused by the binning process.more » « less
- 
            We present the Seismic Laboratory for Imaging and Modeling/Monitoring open-source software framework for computational geophysics and, more generally, inverse problems involving the wave equation (e.g., seismic and medical ultrasound), regularization with learned priors, and learned neural surrogates for multiphase flow simulations. By integrating multiple layers of abstraction, the software is designed to be both readable and scalable, allowing researchers to easily formulate problems in an abstract fashion while exploiting the latest developments in high-performance computing. The design principles and their benefits are illustrated and demonstrated by means of building a scalable prototype for permeability inversion from time-lapse crosswell seismic data, which, aside from coupling of wave physics and multiphase flow, involves machine learning.more » « less
- 
            null (Ed.)Surface-based 2D electrical resistivity tomography (ERT) surveys were used to characterize permafrost distribution at wetland sites on the alluvial plain north of the Tanana River, 20 km southwest of Fairbanks, Alaska, in June and September 2014. The sites were part of an ecologically-sensitive research area characterizing biogeochemical response of this region to warming and permafrost thaw, and the site contained landscape features characteristic of interior Alaska, including thermokarst bog, forested permafrost plateau, and a rich fen. The results show how vegetation reflects shallow (0–10 m depth) permafrost distribution. Additionally, we saw shallow (0–3 m depth) low resistivity areas in forested permafrost plateau potentially indicating the presence of increased unfrozen water content as a precursor to ground instability and thaw. Time-lapse study from June to September suggested a depth of seasonal influence extending several meters below the active layer, potentially as a result of changes in unfrozen water content. A comparison of several electrode geometries (dipole-dipole, extended dipole-dipole, Wenner-Schlumberger) showed that for depths of interest to our study (0–10 m) results were similar, but data acquisition time with dipole-dipole was the shortest, making it our preferred geometry. The results show the utility of ERT surveys to characterize permafrost distribution at these sites, and how vegetation reflects shallow permafrost distribution. These results are valuable information for ecologically sensitive areas where ground-truthing can cause excessive disturbance. ERT data can be used to characterize the exact subsurface geometry of permafrost such that over time an understanding of changing permafrost conditions can be made in great detail. Characterizing the depth of thaw and thermal influence from the surface in these areas also provides important information as an indication of the depth to which carbon storage and microbially-mediated carbon processing may be affected.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    