Mud slowness is an important parameter controlling wave propagation in a fluid‐filled borehole. However, there is no direct measurement of this quantity in the hole at sonic frequencies. The most common approach to measure fluid slowness consists of performing slowness measurements at the surface using mud sample or using empirical laws linking mud slowness as a function of mud density and type. However, these measurements at the surface does not reproduce real well conditions introducing erroneous mud slowness estimate, while empirical equations tend to provide reasonable results but can yield wrong estimates if assumptions are not valid or uncertainties in some parameters are too large. An alternative approach consists in analyzing dispersive modes contained in recorded data and from this study deriving an estimate of the mud slowness. The drawback of this non automatic method is that it relies heavily on the skill of the person that analyzes the dispersion plots. To overcome these limitations, a probabilistic approach combining the high frequency monopole data and outputs from the monopole radial profiling has been developed to get an estimate of the mud slowness. In addition, this algorithm provides an automatic procedure based on the Scholte wave slowness to automatically set the parameters (center and standard deviation) of the a‐priori probability distribution function of the mud slowness. This point is usually critical while using Bayesian approach. Robustness and efficiency of this algorithm will be illustrated on real field data.
We have considered the problem of using microseismic data to characterize the flow of injected fluid during hydraulic fracturing. We have developed a simple probabilistic physical model that directly ties the fluid pressure in the subsurface during the injection to observations of induced microseismicity. This tractable model includes key physical parameters that affect fluid pressure, rock failure, and seismic wave propagation. It is also amenable to a rigorous uncertainty quantification analysis of the forward model and the inversion. We have used this probabilistic rock failure model to invert for fluid pressure during injection from synthetically generated microseismicity and to quantify the uncertainty of this inversion. The results of our analysis can be used to assess the effectiveness of microseismic monitoring in a given experiment and even to suggest ways to improve the quality and value of monitoring.
SUMMARY
We investigate the applicability of an array-conditioned deconvolution technique, developed for analysing borehole seismic exploration data, to teleseismic receiver functions and data pre-processing steps for scattered wavefield imaging. This multichannel deconvolution technique constructs an approximate inverse filter to the estimated source signature by solving an overdetermined set of deconvolution equations, using an array of receivers detecting a common source. We find that this technique improves the efficiency and automation of receiver function calculation and data pre-processing workflow. We apply this technique to synthetic experiments and to teleseismic data recorded in a dense array in northern Canada. Our results show that this optimal deconvolution automatically determines and subsequently attenuates the noise from data, enhancing P-to-S converted phases in seismograms with various noise levels. In this context, the array-conditioned deconvolution presents a new, effective and automatic means for processing large amounts of array data, as it does not require any ad-hoc regularization; the regularization is achieved naturally by using the noise present in the array itself.
Estimation of the elastic properties of the crust from surface seismic recordings is of great importance for the understanding of lithology and for the detection of mineral resources. Although in marine reflection experiments only P-waves are recorded, information on shear properties of the medium is contained in multioffset reflection seismograms. Being able to retrieve both dilatational and shear properties gives stronger constraints on the lithology. It is therefore desirable to recover isotropic elastic parameters from multioffset seismograms. Unfortunately, most classical waveform fitting methods used for extracting shear properties of the subsurface are based on a 1-D earth model assumption and on linear approximations of the wave equations. In this paper, a 2.5-D elastic waveform inversion method is used to extract the variations of acoustic impedance and Poisson’s ratio from marine multioffset reflection seismograms collected in the Gulf of Mexico area. A complete seismic profile is interpreted, including complex physical phenomena apparent in the data, such as unconsolidated sediment reflections and seismic refraction events. The amplitude of the reflections cannot be explained by one parameter related to the dilatational properties (P-impedance) only, when trying to minimize the least absolute fit between observed and synthetic seismograms. When adding an additional parameter related to shear properties (Poisson’s ratio), the fit between observed and synthetic seismograms improves. The resulting 2-D models of P-impedance and Poisson’s ratio contrasts are anticorrelated almost everywhere in depth, except where hydrocarbons are present. The estimation of physical P-impedance and Poisson’s ratio models by a full waveform fitting allows lithology characterization and, therefore, the delineation of a shale‐over‐gas sand reservoir.
The locations of seismic events are used to infer reservoir properties and to guide future production activity, as well as to determine and understand the stress field. Thus, locating seismic events with uncertainty quantification remains an important problem. Using Bayesian analysis, a joint probability density function of all event locations was constructed from prior information about picking errors in kinematic data and explicitly quantified velocity model uncertainty. Simultaneous location of all seismic events captured the absolute event locations and the relative locations of some events with respect to others, along with their associated uncertainties. We found that the influence of an uncertain velocity model on location uncertainty under many realistic scenarios can be significantly reduced by jointly locating events. Many quantities of interest that are estimated from multiple event locations, such as fault sizes and fracture spacing or orientation, can be better estimated in practice using the proposed approach.
are essentially anisotropic due to its ubiquitous stratified structure. This anisotropy seriously complicates formation imaging and data acquisition. This is most salient for deep-water subsalt reservoirs. Traditionally, point scatterers with isotropic radiation patterns are used in migration imaging, but in the survey design problem, these might lead to design errors caused by receivers being placed in poor locations with respect to the radiation pattern of the scattering structure. Here, we extend a framework which accounts for anisotropy in the scattered radiation for optimal geophysical survey design purposes. The propagation medium is assumed to be attenuative. The locally dipping interfaces are modeled as a discrete set of finite-size planar scattering elements. The general elastodynamic expressions for the sensitivity kernels, i.e., the vectors which mathematically represent the candidate observations, in the presence of the scattering elements are provided. The size of each element controls the width of its radiation pattern, which may in turn be used to characterize the uncertainty on the dip angle, thus complementing the information provided by the model-parameter uncertainties and ultimately leading to better geophysical survey designs.
We describe a Bayesian methodology for designing seismic experiments that optimally maximize model-parameter resolution for imaging purposes. The proposed optimal experiment design algorithm finds the measurements that are likely to optimally reduce the expected uncertainty on the model parameters. This Bayesian [Formula: see text]-optimality-based algorithm minimizes the volume of the expected confidence ellipsoid and leads to the maximization of the expected resolution of the model parameters. Computational efficiency is achieved by a greedy algorithm in which the design is sequentially improved. In contrast to minimizing the uncertainty volume over the entire subsurface simultaneously, a refinement of the algorithm minimizes the marginal uncertainties in a region of interest. Minimizing marginal uncertainties simultaneously accounts for quantitative prior model uncertainties while honoring a qualitative focus on particular regions of interest. The benefits of the proposed method over traditional non-Bayesian ones are demonstrated with several geophysical examples. These include reducing large seismic data volumes for real-time imaging and solving the problem of designing seismic surveys that account for source bandwidth, signal-to-noise ratio, and attenuation.
With high-permeability hydrocarbon reservoirs exhausting their potential, developing low-permeability reservoirs is becoming of increasing importance. In order to be produced economically, these reservoirs need to be stimulated to increase their permeability. Hydraulic fracturing is a technique used to do this. A mixture of water, additives, and proppants is injected under high pressure into the subsurface; this fluid fractures the rock, creating additional pathways for the oil or gas. Understanding the nature of the resulting fracture system, including the geometry, size, and orientation of individual fractures, as well as the distance from one fracture to the next, is key to answering important practical questions such as: What is the affected reservoir volume? Where should we fracture next? What are the optimal locations for future production wells?
We show how interferometric methods can be used to improve the location of microseismic events when those events come from several different fractures and are observed from a single well. This is the standard setup for a multi‐stage hydraulic fracturing experiment. Traditionally, in such experiments each event is located separately. Here, we adapt the interferometric approach to the problem of locating events relative to one another and show that this reduces the uncertainty in location estimates. To completely recover the Green's function between two events with interferometry requires a 2D array of receivers. When only a single observation well is available, we do not attempt to recover the full Green's function, but instead perform a partial redatuming of the data allowing us to reduce the uncertainty in two of the three components of the event location.
A special session was held at the 2014 Society of Exploration Geophysicists (SEG) Annual Meeting in Denver to brainstorm on the topic of what microseismic data can say about fluid flow. The special session was sponsored by the Research Committee of SEG and coorganized by Oleg Poliannikov (Massachusetts Institute of Technology) and Hugues Djikpesse (Schlumberger, now CHRYSOS Technologies L.L.C). It consisted of eight presentations with a broad range of speakers from academia and industry and was attended by about 150 participants. The findings of this special session are reported, and then possible directions for the future of flow-guided microseismic research are discussed.