Imaging and inversion
Chris Leader and Biondo Biondi
Separating simultaneously acquired seismic data is the link between more efficient acquisition and conventional imaging techniques. Successful methods of separating these data rely strongly on random source timings and positionings; loosening this acquisition restriction would make survey design and implementation more ﬂexible. By performing a series of transformations it is possible to isolate and remove overlapping artifacts that are ubiquitious when imaging simultaneously acquired data with constant time delays. Initially, Extended Reverse Time Migration is applied, generating a roughly focused image in subsurface offset. The image is transformed to the angle domain and a hyperbolic radon transform is then applied, isolating certain events and allowing a separation to be performed as a function of curvature. Basic tests have shown that after a few iterations of this transform events become well separated. The reverse transforms are then applied and the data demigrated, giving the equivalently unblended dataset without requiring accurate velocity control.
Sjoerd de Ridder and Biondo Biondi
We propose a new technique for passive seismic imaging via the direct application of operators to noise recordings. We propose a time-domain 2D scalar wave equation to describe the propagation of surface waves within a narrow frequency range. Outside the source region, this scalar wave equation relates the second-order spatial and second-order temporal derivatives of the waveﬁeld with the local velocity. Different from seismic interferometry, this technique does not rely on cross-correlations to reveal the statistical coherence of a chaotic waveﬁeld at two locations. Rather, it relies on the local measurements of velocity obtained directly from the ratio between temporal and spatial derivatives of the waveﬁeld. The new method allows us to do passive imaging with much shorter passive recordings. Numerical data examples show that this theory can yield reliable images if the waveﬁeld is sampled sufficiently in space and time.
Musa Maharramov and Biondo Biondi
We propose a formulation of full-waveﬁeld inversion (FWI) as a constrained optimization problem, and describe a computationally efficient technique for solving constrained full-waveﬁeld inversion (CFWI). The technique is based on using a total-variation regularization method, with the regularization weighted in favor of constraining deeper subsurface model sections. The method helps to promote “edge-preserving” blocky model inversion where ﬁtting the seismic data alone fails to adequately constrain the model. The method is demonstrated on synthetic datasets with added noise, and is shown to enhance the sharpness of the inverted model and correctly reposition mispositioned reﬂectors by better constraining the velocity model at depth.
Guillaume Barnier and Ali Almomin
This paper is a tutorial on linearized two-way wave equation modeling and inversion operators. We provide a detailed derivation for the special case of an acoustic, isotropic, constant-density medium. We analyze the Born, tomographic, and WEMVA forward modeling operators, their adjoints, and we extend the analysis to the subsurface offset domain.
Bob is right (per usual) (pdf)
Stewart A. Levin
Bob who? —— Bob Clapp. . . . . . . . . . . . . About what? —— Table lookups. When reusable calculations are expensive, it can be advantageous to precalculate results over relevant parameter ranges and subsequently use table lookups to speed up programs that use such calculations repeatedly. Classically, sin/cos tables for fast Fourier transforms have been used to good effect. During recent development of a 3D downward continuation code, I found a range covering over an order of magnitude in the speed of sin/cos computations, with only nearest neighbor table lookup approximation outperforming Intel optimized vector routines.
I present an overview of a 2D ocean bottom node survey acquired in the North Sea and made available to the Stanford Exploration Project (SEP) by Seabed Geosolutions. I describe the arrangement of the nodes, followed by the shot arrays and the data sets. The main features of this data set are four component data with offsets up to 100 kilometer, trace lengths up to 40 seconds and three different node arrangements: regular spread, long offset and microspread.
Sjoerd de Ridder
The Moere Vest Ocean Bottom Node Survey was acquired for the purpose of exploration with controlled seismic source. However, the nodes recorded continuously. This paper will explore the use of the ambient seismic ﬁeld for the purpose of creating low frequency virtual sources by seismic interferometry. I ﬁnd that this OBN dataset contains abundant energy at microseism frequencies that yield virtual seismic sources by cross-correlations. The cross-correlation gathers contain dispersive interface waves and events with a hyperbolic move out.
SEG 2014 benchmark data (pdf)
Ali Almomin, Xukai Shen, Carlo Fortini, Guillaume Barnier and Biondo Biondi
A synthetic bunchmark test data was created by Chevron for the SEG 2014 workshop. We ﬁrst overview the data and the potential challenges that it presents for velocity estimation methods. Then, we process the data by reducing the noise and the sea bed surface-related multiples. Finally, we estimate the P-wave velocity model with a sequence of full waveform inversion methods that ﬁrst analyze the early arrivals then the reﬂection data. The angle-domain common-image gathers of the ﬁnal velocity model show signiﬁcant improvement compared to the initial velocity model and indicate that most of the kinematics of the data were successfully estimated.
Signal processing and L1/L2 optimization
Of three methods to deal with nonstationary signals the most appealing is to interpolate ﬁlters from a coarse mesh. While operators conveniently carry to a regular mesh scattered data values, scattered data signals invite techniques more akin to matrix inversion.
Iterative migration using sparseness constraints with the hyperbolic penalty function: Application on two 2-D ﬁeld datasets (pdf)
Mandy Wong and Antoine Guitton
Sparse reﬂectivities are obtained with iterative migration thanks to the hyperbolic penalty function for both data ﬁtting and model styling goals. Sparseness is achieved by letting parts of the model treated in a ”l1-norm” sense. Comparing with a more classical least-squares approach without regularization, the sparse-reﬂectivity images have less artifacts and better deﬁned reﬂectors. However, these reﬂectors are often less continuous and the parameterization of the hyperbolic penalty function remains cumbersome. The main advantages of the hyperbolic penalty function, as opposed to other ”l1-type” norms, is that it is convex, can behave like the l1 or l2 norms as needed, and can be minimized very effciently with a fast non-linear conjugate direction method.
Application of the up-down separation using PZ calibration ﬁlter based on critically refracted waves (pdf)
Ettore Biondi and Stewart A. Levin
On marine multicomponent data we may apply PZ summation to separate the up-going and down-going waveﬁelds. We show an example of this procedure for an ocean bottom nodal (OBN) dataset, where we perform an acoustic decomposition that gives the up- and down-going pressure ﬁelds in the water. To design the necessary calibration ﬁlters, we target long-offset refracted waves that contain purely up-going energy. After the acoustic separation, we adaptively subtract the down-going waveﬁeld from the original pressure component. The quality of the separation achieved demonstrates that the calibration ﬁlters obtained from the long-offset waveforms were effective in decomposing almost all the events present in each gather.
Yinbin Ma, Musa Maharramov, and Biondo Biondi
Hyperbolic penalty function method and hybrid L1/L2 methods, often generate better results than conventional least-squares solutions for inverse problem in geophysics. We apply a few hyperbolic and hybrid L1/L2 methods to 2-D Kirchhoff migration inversion and target-oriented linearized waveform inversion. The results demonstrate that we can recover a sparse/blocky model using hyperbolic and hybrid L1/L2 methods with acceptable computational cost.
Modeling and anisotropy
Robert G. Clapp
Generating a realistic synthetic model is a challenging problem in a geophysical research environment. Achieving the right balance between being complex enough to be realistic while still simple enough that a new algorithm can be debugged is hard to achieve. I propose a different way to generate synthetic models, by allowing the user to specify a series of geologic events. The result of each event is approximated on the current model. This approach has the beneﬁt of allowing complex models to be built is easily extendable to multiple model parameters, and allows the user to “turn off” events, allowing the construction of simpler models by stages. A geologic event-based modeling strategy proves to be useful to build simple to quite complex models.
I implement a free surface boundary condition for the generation of surface waves using a 10th order in space and 2nd order in time ﬁnite-difference staggered-grid scheme. I show an example of a ﬁeld seismic section and recreate its main features using the proposed scheme. The synthetic data created show Rayleigh waves, backscattered waves and mode conversions, and ﬁt the kinematics of the ﬁeld data.
Ohad Barak, Robert Brune, Paul Milligan, and Shuki Ronen
In the Moere Vest acquisition a group of ocean-bottom nodes were deployed with a nominal spacing of two meters. We preprocessed the data of one shot line that traverses directly above these nodes. We then generated rotational and pressure-gradient data by differencing the geophone and hydrophone data of the adjacent nodes. We discuss the possibility of reverse-time propagation of such multicomponent seismic data.
Pseudo-acoustic vertical transverse isotropic migration velocity analysis using two-way waveﬁeld propagation (pdf)
Wave-Equation Migration Velocity Analysis (WEMVA) is widely used as a tool to reconstruct a model of the subsurface, such that some features of the migrated image are met. I show an anisotropic WEMVA workﬂow based on the Vertical Transverse Isotropic (VTI) approximation for the velocity model and a pseudo-acoustic anisotropic two-way wave-equation modeling engine. I derive the theory of WEMVA starting from the gradients of the anisotropic Full Waveform Inversion (FWI) that provides the input images for the velocity analysis. In doing so, I introduce the concept of generalized images that deﬁnes the FWI gradients computed with respect to the anisotropic parameters as different images of the subsurface. The results of some preliminary tests on the use of the generalized images as input for WEMVA suggest that this approach could help improving the accuracy and rate of convergence of WEMVA.
Musa Maharramov and Biondo Biondi
We propose a multi-model formulation of full-waveform inversion that is similar to image decomposition into a “cartoon” and “texture” used in image processing. Inversion problem is formulated as unconstrained multi-norm optimization that can be solved using conventional iterative solvers. We demonstrate the proposed model decomposition approach by recovering a blocky subsurface seismic model from noisy data in time-lapse and single-model full-waveform inversion problems.
Reformulating TFWI (pdf)
Tomographic full waveform inversion (TFWI) provides a robust but expensive method to invert the seismic data. Scale separation of the model greatly reduces the cost but adds complexity to theory and the implementation of the inversion. In this paper, I provide two approaches that reduce the complexity of TFWI. First, I rederive TFWI with one model only in an abstract formulation that is applicable to any form of wave-equation. Then, I provide a new approximation to the inversion that can potentially provide more accurate results.
Robust joint full-waveform inversion of time-lapse seismic datasets with total-variation regularization (pdf)
Musa Maharramov and Biondo Biondi
We present a technique for reconstructing subsurface velocity model changes from time-lapse seismic data using full-waveform inversion (FWI). The technique is based on simultaneously inverting multiple survey vintages, with model difference regularization using the total variation (TV) seminorm. We compare the new TV-regularized time-lapse FWI with the L2-regularized joint inversion proposed in our earlier work, using synthetic datasets that exhibit survey repeatability challenges. The results demonstrate clear advantages of the proposed TV-regularized joint inversion over alternatives methods for recovering production-induced model changes that are due to both ﬂuid substitution and geomechanical effects.
Taylor Dahlke, Biondo Biondi and Robert Clapp
Level set methods can provide a sharp interpretation of the salt body by deﬁning the boundary as an isocontour of a higher dimensional implicit representation, and then evolving that surface to minimize the Full Waveform Inversion (FWI) objective function. Because the implicit surface update gradient is based on the tomographic update gradient, there is potential to utilize it to update the background velocity concurrently with the salt boundary. Using a shape optimization approach on synthetic examples, we can achieve reasonable convergence both in terms of the residual L2 norm, as well as the evolution of the salt boundary and background velocity towards the true model, demonstrating the feasibility of this approach. Various factors in processing the gradients and calculating step size inﬂuence this convergence, which we analyze and address. Ultimately, this method can be integrated into the processing work ﬂow as a tool that provides improved building and reﬁning of the velocity models used for imaging.