Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency...

6
73 rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011 G009 Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many land and ocean bottom datasets suffer from high levels of noise which make the task of processing and interpretation difficult. With legacy land data, high noise levels are generally due to low CMP fold. High fold modern acquisition can also be noisy due to poor geophone coupling, ground or mud roll, or because single sensors rather than arrays are used. As these data often exhibit irregular sampling, denoising them can be difficult due to the majority of random noise attenuation algorithms requiring regularly sampled data. We introduce a semblance driven denoising algorithm in the high resolution tau-p domain that can offer strong denoising capabilities and work directly with irregularly sampled data. The algorithm can be applied in all five recording dimensions (inline, crossline, offset-x, offset-y, time) to avoid working on subsets of data, which increases the ability for weak signals to be uncovered from below high levels of noise. Application of the algorithm on irregularly sampled synthetic and real datasets demonstrate the power of the method by greatly reducing the noise content whilst accurately preserving the signal.

Transcript of Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency...

Page 1: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

G009Multi-dimensional Coherency Driven Denoising ofIrregular DataG. Poole* (CGGVeritas Services (UK) Ltd)

SUMMARYMany land and ocean bottom datasets suffer from high levels of noise which make the task of processingand interpretation difficult. With legacy land data, high noise levels are generally due to low CMP fold.High fold modern acquisition can also be noisy due to poor geophone coupling, ground or mud roll, orbecause single sensors rather than arrays are used. As these data often exhibit irregular sampling,denoising them can be difficult due to the majority of random noise attenuation algorithms requiringregularly sampled data. We introduce a semblance driven denoising algorithm in the high resolution tau-pdomain that can offer strong denoising capabilities and work directly with irregularly sampled data. Thealgorithm can be applied in all five recording dimensions (inline, crossline, offset-x, offset-y, time) toavoid working on subsets of data, which increases the ability for weak signals to be uncovered from belowhigh levels of noise. Application of the algorithm on irregularly sampled synthetic and real datasetsdemonstrate the power of the method by greatly reducing the noise content whilst accurately preservingthe signal.

Page 2: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

Introduction

Many land and ocean bottom datasets suffer from high levels of noise which make the task of processing and interpretation difficult. For low fold datasets, regularization algorithms working simultaneously across all five recording axes (inline, crossline, offset-x, offset-y and time) have been shown to increase the sampling density; thus improving the signal-to-noise ratio of the stack section (Trad, 2009 and Poole, 2010). These methods often succeed where lower dimensional algorithms fail because 2D and 3D algorithms only work on a subset of the data. By working in 5D it is possible to uncover weak signal hidden below high amplitude noise. The simultaneous use of all recording directions avoids processing across discontinuities in the data, for example jitter within an offset volume relating to variations in offset and azimuth. Modern single sensor high fold datasets can also exhibit high noise levels due to poor coupling and ground or mud roll. For such datasets it can be pragmatic to reduce the noise level rather than interpolating even more densely. Denoising algorithms are generally split into two categories, those that are designed to remove random noise and those to remove coherent noise. The removal of random noise normally relies on the fact that while signal is predictable, incoherent noise is not. This principle is the basis for fx prediction filtering (Canales, 1984), fx projection filtering (Soubaras, 1994) and many coherency driven techniques (for example Gulunay, 2007). Other denoising algorithms attempt to mitigate coherent noise by characteristics that distinguish it from primary energy. For example, Radon demultiple makes the distinction that on normal moveout corrected CMP gathers primary energy is flat while multiple energy curves downwards (Hampson, 1986). Other coherent energy can be distinguished through modeling and subtraction (Le Meur et al, 2008). For random noise attenuation algorithms that require regularly sampled data, irregular datasets first need to be regularized prior to denoising. The simplest method of achieving this is through flex binning which duplicates traces from neighbouring bins to fill holes in coverage. While this method ensures one trace per bin, the flex bin traces will often not be a good representation of what would have been recorded in those bins; particularly for data with significant dip. In addition, jitter can be apparent in the data due to irregular sampling within the bins. The application of traditional methods (such as fx prediction filtering) in such circumstances will be sub-optimal as the irregularity of the sampling makes the primary energy disjointed. As such, the primary energy will be smeared and detail will be lost. Just as with data regularization, the success of denoising techniques can be greatly improved by applying them in 5D. Tau-p based coherency enhancement can be extended to work in 5D using the inline, crossline, offset-x, and offset-y directions simultaneously to enhance the signal. In this paper we introduce a method based on coherency enhancement for the suppression of random noise on irregularly sampled data.

Algorithm

The first step of the algorithm involves transforming the irregular input data into the slant stack domain. In order for the algorithm to be amplitude preserving and to model energy beyond aliasing, it is essential to use a high resolution transform. Either high resolution Radon transforms (Herrmann et al, 2000) or the slant stack equivalent of the anti-leakage Fourier transform (Xu et al, 2005, and Ng and Perz, 2004) fulfill this requirement. The next step of the process involves distinguishing regions of noise and regions of signal. Noise regions are scaled down, and finally the data is transformed back into the x-t domain. Scaling in the tau-p domain is similar to applying fx prediction filtering to the data as the response of prediction filters in the FK domain is strong for the main signal energy (which is predictable) and weak for noise areas. This property extends to the tau-p domain as utilized by the pyramid transform (Hung et al, 2004). One strength of this method over fx prediction/projection methods is that it can be applied to irregular data. The algorithm can either output the data on the original irregular coordinates

Page 3: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

or on to other specified coordinates. This allows the dataset to be denoised, regularized or to be mapped on to the coordinates of a secondary dataset; for example another vintage of a timelapse study. The proposed method also offers more flexibility in controlling the level of denoising. The following examples demonstrate the power of the proposed method on synthetic and real datasets.

Synthetic data example

A synthetic dataset was generated using shot and receiver coordinates from a real land dataset with irregular spacing and poor sampling (~15 fold). The model consisted of a constant velocity medium (2000m/s) with a single dipping horizon (30o dip). An inline from the dataset for offset range 1000m-1100m is shown in Figure 1a where significant jitter is observed due to holes in coverage and variation in azimuth. After adding random noise to the dataset (Figure 1b), the dataset was denoised using the previously described algorithm in 3D and 5D. The 3D application used the inline and crossline directions to denoise the data, the output is displayed in Figure 1c with difference in Figure 1e. While the algorithm has removed much of the noise, significant signal damage can be observed due to the azimuth related jitter not being modelled by the transform. The results of the 5D application (inline, crossline, offset-x, offset-y, time) along with difference are given in Figures 1d and 1f respectively. We observe a similar level of denoising as the 3D application but with excellent preservation of primary energy. By operating in 5D, the algorithm can model the variation of reflected energy with all spatial coordinates and is able to preserve the clarity of the event.

Ocean bottom cable example

This ocean bottom cable (OBC) acquisition utilised 10 cables with 500m spacing. The receivers were composed of a hydrophone and 3 geophones with a 75m separation. Shots were fired on a 50m x 50m grid with maximum offset 5km. The dataset was processed on a common-offset-vector (COV) grid (150m inline-offset × 1000m crossline-offset) with a 25m × 25m bin size In OBC processing, the upgoing wavefield can be estimated by summing the pressure (P) and vertical geophone (Z) components. Multiple energy is suppressed due to the difference in polarity between P and Z components. One drawback of this process is that the Z component is inherently noisy which degrades the P component. Although PZ summation can be modified to reduce the amount of noise contamination (Zabihi et al, 2011), further denoising is often required. As these data are irregularly sampled this dataset would have to be regularised before application of traditional methods, and even then would be sub-optimal. For this reason coherency enhancement was applied after PZ summation but before imaging. Figure 2 shows a common midpoint (CMP) gather

Figure 1 a) Synthetic data based on coordinates from a real dataset, b) input with added random noise, c) results after 3D denoise, d) results after 5D denoise, e) difference b) – c), f) difference b) – d)

Page 4: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

and stack section from the dataset before and after denoising. The high level of noise is easily seen on the input CMP, but after denoising the signal becomes much more coherent. As the transform fully respects the irregular recording coordinates, the algorithm is ideally suited to irregular data such as this. Although the stack section exhibits a higher signal-to-noise ratio than the pre-stack data, we can also observe a significant improvement in the coherency of events.

Figure 2 Pre-imaging CMP and stack section before (top) and after (bottom) denoising

While the migration process itself cancels out a lot of incoherent energy, the advantages of the denoising technique are still apparent. Figure 3 shows a migrated common image point gather (CIP) and stack section before and after denoising. The displays demonstrate an uplift in the coherency of reflection energy and a suppression of noise. For this dataset the denoising algorithm was particularly useful to aid velocity model building through reflection tomography.

Conclusions

We have introduced a new coherency driven random noise attenuation method in the high resolution tau-p domain. The method has benefits over traditional denoising algorithms as it offers high flexibility in the level of denoising, can work directly with irregular data, and can be applied in up to five dimensions (inline, crossline, offset-x, offset-y, and time). The large number of dimensions avoids cascaded applications of lower dimensional algorithms which only work on small subsets of the available data and are less effective at enhancing very weak signals hidden under high amplitude noise. We have demonstrated the power of the technique on synthetic data as well as a real ocean bottom cable dataset. The resulting pre- and post-stack data exhibits much improved continuity whilst preserving the weak reflected energy.

a) Input CMP gather b) Input stack section

c) Denoised CMP gather d) Denoised stack section

Page 5: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

Figure 3 Post-imaging CIP and stack section before (top) and after (bottom) denoising

Acknowledgements

We would like to thank BP for permission to show the real data example and CGGVeritas to allow the submission of this paper. Also we would like to thank the team at CGGVeritas who processed the data including Ewan Hillier, Sharon Howe, and Sandrine David.

References

Canales, L. L. [1984] Random noise reduction. 54th SEG Annual International Meeting, Expanded Abstracts, 3, no. 1, 525–529. Gulunay, N., Holden, J., and Connor, J. [2007] Coherency enhancement on 3D seismic data by dip detection and dip selection. 77th SEG Annual International Meeting, Expanded Abstracts Hampson, D. [1986] Inverse velocity stacking for multiple elimination. Canadian Journal of Exp. Geophysics, 22, 44-55. Herrmann, P., Mojesky, T., Magesan, M., and Hugonnet, P. [2000] De-aliased, high-resolution Radon transforms. 70th SEG Annual International Meeting, Expanded Abstracts, 1953-1956. Hung, B., Notfors, C., and Ronen, S. [2004] Robust prediction filtering using the pyramid transform. 66th EAGE Conference & Exhibition, Expanded Abstracts, Z-99. Le Meur, D., Benjamin, N., Cole, R., and Al Harthy, M. [2008] Adaptive groundroll filtering. 70th EAGE Conference & Exhibition, Expanded Abstracts. Ng, M., and Perz, M. [2004] High resolution Radon transform in the t-x domain using “intelligent” prioritization of the Gauss-Seidel estimation sequence. 74th SEG Annual International Meeting, Expanded Abstracts. Poole, G. [2010] 5D data reconstruction using the anti-leakage Fourier transform. 72nd EAGE Conference & Exhibition, Expanded Abstracts, B046.

a) Migrated CIP b) Migrated stack section

c) Denoised migrated CIP d) Denoised migrated stack section

Page 6: Multi-dimensional Coherency Driven Denoising of Irregular Data · Multi-dimensional Coherency Driven Denoising of Irregular Data G. Poole* (CGGVeritas Services (UK) Ltd) SUMMARY Many

73rd EAGE Conference & Exhibition incorporating SPE EUROPEC 2011 Vienna, Austria, 23-26 May 2011

Soubaras, R. [1994] Signal-preserving random noise attenuation by the F-X projection. 64th SEG Annual International Meeting, Expanded Abstracts, 13, no. 1, 1576–1579. Trad, D. [2009] Five-dimensional interpolation: Recovering from acquisition constraints. Geophysics, 74, V123. Xu, S., Zhang, Y., Pham, D., and Lambare, G. [2005] Anti-leakage Fourier transform for seismic data regularization. Geophysics, 70, 87-95. Zabihi Naeini, E., Baboulaz, L., and Grion, S. [2011] Enhanced wavefield separation for OBC/OBS data processing. Submitted for acceptance 73rd EAGE Conference & Exhibition, Expanded Abstracts.