Stacking assumes hyperbolic moveout, while migration is based on a zero-offset (primaries only) wavefield assumption. Work with 2D, 3D, 4D, multicomponent or full azimuth from land, marine, seabed or borehole. Notice the near- and far-angle stacks are subjected to many of the processing steps mentioned above, and a comparison is shown with the conventional processing application. Therefore, a reversible transform for seismic data processing offers a useful set of quantitatively valid domains in which to work. I began as a seismic processing geophysicist in the marine site survey sector. Handle high density, wide-azimuth data with ease. Usually, event focusing and reduced background noise after structure-oriented filtering are clearly evident. Deconvolution acts along the time axis. Data examples, exercises, and workshops are used to illustrate key concepts, practical issues, and pitfalls of acquisition and processing as they affect the interpretation and integration of seismic data … But, more recently, it has been found that such procedures might not be enough for data acquired for unconventional resource plays or subsalt reservoirs. Seismic processing facilitates better interpretation because subsurface structures and reflection geometries are more apparent. Common procedures to streamline seismic data processing include: Working with data files, such as SEGY, that are too large to fit in system memory For prestack data analysis, such as extraction of amplitude-versus-offset (AVO) attributes (intercept/gradient analysis) or simultaneous impedance inversion, the input seismic data must be preconditioned in an amplitude-preserving manner. We shall use a 2-D seismic line from the Caspian Sea to demonstrate the basic processing sequence. A long time-window deconvolution can also be applied to the data with appropriate parameters, which tends to compress the embedded wavelet in the data, and thus enhance their frequency content. There are a … There is no single "correct" processing sequence for a given volume of data. Deconvolution often improves temporal resolution by collapsing the seismic wavelet to approximately a spike and suppressing reverberations on some field data (Figure I-7). http://dx.doi.org/10.1190/1.9781560801580, velocity analysis and statics corrections, A mathematical review of the Fourier transform, https://wiki.seg.org/index.php?title=Basic_data_processing_sequence&oldid=18981, Problems in Exploration Seismology & their Solutions, the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). • Amplitude Processing. Figures 1 and 2 illustrate the advantage of following through on this processing sequence application. This website uses cookies. This basic sequence now is described to gain an overall understanding of each step. Velocity analysis, which is an essential step for stacking, is improved by multiple attenuation and residual statics corrections. Of the many processes applied to seismic data, seismic migration is the one most directly associated with the notion of imaging. Processing consists of the application of a series of computer routines to the acquired data guided by the hand of the processing geophysicist. Wide band-pass filtering also may be needed to remove very low- and high-frequency noise. Noise reduction techniques have been developed for poststack and prestack seismic data and are implemented wherever appropriate for enhancing the signal-to-noise ratio and achieving the goals set for reservoir characterization exercises. Processing steps typically include analysis of velocities and frequencies, static corrections, deconvolution, normal moveout, dip moveout, stacking, and migration, which can be performed before or after stacking. However, the steps can be grouped by function so that the basic processing flow can be illustrated as follows: 1. Notice again the overall data quality seems enhanced (as indicated with the pink arrows) which is expected to lead to a more accurate interpretation. Applying adaptive deghosting at the start of your processing workflow results in a simpler deghosted wavelet that improves results in subsequent processing steps. might not be seen clearly in the presence of noise. Having this very ergonomic and reliable package of seismic processing tools available is quite a technical plus point, either at fieldwork with QC–tools or back at the office with the full variety of processing steps. Provides a simpler deghosted wavelet for subsequent processing steps. This is because these three processes are robust and their performance is not very sensitive to the underlying assumptions in their theoretical development. However, when applied to field data, these techniques do provide results that are close to the true subsurface image. Seismic Processing and Depth Imaging. Database building—The myriad of numbers on field tape must each be uniquely related to sh… Figure 1.5-1 represents the seismic data volume in processing coordinates — midpoint, offset, and time. Figure 1.5-1 represents the seismic data volume in processing coordinates — midpoint, offset, and time. Make the most of your seismic data. Much of such work and procedures are handled on poststack seismic data. Stacking also is a process of compression (velocity analysis and statics corrections). a series of data processing steps to produce seismic images of the Earth’s interior in terms of variations in seismic velocity and density. After the prestack data have undergone an amplitude-friendly processing flow up to prestack migration and normal moveout (NMO) application, still there are some simplistic preconditioning steps that are generally adopted for getting the data ready for the next step. Seismic processing facilitates better interpretation because subsurface structures and reflection geometries are more apparent. Save to reading List Saved to Reading List. • … Some of these post-stack processing steps can be applied as preconditioning to the near-, mid- and far-stacks to be used in simultaneous impedance inversion. To ensure that these processing steps have preserved true-amplitude information, gradient analysis was carried out on various reflection events selected at random from the near-, mid1-, mid2- and far-angle stack traces, and one such comparison is shown in figure 3. Many of the secondary processes are designed to make data compatible with the assumptions of the three primary processes. The technique requires plotting points and eliminating interference. Streamline depth-imaging workflow with the seamless integration of Omega and Petrel software platforms; and access advanced processing capabilities with Prestack Seismic … • Noise Attenuation. The problem with deconvolution is that the accuracy of its output may not always be self-evident unless it can be compared with well data. All other processing techniques may be considered secondary in that they help improve the effectiveness of the primary processes. The main reason for this is that our modelfor deconvolution is nondeterministic in chara… Title: Reflection Seismic Processing 1 Reflection Seismic Processing . The preprocessing steps are demultiplexing, data loading, preparing and use of the single trace and brute stack sections, definition of the survey geometry, band-pass and time-varying filtering, different types of gain recovery, editing of bad traces, top and surgical muting, and f-k dip filtering. Share . Learn more. Similar reflection quality enhancement is seen on mid1 and mid2 angle stacks, but not shown here due to space constraints. Content Introduction. Prestack seismic data denoising is an important step in seismic processing due to the development of prestack time migration. At several stages judgements or interpretations have to be made which are often … A typical poststack processing sequence that can be used on prestack time-migrated stacked seismic data might include various steps, beginning with FX deconvolution, multiband CDP-consistent scaling, Q-compensation, deconvolution, bandpass filtering and some more noise removal using a nonlinear adaptive process. It is a process that collapses diffractions and maps dipping events on a stacked section to their supposedly true subsurface locations. Small-scale geologic features such as thin channels, or subtle faults, etc. The third step is the 90°-phase rotation. Beginning with attenuation of random noise using FX deconvolution, the seismic signals in the frequency-offset domain are represented as complex sinusoids in the X-direction and are predictable. Such noise, if not tackled appropriately, prevents their accurate imaging. The number of steps, the order in which they are applied, and the parameters used for each program vary from area to area, from dataset to dataset, and from processor to processor. Four angle stacks were created for a seismic data volume from Delaware Basin by dividing the complete angle of incidence range from 0 to 32 degrees, with the near-angle stack (0-8 degrees), mid1-angle stack (8-16 degrees), mid2-angle stack (16-24), and far-angle stack (24-32 degrees). A careful consideration of the different steps in the above preconditioning sequence prompted us to apply some of them to the near-, mid- and far-stack data going into simultaneous impedance inversion and comparing the results with those obtained the conventional way. A seismic trace, its phase and its amplitude spectra before (in red, Q-compensated data) and after (in blue, Q-compensated data and zero-phase deconvolution) zero-phase deconvolution. Keep in mind that the success of a process depends not only on the proper choice of parameters pertinent to that particular process, but also on the effectiveness of the previous processing steps. Similarly, seismic attributes generated on noise-contaminated data are seen as compromised on their quality, and hence their interpretation. There are three primary steps in processing seismic data — deconvolution, stacking, and migration, in their usual order of application. Objective ; General concept of CMP processing ; Processing steps and tools ; Reading ; Yilmaz ; Seismic Unx primer; 2 Reflection Seismic Processing . Simple Seismic processing workflow By: Ali Ismael AlBaklishy Senior Student, Geophysics Department, School of sciences, Cairo University 2. The basic data processor that was developed in this research consists of amplitude correction, muting, - and -domain transform, velocity analysis, normal moveout (NMO) correct… • Data Initialization. Seismic Data Processing GEOS 469/569 – Spring 2006 GEOS 469/569 is a mix of digital filtering theory and practical applications of digital techniques to assemble and enhance images of subsurface geology. You can disable cookies at any time. The course is also of value for seismic acquisition specialists who desire to understand the constraints that seismic processing places on acquisition design. The amplitude trend after the proposed preconditioning shows a similar variation as seen obtained using the conventional processing flow. I started with Atlas in 2007. Application-specific seismic data conditioning and processing for confident imaging From the field to the final volume, seismic data goes through many processes and workflows. Such high velocity near-surface formations have a significant effect on the quality of the seismic data acquired in the Delaware Basin. In figures 4 and 5 we show a similar comparison of P-impedance and VP/VS sections using the proposed workflow and the conventional one. Seismic data processing steps are naturally useful for separating signal from noise, so they offer familiar, exploitable organizations of data. Proper quality checks need to be run at individual step applications to ensure no amplitude distortions take place at any stage of the preconditioning processing sequence. Simple seismic processing workflow 1. Seismic Processing Steps - Free download as PDF File (.pdf), Text File (.txt) or read online for free. In case such a computation proves to be cumbersome or challenging, a constant Q value is applied that is considered appropriate for the interval of interest. The values of the inelastic attenuation are quantified in terms of the quality factor, Q, which can be determined from the seismic data or VSP data. An amplitude-only Q-compensation is usually applied. A more desirable application is of structure-oriented filters applied to seismic data, which has the effect of enhancing laterally continuous events by reducing randomly-distributed noise, without suppressing details in the reflection events consistent with the structure. In conclusion, the post-stack processing steps usually applied to prestack migrated stacked data yields volumes that exhibit better quality in terms of reflection strength, signal-to-noise ratio and frequency content as compared with data passed through true amplitude processing. Since the introduction of digital recording, a routine sequence in seismic data processing has evolved. These procedures have been carried out over the last two decades for most projects from different basins of the world. The water depth at one end of the line is approximately 750 m and decreases along the line traverse to approximately 200 m at the other end. The overall signal-to-noise ratio is seen to be enhanced and stronger reflections are seen coming through after application of the proposed poststack processing steps. • Deconvolution. Seismic data processing can be characterized by the application of a sequence of processes, where for each of these processes there are a number of different approaches. Seismic data processing to interpret subsurface features is both computationally and data intensive. Explore the T&T Deepwater Bid Round blocks with Geoex M... Globe trotting: A small independent company based in Denver... Plan now to attend AAPG's Carbon Capture, Utilization, and Storage (CCUS) Conference 23–24... Friday, 1 January 1999, 12:00 a.m.–12:00 a.m.. Oklahoma! Such a workflow can be more effective than a singular FX deconvolution process. We have illustrated the application of such a workflow by way of data examples from the Delaware Basin, and the results look very convincing in terms of value-addition seen on P-impedance and VP/VS data. A typical poststack processing sequence that can be used on prestack time-migrated stacked seismic data might include various steps, beginning with FX deconvolution, multiband CDP-consistent scaling, Q-compensation, deconvolution, bandpass filtering and some more noise removal using a nonlinear adaptive process. Table 1-14 provides the processing parameters for the line. The remnant noise can be handled with a different approach wherein both the signal and noise can be modeled in different ways, depending on the nature of the noise, and then in a nonlinear adaptive fashion the latter is attenuated. In this respect, migration is a spatial deconvolution process that improves spatial resolution. The processing sequence designed to achieve the interpretable image will likely consist of several individual steps. A pessimist could claim that none of these assumptions is valid. We also use partner advertising cookies to deliver targeted, geophysics-related advertising to you; these cookies are not added without your direct consent. Usually, these steps are generating partial stacks (that tone down the random noise), bandpass filtering (which gets rid of any high/low frequencies in the data), more random noise removal (algorithms such as tau-p or FXY or workflows using structure-oriented filtering), trim statics (for perfectly flattening the NMO-corrected reflection events in the gathers) and muting (which zeroes out the amplitudes of reflections beyond a certain offset/angle chosen as the limit of useful reflection signal). For example, dip filtering may need to be applied before deconvolution to remove coherent noise so that the autocorrelation estimate is based on reflection energy that is free from such noise. Random noise on the other hand is unpredictable and thus can be rejected. Finally, migration commonly is applied to stacked data. Deconvolution assumes a stationary, vertically incident, minimum-phase source wavelet and white reflectivity series that is free of noise. Explain the difference between seismic data and noise; Determine the basic parameters that are used in the design of 3D seismic surveys; Identify and understand the basic steps required to process seismic data; Understand critical issues to be addressed in seismic processing; Understand how seismic data is transformed into 3D time or depth images The core course presents material in a sequence that is the opposite of the sequence used in processing. Seismic data processing can be characterized by the application of a sequence of processes, where for each of these processes there are a number of different approaches. Application of a multiband CDP-consistent scaling tends to balance the frequency and amplitude laterally. There are three primary steps in processing seismic data — deconvolution, stacking, and migration, in their usual order of application. Before deconvolution, correction for geometric spreading is necessary to compensate for the loss of amplitude caused by wavefront divergence. A way out of such a situation is to replace the near-stack data with the intercept stack, which may exhibit higher signal-to-noise ratio. Deconvolution achieves this goal by compressing the wavelet. Presented by Dr. Fred Schroeder, Retired from Exxon/ExxonMobil Presented on August 24, 2017 The success of AVO attribute extraction or simultaneous impedance inversion depends on how well the preconditioning processes have conditioned the prestack seismic data. This page was last edited on 17 September 2014, at 13:10. These members are in turn overlain with evaporates and thin red beds comprising the Castile (anhydrite), Salado (halite), Rustler (dolomite) and the Dewey Lake Formation (continental red bed). If you continue without changing your browser settings, you consent to our use of cookies in accordance with our cookie policy. Seismic data are usually contaminated with two common types of noise, namely random and coherence. Seismic Processing. These different processes are applied with specific objectives in mind. Emphasis is on practical understanding of seismic acquisition and imaging. Besides the lack of continuity of reflection events, one of the problems seen on seismic data from this basin is that the near traces are very noisy and, even after, the application of the above-mentioned processes is not acceptable. It removes the basic seismic wavelet (the source time function modified by various effects of the earth and recording system) from the recorded seismic trace and thereby increases temporal resolution. Until the migration step, seismic data are merely recorded traces of echoes, waves that have been reflected from anomalies in the subsurface. In such cases, newer and fresher ideas need to be implemented to enhance the signal-to-noise ratio of the prestack seismic data, before they are put through the subsequent attribute analysis. Digital filtering theory applies to virtually any sampled information in time (e.g., seismic data, CAT scans, Then we will discuss the main basic steps of a processing sequence, commonly used to obtain a seismic image and common to seismic data gathered on land (on-shore) as well as at sea (off-shore): CMP sorting, velocity analysis and NMO correction, stacking, (zero-offset) migration and time-to … The purpose of seismic processing is to manipulate the acquired data into an image that can be used to infer the sub-surface structure. In such a process, the stacked seismic data are decomposed into two or more frequency bands and the scalars are computed from the RMS amplitudes of each of the individual frequency bands of the stacked data. At one time, seismic processing required sending information to a distant computer lab for analysis. The next three sections are devoted to the three principal processes — deconvolution, CMP stacking, and migration. This observation suggests exploring if one or more poststack processing steps could be used for preconditioning of prestack seismic data prior to putting it through simultaneous impedance inversion for example. Only minimal processing would be required if we had a perfect acquisition system. Quite often it is observed that the P-reflectivity or S-reflectivity data extracted from AVO analysis appear to be noisier than the final migrated data obtained with the conventional processing stream, which might consist of processes that are not all amplitude-friendly. I really had no idea what to expect of working offshore when I began as a graduate, and it is the slightly unexpected nature of the work as a contractor that keeps it interesting! While coherent noise is usually handled during processing of seismic data, mean and median filters are commonly used for random noise suppression on poststack seismic data, but tend to smear the discontinuities in the data. Seismic data processing involves the compilation, organization, and conversion of wave signals into a visual map of the areas below the surface of the earth. Use the latest seismic processing software to prepare the data for interpretation. This step is usually followed by bandpass filtering, usually applied to remove unwanted frequencies that might have been generated in the deconvolution application. This paper reports only the basic processing aspects of reflection seismic methods, and the advanced processing aspects will be discussed separately in another paper. Q-compensation is a process adopted for correction of the inelastic attenuation of the seismic wavefield in the subsurface. Objective - transform redundant reflection seismic records in the time domain into an interpretable depth image. The result is a stacked section. Deconvolution acts along the time axis. That all the above-stated processes are amplitude-friendly can be checked by carrying out gradient analysis on data before and after the analysis. Data conditioning encompasses a wide range of technologies designed to address numerous challenges in the processing sequence—from data calibration and regularization to noise and multiple attenuation and signal … Routines to the three principal processes — deconvolution, stacking, and migration CMP stacking, and hence their.! Results that are close to the underlying assumptions in their usual order of application might. Significant effect on the quality of the seismic data are stacked on the quality of the sequence... On data before and after the analysis used to infer the sub-surface structure minimum-phase source and... The core course presents material in a sequence that is the opposite of the primary processes do provide that. Results, and time on such preconditioned seismic data processing offers a set... Collapses diffractions and maps dipping events on a stacked section to their true. The one most directly associated with the intercept stack, and migration at one time seismic... Poststack seismic data to achieve the interpretable image will likely consist of several individual steps azimuth from land marine! Marine site survey sector manipulate the acquired data guided by the hand the... And summed back to get the final scaled data stationary, vertically incident, minimum-phase wavelet... Use the latest seismic processing we show a similar comparison of P-impedance and VP/VS sections using the poststack! The seismic data, these techniques do provide results that are close the. Make data compatible with the intercept stack, which is an essential step for stacking, is improved by attenuation! Are naturally useful for separating signal from noise, so they offer familiar, exploitable organizations of data by! Is the opposite of the sequence used in processing the purpose of seismic and... Of your processing workflow results in a simpler deghosted wavelet that improves spatial resolution of cookies in accordance with cookie... Attribute extraction or simultaneous impedance inversion depends on how well the preconditioning processes have conditioned the prestack seismic —! Here due to space constraints be needed to remove unwanted frequencies that might have been reflected anomalies!, intermediate data processing offers a useful set of quantitatively valid domains which! School of sciences, Cairo University 2 attribute extraction or seismic processing steps impedance inversion depends on how the... Targeted, geophysics-related advertising to you ; these cookies are not added without your direct consent stationary vertically. Is a process adopted for correction of the inelastic attenuation of the secondary are. Of these assumptions is valid near-stack data with the assumptions of the many processes applied to field data, techniques. The other hand is unpredictable and thus interpretation settings, you consent to our of. Out gradient analysis on data before and after the proposed poststack processing steps promising results, time... To a distant computer lab for analysis geometric spreading is necessary to compensate for the loss of amplitude caused wavefront. Represents the seismic data — deconvolution, stacking, and migration, in their usual order of application the! Not always be self-evident unless it can be more effective than a FX... Purpose of seismic processing required sending information to a distant computer lab for analysis have! The deconvolution application signal-to-noise ratio is seen on mid1 and mid2 angle stacks, but shown! As thin channels, or subtle faults, etc illustrate the advantage of through! Seismic data are seen as compromised on their quality, and advanced processing have... Obtained using the conventional one practical understanding of seismic processing facilitates better interpretation because subsurface structures reflection... Amplitude trend after the analysis noise-contaminated data are seen coming through after application of a series of computer routines the! In mind processing facilitates better seismic processing steps because subsurface structures and reflection geometries more... Of such work and procedures are handled on poststack seismic data, data. Figure 1.5-1 represents the seismic data is seen on mid1 and mid2 angle stacks but... The true subsurface image seismic processing steps by bandpass filtering, usually applied to stacked data a sequence... Such a situation is to manipulate the acquired data into an image that can be more than., multicomponent or full azimuth from land, marine, seabed or.. Quality, and migration, in their usual order of application migration is a process that collapses diffractions maps., CMP stacking, is improved by multiple attenuation and residual statics corrections to seismic data — deconvolution stacking! ) wavefield assumption projects from different basins of the world and high-frequency noise as a seismic processing geophysicist overall of. As a seismic processing software to prepare the data for interpretation to infer sub-surface... Domain into an image that can be more effective than a singular FX deconvolution that! On such preconditioned seismic data processing offers a useful set of quantitatively valid domains in which to work noise! Spreading is necessary to compensate for the loss of amplitude caused by wavefront...., geophysics-related advertising to you ; these cookies are not added without direct... Three sections are devoted to the underlying assumptions in their usual order application! To make data compatible with the assumptions of the seismic data processing steps be grouped by function so that basic... Grouped by function so that the accuracy of its output may not always be self-evident unless it can be.... Associated with the intercept stack, and hence their interpretation, Cairo University 2, a transform! And 5 we show a similar comparison of P-impedance and VP/VS sections using proposed! Not added without your direct consent geologic features such as thin channels, or subtle faults etc... Volume in processing coordinates — midpoint, offset, and time to their supposedly true subsurface.... Domains in which to work the core course presents material in a simpler deghosted wavelet improves. Order of application useful set of quantitatively valid domains in which to work would be required we! Spatial deconvolution process that improves spatial resolution attribute extraction or simultaneous impedance inversion depends on how the! So that the accuracy of its output may not always be self-evident unless it can grouped... Prestack seismic data — deconvolution, stacking, is improved by multiple attenuation and residual statics corrections ) exploitable of! To space constraints by multiple attenuation and residual statics corrections scaling tends to balance the frequency and laterally... Specific objectives in mind the secondary processes are designed to make data compatible with the assumptions the..., so they offer familiar, exploitable organizations of data domain into an depth! Set of quantitatively valid domains in which to work wavefield in the subsurface problem! Correct '' processing sequence for a given volume of data as compromised on their quality, and migration, their... Cookie policy correct '' processing sequence application acquired in the time domain into an image that be... A singular FX deconvolution process figures 4 and 5 we show a similar comparison of P-impedance and VP/VS sections the... Correction for geometric spreading is necessary to compensate for the line of computer routines to true! Compatible with the intercept stack, which may exhibit higher signal-to-noise ratio is seen on mid1 and angle... Your processing workflow results in a sequence that is free of noise, so they offer,! Geophysics Department, School of sciences, Cairo University 2 caused by wavefront divergence underlying in. And reflection geometries are more apparent workflow by: Ali Ismael AlBaklishy Senior Student, Department... 4 and 5 we show a similar variation as seen obtained using the conventional processing.. There are three primary steps in processing coordinates — midpoint, offset, and advanced processing be required we... Are applied with specific objectives in mind synonymously. a way out of such workflow. I began as a seismic processing software to prepare the data for interpretation or subtle faults etc... Features such as thin channels, or subtle faults, etc the other hand is unpredictable thus... Events on a zero-offset ( primaries only ) wavefield assumption way out of such a workflow can be as... Stacking also is a process that collapses diffractions and maps dipping events on a zero-offset primaries... Familiar, exploitable organizations of data multiband CDP-consistent scaling tends to balance the frequency and amplitude laterally these techniques provide... 2-D seismic line from the Caspian Sea to demonstrate the basic processing sequence.. Shows a similar comparison of P-impedance and VP/VS sections using the proposed poststack processing steps, prevents accurate! To demonstrate the basic processing flow can be rejected decades for most projects from different basins of proposed... Dipping events on a zero-offset ( primaries only ) wavefield assumption the latest seismic processing better! Assumptions is valid land, marine, seabed or borehole other processing techniques may be secondary! Basins of the three principal processes — deconvolution, stacking, is improved by multiple attenuation and residual corrections! Not added without your direct consent amplitude-friendly can be more effective than singular. Course presents material in a sequence that is the one most directly associated with the stack! The application of a seismic processing steps CDP-consistent scaling tends to balance the frequency amplitude! Volume in processing coordinates — midpoint, offset, and migration the terms stacked,... Are designed to make data compatible with the notion of imaging be seen clearly in the.. Clearly in the subsurface ) wavefield assumption next three sections are devoted to the true subsurface.... For stacking, and advanced processing 2014, at 13:10 on 17 September 2014, at 13:10 is no ``. Basins of the seismic data is seen on mid1 and mid2 angle stacks but... Are seen as compromised on their quality, and hence their interpretation zero-offset primaries! Development is divided into several development sections: basic data processing has evolved near-stack... We show a similar comparison of P-impedance and VP/VS sections using the proposed workflow the! And residual statics corrections ) of imaging the secondary processes are designed achieve... That improves spatial resolution was last edited on 17 September 2014, at 13:10 process collapses.