Sunday, 19 May 2013

Marine Processing - Part 1 | Resample & Minimum Phase Conversion

These blog posts will build up into a complete description of a 2D marine processing sequence.  They are based on our tutorial datasets, which in turn came from the New Zealand Government’s Ministry of Economic Development under the “Open File” System.

Marine seismic data is typically recorded at a 2 ms (0.002 s) sample interval, giving a theoretical maximum frequency of 250 Hz that can be accurately recorded;

Sampling frequency (F) = 1/Period (T) = 1/0.002 = 500
Nyquist Frequency (fN) = Sampling frequency (F)/2 = 500/2 = 250 Hz

We don’t do this because we actually have those frequencies – in practice anything over 90-100 Hz is pretty optimistic – but instead to avoid issues with aliased noise.
If we record data such that the maximum frequency is 250 Hz, but then resample to a 4 ms sample interval and filter out energy above around 120 Hz, we can be confident that there are no aliased frequencies in our raw data.
As a result, this is typically the very first processing step; in fact we may do this before we really take much of a look at the data.

Processing software tends to have a resample routine with a built in anti-alias filter. The main concern with using this is that you have no idea how good (or bad) it is. Some “built in” filters are very old in software terms i.e. still using as little as 51 input points. In general, fewer input points leaves more noise (ringing) in the data after applying the filter.

Variation in the number of input points and resulting filtered output (Image from Gaussian Waves -

You also may not know what phase the filter has, and even if you do, your raw data is likely to be mixed phase.

So I prefer not to use the built in filters. I tend to create my own resample filter each time, and save it (along with a screen grab of the displays) for my report so that I know what has been applied.

I also combine this stage with the minimum-phase conversion. You are likely to have been supplied with a source signature at the original sample interval, so we can combine the high-cut filter we want with the minimum-phase conversion filter derived from the source signature.
Typically, the starting point for this is a far-field source-signature from the acquisition company, which will probably look something like this:

A typical marine far-field source signature in a “wavelet calculator”; the time samples are on the left, and the frequency content on the right.  Note the frequency “notch”, identified by the cursor at 125 Hz.  The signature looks pretty close to a minimum phase wavelet, as you might expect.

When loading up the source signature it is important to look out for any “time shift” applied; in this case the time of the first sample was given as -64 ms in the header of the source signature file.

The key thing to notice with this signature is the “notch” in the frequency spectrum. This arises because of the “ghost” effect (and is thus referred to as the “ghost notch”). Energy travels up from the source and is reflected off the sea surface. When this energy combines with the original down-traveling energy, the waveshape is changed and the “ghost” is created.

You can calculate the dominant frequency of the “ghost notch” easily enough, by considering the travel time from the source to the sea surface and back again.

In this case, the source is being towed at 6 m, so the distance traveled is 2 x 6 m = 12 m.

The speed of sound in sea water is about 1500 m/s, which gives a ‘delay time’ of 0.008 seconds (8 ms).

This time period (T) corresponds to a frequency of 1/0.008 or 125 Hz.

Effect of towing at 6 m depth i.e. 8 ms delay (Image from Mohamed in 'Ghost Reflections' forum discussion -

This of course is one reason why 6 m is a common tow depth – the ghost effect is used as a natural anti-alias filter to suppress energy around the Nyquist frequency (fN) and make aliasing less likely.

Over time, however, it has become common to tow the receivers slightly deeper than the source; the main reason being that the bubbles created by an airgun array can interfere with the sonic ranging systems used to position the cables.

A 3D marine vessel from the air during a crew change (photo courtesy of Tim Mason);  the two source arrays  can be seen as they have three closely spaced “wakes”;  the starboard source has just fired (black arrow), but the bubbles from the previous three shots (orange arrows) can still be seen on the sea surface. This indicates that the source is towed shallower than the streamers.
The signature I have shown includes the source ghost, but not the receiver ghost, where a delayed signal that has reflected off the sea-surface will also be recorded.

Ray paths showing the possible routes for ghost reflectors (Image from Mohamed in 'Ghost Reflections' forum discussion -

This is pretty common, and you need to check carefully. If this is the case, you’ll need to model the receiver ghost as well.

To do this, simply create a spike (of amplitude +1) at time zero and a second spike (of amplitude -1) at the appropriate time for the receiver ghost to arrive (twice the receiver depth).  

For example, with a 7 m tow depth and 1500 m/s seawater velocity this would be 14/1500 = 0.00933 s. The result will look something like this:

The original signature plotted (above) with a modeled receiver ghost for a streamer depth of 7 m, created by making a spike at time zero and a negative spike at the arrival time of the ghost. Note the ghost notch is now at 112.5 Hz.

This illustrates another key issue with the ghost – it has a low frequency effect as well.

To create the combined signature, you need to convolve these two together, but we also need to apply a time shift to the modeled signature. In the supplied information it, it tells me that the time of the first sample is -64 ms, so I have to apply this “delay” to correct the signature.

The modeled signature including the receiver ghost effect.  The notch has broadened, and the wavelet shape is now far from the “minimum phase” ideal.  The start delay of 64 ms has also been removed from the data.

As I mentioned before, this additional modelling of the receiver ghost is a critical step that is often left out; the results as you can see can look a little like a zero phase wavelet.  I have seen situations where no phase correction was applied, but in looking at the seabed reflection the interpreters have assumed the data were zero phase corrected.

We can calculate the minimum phase equivalent of this, as a starting point for our idealised signature.

The minimum phase equivalent of the model source signature, including the ghost and start time shift.

While this represents our idealised source wavelet, we have one more thing to do before creating a conversion filter.  If we are going to include the anti-alias high-cut filter in here as well, then we need to create the minimum phase equivalent of the signature with the appropriate minimum phase filter high-cut filter.  I typically use an 80 Hz-100 Hz high cut.

A couple of points on the design of this filter; I tend to use the minimum phase equivalent of a butterworth filter, with at least 200 points. While butterworth filters are pretty stable, the minimum phase formulations are not terribly minimum phase at higher orders – hence the additional step of making a minimum phase equivalent.
As always, you need to check the filter you have created carefully – look for “ringing” in the frequency domain, which appear as “ripples” on top of the pass-band of the filter; if you see these you may need to adjust the slope of the high-cut or lengthen the filter.

The minimum phase equivalent of an 80 Hz-90 Hz high cut filter, which will be used to create the anti-alias effect.

Finally, we can convolve this filter with our minimum-phase equivalent of the time-shifted signature, that now includes the receiver ghost; this is our idealised output signature:

The idealised signature, created by including the receiver ghost and time shift, computing the minimum phase equivalent, and convolving this with a minimum phase high cut anti-alias filter.

The final stage is to design a matching filter to go from the time-shifted signature with the receiver ghost, to our new idealised version – and to apply and test that this works! There is usually a function in a wavelet calculator that allows you to make a “matching” or “conversion” filter.

The final combined minimum-phase conversion and anti-alias filter (top) and the test application of this to the modeled source signature, including time shift and modeled receiver ghost.

The last step is to save this filter so that you can apply it (convolve it) with the seismic traces.  One final trap for the unwary here is to save the wrong thing – one of the saved results you have been working though, or the test result instead of the filter.  If you do this and it’s not spotted, you can have to re-run the whole project.

If you wanted to go to zero-phase the process would be much the same – instead of a minimum phase equivalent of the source signature, you would calculate the zero-phase version and work with that – the anti-alias filter would also need to be zero phase, of course.

Now, you can successfully resample the data from 2 ms to 4 ms sample interval, and pass it on in a controlled phase way to the rest of the processing sequence.

By: Guy Maslen


  1. What happen if you before sample from 2 to 4 these filter were not applied? Would I have the aliasing effect and removed it later? What about the ghost? Can I remove later as well? Should I use these filters even if I not resample the data?

  2. Hi Fabio -

    Any frequencies that you have above 125Hz in the dataset will "wrap back" into the data - so if you had some data at 130Hz-140Hz, this would appear after the resampling to be data in the 100-120Hz range and so on.

    Once it is aliased that's it - you can't then remove the aliased information from the signal. This is what its important to address this upfront.

    It is very unlikely on this type of survey that you will have any real reflection data over 125Hz, which is even worse as now you have some aliased noise to deal with that is wrapped into the "signal band"...

    Removing the "ghost" is a different matter; broad-band seismic techniques use a combination of acquisition methods (different tow depths and so on) coupled with processing to address the ghost issues, mainly at the low end. There are processing methods to address the ghost too, but these are still estimating completing the spectrum in various ways.

    You don't have to resample the data from 2ms to 4ms, however unless your seismic experiment has been designed to capture higher frequencies its a good idea. A key factor in the design is the spacing of the receivers - the higher the frequency you want to record, the close the spacing (to avoid spatial aliasing). Plus of course all the processing takes twice as long at a 4ms sample interval.

    There's a long discussion on the pros and cons of 2ms and 4ms sampling (mainly for land data) on LinkedIn in the Seismic Processing group.


  3. I think there is a mistype. Related to receiver ghost, if you tow the receiver on 7 m you will need to divide 14/1500?

    1. You are correct Ayaz, thank you for pointing it out.

  4. I have some older data for which I do not have a source signature. What is the best way to create one?

    1. You can estimate the signature from the direct arrival - if the water is deep enough so this is not confused by the seafloor or refracted arrivals - by applying a linear moveout correction and stacking; this in general will exclude source and receiver ghosts, which you will need to.

      Of course, if you have an acquisition report you may be able to find a signature (that can be digitised) or even contact the acquisition company who may have the signature on record; there's also software that can simulate an array (Gandelf, Nucleus)if you know the air gun sizes, pressure and their arrangement.

      It is also probably worth thinking about whether you need a signature or not - for example whether you can run through with just a deconvolution and then address the phase post-processing via well ties and so on.


  5. Hi Guy,

    Thanks a lot for writing this blog! It's a really useful guide.

    I have an old vibroseis dataset that has been coss correlated to produce raw gathers that are mixed phase. I would like to convert to minimum phase before proceeding with the processing but I have no additional information for creating a source wavelet.
    I thought I could estimate the source wavelet by taking the first arrival from the first few traces, flatten and stack, then create a minimum phase matching filter. However, the near trace is at an offset of 200 m.... Do you have any advice for this situation?

    1. Ah - Vibroseis and phase is a complicated animal; one group that has done a lot of work on this is the CREWES consortia from the University of Calgary; as I recall they suggested that the correlated Vibroseis wavelet was actually a reasonable approximation to minimum phase (as opposed to zero or mixed) It's certainly close enough for deconlvoltuion and so on.

      So - I would suggest press on with the data you have and then worry about phase-matching after you have cleaned up the shots with the appropriate well control (or known-phase seismic); without well control of some sort you can only approximate phase control, not ground truth it.

  6. Hi Guy,

    Thanks for knowledge that you offer whiting this blog. It is very helpful and objective.
    Well, I follow your step and some questions pop up in my mind and I would like to discuss with you.
    1- The gun delay that you mention is related to the trigger time to shot? Are there more reasons?

    2- I am wondering what the physical meaning of the source signature data convoluted with modeled spikes. Is it the signature without the ghost effect? Why that happen?

    Really thanks for comments

    1. the "gun delay" is the difference between the recording system starting and the shot being fired. With some systems that could be the "tigger time" - I think the steam-based vaporchoc tended to be a bit un-predictable (and so the signature was recorded shot-by-shot) but for the most part it was related to the recording system technology, not the source.

      You often get the "far field signature" supplied without taking the receiver ghost into consideration; historically that was related to how the far field signatures were recorded, but more recently its become about the use of varying tow depths and so on.

      The general advice I'd give is to *check* what has and has not been done to your signature; you might find that the company also modeled the tow depths and so on. Of course, broadband type deghosting is a whole other issue that I haven't addressed here...


Total Pageviews