Professional Documents
Culture Documents
Abdullatif Al-Shuhail
Executive Summary
GEOP320 (Seismic Data Processing) and GEOP510 (Seismic Data Analysis). GEOP320
sessions that require the use of the Seismic UNIX (SU) processing software under the
Center for Wave Phenomena at Colorado School of Mines. The Linux OS, freely
unfamiliar to most KFUPM students who are mainly familiar with the Microsoft
essential part of the course. The goal of this lab manual is to fulfill this need by
softwares for the purpose of establishing a stable seismic data processing environment.
The manual consists of a CDROM that includes the manual (this document), copies of
the latest release of SU and related softwares as well as tutorials on conventional seismic
Table of Contents
Topic Page
1. Introduction 3
2. Installation 8
2.1 Linux OS 8
2.2 SU Installation 10
3.1.1 Objective 13
3.1.2 Introduction 13
3.1.4 Exercises 14
3.2.1 Objective 18
3.2.2 Introduction 18
3.2.3 Exercises 18
4
3.3 Deconvolution 21
3.3.1 Objective 21
3.3.2 Introduction 21
3.3.3 Exercises 22
3.4.1 Objective 25
3.4.2 Introduction 25
3.4.3 Exercises 25
3.5.1 Objective 29
3.5.2 Introduction 29
3.5.3 Exercises 29
3.6.1 Objective 33
3.6.2 Introduction 33
3.6.3 Exercises 33
3.7 Migration 37
3.7.1 Objective 37
3.7.2 Introduction 37
3.7.3 Exercises 37
4. Further Readings 41
5. CDROM Contents 42
6. Acknowledgements 43
5
1. Introduction
• There are three primary stages in processing seismic data. In their usual order of
• Secondary processes are implemented at certain stages to condition the data and
¾ Demultiplexing: The data is transposed from the recording mode (each record
contains the same time sample from all traces) to the trace mode (each record
¾ Trace editing: Bad traces, or parts of traces, are muted (zeroed) or killed (deleted)
¾ Gain application: Corrections are applied to account for amplitude loss due to
¾ Setup of field geometry: The geometry of the field is written into the data (trace
headers) in order to associate each trace with its respective shot, offset, channel,
and CDP.
¾ Application of field statics: In land surveys, elevation statics are applied to bring
(1) Deconvolution is performed along the time axis to increase vertical resolution by
multiples.
(2) CDP sorting transforms the data from shot-receiver (shot gather) to depth point-offset
RMS, or NMO velocities to each reflector. Velocities are interpolated between the
analyzed CDPs.
5. Residual statics correction is usually needed for most land data. It corrects for
6. NMO correction and muting: The stacking velocities are used to flatten the
reflections in each CDP gather (NMO correction). Muting zeros out the parts of
NMO-corrected traces that have been excessively stretched due to NMO correction.
7. Stacking: The NMO-corrected and muted traces in each CDP gather are summed
over the offset (stacked) to produce a single trace. Stacking M traces in a CDP
8. Poststack processing includes time-variant band-pass filtering, dip filtering, and other
9. Migration: Dipping reflections are moved to their true subsurface positions and
young student, Linus Torvalds, at the University of Helsinki in Finland. He began his
work in 1991 when he released version 0.02 and worked steadily until 1994 when version
1.0 of the Linux Kernel was released. The kernel, at the heart of all Linux systems, is
developed and released under the GNU General Public License and its source code is
freely available to everyone. It is this kernel that forms the base around which a Linux
operating system is developed. There are now literally hundreds of companies and
organizations and an equal number of individuals that have released their own versions of
operating systems based on the Linux kernel. More information on the kernel can be
Apart from the fact that it's freely distributed, Linux's functionality, adaptability
and robustness, has made it the main alternative for proprietary Unix and Microsoft
operating systems. IBM, Hewlett-Packard and other giants of the computing world have
embraced Linux and support its ongoing development. Well into its second decade of
existence, Linux has been adopted worldwide primarily as a server platform. Its use as a
home and office desktop operating system is also on the rise. The operating system can
project, unsuitable for the general public's computing needs. Through the efforts of
developers of desktop management systems such as KDE and GNOME, office suite
project OpenOffice.org and the Mozilla web browser project, to name only a few, there
are now a wide range of applications that run on Linux and it can be used by anyone
8
Linux can download a live CD version called Knoppix . It comes with everything you
might need to carry out day-to-day tasks on the computer and it needs no installation. It
will run from a CD in a computer capable of booting from the CD drive. Those choosing
to continue using Linux can find a variety of versions or "distributions" of Linux that are
Seismic Unix (SU) package is a free software developed and maintained by the
Center for Wave Phenomena (CWP) at Colorado School of Mines. The package is
maintained and expanded periodically, with each new release appearing at 3 to 6 month
intervals, depending on changes that accumulate in the official version here at CWP. The
package is distributed with the full source code, so that users can alter and extend its
capabilities. The philosophy behind the package is to provide both a free processing and
broad suite of wave-related processing can be done with SU, making it a somewhat more
general package than the word ``seismic'' implies. SU is intended as an extension of the
Unix operating system, and therefore shares many characteristics of the Unix, including
Unix flexibility and expandability. The fundamental Unix philosophy is that all operating
system commands are programs run under that operating system. The idea is that
individual tasks be identified, and that small programs be written to do those tasks, and
9
those tasks alone. The commands may have options that permit variations on those tasks,
operating system, multiple processes may be strung together in a cascade via ``pipes'' (|).
applications, or in some commercial seismic utilities, for example. Unix has the added
Seismic Unix benefits from all of these attributes as well. In combination with standard
Unix programs, Seismic Unix programs may be used in shell scripts to extend the
functionality of the package. Of course, it may be that no Unix or Seismic Unix program
will fulfill a specific task. This means that new code has to be written. The availability of
a standard set of source code, designed to be readable by human beings, can expedite the
process of extending the package through the addition of new source code
(www.cwp.mines.edu, 2007).
10
2. Installation
2.1 Linux OS
1. Hard-disk partitioning
2. Linux OS installation
• This step is important because the Linux OS will not install on a Windows-formatted
• The most widely used software for partitioning is Partition Magic by PowerQuest,
in this CD).
3. Create the following logical Linux partitions in the newly-created free space:
One Linux Swap partition with 1 GB size (or equal to twice the size of
your PC RAM).
11
• Follow the following steps for installing the Redhat Linux OS on your PC hard disk:
1. Insert CD 1 into your CDROM drive and reboot your system from the CDROM.
2. Follow instructions until you reach the installation type step where you should
3. Follow instructions until you reach the partitioning step, where you should do the
following:
o Do not re-partition the hard disk using the Linux partition utility.
4. Follow instructions until you reach the network setup step, where you should skip
it.
5. Follow instructions until you reach the Linux loader step, where you should do
the following:
6. Follow instructions until you reach the root password step, where you should
enter the root password of your choice (to be used for Linux OS administration).
7. Follow instructions until you reach the Add Users step, where you should create
8. Follow instructions until installation ends, remove any disks, and restart the PC.
9. When the PC restarts and prompts you to select the OS, select Linux.
12
• More details on Redhat Linux installation can be found in the file Redhat-Linux-
2.2 SU Installation
• Follow the following steps for installing SU under the Linux OS of your PC:
“.bash_profile_original”.
o Re-login as user.
o Open Konqueror and download the current version (version 40 in this case) of SU
ftp://ftp.cwp.mines.edu/pub/cwpcodes/
directory.
13
“Konsole”.
cd /home/seismic/su
cd /home/seismic/su/src
to “Makefile.config.original”.
“/home/seismic/su/src”.
o Compile the codes using the following command sequence (making sure that you
finish executing a command before you try the one after it!):
make install
make xtinstall
make finstall
make utils
make xminstall
make mglinstall
suascii
• In this section, we will process a real seismic line called data.sgy (found in this CD)
from raw SEGY shot gathers to fully processed and stacked line.
cd /home/seismic/tutorials
4. Copy the seismic data file data.sgy into the directory “/home/seismic/tutorials”.
• You must do all the tutorials while you are in a terminal and within the directory
“/home/seismic/tutorials” because the input data file and all outputs will be saved in
this directory.
3.1.1 Objective
The objective of this tutorial is to get acquainted with the processing software
3.1.2 Introduction
(1) Demultiplexing.
16
(2) Reformatting.
(3) Editing.
• Sampling rate = 2 ms
• Traces/shot (record) = 33
3.1.4 Exercises
(2) The input data file data.sgy is in SEGY format and we want to reformat it to SU.
o Use the following command to convert the data from SEGY to SU format:
o Successful execution of this command should produce three files: data.su that
contains the seismic traces, binary that contains some binary information, and
o Use the following command to view the first 6 shot records of the data (Figure 1):
o The xwigb window shows the traces with trace numbers along the horizontal axis
o Kill the xwigb window by clicking anywhere in it and pressing the letter “q” on
the keyboard.
o Automatic Gain Control (AGC): Use the following command to gain the data
o t2: Use the following command to gain the data using the t2 method:
3.2.1 Objective
The objective of this tutorial is to use frequency filtering to filter out the ground
3.2.2 Introduction
(1) Taking the Fourier Transform (FT) of the data and displaying the amplitude spectrum
3.2.3 Exercises
(1) Take the FT of the data and save the amplitude spectrum using the command:
o To zoom within the xwigb panel, left-click and drag on your zoom area (Figure
3B).
(2) Use the following command to filter and save the data:
Figure 3A: Amplitude spectra of first 6 shot records, where the vertical axis indicates
frequency (Hz).
22
3.3 Deconvolution
3.3.1 Objective
The objective of this tutorial is to deconvolve the dataset in order to spike it and
3.3.2 Introduction
These two processes can be done by selecting appropriate values for the following
parameters:
sequence:
(1) Autocorrelation, which allows us to select the deconvolution type and related
parameters.
(2) Deconvolution.
3.3.3 Exercises
(1) Autocorrelation:
o Use the following command to generate and save the trace autocorrelations:
(2) Deconvolution:
minlag=0.002 s, which sets the prediction lag parameter (α) for spiking
decocvolution.
maxlag=0.2 s, sets the operator length parameter (n) to the first transient
zone.
(3) Gain:
following command:
3.4.1 Objective
The objective of this tutorial is to determine the stacking velocities in the data
area.
3.4.2 Introduction
Velocity analysis is used to determine the stacking velocity function along the
seismic line. The stacking velocities are then used in various seismic processing and
interpretation stages. Our main objective for determining the stacking velocities is to use
them for NMO correction. The stacking velocities can be determined using the constant-
velocity stack (CVS) or velocity spectrum methods. In this tutorial, we will use the
points (CDPs); therefore, we must sort the data from shot to CDP gathers before velocity
analysis.
3.4.3 Exercises
o Use Konqueror to copy the example shell script Velan from the directory
velpanel=/home/seismic/tutorials/data-tm-flt-dec-bal-cdp.su
velpicks=/home/seismic/tutorials/stkvel.p1
o Save the modified Velan script, exit KWrite, and open a terminal.
./Velan
o The velocity spectrum of CDP 225 will be displayed (Figure 9). Note that dark
Make your picks by pointing the mouse to your selected pick position and
typing “s” using the keyboard (remember that velocities for this dataset
Type “q” using the keyboard to end picking for this CDP.
The velocity function for this CDP will be displayed for your approval.
29
If you approve the velocity function, press enter on the keyboard and the
o When you are done picking all the selected CDPs, your picks will be saved in the
3.5.1 Objective
3.5.2 Introduction
In this tutorial, we will use the sunmo and sustack commands to NMO-correct
the data and stack the traces in every CDP to produce the stacked section. The stacked
section gives an image of the subsurface in T0-CDP domain. It is used for later
3.5.3 Exercises
o Use the following command to NMO-correct the dataset (note that we used the
velocity function that we got from the velocity analysis by copying it from the file
stkvel.p1):
cdp=225,230,235,240,245,250
tnmo=0.0783034,0.337684,0.924959,1.31158,1.64927,1.75204,1.96248,2.60848
,2.8385
vnmo=5325.58,5813.95,7046.51,7790.7,8488.37,8976.74,9953.49,12000,12790.
32
7 tnmo=0.0440457,0.303426,0.636215,1.25775,1.92333,2.30995,2.79445
vnmo=5255.81,5674.42,6418.6,7558.14,8069.77,9465.12,11558.1
tnmo=0.0880914,0.411093,0.734095,1.42414,1.9429,2.36378,2.71126,2.81892
vnmo=5232.56,5883.72,6232.56,7139.53,8720.93,10558.1,12116.3,12767.4
tnmo=0.0978793,0.601958,1.67374,1.93801,2.28548,2.53997,2.78467
vnmo=5348.84,6441.86,8000,8674.42,9860.47,11209.3,12372.1
tnmo=0.0636215,0.415987,0.817292,1.15008,1.58564,1.68842,1.9478,2.1925,2.
45188,2.62806,2.87765
vnmo=5232.56,6186.05,7046.51,7511.63,8093.02,8279.07,8744.19,9395.35,104
18.6,11186,12209.3
tnmo=0.0734095,0.293638,0.626427,0.969005,1.32137,1.69821,1.98206,2.3295
3,2.64274,2.83361
vnmo=5232.56,5744.19,6534.88,7232.56,8000,8651.16,9558.14,10744.2,11534.
9,12930.2
o Use this command to view CDPs 231-240 after NMO correction (Figure 10):
o Display the NMO-corrected CDPs in T-X domain and take a look at them. If
your reflections are not horizontally aligned, try another velocity function until
the reflections are horizontally aligned for all CDPs and times (Hint: Our velocity
function is fine!).
(2) Stacking:
stack.su
3.6.1 Objective
The objective of this tutorial is to apply field and residual static corrections on the
dataset.
3.6.2 Introduction
The filed static correction accounts for variable source and receiver elevations and
puts them all on a flat datum (reference elevation). The residual static correction
accounts for the lateral thickness and velocity variation in the weathering layer.
3.6.3 Exercises
o The command sustatic is used to apply any type of static shifts to the traces.
(iv) Sub-weathering layer velocities under the source and the receiver
36
o Information (i) and (ii) are usually available for every survey and is available in
our dataset.
o Information (iii) and (iv) have to be estimated from uphole or refraction data. The
theory and application of these methods is beyond the scope of this manual.
o We will not apply the field correction to the data because the required velocities
o The command suresstat is used to calculate the residual static shift for every
source, receiver, and trace. It requires that the data be NMO-corrected and sorted
o Use the following command to sort the NMO-corrected data into shot gathers:
fldr offset
o Use this command to view the first 6 NMO-corrected shot records (Figure 12):
o Use the following command to calculate the residual static shift for every source
and receiver:
cfold=18
o Execution of the above suresstat command should produce two binary files
o The command sustatic is used then to apply these residual static shifts to the
traces.
37
o Use the following command to apply the source and receiver residual statics to
the traces:
o Use this command to view CDPs 231-240 after NMO correction and residual
suxwigb
o We can see that the NMO-corrected reflections within these CDP gathers have
been more horizontally aligned after residual static correction (compare with
Figure 10).
3.7 Migration
3.7.1 Objective
3.7.2 Introduction
and moving dipping reflectors to their true subsurface locations. There are several types
and algorithms of migration. We will use the Stolt (FK) migration algorithm on our 2D
3.7.3 Exercises
o Use the following command to migrate the stacked time section (data-tm-flt-dec-bal-
dxcdp=110 smig=0.6
tmig=0.0734095,0.293638,0.626427,0.969005,1.32137,1.69821,1.98206,2.3295
3,2.64274,2.83361
vmig=5232.56,5744.19,6534.88,7232.56,8000,8651.16,9558.14,10744.2,11534.
o Use this command to view the stacked migrated section (Figure 14):
o We can see that migration did not change the data considerably because of the flat
Figure 14: Stacked section after Stolt migration. Horizontal axis indicates CDP numbers.
41
4. Further Readings
http://faculty.kfupm.edu.sa/ES/ashuhail/GEOP320.htm.
5. CDROM Contents
1. This document.
Redhat Linux is not included in the CDROM due to its excessively large size but
https://www.redhat.com/apps/download/.
5. Tutorials directory that includes all files required for the Tutorials explained in
6. Acknowledgements
I would like to thank KFUPM for supporting this work through the 2007 Summer
Special Assignment program. I also thank the Center for Wave Phenomena at Colorado
School of Mines and its sponsors for creating and maintaining the SU package.