Professional Documents
Culture Documents
June 1999
Copyright 1999 Landmark Graphics Corporation All Rights Reserved Worldwide This publication has been provided pursuant to an agreement containing restrictions on its use. The publication is also protected by Federal copyright law. No part of this publication may be copied or distributed, transmitted, transcribed, stored in a retrieval system, or translated into any human or computer language, in any form or by any means, electronic, magnetic, manual, or otherwise, or disclosed to third parties without the express written permission of: Landmark Graphics Corporation 15150 Memorial Drive, Houston, TX 77079, U.S.A. Phone: 281-560-1000 FAX: 281-560-1410
Trademark Notices Landmark, OpenWorks, SeisWorks, ZAP!, PetroWorks, and StratWorks are registered trademarks of Landmark Graphics Corporation. Pointing Dispatcher, Log Edit, Fast Track, SynTool, Contouring Assistant, TDQ, RAVE, 3DVI, SurfCube, SeisCube, VoxCube, Z-MAP Plus, ProMAX, ProMAX Prospector, ProMAX VSP, MicroMAX, and Landmark Geo-dataWorks are trademarks of Landmark Graphics Corporation. ORACLE is a registered trademark of Oracle Corporation. IBM is a registered trademark of International Business Machines, Inc. AIMS is a trademark of GX Technology. Motif, OSF, and OSF/Motif are trademarks of Open Software Corporation. UNIX is a registered trademark in the United States and other countries, licensed exclusively through X/Open Company, Ltd. SPARC and SPARCstation are registered trademarks of SPARC International. Solaris, Sun, and NFS are trademarks of SUN Microsystems. X Window System is a registered trademark of X/Open Company, Ltd. SGI is a trademark of Silicon Graphics Incorporated. All other brand or product names are trademarks or registered trademarks of their respective companies or organizations.
Note The information contained in this document is subject to change without notice and should not be construed as a commitment by Landmark Graphics Corporation. Landmark Graphics Corporation assumes no responsibility for any error that may appear in this manual. Some states or jurisdictions do not allow disclaimer of expressed or implied warranties in certain transactions; therefore, this statement may not apply to you.
Contents
Agenda
.................................................................. i
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v
About The Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v How To Use The Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii
Contents
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-1 ProMAX Menu Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-2 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-3 Building and Executing a Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-7
3-1
Topics to be covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-1 Trace Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-2 Create and Apply a Parameter Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-9
4-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-1 Parameter Test. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-2 IF/ENDIF Conditional Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-5 Interactive Spectral Analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-8
5-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-1 VSP Real Dataset Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2 Geometry Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-3
6-1
ii
Landmark
Contents
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-1 Display the Input Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-2 Write Dataset To Disk in Your Area . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-3
Landmark
iii
Contents
11-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-1 True Amplitude Recovery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2 Compute an RMS Velocity Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-3 Test TAR Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-7
12-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-1 Flatten the Downgoing with F-B Picks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2 Flatten with F-B Picks and Event Alignment . . . . . . . . . . . . . . . . . . . . . . . . . . 12-4 Compare Flattening Iterations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-7 Waveeld Separation with Median Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-9 F-K Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-13 Waveeld Separation with F-K Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-16 Waveeld Separation with Eigenvector (K-L) Filter . . . . . . . . . . . . . . . . . . 12-18 Waveeld Separation Comparison Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-22 Save the Upgoing Energy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-24 Waveeld Separation to Keep Downgoing . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-26 Save the Downgoing Energy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-28 QC plot of Separated Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-30
iv
Landmark
Contents
Landmark
Contents
VSP Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
17-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-1 VSP Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-2 Display the VSP Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-3
18-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-1 Assign VSP Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-2 Quality Control Plots from the database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18-8
19-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-1 VSP Prevertical Stack Dataset Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19-2
20-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-1 Plot the Traces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-2 VSP Level Statics Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-5 Compute and Apply the Level Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20-7
vi
Landmark
Contents
Vertically Stack Shots by Common Header Entry . . . . . . . . . . . . . . . . . . . . . 20-10 Compare Stacks With and Without Level Statics . . . . . . . . . . . . . . . . . . . . . 20-11
Level Stat and Vertical Stack for Multi Component / Multi Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-1 Plot the Traces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-2 Determine Level Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22-3 Vertically Stack Shots for Common Depth Levels . . . . . . . . . . . . . . . . . . . . . 22-7 Examine Headers for Common Header Entry . . . . . . . . . . . . . . . . . . . . . . . . . 22-9 Vertically Stack Shots by Common Header Entry . . . . . . . . . . . . . . . . . . . . . 22-10 Compare Stacks With and Without Level Statics . . . . . . . . . . . . . . . . . . . . . 22-11
Contents
24-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-1 3 Component Hodogram Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-2 Example Hodogram Analysis Plot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-4 Example of Hodogram Output Trace Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24-7
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-1 Preparing the Input Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25-2
Archival Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
26-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-1 SEG-Y Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-2 Tape Data Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-4 UNIX tar. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-7 Archive to Tape . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26-8
27-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-1 Text Editors in ProMAX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-2 UNIX Commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-5 Examples of UNIX Commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27-15
viii
Landmark
Agenda
Agenda - Day 1
Introductions, Course Agenda ProMAX User Interface Overview Trace Display Functionality
Landmark
Keep Vertical Traces First Break Picking Velocity Function Generation Velocity Function Manipulation True Amplitude Recovery Testing True Amplitude Recovery Waveeld Separation Testing
Agenda-ii
Landmark
Agenda Day 2
Isolate the Upgoing Energy
After choosing the desired waveeld separation technique we will isolate the upgoing energy
Isolate the Downgoing Energy
After choosing the desired waveeld separation technique we will isolate the upgoing energy
Deconvolution
Preprocessing exercise for vertically stacking multiple shots at the same receiver locations
Look at Synthetic Data Level Statics Level Summing (vertical stack)
Landmark
iii
Agenda Day 3
3-Component Transforms and rst break picking 3-Component Hodogram Analysis Dataset Preparation VSP Modelling Cross Well Tomography Demonstration Archive Methods Generation of CGM Plots
Agenda-iv
Landmark
Preface
About The Manual
This manual is intended to accompany the instruction given during the standard ProMAX VSP User Training course. Because of the power and flexibility of ProMAX VSP, it is unreasonable to attempt to cover all possible features and applications in this manual. Instead, we try to provide key examples and descriptions, using exercises which are directed toward common uses of the system. The manual is designed to be flexible for both you and the trainer. Trainers can choose which topics, and in what order to present material to best meet your needs. You will find it easy to use the manual as a reference document for identifying a topic of interest and moving directly into the associated exercise or reference.
Landmark
This format allows you to glance at the topic description to either quickly reference an implementation, or simply as a means of refreshing your memory on a previously covered topic. If you need more information, see the Exercise sections of each topic.
vi
Landmark
Conventions
Mouse Button Help
This manual does not refer to using mouse buttons unless they are specific to an operation. MB1 is used for most selections. The mouse buttons are numbered from left to right so: MB1 refers to an operation using the left mouse button. MB2 is the middle mouse button. MB3 is the right mouse button. Actions that can be applied to any mouse button include: Click: Briey depress the mouse button. Double Click: Quickly depress the mouse button twice. Shift-Click: Hold the shift key while depressing the mouse button. Drag: Hold down the mouse button while moving the mouse.
Mouse buttons will not work properly if either Caps Lock or Nums Lock are on.
Exercise Organization
Each exercise consists of a series of steps that will build a flow, help with parameter selection, execute the flow, and analyze the results. Many of the steps give a detailed explanation of how to correctly pick parameters or use the functionality of interactive processes. The editing flow examples list key parameters for each process of the exercise. As you progress through the exercises, familiar parameters will not always be listed in the flow example. The exercises are organized such that your dataset is used throughout the training session. Carefully follow the instructors direction when assigning geometry and checking the results of your flow. An improperly generated dataset or database may cause a subsequent exercise to fail.
Landmark
vii
viii
Landmark
Chapter 1
Landmark
1-1
Directory Structure
/advance (or $PROMAX_HOME) The directory structure begins at a subdirectory set by the $PROMAX_HOME environmental variable. This variable defaults to / advance, and is used in all the following examples. Set the $PROMAX_HOME environment variable to /my_disk/my_world/ advance to have your Advance directory tree begin below the /my_disk/ my_world subdirectory.
/advance/sys /advance/sys is actually a symbolic link to subdirectories unique to a given hardware platform, such as: /advance/rs6000 for IBM RS6000 workstations, /advance/sparc for Sun Microsystems Sparcstations running SunOS, /advance/solaris for Sun Microsystems Sparcstations and Cray 6400 workstations running Sun Solaris OS, /advance/sgimips for Silicon Graphics Indigo workstations using the 32 bit operating system and /advance/sgimips4 for Silicon Graphics Indigo and Power Challenge workstations using the 64 bit operating system. This link facilitates a single file server containing executable programs and libraries for all machine types owned by a client. Machine specific executables are invoked from the UNIX command line, located in / advance/sys/bin. Operating System specific executables and libraries, called from ProMAX, are located under /advance/sys/exe. These machinedependent directories are named after machine type, not manufacturer, to permit accommodation of different architectures from the same vendor. Accommodating future hardware architectures will simply involve addition of new subdirectories. Unlike menus, help and miscellaneous files, a single set of executables is capable of running all Advance products, provided the proper product specific license identification number is in place.
1-2
Landmark
/sys
/exe exec.exe super_exec.exe *.exe /bin promax promax3d promaxvsp /lib lib*.a /plot /help
/port
/menu
/misc *_stat_math *.rgb-colormaps ProMax_defaults /etc cong_le product install.doc pvmhosts qcong license.dat
/promax *.lok - Frame help *.help -ASCII help /promax3d /promaxvsp /promax *.menu Processes /promax3d /promaxvsp
/area
/line
Third party software distributed by Advance will now be distributed in a subdirectory of /advance/sys/exe using the companys name, thus avoiding conflicts where two vendors use identical file names. For example, SDIs CGM Viewer software would be in /advance/sys/exe/ sdi and Frame Technologys FrameViewer would be in /advance/sys/ exe/frame.
1-4
Landmark
/sys
/exe exec.exe super_exec.exe *.exe /bin promax promax3d promaxvsp /lib lib*.a /plot /help
/port
/promax
*.lok - Frame help *.help -ASCII help
/menu
/misc *_stat_math *.rgb-colormaps ProMax_defaults /etc cong_le product install.doc pvmhosts qcong license.dat
Landmark
1-5
/advance/port Software that is portable across all platforms is grouped under a single subdirectory /advance/port. This includes menus and Processes (/ advance/port/menu), helpfiles(/advance/port/help), miscellaneous files (/advance/port/misc). Under the menu and help subdirectories are additional subdirectories for each ProMAX software product. For instance, under /advance/port/menu, you will find subdirectories for ProMAX 2D (promax), ProMAX 3D (promax3d), and ProMAX VSP (promaxvsp). Menus for additional products are added as new subdirectories under /advance/port/menu.
/advance/etc Files unique to a particular machine are located in the /advance/etc subdirectory. Examples of such files are the config_file, which contains peripheral setup information for all products running on a particular machine, and the product file, which assigns unique pathnames for various products located on the machine.
/advance/scratch The scratch area defaults to /advance/scratch. This location can be overridden with the environmental variable, PROMAX_SCRATCH_HOME. All ProMAX development tools are included within the following subdirectories: /advance/sys/lib, /advance/sys/obj, /advance/port/src, / advance/port/bin, /advance/port/include and /advance/port/man.
/advance/data (or $PROMAX_DATA_HOME) The primary data partition defaults to /advance/data, with new areas being added as subdirectories beneath this subdirectory. This default location is specified using the entry: primary disk storage partition: /advance/data 20 in the /advance/etc/config_file. This location can also be set with the environmental variable $PROMAX_DATA_HOME.
1-6
Landmark
/Line DescName 17968042TVEL 31790267TGAT 36247238TMUT 12345678CIND 12345678CMAP /12345678 HDR1 HDR2 TRC1 TRC2 /Flow1 DescName TypeName job.output packet.job /OPF.SIN
OPF60_SIN.GEOMETRY.ELEV
/OPF.SIN Database subdirectory and a non-spanned le /OPF.SRF Database subdirectory and a span le
/OPF.SRF
#s0_OPF60_SRF.GEOMETRY.ELEV
Each region identifies a collection of files and directories which can be summarized as the Area, Line, Parameter Tables, Flow, Trace Headers, and Ordered Parameter Files database.
Landmark
1-7
Program Execution
User Interface ($PROMAX_HOME/sys/bin/promax) Interaction with ProMAX is handled through the User Interface. As you categorize your data into Areas and Lines, the User Interface automatically creates the necessary UNIX subdirectories and provides an easy means of traversing this data structure. However, the primary function of the User Interface is to create, modify, and execute processing flows. A flow is a sequence of processes that you perform on seismic data. Flows are built by selecting processes from a list, and then selecting parameters for each process. A typical flow contains an input process, one or more data manipulation processes, and a display and/or output process. All information, needed to execute a flow, is held within a Packet File (packet.job) within each Flow subdirectory. This Packet File provides the primary means of communication between the User Interface and the Super Executive program. See next section, Super Executive Program. In addition, the User Interface provides utility functions for copying, deleting and archiving Areas, Lines, Flows, and seismic datasets; accessing and manipulating ordered database files and parameter tables; displaying processing histories for your flows; and providing information about currently running jobs. The User Interface is
1-8
Landmark
Program Execution
Super Executive Program (super_exec.exe) Execution of a flow is handled by the Super Executive, which is launched as a separate task by the User Interface. The Super Executive is a high level driver program that examines processes in your flow by reading packet.job and determines which executables to use. The majority of the processes are subroutines linked together to form the Executive. Since this is the processing kernel for ProMAX, many of your processing flows, although they contain several processes, are handled by a single execution of the Executive. Several of the processes
Landmark
1-9
are stand-alone programs. These processes cannot operate under the control of the Executive, and handle their own data input and output by directly accessing external datasets. In these instances, the Super Executive is responsible for invoking the stand-alone programs and, if necessary, multiple calls to the Executive in the proper sequence. The Packet File, packet.job, defines the processes and their type for execution. The Super Executive concerns itself with only two types of processes: Executive processes Stand-alone processes
Executive processes are actually subroutines operating in a pipeline, meaning they accept input data and write output data at the driver level. However, stand-alone processes cannot be executed within a pipeline, but rather must obtain input and/or produce output by directly accessing external datasets. The Super Executive sequentially gathers all Executive-type processes until a stand-alone is encountered. At that point, the Packet File information for the Executive processes is passed to the Executive routine (exec.exe) for processing. Once this is completed, the Super Executive invokes the stand-alone program for processing, and then another group of Executive processes, or another stand-alone process. This continues until all processes in the flow have been completed.
Executive Program (exec.exe) The Executive program is the primary processing executable for ProMAX. The majority of the processes available under ProMAX are contained in this one executable program. The Executive features a pipeline architecture that allows multiple seismic processes to operate on the data before it is displayed or written to a dataset. Special processes, known as input and output tools, handle the tasks of reading and writing the seismic data, removing this burdensome task from the individual processes. This results in processes that are easier to develop and maintain.
1-10
Landmark
The basic flow of data through the Executive pipeline is shown below:
Each individual process will not operate until it has accumulated the necessary traces. Single trace processes will run on each trace as the traces come down the pipe. Multi channel processes will wait until an entire ensemble is available. For example in the example flow the FK
Landmark
1-11
filter will not run until one ensemble of traces has passed through the DDI and AGC. If we specify for the Trace Display to display 2 ensembles, it will not make a display until two shots have been processed through the DDI, AGC and FK filter. No additional traces will be processed until Trace Display is instructed to release the traces that it has displayed and is holding in memory by clicking on the traffic light icon or terminating its execution (but continuing the flow). Note: All the processes shown are Executive processes and thus operate in the pipeline. An intermediate dataset and an additional input tool process is needed if a stand-alone process were included in this flow. A pipeline process must accept seismic traces from the Executive, process them, and return the processed data to the Executive. The table below describes the four types of processes defined for use in the Executive.
1-12
Landmark
Disk Data Input, Tape Data Input and standalone tools always start new pipes within a single ow
CDP Stack
Bandpass Filter
Disk Data Output One pipe must complete successfully before a new pipe will start processing
Landmark
1-13
Types of Executive Processes The table below describes the four types of processes defined for use in the Executive. Table 1: ProMAX Executive Process Types
Process Type simple tools ensemble tools complex tools Description Accepts and returns a single seismic trace. Accepts and returns a gather of seismic traces Accepts and returns a variable number of seismic traces such as, stack. This type of process actually controls the ow of seismic data. Accepts and returns overlapping panels of traces to accommodate a group of traces too large to t into memory. Overlapping panels are processed and then merged along their seams.
panel tools
1-14
Landmark
This section discusses the following issues relating to the Ordered Parameter Files database: Organization Database Structure File Naming Conventions
The Ordered Parameter Files database serves as a central repository of information that you or the various tools can rapidly access. Collectively, the ordered database files store large classes of data, including acquisition parameters, geometry, statics and other surface consistent information, and pointers between the source, receiver and CDP domains. The design of the Orders is tailored for seismic data, and provides a compact format without duplication of information. The Ordered Parameter Files database is primarily used to obtain a list of traces to process, such as traces for a shot or CDP. This list of traces is then used to locate the index to actual trace data and headers in the MAP file of the dataset. Once determined, the index is used to extract the trace and trace header data from their files.
Organization
The Ordered Parameter Files contain information applying to a line and its datasets. For this reason, there can be many datasets for a single set of Ordered Database Files. Ordered Parameter Files, unique to a line, reside in the Area/Line subdirectory. The Ordered Parameter Files database stores information in structured categories, known as Orders, representing unique sets of information. In each Order, there are N slots available for storage of information, where N is the number of elements in the order, such as the number of sources, number of surface locations, or number of CDPs. Each slot contains various attributes in various formats for one
Landmark
1-15
particular element of the Order. The Orders are organized as shown in the table below. Table 2: Organization of Ordered Parameter Files
LIN (Line) Contains constant line information, such as nal datum, type of units, source type, total number of shots. Contains information varying by trace, such as FB Picks, trim statics, source-receiver offsets. Contains information varying by surface receiver location, such as surface location x,y coordinates, surface location elevations, surface location statics, number of traces received at each surface location, and receiver fold. Contains information varying by source point, such as source x,y coordinates, source elevations, source uphole times, nearest surface location to source, source statics. Contains information varying by CDP location, such as CDP x,y coordinates, CDP elevation, CDP fold, nearest surface location. Contains information varying by channel number, such as Channel gain constants, channel statics Contains information varying by offset bin number, such as surface consistent amplitude analysis. OFB is created when certain processes are run, such as surface consistent amplitude analysis. Contains information describing the recording patterns.
TRC (Trace)
PAT (Pattern)
XLN (Crossline)
OPF Matrices The OPF database files can be considered to be matrices. Each OPF is indexed against the OPF counter and there are various single numbers per index. Note the relative size of the TRC OPF to the other OPF files. The TRC is by far the largest contributor to the size of the database on disk
1-16
Landmark
OPF Maftrices
Landmark
1-17
Database Structure The ProMAX database was restructured for the 6.0 release to handle large 3D land and marine surveys. The features of the new database structure are listed below: Each order is contained within a subdirectory under Area and Line. For example, the TRC is in the subdirectory OPF.TRC. There are two types of files contained in the OPF subdirectories: Parameter: Contain attribute values. There may be any number of attribute les associated with an OPF. Index: Holds the list of parameters and their formats. There is only one index le in each OPF subdirectory. The exception to this is the LIN OPF. The LIN information is managed by just two les, one index and one parameter, named LIN.NDX and LIN.REC.
OPF files are of two types: Span: These les are denoted by the prex, #s. Non-span les lack this prex. The TRC, CDP, SIN, and SRF OPF parameters are span les. The rst span for each parameter is always written to primary storage. Span les are created in the secondary storage partitions listed in the cong_le as denoted with the OPF keyword. Span les may be moved to any disk partition within the secondary storage list for read purposes. Newly created spans are written in the OPF denoted secondary storage partitions. All subsequent spans are written to the secondary storage partitions denoted by the OPF keyword in a round robin fashion until the secondary storage is full. Then, subsequent spans are created in primary storage. Span le size is currently xed at 10 megabytes, or approximately 2.5 million 4 byte values per span le. Non-span: All other OPFs are non-span.
Given the fact that each parameter is managed by a file, it may be necessary to increase the maximum number of files open limit on some systems, specifically, SUN, Solaris and SGI. From the csh, the following command increases the file limit to 255 files open, limit de 255. The geometry spreadsheet is a ProMAX database editor. Modifying information within a spreadsheet editor and saving the changes will automatically update the database.
1-18
Landmark
There is no longer an import or export from the geometry database to the ProMAX database files as was required prior to the 6.0 release. Database append is allowed. Data can be added to the database via the OPF Extract tool or the geometry spreadsheet. This allows for the database to be constructed incrementally as the data arrives. There is improved network access to the database. Database I/O across the network is optimized to an NFS default packet size of 4K. All database reads and writes are in 4K pages. Existing and restored 5.X databases are automatically converted to the 6.0 (and later) database format.
File Naming Conventions Parameter file names consist of information type and parameter name, preceded by a prefix denoting the Order of the parameter. For example, the x coordinate for a shot in the SIN has the following name: #s0_OPF60_SIN.GEOMETRY.X_COORD. Where #s0_OPF60 indicates a first span file for the parameter, _SIN denotes the Order, GEOMETRY describes the information type of the parameter, and X_COORD is the parameter name. 0. Index file names contain the three letter Order name. For example, the index file for the TRC is called OPF60_TRC.
NOTE:
The index file for each Order must remain in the primary storage partition. Span parameter files may be moved and distributed anywhere within primary and secondary storage.
Within each Order, there are often multiple attributes, with each attribute being given a unique name.
Landmark
1-19
Parameter Tables
Parameter Tables are files used to store lists of information in a very generalized structure. To increase access speed and reduce storage requirements, parameter tables are stored in binary format. They are stored in the Area/Line subdirectory along with seismic datasets, the Ordered Parameter Files database files (those not in separate directories), and Flow subdirectories. Parameter Tables are often referred to as part of the database. Parameter tables differ from the OPF database in OPF files contain many attributes that are 1 number per something. Parameter tables contain more than one number per something. For example a velocity function contains multiple velocity-time pairs at 1 CDP.
Creating a Parameter Table Parameter tables are typically created in three ways: Processes store parameters to a table for later use by other processes. Parameter tables can be imported from ASCII les that were created by other software packages or hand-edited by you. Parameter tables can be created by hand using the Parameter Table Editor which is opened by the Create option on the parameter table list screen.
An example is the interactive picking of time gates within the Trace Display process. After seismic data is displayed on the screen, you pull down the Picking Menu and choose the type of table to create. The end result of your work is a parameter table. If you were to pick a top mute, you would generate a parameter table ending in TMUT. If you were picking a time horizon, you would generate a table ending in THOR. These picks are stored in tabular format, where they can be edited, used by other processes in later processing, or exported to ASCII files for use by other software packages. Remember, you name and store the parameter tables in their specific Area/Line subdirectory. Therefore, you can inadvertently overwrite an existing parameter table by editing a parameter table in a different processing flow.
1-20
Landmark
Exercise
1. In a ow-building window, add the Access Parameter Tables process to a ow and view the parameter menu with MB2. Find the line: VEL: RMS (stacking) velocity and click on Invalid. The list of parameter tables for RMS Velocity appear. 2. Click on Edit and select the name of the le to export. A Parameter Table spreadsheet appears with CDP, TIME, and SEMB_VEL columns. 3. Click on File and select Export. An ASCII File Export window appears with export information for quality control before actually creating the ASCII le. 4. Click on File. A new window appears with the path to your working directory. 5. Enter a lename after the last / and click OK. The window disappears and a dashed line appears in the ASCII File Export window. 6. Click on Format. An Export Denition Selection window appears. 7. Type in a selection name and click on OK. The Column Export Denition window appears. 8. Fill the Column Export Denition with starting and ending column numbers, then click on Save. When you ll in the start and end columns for a particular column denition, the contents of the column appear in the ASCII File
Landmark
1-21
Export window. Be sure the column denitions are wide enough to accommodate all the signicant gures, as well as complete column titles. If they are not, edit the Column Export Denition window until the information is correct. 9. When the ASCII File Export window is correct, click on Apply. An Apply Export window appears. You may choose to overwrite or append new information to the ASCII le. You may also add a single line description of your work that will be internal to the le. 10. Click on OK. This creates the ASCII le in the directory you specied. You may now Quit the Column Export Denition window, Cancel the ASCII le Export Window, and pull down the File menu in the Parameter Table window and exit this window and continue working.
Exercise
1. In a ow-building window, add the Access Parameter Tables process and view the parameter menu with MB2. Find the line: VEL: RMS (stacking) velocity and click on Invalid. The list of Parameter Files(tables) for RMS velocity appear. 2. Click on Create. The cursor will move to the top of the table name column, enter a new velocity le name. After typing a name, press Return. A Parameter Table spreadsheet appears with CDP, TIME, and VEL_RMS columns. 3. Click on File and choose Import. Two new windows appear: ASCII/EBCDIC File Import and File Import Selection. In the File Import Selection window, choose the path to the le containing velocity information to import and click on OK. The import information appears in the ASCII/EBCDIC File Import window.
1-22
Landmark
4. Click on Format in the ASCII/EBCDIC File Import window. The Import Denition Selection window appears. 5. Type in a selection name and click on OK. The Column Import Denition window appears. 6. Blank rows that will not be imported into the new velocity le. To blank the rows, click MB1 in the rst row to ignore and click MB2 in the last row to ignore. Press Ctrl-d, the rows to ignore are labeled Ignore Record for Import. 7. Fill the Column Import Denition window. Begin lling the Column Import Denition window by choosing a denition parameter by clicking on the parameter name. The parameter box will be highlighted in white. Next, move the cursor into the ASCII/EBCDIC File Import window to the values dening the denition parameter. Hold down MB1 as you drag it from left to right across the import parameter values. The chosen columns should highlight in black in the ASCII/EBCDIC File Import window and the Start Col and End Col boxes in the Column Import Denition window should contain the appropriate column numbers. Repeat this process with the other two parameters and save the denition. 8. When the Column Import Denition window is correct, click on Apply in the ASCII/EBCDIC File Import window. The Apply Import window appears. You may choose to overwrite or append new information to the spreadsheet. 9. Click OK. This lls in the spreadsheet with selected numbers. Also, the Import windows disappear from the screen. You may now continue working and apply these velocities to your data.
Landmark
1-23
Disk Datasets
ProMAX uses a proprietary disk dataset format that is tailored for interactive processing and random disk access. Disk dataset files can span multiple filesystems, allowing for unlimited filesize datasets. A typical set of files might look like this: /advance/data/usertutorials/landexample/12345678CIND /advance/data/usertutorials/landexample/12345678CMAP /advance/data/usertutorials/landexample/12345678/TRC1 /advance/data/usertutorials/landexample/12345678/HDR1 These files are described in more detail in the table below. Table 4: Composition of a Seismic Dataset
File Name Trace (...TRCx) Trace Header (....HDRx)
Contents
File containing actual sample values for data trace.
File containing trace header entries corresponding to data samples for traces in the trace le. This le may vary in length, growing as new header entries are added. Keep trace headers in a separate le so trace headers can be sorted without needing to skip past the seismic data samples. File keeps track of trace locations. Given a particular trace number, it will nd the sequential trace number within the dataset. This rapidly accesses traces during processing. The map le is a separate le, as it may grow during processing. File has free-form format information relating to the entire dataset, including sample interval, number of samples per trace, processing history, and names of trace header entries. This le may grow during processing.
Map (....CMAP)
Index (....CIND)
1-24
Landmark
CIND
HDRx
CMAP
TRCx
Secondary Storage In a default ProMAX configuration, all seismic dataset files reside on a single disk partition. The location of this disk partition is set in the $PROMAX_HOME/etc/config_file with the entry: primary disk storage partition: /advance/promax/data 20 In addition to the actual trace data files, the primary storage partition will always contain your flow subdirectories, parameter tables, ordered parameter files, and various miscellaneous files. The ...CIND and ...CMAP files which comprise an integral part of any seismic dataset are always written to primary storage. Since the primary storage file system is of finite size, ProMAX provides the capability to have some of the disk datasets, such as the ...TRCx and ...HDRx files, and some of the ordered parameter files span multiple disk partitions. Disk partitions other than the primary disk storage partition are referred to as secondary storage. All secondary storage disk partitions must be declared in the appropriate $PROMAX_HOME/etc/config_file. Samples entries are:
Landmark
1-25
secondary disk storage partition: /advance/promax/data2 20 TRC OPF secondary disk storage partition: /advance/promax/data3 20 TRC secondary disk storage partition: /advance/promax/data4 20 OPF secondary disk storage partition: /advance/promax/data5 20 Refer to the ProMAX System Administration guide for a complete description of the config_file entries for primary and secondary disk storage. Under the default configuration, the initial TRC1 and HDR1 files are written to the primary storage partition. It is possible to override this behavior by setting the appropriate parameter in Disk Data Output. If the parameter Skip primary disk partition? is set to Yes, then no TRC or HDR files will be written to the primary disk partition. This can be useful as a means of maintaining space on the primary storage partition. (To make this the default situation for all users, have your ProMAX system administrator edit the diskwrite.menu file, setting the value for Alstore to t instead of nil). A typical set of data files might look like this: /advance/data/usertutorials/landexample/12345678CIND /advance/data/usertutorials/landexample/12345678CMAP /advance/data/usertutorials/landexample/12345678/TRC1 /advance/data/usertutorials/landexample/12345678/HDR1 /advance/data/usertutorials/landexample/12345678/TRC4 /advance/data/usertutorials/landexample/12345678/HDR4 /advance/data/usertutorials/landexample/12345678/TRC7 /advance/data/usertutorials/landexample/12345678/HDR7 /advance/data2/usertutorials/landexample/12345678/TRC2 /advance/data2/usertutorials/landexample/12345678/HDR2 /advance/data2/usertutorials/landexample/12345678/TRC5 /advance/data2/usertutorials/landexample/12345678/HDR5 /advance/data2/usertutorials/landexample/12345678/TRC8 /advance/data2/usertutorials/landexample/12345678/HDR8 /advance/data3/usertutorials/landexample/12345678/TRC3 /advance/data3/usertutorials/landexample/12345678/HDR3 /advance/data3/usertutorials/landexample/12345678/TRC6 /advance/data3/usertutorials/landexample/12345678/HDR6
1-26
Landmark
Secondary storage is used in a as listed and available fashion. As an attempt to minimize data loss due to disk hardware failure, ProMAX tries to write a dataset to as few physical disks as possible. If the primary storage partition is skipped by setting the appropriate parameter in Disk Data Output, the CIND and CMAP files are still written to the primary storage partition, but the TRCx or HDRx files will not be found there.
Landmark
1-27
Tape Datasets
Tape datasets are stored in a proprietary format, similar to the disk dataset format, but incorporating required structures for tape input and output. Tape input/output operates either in conjunction with a tape catalog system, or without reference to the tape catalog. The tape devices used for the Tape Data Input, Tape Data Insert, and Tape Data Output processes are declared in the ProMAX device configuration window. This allows access to tape drives anywhere on a network. The machines that the tape drives are attached to do not need to be licensed for ProMAX, but the fclient.exe program must be installed.
Tape Trace Datasets A ProMAX tape dataset is similar to a disk dataset in that the index file (...CIND) and map file (...CMAP) still reside on disk in the Line/survey database. Refer to the documentation in the Disk Datasets portion of this helpfile for a discussion of these files. Having the index and map files available on disk provides you with immediate access to information about the dataset, without needing to access any tapes. It also provides all the information necessary to access traces in a non-sequential manner. Although the index and map files still reside on disk, copies of them are also placed on tape(s), so that the tape(s) can serve as a self-contained unit(s). If the index and map files are removed from disk, or never existed, as in the case where a dataset is shipped to another site, the tapes can be read without them. However, access to datasets through the index and map files residing solely on tape must be purely sequential. Tape datasets are written by the Tape Data Output process, and can be read using the Tape Data Input or Tape Data Insert processes. These input processes include the capability to input tapes by reel, ensemble number, or trace number. Refer to the relevant helpfile for a complete description of the parameters used in these processes. The use or non-use of the tape catalog in conjunction with the tape I/O processes is determined by the tape catalog type entry in the appropriate $PROMAX_HOME/etc/config_file. Setting this variable to full activates catalog access, while an entry of none deactivates catalog access. An entry of external is used to indicate that an external tape catalog, such as the Cray Reel Librarian, will be used. You can override the setting provided in the config_file by setting the environment
1-28
Landmark
variable for BYPASS_CATALOG to t, in which case the catalog will not be used. The actual tape devices to use for tape I/O must also appear as entries in the config_file, under the tape device: stanza.
Landmark
1-29
Getting Started The first step in using the Advance tape catalog is to create some labeled tapes. The program $PROMAX_HOME/sys/bin/tcat is used for tape labelling, catalog creation and maintenance, and for listing current catalog information. The program is run from the UNIX command line. The following steps are required to successfully access the tape catalog: 1. Label tapes 1. Read and Display tape labels 1. Add labeled tapes to a totally new catalog Before adding the tapes to a new catalog, it is a good idea to visually inspect the contents of the label information file for duplicate or missing entries. The contents typically look like: 0 AAAAAA 0 1 4 1 AAAAAB 0 1 4
1-30
Landmark
2 AAAAAC 0 1 4 3 AAAAAD 0 1 4 4 AAAAAE 0 1 4 The fields are: volume serial number (digital form), volume serial number (character form), tape rack slot number, site number, and media type, respectively. You can manually edit these fields. 1. Write a label information le from the existing catalog 1. Add labeled tapes (and datasets) to the existing catalog 1. Merge an additional catalog into the existing catalog 2. Delete a dataset from the catalog
Landmark
1-31
1-32
Landmark
Chapter 2
Landmark
2-1
2-2
Landmark
Getting Started
ProMAX is built upon a three level organizational model referred to as Area/Line/Flow. When entering ProMAX for the first time, you will build your own Area/Line/Flow workspace. As you add your own Area, you may want to name it with reference to a geographic area that indicates where the data were collected, such as Onshore Texas, or use your name, such as daves area. Line is a subdirectory of Area which contains a list of 2D lines from an area or a 3D survey name. After choosing a line from the Line menu or adding a new line, the Flow window will appear. Name your flows according to the processing taking place, such as brute stack. Look at the Menu Map figure on the previous page. This figure refers to other menus you can use to access your datasets, database entries and parameter tables. These features will be discussed later.
Exercise
In this exercise, you will build a workspace and look at some of the available options. Initiating a ProMAX session can be done in a variety of ways. Typically your system administrator will create a start-up script or make a UNIX alias, and set certain variables within your shell start-up script to make this easy. This topic is discussed in the system overview chapter. 1. Type promax. A product name window should pop up followed by the Area window. The window, as shown below displays a list of all available Areas. Other information is listed, such as owner, date and UNIX name.
Landmark
2-3
Global Commands
Area Menu
Conguration Options
The black horizontal band below the menu is called mouse button helps. Mouse button helps describe the possible actions at the current location of the cursor. Below the mouse button helps are options to Exit ProMAX, congure the queues and user interface, as well as check on the status of jobs. These options will be discussed at length later. The list of options running across the top of this menu: Select, Add, Delete, Rename, and Permission are called global options. To use these, you must rst click on the option followed by clicking the line on your screen with your Area name. The Copy option works differently by providing popup menus to choose Areas not displayed in this window.
2-4
Landmark
2. Click on Add from the Area Menu with MB1. At this point you are building your work space. Adding an Area creates a UNIX directory. 3. Before moving the mouse, enter an Area name You can choose the area name. 4. Press return, or move the mouse to register your selection. The Line Menu appears with the same global options to choose from as the Area Menu. (Pressing return or moving the mouse to register a selection depends on whether the Popups remain after mouse leaves option is toggled on or off. This option is listed under the Conguration Options.)
Global Commands
Line Menu
Processing Queues Window Exit Promax Job Notication and Control
Conguration Options
Landmark
2-5
5. Add a Line using the same steps as you did for adding an Area. The Flow window appears with the following new global options: 6. Datasets: Lists all your datasets for that particular line. Database: Allows you to view your Ordered Parameter Files. Product: Changes from ProMAX 2D to ProMAX 3D or VSP.
Global Commands
Available Flows Active Command Access Datasets Change Products Access Database
Flows Menu
2-6
Landmark
Exercise
Upon completion of the previous exercise, you are in the ProMAX flow building menu (see below). From here, you will construct your flows by ordering processes and selecting the necessary parameter information. Once the flow is ready, you will execute it and look at the results. 1. Look at the ow building menu.
Landmark
2-7
The screen is split into two sides: a list of processes on the right and a blank tablet below the global options on the left. You will select from the processes on the right and add them to the left. The list of available processes is very long. This list is ordered from top to bottom into a general processing sequence with I/O processes at the top and poststack migration tools further down on the list. There is a scroll bar to help you look at the list. There are also options available to hide processes in the secondary or More list (use the mouse button helps). You can customize the list to have only the processes you use most often displayed. 2. Move your cursor into different areas of the display, such as into the processes list, the blank tablet and the various options. The mouse button helps are sensitive to the current cursor location. 3. Global Options for ow editing are as follows. Add: This is the default. When highlighted in blue, a process can be selected from either the list of processes or a text search menu. Delete: When selected with MB1, the highlighted process is removed from the ow. This process is actually stored in a buffer and can be accessed by selecting Delete with MB3. Selecting Delete with MB2 appends a newly deleted process to the existing delete buffer. MB3 is also used to paste the contents of this buffer into the current ow. The memory of the buffer is maintained even after exiting a ow menu, so the contents may be pasted into another ow. Execute: When selected, the job is executed. There are two methods available to execute a ow using the Trace Display process: MB1 and MB2 will Execute suppressing pause for display. These options allow the display to immediately take over the monitor when the job has nished running. MB3 indicates Execute via Queue. This option enables the use of the two types of queues. When using MB3, a new menu pops up allowing the use of either the general batch queues or the
2-8
Landmark
small job batch queues. In order for this option to work your system administrator should have enabled the queues when ProMAX VSP was installed. Note: When using Screen Display, the mouse button helps are correct and MB1 will Execute With Normal Wait on display. When this option is used, the Notication window rst shows the job has started and is then waiting for display. By clicking on the Notication window, a new Processing Jobs window appears where it waits for your response. Clicking on Wait for Display, prompts the display to come to the foreground of the monitor. This option is useful if you want to work on something else and do not want to be interrupted by the display taking over the monitor. View: Accesses the view (job.output) le. This le includes important job information such as error statements. Exit: Brings you back to the menu listing of all your ows.
4. Move your cursor into the Data Input/Output portion of the processes list, and select the process SEG-Y Input with MB1. You have just added your rst process to a ow. 5. Move your cursor back into the processes list (but not on a category heading) and type trace d and press return. This acts as a text search. Click on Trace Display to add it to the ow. 6. Parameterize the ow. Editing Flow: 00- Display data Add Delete Execute View Exit
Trace Display
Number of ENSEMBLES/screen-----------------------10 ----Default all remaining parameters for this process---7. Select SEG-Y Input parameters.
Landmark
2-9
Click on SEG-Y Input with MB2 to bring up the parameter selection window. Now you can select the parameters for this process. To get a helple for a process, click on the red highlighted question mark. 8. In the SEG-Y Input menu, select the dataset as directed by your instructor. There are 3 traces per shot ensemble in this SEGY dataset. All of the remaining parameters may be defaulted. 9. Select the Trace Display parameters. For now, do not change any of the values except that we want to display 10 ensembles. We will discuss many of the other options in the next chapter. 10. Run the ow by clicking on the global command, Execute. Execution results in a trace display on the screen. Eight icons appear in a column to the left of the traces, and pulldown menus appear above the traces. 11. Click on the page forward icon a few times and watch as we move from one group of shots to the next. 12. You may elect to change the primary annotation from Source to FFID using the VIEW/Trace Annotation pull down menu. 13. Click on File and then Exit/Stop ow in the pulldown menu. This interrupts the job and brings you back to the ow builder.
2-10
Landmark
Trace Display
Primary trace LABELING header entry-----------------FFID
Landmark
2-11
6. Toggle the Trace Display active and the Disk Data Output inactive using MB3. 7. Select new Disk Data Input parameters. Your rst look at the executed job was all of the shots with all channels. After clicking the Page Forward icon, you saw the next set of shots. What if you wanted to look at a every other shot? What if you only wanted to look at a single channel for each shot? These options, and many more, are available in Disk Data Input. 8. Click on the Get All for Trace Read Option. This toggles to Sort and the menu will automatically add three new options: Select Primary trace header entry: Allows you to resort to another domain, such as CDP, or remain in the same sort order, which sets you up for trace limiting. Select Secondary trace header entry: Same as above. Sort order for dataset: Allows you to restrict the amount of data brought into the ow, such as channels 1-60.
Lets try one. 9. Set the Primary trace header entry to FFID (Field le ID number) 10. Click on Sort order for dataset. An Emacs Widget Window appears for specifying input traces. A format and example are given at the bottom of this window. Emacs Help is discussed later in the training class. 11. In the Widget Window delete existing values and type 1-80 (2) /. 12. Move your cursor out of the Widget Window. 13. Click on Execute. You will see FFIDs 1- 19 by 2. 14. You may want to change the primary trace annotation again to FFID instead of SOURCE using the pull down menu.
2-12
Landmark
15. Click on the Page Forward icon. This will be Live Source Numbers 21-39 by 2. When the last available data is displayed, the Page Forward triangle becomes grayed out and is inactive. To exit this display, click on File and choose Exit/Stop Flow. Lets make the exercise a little bit more complicated and try to display all the shots but only with channel 1. 16. Select the parameters for Disk Data Input. . Editing Flow: 00- Display data Add Delete Execute View Exit
Trace Display
Primary trace LABELING header entry--------------CHAN Secondary trace LABELING header entry-------------FFID >Disk Data Output< Choose CHAN from the popup menu for primary trace header entry and FFID for secondary. 17. Change the Sort order for dataset to 1:*. This format species to build ensembles of recording channel number and have the traces within this ensemble ordered by FFID. Check the formats and examples for hints. 18. Execute the ow. You will only see the trace from channel 1 for all the shots displayed as a single ensemble
Landmark
2-13
In this case you may elect to set the primary annotation to CHAN and the secondary to FFID. This is a typical sort type for VSP data. 19. Select to Exit/Stop the ow.
2-14
Landmark
Chapter 3
Landmark
3-1
Trace Display
When you execute your job, the following display appears: Trace Display Window Icon Bar Menu Bar Trace Header Plot
Active Icon
Mouse Help
Data Display
3-2
Landmark
Icon Bar
The following is a brief description of the Trace Display icons, located along the side border: Next ensemble: Show the next ensemble. When there is no more data in the ow, the icon will turn gray and become inactive. In ProMax, an ensemble is a collection of traces, such as a shot record or CDP gather. Each ensemble is agged with an end of ensemble mark in the trace header (END_ENS). Previous ensemble: Show the previous ensemble. When at the beginning of the ow, this icon is gray and inactive. Rewind: Return to the rst ensemble. Save Image: Save the current screen image. Annotation and picked events are saved with the trace data.
Animation: Brings up the Animation dialog box to review the saved images. This button is active only when there are at least two saved screen images. You have the option to cycle through the selected screens at a chosen rate. These are just screen images, you cannot edit parameter les using the saved image. Paintbrush: Allows you to "paint" trace kills, reversals and mutes interactively on the screen after they have been picked.
Zoom Tool: Click and drag using MB1 to select an area to zoom. If you release MB1 outside the window, the zoom operation is canceled. If you just click MB1 without dragging, this tool will unzoom. You can use the zoom tool in the axis area to zoom in one direction only. Velocity Tool: Displays linear or hyperbolic velocities. For a linear velocity, click MB1 at one end of a waveform and drag the red vector out along the event. A velocity is displayed at the bottom of the screen. Use MB2 to display a hyperbolic velocity by anchoring the cursor at the approximate zero offset position of the displayed shot or CDP. Drag the red line along the event and read the velocity at the bottom. New events can be measured with either velocity option by reclicking the mouse on a new reector and re-anchoring
ProMAX VSP User Traiining Manual 3-3
Landmark
the starting point. Velocities can be labeled by using MB3 on the current velocity. Geometry must be assigned to successfully use this icon. Header Tool: Displays detailed information about trace headers and their values for each individual trace. Click MB1 on any trace to call up the header template. If the header template is in the way of the traces being viewed, you can move the template by dragging the window. To remove the template click on the header icon or on any other icon. Annotation Tool: When active you can add, change, and delete text annotation in the trace and header plot areas. The pointer changes to a circle when it is over text annotation. You can move an annotation by clicking and dragging MB1, or add new annotation by clicking MB1 when the pointer is not over an existing annotation. When the pointer is over an existing annotation, click MB2 to delete the text or MB3 to edit the text or change its color.
Menu bar
File has five options available in a pulldown menu. You can save your picks, move to the next screen, make a hardcopy plot or exit Trace Display. You have two choices when you exit. You can exit and stop the flow, or you can exit and let the flow continue without Trace Display. Note: Use caution when using the stop option. For example, you use Disk Data Input to read in ten ensembles with a Disk Data Output and a Trace Display. If you execute this flow and use the Exit/Stop Flow option after clicking through the first five ensembles, then you will actually output five ensembles in the output dataset as opposed to writing out ten ensembles. View has five options in a pulldown menu. You can control the trace display, the trace scaling, and trace annotation parameters. You can also choose to plot a trace header above the trace display and edit the color map used for color displays.
3-4
Landmark
Animation saves screens, or displays previously saved screens in any order and different swap speeds .
Landmark
3-5
For example, to create a parameter table file with a list of traces to kill, click on Picking and a menu of parameter table choices appears. Click on Kill traces. Another window appears for selecting a previous kill parameter file or creating a new file.
When you create a new file, another window appears listing trace headers to choose from for a secondary key.
3-6
Landmark
In this case, an appropriate key for killing traces would be CHAN, allowing selection of each individual trace within each shot record. Depending upon the parameter table you are using, the most appropriate secondary header should appear at the top of the list. At this time a Picking Tool icon will appear on the side of the Trace Display screen below the other icons. Picking Tool: This appears when one or more pick objects from the Picking menu are selected. A small window with the le name will appear on the right hand side of the screen. This means the le is open and ready to be lled with the primary and secondary key values of killed traces. When active, click on MB1 to pick a point on a trace or click and drag to pick a range of traces. When the mouse is over a picked point, the pointer shape changes into a circle. Click and drag using MB1 to move a picked point. Use MB2 to click on a single point to delete it, or click and drag over a range of points to delete them. Click MB3 for additional picking options. Holding MB1 down and dragging it across several traces allows for a consecutive number of traces to be added. To select traces from the next shot use the Trafc light icon. The created Kill traces le remains open and waiting for more traces to be added to the le.
To create a new parameter table such as a reverse traces file, use the Pick icon again and select Reverse traces from the menu. After creating a new file with a new name, choose a secondary key of CHAN. The new file name appears in the small window on the right hand side of the screen below the kill traces file name. The kill traces file is no longer highlighted, meaning that it is inactive and the reverse traces file is highlighted. If you have chosen traces to kill and reverse on the screen, the active parameter file will have the chosen traces overplotted with a red line. The traces chosen for the inactive table(s) will be overplotted in blue. This helps you distinguish which file is active and which file is inactive. Traces are only added to the active file. Select or delete traces in the same manner using the mouse button helps at the bottom. To go back to adding to the kill traces file, click on the kill file and use MB1 to toggle that file to active. The reverse traces file table is no longer highlighted in black and any reverse traces picked on the screen are overplotted in blue. Some parameters require a top and a bottom pick, such as a surgical mute. Once you have picked the top of the mute zone, click MB3 anywhere inside the trace portion of Trace Display. A new menu appears allowing you to pick an associated layer (New Layer). You can also snap your pick to the nearest amplitude peak, trough or zero crossing.
Landmark
3-7
Miscellaneous time gates are parameter tables used for such procedures as picking a window for a deconvolution operator design gate or windows for time variant filtering or scaling. For this exercise pick a decon design gate with a secondary key of AOFFSET. Picking a miscellaneous time gate is also done in two steps. First, pick the top of the gate by selecting points to be connected with MB1. Because AOFFSET is the secondary key, the picks at the corresponding offset on the opposite side of the shot will be displayed if you click MB3 in the display field and choose Project from the popup menu. Then use MB3 to select an associated layer for the bottom half of the gate. In order to pick another time gate, below or overlapping the previous, continue to use MB3 to pick tops and bottoms. Time gates must always be picked in pairs, otherwise your job may fail. Each time gate pair is also shown in the legend box.
3-8
Landmark
Exercise
This exercise describes the way to pick a top mute. Other parameter tables may be picked in the same fashion. Trace kills, trace reversals and miscellaneous time gates were discussed in the previous section. 1. Build this ow: Editing Flow: 01- Pick Parameter Tables Add Delete Execute View Exit
2. Read the le we created in the last exercise. This le should exist in your own line. 3. Insert an Automatic Gain Control process for cosmetics. 4. Parameterize Trace Display to display 80 ensembles per screen. This VSP data has 3 traces per shot and there are a total of 80 shots in this project. 5. Set the primary annotation to be FFID instead of SOURCE. 6. Click on Execute.
Landmark
3-9
The interactive Trace Display window appears. 7. Click the Picking pulldown menu and choose Pick Top Mute. Since you have not previously created a top mute table, enter a new table name called Top Mute. A select Secondary key window appears. 8. For this dataset, select FFID for the trace header entry. The mute times that you pick will be interpolated as a function of FFID. This is a relatively unique relationship for VSP data that differs from surface seismic. 9. Pick a mute. Turn on the Picking tool icon and pick a top mute to remove the energy above the rst arrivals. Select only a few traces on the record because points will be connected and interpolated as well as extrapolated. Click MB3 in the display eld and choose Project from the popup menu to display the picks at the intermediate FFIDS that were not explicitly picked. NOTE: all of the traces at the same FFID will get "X"ed as the project interpolates the points. You may also elect to press the "Paintbrush" icon and interactively apply the mute on the display. 10. Exit and Stop the ow. To exit, click on File pulldown menu and select Exit/Stop Flow. If you choose to exit, you are prompted to save or not save the work you have completed. Save this mute so that we can re-apply it via the Trace Muting Process.
3-10
Landmark
11. Edit your previous ow by inserting Trace Muting. Editing Flow: Display Gathers Add Delete Execute View Exit
Landmark
3-11
3-12
Landmark
Chapter 4
Landmark
4-1
Parameter Test
The Parameter Test process provides a mechanism for testing simple numeric parameters by creating multiple copies of input traces and replacing a key parameter in the next process in the flow with specified test values. It automatically expands the processing flow, creating IF conditional branches for each test value. The output consists of copies of the input data with a different test value applied to each copy. Parameter Test creates two header words. The first is called REPEAT. This is the data copy number and is used to distinguish each of the identical copies of input data. The second is called PARMTEST and is an ASCII string, uniquely interpreted by the Trace Display processes as a label for the traces.
Exercise
In this exercise, you will use Parameter Test to compare shot gathers with different AGC operator lengths. 1. Build the following ow: Editing Flow: 02- Parameter Test Example Add Delete Execute View Exit
Parameter Test
Enter Parameter Values: ---------------------250|500|1000 Trace Grouping to Reproduce: ----------------------Ensemble
Trace Display
2. Read the le that we wrote to your line after reading the SEGY le. Sort the input to have a primary sort order of CHAN and a secondary of FFID. Get channel 1 only for all FFIDs
4-2
Landmark
3. Specify Parameter Test test values. Type in a list of parameter values for AGC operator lengths, each separated by a vertical bar ( | ). To determine the format (real, integer, sequence) and a realistic range of test values, look at the default value in the AGC process, in this example the AGC operator length. (Try values of 250, 500 and 1000 ms.). We will reproduce by ensembles. 4. Replace the AGC operator length default value with ve nines (99999). 99999 is a ag telling Parameter Test which parameter you are testing. 5. Use Trace Display to present the results from the test to the screen. We will have 3 original ensembles each copied 4 times. This gives a total of 12 ensembles. 6. Execute the ow. After the Trace Display appears, you can use the zooming and scrolling capabilities to move through the ensembles. 7. Exit and Stop the ow. 8. Select View from the ow builder menu to look at the processes that were executed in your ow. Near the bottom of the job.output le is a listing of the executed processes as shown below. There are some additional processes in the ow and Parameter Test is absent because Parameter Test is a macro, built from other processes.
DISKREAD2 REPEAT FLOW_IF AGC THDRMATH FLOW_ELSEIF
Landmark
4-3
Chapter 4: Parameter Selection and Analysis AGC THDRMATH FLOW_ELSEIF AGC THDRMATH FLOW_ELSEIF THDRMATH FLOW_ENDIF ST_TRACE_DISPLAY
In the next exercise we will build a ow similar to this manually to see how these components communicate with one another.
4-4
Landmark
One method of generating multiple data copies is to use the Reproduce Traces process. This process is included in the Parameter Test macro. Reproduce Traces generates a specified total number of copies and appends a header word to each trace, allowing you to distinguish between the multiple versions of data. This header word is known as Repeated Data Copy Number or REPEAT for short. It is a numeric value from 1-N, where N is the total number of generated copies. You should consider placing Reproduce Traces after any processing which is common to all copies of the data, but prior to the processes you wish to compare. Splitting or branching the flow is a conceptual term for controlling the processes your dataset utilizes. In other words, you do not actually break up any single flow into separate flows, rather utilize the capability of the IF, ELSEIF, and ENDIF processes to select and direct traces for processing. This is handled automatically by the Parameter Test process, as you saw if you looked at the View information when you executed the previous flow. More specifically, each copy of the data is passed to a different process, or the same process with different parameter selection using a series of IF, ELSEIF and ELSE processes in the flow. For example, if the data copy number (REPEAT) is 1, then pass that copy of the data to the next process. If the data copy number is 2, pass that copy to a different process, and so on until all copies of the data have been passed to unique processes. The series of conditions is ended with ENDIF.
Landmark ProMAX VSP User Traiining Manual 4-5
Finally, you may use a process called Trace Display Label to generate a header word for posting a label on the display.
Exercise
Incorporate Reproduce Traces with IF and ENDIF to compare processed and unprocessed data. In this exercise, we will compare the first shot of the AGC dataset to a version with true amplitude recovery. It is always a good idea to have a control copy, the original input, for further comparison. This flow illustrates how to compare these three copies. 1. Build the following ow: Editing Flow: 03 - IF/ELSEIF Conditional Add Delete Execute View Exit
Reproduce Traces
Trace grouping to reproduce: ----------------------Ensembles Total Number of datasets: ----------------------------------------3
IF
SELECT Primary trace header word:-----------------Repeat SPECIFY trace list:----------------------------------------------------1
Trace Equalization Trace Display Label:--------------- EQ ELSE Trace Display Label:-------- Original Input ENDIF Trace Display
2. Read the le that we wrote to your line after reading the SEGY le.
4-6
Landmark
Sort the input to have a primary sort order of CHAN and a secondary of FFID. Get channel 1 only for all FFIDs 3. In Reproduce Traces, enter 3 for the total number of datasets. You will generate two additional copies, one ensemble (record) at a time. 4. Select Repeat for Select Primary trace header word in IF and ELSEIF. IF acts as the gate keeper, providing the mechanism for selecting or restricting traces which will be passed into a particular branch of the ow. Header words are used (just as in Disk Data Input) to uniquely identify the traces to include or exclude in a particular branch. In the rst IF conditional, select REPEAT as the primary trace header and 1 (copy number) as the trace list entry. Data copy 1 is passed to AGC in this example. The ELSEIF condition passes the second data copy number (REPEAT=2) to Trace Equalization. The ELSE process selects all traces, not previously selected with IF or ELSEIF. In our case, having selected two of the three copies of data for ltering, leaves only the third data copy (REPEAT=3) for the ELSE branch. In this example, no additional processing is applied to this copy. It is the control copy. 5. Use Trace Display Label to create labels for each copy. Label the copies according to their unique processing. For example, label the rst copy with AGC, the second with EQ and the nal copy with Original Input. 6. Select to use a hand input design gate for the Trace Equalization and use the default parameters. 7. Modify Trace Display to do each of the following in two different executions: each copy on different screens and use screen swapping all records on same screen.
Landmark
4-7
Exercise
In this exercise you will run Interactive Spectral Analysis in the simple mode.
4-8
Landmark
1. Build this ow. Editing Flow: interactive spectral analysis Add Delete Execute View Exit
Landmark
4-9
There are many different displays that you can interactively turn on and off. Remember that you have control of your display when you are selecting parameters. 6. Select Options/PreFFT Time Window, and turn on the Boxcar. You have a lot of control from within the interactive session to modify your analysis. 7. Activate the Zoom icon to enlarge the trace data. In this case, your F-X spectrum is zoomed as well. 8. From the File pull down select to Exit and Stop the ow.
Exercise
1. Rerun the ow after changing to Single Subset mode. Editing Flow: interactive spectral analysis Add Delete Execute View Exit
2. Click on the Select Rectangular Region icon to window the data on the leftmost (large) shot display. 3. Select a range of data from the left hand window over which to do the analysis. Use MB1 to start the rectangle and MB1 again to end the window.
4-10
Landmark
Now the trace data in the top middle of the screen is the subset of data you just dened with the corresponding spectra also displayed. 4. Click on the Select Rectangular Region again. 5. Click MB2 inside the zoom window on the left data display window to drag the box to another location and click MB2 again to redisplay the zoom window. 6. Try resizing the selection window with the other mouse button options. 7. From the File pulldown select to Exit and Stop the ow.
Landmark
4-11
Exercise
1. Rerun the ow after changing to the Multiple Subset mode. Editing Flow: interactive spectral analysis Add Delete Execute View Exit
4-12
Landmark
Chapter 5
Landmark
5-1
Number of recording levels: 80 Depth of first record: 12100 ft. Depth of last record: 8150 ft. Depth increment: 50 ft. Source offset from hole: 500 ft. The bore hole is vertical with no deviation Source elevation: 0 ft. Datum elevation: 0 ft. Assume the Kelly Bushing is also at 0 ft. for simplicity Source is at station 1 Receivers are at stations 2-81
5-2
Landmark
Geometry Diagram
8150 ft
12100 ft
Landmark
5-3
5-4
Landmark
Chapter 6
Landmark
6-1
SEGY Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ----------------------------------------------------------------------/misc_les/vsp/vsp_segy MAX traces per ensemble: ----------------------------------------3 Remap SEGY header values -----------------------------------No
6-2
Landmark
SEGY Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ----------------------------------------------------------------------/misc_les/vsp/vsp_segy MAX traces per ensemble: ----------------------------------------3 Remap SEGY header values -----------------------------------No
Landmark
6-3
6-4
Landmark
Chapter 7
VSP Geometry
VSP Geometry Assignment takes advantage of the simplicity of the spatial relationship between the source and receiver positions in VSP data. This helps to minimize the input required to describe the geometry. Some VSP data is very complex and incorporates a lot of varied information to describe the geometry. Some holes are deviated (crooked) and you may have inclination and azimuth information at all recorded depth levels. In these cases you may also have two sets of depth information: log depth and vertical depth. The Spreadsheets have been written to handle all such information. Our case is very simple, using a non-deviated hole.
Landmark
7-1
Exercise
1. Build a ow to Assign VSP Geometry. Editing Flow: Spreadsheet / Geometry Add Delete Execute View Exit
Fill in each of the Borehole, Patterns, and Sources spreadsheets in this order. The Borehole spreadsheet describes the X, Y and Z information of the borehole. The Patterns spreadsheet describes how many channels were recorded and the orientation of these channels. The Sources spreadsheet describes the X, Y and Z information for all of the source locations and relates the recorded FFID information with a given source and spread reference position.
7-2
Landmark
3. Open the Borehole spreadsheet by clicking on Borehole on the main menu. In this case we have a straight, vertical borehole. The log depths are the same as the elevations, except that they are all positive numbers. All x,y values will be dened at 0.0 and 0.0. 4. Dene the borehole with two sets of X,Y, and Z coordinates.
5. Exit from the Borehole Spreadsheet. 6. Open the Patterns Spreadsheet by clicking on Patterns on the main menu. There is only one pattern for this geometry. The Grp Int column species the separation between the specied recording channels in the borehole. The Offset column species a shift to apply to the chan from channel relative to the depth listed in the sources spreadsheet. In this case we have three channels all at the same depth. You will dene the exact depth for the receivers for each shot.
Landmark
7-3
8. Open the Sources Spreadsheet by clicking on Sources on the main window. 9. We have a total of 80 shots in this VSP, so the rst thing to do is expand the sources spreadsheet to 80 rows. Mark the last card as a block with MB1 and MB2 and then use the edit pull down to insert the required number of cards. 10. Number the Sources and FFIDs starting at 1 and incrementing by 1. 11. All shots are at shot station number 1 and at an elevation of 0.0 ft. 12. X,Y values are dened at 500.0 and 0.0 respectively. 13. All shots use the same pattern (1) and have 3 channels. 14. The pattern reference depths start at 12100 and decrement by 50 ft. for each shot.
NOTE: For documentation purposes, the columns have been re-ordered slightly. All additional columns are lled with 0.0
15. Exit from the Sources Spreadsheet The next steps in the geometry denition process are to dene the pseudo CDP binning and to nalize the database.
7-4
Landmark
This is a 3 step process. 16. Open the Bin menu and select to Assign trace geometry by pattern information. 17.
18. With the Assign option selected, click on the OK button. You should see several window related to Assigning VSP geometry based on patterns ash by fairly quickly. The last window will say that the geometry has been successfully assigned. 19. Dismiss the Status window by clicking on OK. 20. Compute the Pseudo Common Depth points. Bin starting at CDP 1, starting at 0.0 ft. and ending at 12100 ft. incrementing by 50 ft. per bin.
21. Click on the OK button. Again you should see several window ash by ending with a window indicating that the binning was completed successfully. 22. Dismiss this window by clicking on the OK button. 23. Finalize the database.
Landmark
7-5
This step completes building the look up tables and other database nalization functions.
24. Select the Finalize Database option and click on the OK button. You should see a window indicating that the VSP geometry nalization has completed successfully. 25. Dismiss the Status window by clicking on OK. 26. Click on the Cancel button in the binning dialog box to dismiss this window.
7-6
Landmark
Landmark
7-7
7-8
Landmark
4. Create the GEO_COMP trace header word. In Trace Header Math, create a trace header word called GEO_COMP, which is equivalent to recording channel number. For multi component VSP processing we need to be able to distinguish between the vertical and two horizontal components by a geophone component header word. Component 1 is the vertical trace. Component 2 is the primary horizontal and component 3 is the other horizontal. By convention horizontal 2 is 90 degrees clockwise from horizontal 1 looking from the top. 5. In Disk Data Output, output a new le. Since there are no valid trace numbers, we cannot do trace header only processing in an overwrite mode.
Exercise
This exercise QCs the headers. 1. Build a new ow to re-read the data and plot it to check the new values in the trace headers Editing Flow: qc geometry load Add Delete Execute View Exit
Trace Display
Number of ENSEMBLES per screen --------------- 80 Primary trace LABELING ------------------------------------ FFID Secondary trace LABELING ----------------------- REC_ELEV INCREMENT for Secondary annotation ------------------- 12 2. Input the traces with the new geometry and check the headers with the Header Dump capabilities in Trace Display. Plot 80 ensembles and annotate each FFID and every 12th receiver elevation.
Landmark ProMAX VSP User Traiining Manual 7-9
You should see the correct shot X value, and receiver elevation values. .
NOTE: The receiver depths go into receiver elevation not receiver depth.
7-10
Landmark
Chapter 8
Landmark
8-1
Trace Length
New trace length ----------------------------------------------- 2000
8-2
Landmark
Plot 1 ensemble. You also may want to change the annotation to be CHAN and then Receiver Elevation.
If you are successful, the Trace Display plot should look as follows:
Landmark
8-3
Trace Length
New trace length ----------------------------------------------- 2000
>Trace Display<
8-4
Landmark
Chapter 9
Landmark
9-1
Trace Display
Number of ENSEMBLES per screen -------------------------- 1 Primary trace LABELING ---------------------------------- CHAN Secondary trace LABELING ----------------------- REC_ELEV INCREMENT for Secondary annotation ------------------- 12 2. In Disk Data Input, input the previously created le containing the vertical trace. This le is one ensemble of all traces from channel 1 3. In Trace Display, plot 1 ensemble. You may also want to set the annotation heading to be CHAN on the rst line and then plot every 12th receiver elevation on the second. 4. Execute the Flow. 5. Select the Picking pulldown menu, and choose to edit the rst arrivals in the database. You will be prompted to select a type of attribute. You will write these rst break times to an attribute of type GEOMETRY in the TRC database called FB_PICK.
Landmark
9-2
6. The Pick editing icon on the left side of the plot will automatically be selected for you. 7. Pick the arrivals with the rubber-band and then snap to the desired phase with MB3. It is suggested to pick the rst strong, continuous peak. 8. Edit any picks as you see t. 9. Exit the program to save the picks to the database.
9-3
Landmark
9-4
Landmark
Chapter 10
Landmark
10-1
Exercise
1. Build the following ow to compute the average velocity: Editing Flow: generate avg.velocity function Add Delete Execute View Exit
10-2
Landmark
Using a reference datum of 0 ft., generate an average velocity vs. depth velocity table. Do not impose any limits. Input the set of rst breaks that was picked from the vertical traces and then edited from the database. 3. View the output function using Velocity Viewer/ Point Editor. Select parameters to input the average velocity vs. depth table created from the rst arrivals, and output to a new velocity table that is generated by smoothing the computed function over a depth range of 250 ft. (or 5 receiver levels). In the interactive smoothing parameters, set to output a function every 1000 CDPs to ensure that only one function is output. Also set the depth sampling interval to 50 ft. to match the original input sampling interval. The CDP smoothing value can be defaulted and set the depth smoothing level to 250 ft.
Landmark
10-3
The following diagram shows the difference between the original, or raw average velocity vs. the smoothed version.
10-4
Landmark
Chapter 11
Landmark
11-1
11-2
Landmark
Velocity Manipulation*
Type of velocity table to input ----- Average Vel in Depth Get velocity table from database entry ------------------ Yes Select input velocity database entry --------------------------------------------------from raw rst break pick times Combine a second velocity table ---------------------------- No Resample the input velocity table? ------------------------- No Shift or stretch the input velocity table ------------------- No Type of parameter table to output ---------------------------------------------------------------- Stacking (RMS) Velocity Select output velocity database entry ------------------------------------------------------------------- from raw average Spatially resample the velocity table ---------------------- No Output a single average velocity table -------------------- No Smooth velocity eld --------------------------------------------- No Vertically resample the output velocity table ----------- No Adjust Output velocity by percentage --------------------- No
Landmark
11-3
2. Input the average velocity function that was computed from the rst arrival times before smoothing and convert it to an RMS function. You might want to name the output table from raw average. 3. Display the output function using the point editor. 4. Rerun the same ow using the smoothed average function that you created earlier. Convert it to an RMS function using the option: from smoothed average. Editing Flow: 06- compute RMS from AVG vel Add Delete Execute View Exit
Velocity Manipulation*
Select input velocity database entry ---------------------------------------------------------------------smoothed version Select output velocity database entry ----------------------------------------------------------- from smoothed average
11-4
Landmark
If you zoom in around a single output point on either plot, you will see that there are actually two points at each time knee separated by only a couple of ms.
From Raw Average ----------- From Smoothed Average Comparison of RMS Velocity Functions
Landmark
11-5
6. Edit the Velocity Manipulation* menu to vertically resample the output RMS from the smoothed average at a new sample interval of 48 ms. Editing Flow: 06- compute RMS from AVG vel Add Delete Execute View Exit
Velocity Manipulation*
Select input velocity database entry ---------------------------------------------------------------------smoothed version Select output velocity database entry ------------------------------------------------------- from smoothed average Spatially resample the velocity table ---------------------- No Output a single average velocity table -------------------- No Smooth velocity eld --------------------------------------------- No Vertically resample the output table ----------------- Yes Time step sizes for the output table ------------------- 48 Adjust Output velocity by percentage --------------------- No
11-6
Landmark
Database/Header Transfer
Direction of transfer -- Load TO trace headers FROM db Number of parameters -------------------------------------------- 1 First database parameter --- TRC GEOMETRY FB_PICK First header entry --------(FB_PICK) First break pick time
Parameter Test
Enter parameter VALUES ----------------- 2|4|6|8|10|12 Trace grouping to reproduce ---------------------- Ensembles
Trace Display
Number of ENSEMBLES per screen -------------------------- 7 2. Input the le with only the vertical traces and process all traces.
Landmark ProMAX VSP User Traiining Manual 11-7
3. Transfer the rst break times to the trace headers. 4. Produce a comparison of 2,4, 6, 8, 10 and 12 dB/Sec combined with a 1/dist spherical divergence correction. Use the RMS velocity function that you generated from the smoothed average and then resampled to every 48 ms. 5. Parameterize Trace Display for the test panels. We are generating 6 panels plus the control panel, so we will have a total of 7 ensembles. We may also elect to set the minimum time of the display to 500 msec instead of 0 for the comparison to avoid a lot of dead samples at the top of the display. Since we are looking for relative amplitude on these traces, we may nd that using entire screen scaling will be a better choice than individual trace scaling. 6. Produce a second set of test panels varying the time power value from 1.4 to 2.2 by .2 and turning off the SPHDIV and dB/sec corrections. You must reset the dB/sec correction back to a single number other than 99999 and dont forget to reset the number of ensembles to display in Trace Display if you are using this option.
11-8
Landmark
7. After selecting a set of TAR parameters (suggested SPHDIV and 6 dB/sec to 2000 ms), process the traces and output a new data le with TAR applied. Editing Flow: 07 - true amp recovery (test) Add Delete Execute View Exit
Database/Header Transfer
Direction of transfer -- Load TO trace headers FROM db Number of parameters -------------------------------------------- 1 First database parameter --- TRC GEOMETRY FB_PICK First header entry --------(FB_PICK) First break pick time
Landmark
11-9
11-10
Landmark
Chapter 12
Landmark
12-1
Exercise
1. Build the following ow to apply the rst break pick times as a static to atten the down going energy. Editing Flow: 08- wavefield separation Add Delete Execute View Exit
Header Statics
Bulk shift Static -------------------------------------------------- 100 What about previous statics ---- Add to previous statics Apply how many static header entries --------------------- 1 First header word to apply --------------------------- FB_PICK How to apply header statics ------------------------- Subtract
12-2
Landmark
4. In Apply Fractional Statics, apply the non-sample period portion of the static. 5. Plot the output traces on the screen and check to see that the rst arrivals are approximately at at about 100 ms. Set the maximum time of the display to 500 msec.
Landmark
12-3
Disk Data Input Header Statics Apply Fractional Statics -----------Event Alignment in Window
Maximum allowable static shift ----------------------------- 10 Allowable percentage of hard zeros ------------------------ 55 Method of building model trace ------------ Selective Stack Ignore end of ensembles? ------------------------------------- Yes Seek and report reversed traces ---------------------------- No Accumulate statics in TOT_ALIN ---------------------------- No Get analysis window parms from Database? --------- No SELECT Primary header word ---------------------------- FFID SELECT secondary header word ---------------------- NONE SPECIFY window analysis parameters ------ 1:50-150/
Header Statics
Bulk shift Static ------------------------------------------------------ 0 What about previous statics ---- Add to previous statics Apply how many static header entries --------------------- 1 First header word to apply ------------------------------alinstat How to apply header statics -------------------------------- Add
12-4
Landmark
Use a 55 trace Selective Stack model, ignoring end of ensemble issues to estimate static shifts up to 10 ms on a Hand Input window 100 ms wide centered on the rst breaks [1:50-150/]. Use a primary header word of FFID with no secondary header. 3. Read the Event Alignment helple to nd the name of the attribute to apply in Header Statics and also how to set the yes/no switch for Accumulate Statics in TOT_ALIN. Set to No for this ow. 4. In Header Statics, ADD a user dened attribute called ALINSTAT to any previous statics and apply any remaining fractional statics. 5. Plot the output traces on the screen and check to see that the rst arrivals are atter than those from the previous exercise.
Landmark
12-5
6. Expand the previous ow to add in a second iteration of event alignment. Editing Flow: wavefield separation Add Delete Execute View Exit
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics -------------Event Alignment in Window
Allowable percentage of hard zeros ------------------------ 30 Accumulate statics in TOT_ALIN ----------------------------Yes
Header Statics
Bulk shift Static ------------------------------------------------------ 0 What about previous statics ---- Add to previous statics Apply how many static header entries --------------------- 1 First header word to apply ------------------------------alinstat How to apply header statics -------------------------------- Add
12-6
Landmark
Disk Data Input Header Statics Apply Fractional Statics Reproduce Traces IF <REPEAT=1> Trace Display Label ELSEIF <REPEAT=2> Event Alignment in Window Header Statics Apply Fractional Statics Trace Display Label ELSEIF <REPEAT=3> Event Alignment in Window Header Statics Apply Fractional Static Event Alignment in Window Header Statics Apply Fractional Statics Trace Display Label ENDIF Trace Display
Specify Display END time ------------------------------------ 500 2. Using ow editing techniques, rearrange and expand the existing ow to generate the comparison displays of: First Arrivals only 1 loop of Event Alignment 2 loops of Event Alignment
Landmark
12-7
3. Display the results using Trace Display. The three comparison displays should resemble the following examples:
You may nd that setting the trace display to display 3 vertical panels will help you do this comparison.
12-8
Landmark
Landmark
12-9
Exercise
1. Expand the previous ow to do 2D spatial ltering to estimate and subtract the downgoing energy. Editing Flow: wavefield separation Add Delete Execute View Exit
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics --------------Parameter Test
Enter parameter VALUES ------ 3|5|7|9|11|13|15|19 Trace grouping to reproduce ---------------------- Ensembles
Bandpass Filter
Default all parameters EXCEPT Ormsby lter frequency values ------------- 8-12-100-125
12-10
Landmark
Test values of 3 |5| 7 |9| 11 |13| 15 | 19 for the number of traces in the lter. 3. In 2D Spatial Filtering, apply a Single Sample, Simple 2D Median Filter to Subtract the downgoing energy from the total attened waveeld. In the Minimum Number of traces for Subtraction parameter, use a minimum of 3 traces in the lter and fold live traces back over the edge to make sure that there are always enough traces for the lter. 4. Apply a fairly wide open zero phase Ormsby Band Pass lter to suppress any adverse side effects of the median lter. For this data at a 4 ms sample rate, apply a lter of 8-12-100-125. 5. Display the results using Trace Display. You may nd that setting the maximum time to display to 700 ms prior to display may save you some time in the zooming process. You may also nd that setting the display to plot 5 horizontal panels will be helpful. You may also want to reset the Trace Display to do one vertical panel with 1 ensemble per screen and use the screen swapping capabilities within Trace Display to compare the different results. 6. After selecting the length of lter that works best, rerun the ow to QC the output section. Toggle the Parameter Test inactive and input the proper lter length (11) in the 2D Spatial Filter process instead of the 99999 for the parm test.
Landmark
12-11
7. Add a Trace Display Label after the Median Filter to annotate these data for future reference.
12-12
Landmark
F-K Analysis
Using an F-K filter to separate the input data into various dip components is another very effective means of separating the flattened downgoing energy from the dipping upgoing energy, thus separating the upgoing from the downgoing. We can plot the flattened data in the F-K plane and estimate various fan filters and/or polygonal filters to isolate one of the dip components. Using the Interactive F-K Analysis process, you can interactively test various reject and accept F-K polygons to keep the upgoing and downgoing.
Landmark
12-13
Exercise
1. Expand the previous ow to add an F-K Analysis to pick the fan lter, or polygon lters to apply. Editing Flow: wavefield separation Add Delete Execute View Exit
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Static >Parameter Test< >2-D Spatial Filtering< >Bandpass Filter< --------------F-K Analysis
DEFAULT all parameters EXCEPT Panel width in traces -------------------------------------------- 80 Distance between input traces ------------------------------- 50 Select mute polygon table -- reject poly to keep upgoing Mode of F-K lter windowing ------------------------- REJECT
-------------->Trace Display<
Note: Toggle the median lter, bandpass lter, and Trace Display steps inactive. 2. Select F-K Analysis parameters. There are 80 traces per panel and the traces are separated by 50 ft. Add a Parameter Table name for the FK-Polygon. We may elect to use polygon editing or we may just measure velocities to use a fan function in the F-K lter process.
12-14
Landmark
3. Use the dx/dt tool to measure the apparent velocity of the up-going energy in attened space on the F-K Analysis section. The velocity should be about 6700 ft./sec. 4. Pick a positive and negative velocity cut to apply as a fan lter in FK Filter. Numbers like -4000 and + 20000 are good choices for a reject lter to keep the upgoing. You may choose numbers like -20000 and +20000 as an accept lter to keep the downgoing. Note: If you are working with polygons, be careful about how you set the Accept and Reject options. 5. Generate the Filtered Output panel to QC the polygon and parameters.
Landmark
12-15
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Static >Parameter Test< >2-D Spatial Filtering< >Bandpass Filter< --------------F-K Filter
Type of F-K lter ----------------------------- Arbitrary Polygon Distance between input traces ------------------------------- 50 Panel Width on Traces ------------------------------------------ 80 Select mute parameter le - reject poly to keep upgoing Mode of F-K lter operation --------------------------- REJECT
12-16
Landmark
Suggested parameters are to use a fan lter of -4000 and + 20000 ft./ sec in reject mode. With this velocity the K-space wrap parameter should be set to No. QC the output with F-K analysis.
Landmark
12-17
In general the application and subtraction gates are the entire time range of the data. The design gates should be restricted to a good data zone. For VSP data, this is the area near the first arrivals. When operating on data that has been flattened on the first arrivals, the low percentage eigenvectors are the flattened downgoing energy and the high percentages are the dipping upgoing. In this exercise you will design the eigenvectors over a time window around the first arrivals using a fairly short spatial window and then subtract the low percentage values from the input to extract the upgoing energy.
12-18
Landmark
Exercise
1. Alter the existing ow to use the Eigenvector Filter to separate the waveelds. Editing Flow: wavefield separation Add Delete Execute View Exit
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Static >Parameter Test< >2D Spatial Filtering< >Bandpass Filter< >F-K Analysis< >F-K Filter< --------------Parameter Test Eigenvector Filter --------------Trace Display
Parameters for Parameter Test and Eigenvector Filter are on the next page.....
Landmark
12-19
--------------Parameter Test
Enter parameter VALUES ------------------- 3|7|11|15|19 Trace grouping to reproduce ---------------------- Ensembles
Eigenvector Filter
Mode ----------------------------- Subtract Eigenimage of Zone Get matrix design gates from DATABASE --------------- No SELECT Primary header word ---------------------------- FFID SPECIFY design time gate ---------------------------- 1:0-500/ Get application gates from DATABASE ------------------- No SELECT Primary header word ---------------------------- FFID SPECIFY application gate -------------------------- 1:0-2000/ Get Subtraction gate from DATABASE -------------------- No SELECT Primary header word ---------------------------- FFID SPECIFY subtraction gate -------------------------- 1:0-2000/ Type of Computation ------------------------------------------ Real Horizontal window width -------------------------------- 99999 Start percent of eigenimage range ---------------------------- 0 End percent of eigen image range -------------------------- 10 Re-apply trace mutes after lter --------------------------- Yes
--------------Trace Display
Note: Toggle the F-K lter and F-K Analysis inactive in the ow 1. Design a test of the Eigenvector lter over the rst arrivals Use a constant design window for all FFIDs from 0-500 ms and apply a lter over the entire time range (0-2000 ms). Also, subtract over the entire time range from 0-2000 ms. Test values of 3, 7, 11, 15, and 19 for the trace window width and subtract the rst 10 percent of the Eigen images.
12-20
Landmark
2. You may want to test various panel widths, design gates, and Eigen image percentage ranges. Note that you cannot use the Parameter Test sequence to test the percentage ranges. 3. Try various Trace Display congurations: 1) Each output ensemble individually and then swap the screens. 2) All ensembles on the same screen. Note that the Eigen Filter is very difcult to test because the percentage to keep range varies as a function of the length of the lter.
Landmark
12-21
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics >Parameter Test< Reproduce Traces IF 2D Spatial Filtering Trace Display Label Bandpass Filter ELSEIF F-K Filter >F-K Analysis< Trace Display label ELSEIF Eigenvector Filter Trace Display Label ENDIF >Trace Length< Trace Display
2. Based on the value of the Repeat header word, apply all three types of separation possibilities and compare the results using Trace
12-22
Landmark
Display. 3. If desired, an AGC or other type of gain function may be applied. 4. Experiment with various display options to compare the results from the different separation techniques. Display each 80 trace ensemble on the screen independently and scroll through them. Display all three 80 trace ensembles on the screen at the same time. Display all three 80 trace ensembles on the screen in 3 vertical and then 3 horizontal display panels.
Landmark
12-23
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics >Reproduce Traces< >Parameter Test< >IF< >2D Spatial Filtering< >Trace Display Label< >Bandpass Filter< >ELSEIF< >F-K Analysis< F-K Filter Trace Display label >ELSEIF< >Eigenvector Filter< >Trace Display Label< >ENDIF< -------------Header Statics Disk Data Output ------------->Trace Display<
12-24
Landmark
2. Suppose that the F-K Filter was selected as the best option to isolate the upgoing energy. 3. Comment out all other processes and Add in a Header Statics to Remove the previous statics. Set the number of header statics to apply to 0. 4. Add in a Disk Data Output to save the upgoing energy in a le for later processing. Editing Flow: wavefield separation Add Delete Execute View Exit
-------------Header Statics
Bulk shift static ------------------------------------------------------ 0 What about previous statics -- Remove previous statics Apply how many static header entries --------------------- 0 HOW to apply header statics ------------------------------- Add
-------------Note: You may want to toggle the Trace Display inactive for this exercise to ensure that all traces get processed. If you leave the Trace Display turned on you will nd that the display is not very good because we have returned the data to original recorded time but the display is set for the rst 700 msec only.
Landmark
12-25
Disk Data Input Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Reproduce Traces >Parameter Test< IF 2D Spatial Filtering Trace Display Label Bandpass Filter ELSEIF F-K Filter >F-K Analysis< Trace Display label ELSEIF Eigenvector Filter Trace Display Label ENDIF >Header Statics< >Disk Data Output< Trace Display
12-26
Landmark
2. In 2D Spatial Filtering, select to run in Normal mode, the Eigenvector Filter to Output the eigenvector ltered zone, and the F-K lter to run in an Accept mode. Note: You may want to change the fan lter velocities for this exercise. Values of -20000 to 20000 ft./sec in an accept mode are reasonable. 3. Repeat the various comparison displays and select the method which gives the desired results. Display 3 vertical panels limiting the time on each panel to 1100 ms. Display 3 ensembles on one screen to 2000 ms.
Landmark
12-27
Disk Data Input Database/Header Transfer Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics Event Alignment in Window Header Statics Apply Fractional Statics >Reproduce Traces< >IF< >2-D Spatial Filtering< >Trace Display Label< >Bandpass Filter< >ELSEIF< >F-K Analysis< >F-K Filter< >Trace Display label< >ELSEIF< Eigenvector Filter Trace Display Label >ENDIF< Disk Data Output ------------->Trace Display<
2. Comment out all other processes.
12-28
Landmark
3. Change the dataset name in Disk Data Output to save the down going energy for later processing. 4. In this case, also make sure that the Header Statics process is toggled inactive. Why do we leave the statics applied to the downgoing data?
Landmark
12-29
Trace Display
Number of ENSEMBLES per screen -------------------------- 3 In the Disk Data Input and Insert processes, get three input les: the original input, the separated upgoing with statics removed, and the separated downgoing with the statics still applied. 2. In Trace Display, select to plot three ensembles. 3. Plot the rst break picks on the traces. They should plot at about the start of the reection data on the upgoing. Note: This is meaningless on the downgoing.
12-30
Landmark
Chapter 13
VSP Deconvolution
Deconvolution of VSP data involves the generation of an inverse filter designed to compress an input wavelet to a zero phase wavelet. The input wavelet is commonly extracted from the separated downgoing energy. A filter is designed to compress this energy into a zero-phase wavelet centered on the first arrival time. This filter is then applied to the upgoing data to remove the source signature from the reflection energy and output a zero phase wavelet at the actual time of the reflection generation interface. Some design gate determination is commonly performed to isolate the wavelet from which the inverse filter is designed. This design gate generally starts at zero time, envelopes the first arrivals and progresses in time for a couple of hundred milliseconds. The maximum time of the gate typically comes immediately after the last consistent reverberation of the first arrival.
Landmark
13-1
Trace Display
2. Input the separated, attened downgoing data. 3. All of the Trace Display parameters may be defaulted. 4. Using the Pick pulldown menu, select to pick a Bottom Mute to be applied prior to inverse lter design. When prompted for a header entry to use for the mute function, select FFID as the header entry over which to vary the mute start times. Set the bottom mute to start at about 400 ms. 5. Exit the program to save the mute parameter table.
13-2
Landmark
Trace Muting
Reapply previous mutes --------------------------------------- NO Mute time reference ---------------------------------------- Time 0 Type of mute -------------------------------------------------- bottom ending ramp --------------------------------------------------------- 30 EXTRAPOLATE mute times --------------------------------- YES get mute le from the database ---------------------------- Yes Select mute parameter le -- decon design bottom mute
Trace Display
2. Apply the mute that was just picked as a Bottom Mute. 3. Display the result.
Landmark
13-3
Trace Display
2. Input the separated, attened downgoing data and apply the bottom mute to limit the design gate. 3. Select Filter Generation parameters. After applying a Hanning Window taper over 100% of the input wavelets (zero percent at), design and output to disk 1000 ms inverse lters where time zero on the input trace is 100 ms and use 3% white noise. 4. Plot the output from Filter Generation. The plotted traces are the actual lters to be applied.
13-4
Landmark
5. In Filter Generation, output the lter traces to a disk le. Where did the 100 ms in the lter generation come from?
Landmark
13-5
Deconvolution Filter QC
Exercise
1. Expand the previous ow to apply the lters and QC the results on the down going data. Editing Flow: 12 - VSP decon Add Delete Execute View Exit
Trace Display
2. Input the separated, attened downgoing data. 3. Select VSP Deconvolution parameters. Apply lters that have been mixed over 5 FFIDs and exclude 1 lter trace on each end. Make sure that the zero reference time of the lter is correct. This should be set to 500 ms. which is the center time of the lter traces. 4. Add a label for display Is the peak of the zero phase wavelet at the correct time?
13-6
Landmark
Exercise
1. Build a ow to apply the decon lters to the upgoing data. Editing Flow: 12 - VSP decon Add Delete Execute View Exit
Trace Display
Number of ENSEMBLES per screen -------------------------- 2 2. Input the separated upgoing data at original recorded time.
Landmark ProMAX VSP User Traiining Manual 13-7
3. In VSP Deconvolution, apply the lters that were previously generated. Mix the lters over 5 FFIDs and exclude 1 lter trace on each end. Make sure that the zero reference time of the lter is correct. This should be set to 500 ms or the center of the lter traces. 4. In Trace Display Label, label this data as being upgoing energy with decon applied. 5. In Disk Data Output, write the deconvolved data to disk. 6. Read the before and after decon les in a Disk Data Input and compare them with Trace Display.
13-8
Landmark
Exercise
1. Expand the previous ow to read two les from disk and then do a spectral analysis on each. Editing Flow: 13 - spectral analysis Add Delete Execute View Exit
Landmark
13-9
13-10
Landmark
Chapter 14
Landmark
14-1
Exercise
1. Build a ow to pick the top and bottom mute to dene the corridor to stack. Editing Flow: corridor stack Add Delete Execute View Exit
Trace Display
2. In Disk Data Input, input the deconvolved upgoing data le. 3. Use Trace Display to plot the trace. You may nd that adjusting the minimum and maximum display time will help you position your mutes. 4. From the picking pulldown menu, select to dene a top mute. Dene the mute to set the Top of the corridor. When prompted, select FFID as the header entry over which to vary the mute start times. Note: This mute should be about the same time as the rst arrivals. 5. From the picking pulldown menu, select to dene a bottom mute. Dene the mute to set the Bottom of the corridor. It is normal to make the corridor about 100 ms wide.
14-2
Landmark
Exercise
1. Expand the existing ow to add in two Trace Muting processes. Editing Flow: corridor stack Add Delete Execute View Exit
Trace Muting
Re-apply previous mutes-----------------------------------------No Mute time reference------------------------------------------Time 0 TYPE of mute---------------------------------------------------Bottom Starting ramp--------------------------------------------30. EXTRAPOLATE mute times?-----------------------Yes Get mute le from the DATABASE?-------------------------Yes SELECT mute parameter le--------------------------------------------------------------------corridor stack bottom mute
--------Trace Display
2. In Disk Data Input, input the deconvolved upgoing data le. 3. In Trace Muting, apply the Top and Bottom mutes.
Landmark
14-3
Do not forget that one is a Top mute and the other is a Bottom mute. 4. Display the result with Trace Display.
14-4
Landmark
Exercise
1. Expand the existing ow to add in the processes associated with VSP Corridor Stack and optional enhancement programs. Editing Flow: 14 -corridor stack Add Delete Execute View Exit
Disk Data Input <GET ALL> >Trace Muting< >Trace Muting< >Trace Display< --------One Way Normal Moveout Correction VSP Corridor Stack Trace Display Label Disk Data Output Automatic Gain Control Bandpass Filter Trace Display
Parameters for One Way NMO and VSP Corridor Stack are on the next page.
Landmark
14-5
14-6
Landmark
3. Apply the One Way NMO correction using the RMS velocity function that was generated earlier for the Spherical Divergence Correction. Use the resampled RMS from the smoothed average. 4. In VSP Corridor Stack, apply the Top and Bottom mutes and add the rst arrival times from the header as a static. Make 5 copies of a mean stack trace. For display purposes, apply a bulk shift static correction of -900 ms. 5. Write the Corridor Stack traces to a disk dataset. 6. If desired, add in the AGC and/or Bandpass Filter before and/or after stack to help with the cosmetic appearance of the stack traces. 7. Add a new Trace Display to plot the corridor stack.
Landmark
14-7
Exercise
1. Build the following ow: Editing Flow: splice corr stk into stack Add Delete Execute View Exit
Bandpass Filter
Default all parameters
Trace Display
2. In Disk Data Input, input the Final Stack le. 3. In Trace Label, add a label called Stack. 4. In Splice Datasets, splice in the Corridor Stack at CDP Bin Number 820 and pad with 3 dead traces. 5. Apply a bandpass lter and amplitude scaler (AGC) for cosmetic purposes.
14-8
Landmark
6. Plot the combined display with Trace Display Note: The stack and VSP are from completely different areas. When the corridor stack was generated, a time shift is applied to approximately tie the stack and the corridor stack.
Landmark
14-9
14-10
Landmark
Chapter 15
Landmark
15-1
15-2
Landmark
1. Build the following ow: Editing Flow: 16- generate intv-dpth function Add Delete Execute View Exit
Velocity Manipulation*
Type of velocity table to input ----- Average Vel in Depth Get velocity table from database entry ------------------ Yes Select input velocity database entry --------------------------------------------------from raw rst break pick times Combine a second velocity table ---------------------------- No Resample the input velocity table? ------------------------- No Shift or stretch the input velocity table -------------------- No Type of parameter table to output --------------------------------------------------------------------- Interval Vel in Depth Select output velocity database entry ------------------------------------------------------------------- from raw average Spatially resample the velocity table ---------------------- No Output a single average velocity table -------------------- No Smooth velocity eld --------------------------------------------- No Vertically resample the output velocity table ----------- No Adjust Output velocity by percentage --------------------- No
We will not do any editing, so you can output to the same table as you are reading from. Are there any problems with this interval velocity function?
Exercise
1. Expand the ow to generate a new interval velocity vs. depth function from the smoothed average velocity vs. depth function. Editing Flow: 16- generate intv-depth function Add Delete Execute View Exit
15-4
Landmark
------- from raw avg ------- from smoothed avg ----Note: There are two points very close together on both functions so you can elect to resample the function in Velocity Manipulation prior to output.
Landmark
15-5
Exercise
One of the requirements for the VSP migration is that the velocity field span the entire range of the output image area. Since we may want to image events recorded below the bottom of the well, we must expand the velocity field in depth to cover the proposed image area. We will also resample the output intv-depth function to the original sample period of 50 ft. 1. Edit the existing ow. Editing Flow: 16- generate intv-depth function Add Delete Execute View Exit
15-6
Landmark
5. Remember to go into edit mode and you may elect to edit the velocity function in preparation for migration. Edit the smoothed version and output a Velocity Function for VSPCDP transform and Migration.
Landmark
15-7
15-8
Landmark
Chapter 16
Landmark
16-1
VSP/CDP Transform
Horizontal binning interval -------------------------------------- 5 CDP at which to extract vel function --------------------- 100 Specify trace length of output trace in msec -------- 3000 Select how velocity is to be specied ------------ Database Select a velocity le ---------------- from smoothed average Ray trace interval ------------------------------------------------- 20 Datum elevation ----------------------------------------------------- 0 Allowable percentage of moveout stretch ---------------- 50
Trace Display
Primary trace LABELING header ----------------------- NONE Secondary trace LABELING header ---------------- RBIN_X 2. In Disk Data Input, input the upgoing data with decon applied. 3. Select the VSP/CDP Transform parameters.
16-2
Landmark
Use the interval velocity function that was created from the smoothed average function and edited. Build a trace every 5 ft. to 3 sec, and ray trace every 20 ft. 4. Use Trace Label to label the traces as the VSP-CDP transform. In Disk Data Output, output the le. 5. Plot the output traces using Trace Display. Plot 1 ensemble. You will probably want to make the display window smaller in order to see the traces more clearly. 6. Look at the headers of the traces and nd the new header word that you can use to best annotate above the traces
Exercise
1. Expand the existing ow to redisplay the VSP-CDP transform. Editing Flow: VSP-CDP transform Add Delete Execute View Exit
>Disk Data Input< >VSP/CDP Transform< >Trace Display Label< >Disk Data Output< Disk Data Input
Select dataset----------------------------------shots - input data Trace Read Option--------------------------------------------Get All
Bandpass Filter
Default all parameters
Trace Display
2. In Disk Data Input, input the VSP-CDP transform. 3. Apply a bandpass lter and AGC for cosmetic appearance. 4. Display the traces using Trace Display.
Landmark
16-3
Plot the traces by annotating the RBIN_X header word above the traces. This will plot a value representing the distance from the borehole above the traces. Note: This is a user-dened attribute. You may want to enhance the appearance of the transform by applying a trace mix and/or adjusting the scaling and/or bandpass lter parameters.
16-4
Landmark
Chapter 17
VSP Migration
For VSP surveys where the source is offset from the well location, it is possible to migrate the recorded data. The migration produces a high spatial resolution seismic section that allows you to image reflection events in the vicinity of the bore-hole looking in then plane defined by the well bore and the shot location. Unlike the VSP-CDP transform, the migration can look on the opposite side of the borehole. This may help identify faults and/or the attitude of dipping reflected events. The migration differs from the VSP-CDP transform in that the transform is a simple mapping function that takes a point on a shot to receiver trace and maps that point to a single reflection point in the subsurface. The migration operation is similar to that for surface seismic data, where it attempts to place a data point at all locations from which it could have originated. The migration can be a time consuming process depending on the size of the output image area, the selected algorithm and the size of the dataset.
Landmark
17-1
VSP Migration
Exercise
1. Build the following ow to migrate the VSP data: Editing Flow: VSP migration Add Delete Execute View Exit
Disk Data Input <GET ALL> VSP Kirchhoff Migration Trace Display Label Disk Data Output
2. In Disk Data Input, input the upgoing data with decon applied. 3. Select the following VSP Kirchhoff Mig. parameters:
4. In Trace Label, label the traces as the migration. In Disk Data Output, output the le to disk.
17-2
Landmark
>Disk Data Input< <GET ALL> >Trace Header Math< >VSP Kirchhoff Migration< >Trace Display Label< >Disk Data Output< ------------------Disk Data Input Automatic Gain Control ------------------Trace Display
2. In Disk Data Input, input the migration le. 3. Scale the data to improve its cosmetic appearance. Use a value of about 2000 ft. for the AGC gate length. 4. In Trace Display, plot the migrated data and annotate CDP number above the traces.
Landmark
17-3
17-4
Landmark
Chapter 18
Landmark
18-1
Exercise
1. Build a ow to Assign VSP Geometry. Editing Flow: Spreadsheet / Geometry Add Delete Execute View Exit
Fill in each of the Borehole, Patterns, and Sources spreadsheets in this order. The Borehole spreadsheet describes the X, Y and Z information of the borehole. The Patterns spreadsheet describes how many channels were recorded and the orientation of these channels. The Sources spreadsheet describes the X, Y and Z information for all of the source locations and relates the recorded FFID information with a given source and spread reference position.
18-2
Landmark
1100
100
1050,1050
929.92
929.92
1000
1100
1070.71
Landmark
18-3
3. Open the Borehole spreadsheet by clicking on Borehole on the main menu. In this case we have a curved borehole. We have 8 control points. The log depths differ from the elevations. 4. Dene the borehole with six sets of X,Y, and Z coordinates.
5. Exit from the Borehole Spreadsheet. 6. Open the Patterns Spreadsheet by clicking on Patterns on the main menu. There is only one pattern for this geometry. The Grp Int column species the separation between the specied recording channels in the borehole. The Offset column species a shift to apply to the chan from channel relative to the depth listed in the sources spreadsheet. In this case we have fteen channels with a set of three at the same depth. We will simulate a 5 level multi component tool where the individual levels are 50 apart. You will dene the exact depth for the rst receiver for each shot.
18-4
Landmark
7. Exit from the Patterns Spreadsheet. 8. Open the Sources Spreadsheet by clicking on Sources on the main window. 9. We have a total of 28 shots in this VSP, so the rst thing to do is expand the sources spreadsheet to 28 rows. Mark the last card as a block with MB1 and MB2 and then use the edit pull down to insert the required number of cards. 10. Number the Sources and FFIDs starting at 1 and incrementing by 1. 11. All shots are at shot station number 1 and at an elevation of 0.0 ft. 12. X,Y values are dened at 1050.0 and 1050.0 respectively. 13. All shots use the same pattern (1) and each has 15 channels. 14. The pattern reference depths start at 6800 and decrement by 250 ft. for each shot. Note: For documentation purposes, the columns have been reordered slightly. All additional columns are lled with 0.0.
15. Exit from the Sources Spreadsheet The next steps in the geometry denition process are to dene the pseudo CDP binning and to nalize the database. This is a 3 step process.
Landmark ProMAX VSP User Traiining Manual 18-5
16. Open the Bin menu and select to Assign trace geometry by pattern information.
17. With the Assign option selected, click on the OK button. You should see several windows related to Assigning VSP geometry based on patterns ash by fairly quickly. The last window will say that the geometry has been successfully assigned. 18. Dismiss the Status window by clicking on OK. 19. Compute the Pseudo Common Depth points. Bin starting at CDP 1, starting at 0.0 ft. and ending at 7000 ft. incrementing by 50 ft. per bin.
18-6
Landmark
20. Click on the OK button. Again you should see several windows ash by ending with a window indicating that the binning was completed successfully. 21. Dismiss this window by clicking on the OK button. 22. Finalize the database. This step completes building the look up tables and other database nalization functions. 23. Select the Finalize Database option and click on the OK button. You should see a window indicating that the VSP geometry nalization has completed successfully. 24. Dismiss the Status window by clicking on OK. 25. Click on the Cancel button in the binning dialog box to dismiss this window.
Landmark
18-7
2D plot of SRF vs. elevation used to check depth assigned to each receiver station
2D plot of TRC vs. various other values used to check additional information for each trace
18-8
Landmark
Chapter 19
Landmark
19-1
Number of recording levels: 80 Depth of first record: 12100 ft. Depth of last record: 8150 ft. Depth increment: 50 Source offset from hole: N/A The bore hole is vertical with no deviation Source elevation: 0 ft. Datum elevation: 0 ft. Assume the Kelly Bushing is also at 0 ft. for simplicity Source is at station 1 Receivers are at stations 2-81
19-2
Landmark
Chapter 20
Landmark
20-1
Exercise
1. Build a ow to plot the input data. Editing Flow: level statics - vertical stack Add Delete Execute View Exit
If these header words did not already exist, how could you build them?
20-2
Landmark
Exercise
In this exercise, we will pick the level statics correlation time gate. 1. Edit the ow to toggle the VSP level Statics process inactive.. Editing Flow: level statics- vertical stack Add Delete Execute View Exit
2. In Disk Data Input, sort the input with a primary sort key of CHAN and a secondary of REC_ELEV. This will combine all traces into one ensemble with the traces ordered as a function of the receiver elevation. 3. Pick a miscellaneous time gate with a secondary key of rec_elev and select times on the rst trace and last trace about 50 ms before the rst arrivals. 4. Using MB3, Project the pick times to all of the other traces.
Landmark ProMAX 3D User Traiining Manual 20-3
You should see that all traces recorded at the same receiver elevation have the same time. 5. Add a New Layer using MB3 to this table. Pick the bottom time of the correlation gate about 100 ms below the top time. 6. Use MB3 to Project the times to the other traces. Exit the Trace Display program and save the table to disk.
20-4
Landmark
Exercise
1. Expand the ow to add the VSP level Statics process: Editing Flow: level statics - vertical stack Add Delete Execute View Exit
Disk Data Input <GET ALL> -------------------VSP Level Statics -------------------Trace Display
2. Shots will be identied by their FFIDs, that is all traces with the same FFID belong to the same shot. There are two methods for identifying groups of shots to be operated on as groups: By hand listing the respective FFIDs By reading all traces with a common header word
In our case we have two header words to choose from, the Receiver Elevation and the SHT_GRP. We will use the SHT_GRP header word for this exercise. There are a maximum of 5 shots in a group. 3. The maximum separation between groups of SHT_GRP must be set to a value less than 1. 4. Analyze the vertical Recording Channel Number from each shot [channel 1]. 5. You can expect a maximum static shift of about 5 ms.
Landmark
20-5
20-6
Landmark
Exercise
1. Expand the ow to add the Header Statics, Apply Fractional Statics and the Trace Display: Editing Flow: level statics - vertical stack Add Delete Execute View Exit
Disk Data Input <GET ALL> -------------------VSP Level Statics Header Statics Apply Fractional Statics Trace Display Label -------------------Trace Display
2. In VSP Level Statics, select the time gate that was previously picked. 3. In Header Statics, add the value in trace header word LVL_SHFT as a static. 4. Complete the static shift using the Apply Fractional Statics process 5. Add a label to the headers and display the results. 6. You may want to produce a Header Plot of the LVL_SHFT values.
Landmark
20-7
Exercise
With a little rearranging we can produce a comparison plot to look at the data before and after the level statics application. 1. Expand the ow to compare the traces before and after level statics application. Editing Flow: level statics - vertical stack Add Delete Execute View Exit
Disk Data Input <GET ALL> Reproduce Traces <2 copies ALL DATA> IF <REPEAT=1> VSP Level Statics Header Statics Apply Fractional Statics Trace Display Label <input> ELSEIF <REPEAT=2> Trace Display Label ENDIF Inline Sort <REPEAT:FFID> Trace Display
2. Add the Reproduce Traces and IF-ELSEIF-ENDIF processes. 3. Add the Inline Sort to resort the data by Repeat number and FFID for display. There are 400 traces per ensemble and a total of 800 traces in the sort buffer. 4. In the Trace Display select to plot 1 ensemble per screen and plot 2 vertical panels. You may also select to generate a header plot of the LVL_SHFT header values.
20-8
Landmark
Comparison of with and without level statics including a header plot of the Level Statics Values for a subset of the data.
Landmark
20-9
Disk Data Input <GET ALL> >Reproduce Traces< >IF< VSP Level Statics Header Statics Apply Fractional Statics VSP Level Summing Trace Display Label >ELSEIF< >Trace Display Label< >ENDIF< >In-line Sort< Trace Display
2. Toggle the Comparison processes inactive. 3. Add VSP Level Sum after the statics application. In VSP Level Sum, select to identify shot groups by header word SHT_GRP. There will be a maximum of 5 shots in a group. 4. Plot the results. You should have 80 traces. (80 ensembles) You can do an Inline Sort prior to the Trace Display with a primary ensemble of CHAN and secondary sort of FFID and then you will have a single ensemble for the Trace Display.
20-10
Landmark
Disk Data Input <GET ALL> Reproduce Traces IF VSP Level Statics Header Statics Apply Fractional Statics VSP Level Summing Trace Display Label <level stat> ELSEIF VSP Level Summing Trace Display Label ENDIF Inline Sort <REPEAT:FFID> Trace Display
2. Using the screen swapping in Trace Display, compare the results with and without the level summing. Display 1 ensemble per screen and then set the window size and zoom parameters. Save one screen and then go to the next. Save it and compare the two plots. The differences in this example will be minimal. 3. You may also try to use two vertical (or horizontal) panels and plot both results simultaneously.
Landmark
20-11
20-12
Landmark
Chapter 21
Landmark
21-1
Number of recording levels: 4 Depth of first record: 1200 - 1100 ft. Depth of last record: 1000 - 900 ft. Depth increment: 100 Source offset from hole: N/A The borehole is vertical with no deviation Source elevation: 0 ft. Datum elevation: 0 ft. Assume the Kelly Bushing is also at 0 ft. for simplicity Source is at station 1 Receivers are at stations 2-5
21-2
Landmark
Chapter 22
Level Stat and Vertical Stack for Multi Component / Multi Level
When collecting VSP data, it is common to acquire multiple records with the sources and receivers at the same location. This helps attenuate random noise and build up the signal to noise ratio of the data. Each time the source and recording system are activated, there can be very small time differences in the records relative to one another. In order to optimize the vertical stack of these records, these time differences can be measured, normalized, and applied prior to vertically stacking the records. In this set of exercises we will use a synthetic dataset simulating the Multi Component - Multi Level situation.
Landmark
22-1
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Exercise
1. Build the following ow to plot the input data: Editing Flow: level statics Add Delete Execute View Exit
Trace Display
Specify display END time-------------------------------400 Number of ensembles(line segments)/screen------------10 2. In Disk Data Input, input the synthetic shot record dataset. This dataset can be found in the VSP tutorials area. 3. In Trace Display, plot 10 ensembles. 4. Estimate the time of the rst arrivals for each set of shots. In the next exercise we will need some time gate information. At approximately what time are the rst arrivals on this dataset for each set of 5 shots? Shots 1-5 __________ Shots 6-10 _________
22-2
Landmark
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Exercise
1. Expand the ow to compute and apply level_statics. Editing Flow: level statics Add Delete Execute View Exit
Header Statics
First header word to apply:--------------------------LVL_SHFT
Landmark
22-3
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
of the rst arrivals. This analysis window will be constant for the rst 5 FFIDs and change to a new constant for the second 5.
1:100-200/5:100-200/6:50-150/10:50-150
3. Read the VSP Level Statics helple to determine the name of the Header Attribute to apply as a static in Header Statics. 4. After applying the LVL_SHFT statics using the Headers, apply the fractional remainder with Apply Fractional Statics.
Exercise
With a little rearranging we can produce a comparison plot to look at the data before and after the level statics application.
22-4
Landmark
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
1. Modify the ow to compare the traces before and after level statics application. Editing Flow: level statics Add Delete Execute View Exit
Reproduce Traces
Total number of datasets------------------------------------------2
IF
SELECT Primary trace header word:------------REPEATED SPECITY trace list:----------------------------------------------------1
ELSEIF
Trace selection MODE:-------------------------------------Include SELECT Primary trace header word:-----------REPEATED SPECIFY trace list:----------------------------------------------------2
VSP Level Statics Header Statics Apply Fractional Statics Trace Display Label
Trace label------------------------------------------------w/hdr stat
Ensemble Redefine
Mode of application:-------------------------------------Sequence Max traces per output ensemble:-------------------------------6
Trace Display
Number of ensembles(line segments)/screen------------10 2. Add Reproduce Traces and IF-ELSEIF-ENDIF.
Landmark
22-5
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
3. In Inline Sort, resort the data by Repeat number and FFID for display. We have a total of 60 traces per ensemble and a total of 120 traces in the sort buffer. 4. Split the Repeat ensembles back into individual shot ensembles using Ensemble Redene. We will take each sequence of 6 consecutive traces as one output ensemble. 5. In Trace Display, plot 10 ensembles per screen and use the screen swap functionality to compare the data before and after level static adjustment.
22-6
Landmark
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Exercise
1. Modify the previous ow to vertically stack shots by hand input shot groups for common receiver depth levels. Editing Flow: vertical stack Add Delete Execute View Exit
Disk Data Input <GET ALL> >Reproduce Traces< >IF< >Trace Display Label< >ELSEIF< VSP Level Statics Header Statics Apply Fractional Statics >Trace Display Label< >ENDIF< >In-line Sort< VSP Level Summing
Shot header name:----------------------------------------------FFID Header name for secondary key:-----------------------CHAN How will shot groups be identied?:-----------Hand input Shot grouping:---------------------------------------------1,5/6,10/
Landmark
22-7
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
4. In Trace Display, plot the result. You should now have only 12 traces, 3 traces for each depth level. Use the Header Dump icon to look at a few trace headers.
22-8
Landmark
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Exercise
1. Rearrange the ow to input the data and plot it via Trace Display. Editing Flow: vertical stack Add Delete Execute View Exit
Disk Data Input <GET ALL> >Reproduce Traces< >IF< >Trace Display Label< >ELSEIF< >VSP Level Statics< >Header Statics< >Apply Fractional Statics< >Trace Display Label< >ENDIF< >In-line Sort< >VSP Level Summing< >Trace Display Label< Trace Display
2. Plot the traces using Trace Display. Examine the headers to see if there is a header word that is common to all traces in a group of shots that should be vertically stacked together. In this case there is a header entry called SHT_GRP. We can use this header entry in VSP Level Summing as an alternative to hand inputting the shot groups.
Landmark
22-9
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Disk Data Input <GET ALL> >Reproduce Traces< >IF< >Trace Display Label< >ELSEIF< VSP Level Statics Header Statics Apply Fractional Statics >Trace Display Label< >ENDIF< >In-line Sort< VSP Level Summing Trace Display Label <summed by header> Trace Display
2. Toggle the level statics, static application and level summing processes back to active. 3. Review the parameters of VSP level summing. In the VSP Level Summing process, select to identify shot groups by header word SHT_GRP. Plot the results. Again you should have 12 traces, 3 from each depth level.
22-10
Landmark
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
Disk Data Input <GET ALL> Reproduce Traces IF VSP Level Summing Trace Display Label <No level stat> ELSEIF VSP Level Statics Header Statics Apply Fractional Statics VSP Level Summing Trace Display Label <level stat> ENDIF Inline Sort <REPEAT:FFID> Trace Display
2. Using the screen swapping in Trace Display, compare the results with and without the level summing. Display 1 ensemble per screen and then set the window size and zoom parameters. Save one screen and then go to the next. Save it and compare the two plots. There are some very subtle differences.
Landmark
22-11
Chapter 22: Level Stat and Vertical Stack for Multi Component / Multi Level
22-12
Landmark
Chapter 23
Landmark
23-1
Exercise
1. Build a ow to construct an RMS trace and display the results. Editing Flow: three component transform Add Delete Execute View Exit
3-Component Transforms
Header word for selecting replacement trace:----Geophone component (x,y,z) Value of replacement trace header ---------------------------2 Select 3-component transform to apply:-----------------Sum Squares Stack Maximum time to calculate transform (ms):----------1500
In Line Sort
Select new PRIMARY sort key:-----------------------Geophone component (x,y,z) Select new SECONDARY sort key:------------------------FFID Maximum traces per output ensemble:--------------------80 Number of traces in buffer:----------------------------240
Trace Display
Number of ENSEMBLES(line segments)/screen:---------1 Number of display panels:--------------------------------3 Trace Orientation:---------------------------------------Horizontal 2. In Disk Data Input, read the real data with the correct geometry in the headers. This le still has 3 traces per shot and has a primary sort order of FFID. 3. Select 3-Component Transform parameters.
23-2
Landmark
Replace header entry geophone component (x,y,z) number 2 and process to 1500 ms using a sum squares stack. 4. Sort the data with a primary sort of Geophone Component (x,y,z) and secondary of FFID. There are 80 traces per ensemble and a total of 240 traces in the sort buffer. 5. In Trace Display, display the three component traces. Use 1 ensemble per screen and 3 horizontal panels. You may also want to try 3 vertical panels. 6. Identify the rst arrivals on the display of the RMS trace. 7. Create a new First Break entry of the type GEOMETRY in the database using the Picking pulldown menu. Select to edit database values (rst breaks) and give these FB Picks a name that describes them as being picks from the RMS trace. 8. After rubber-banding the rst arrivals on one of the panels, snap them to the nearest peak. Notice that each panel is picked completely independently from the others. In this case only pick the one panel that contains the RMS trace. 9. Compare the picks by plotting them from the database. We should have two sets of rst break picks in the TRC database. The picks from the vertical traces that we picked earlier and these new picks from the RMS traces.
Landmark
23-3
Exercise
1. Build a ow to copy the time pick from 1 component to the other components. Editing Flow: copy first break picks Add Delete Execute View Exit
Disk Data Input <GET ALL> Database/Header Transfer Assign Common Ensemble Value Database/Header Transfer Disk Data Output
2. In the rst Disk Data Input, read the le with all three traces per shot with the geometry installed in the headers. 3. In Database/Header Transfer, move the database resident rst break pick that was picked from the single vertical trace to the fb_pick word in the trace header. This is the rst break pick that was picked earlier on the vertical traces only and then edited in the database.
23-4
Landmark
4. In Assign Common Ensemble Value, copy the rst break pick time from channel 1 to the other 2 channels of each shot.
5. Transfer the copied rst break times from the trace header back to the database. Each trace has a rst arrival time in the trace header, but there is no attribute in the database that has a rst break time for all traces that is correct. For future reference it would be advisable to make a copy of the copied arrival times in the database. 6. Write the output data to a new le.
Landmark
23-5
Exercise
1. Display the picks in the database. Exit from the ow. Click on the global Database button. Use the Database display tool to graph the various picks and compare the results. 2. Expand the ow to reread the new data le and plot the rst breaks. Editing Flow: copy first break picks Add Delete Execute View Exit
>Disk Data Input< >Database/Header Transfer< >Assign Common Ensemble Value< >Database/Header Transfer< >Disk Data Output< ---------------------Disk Data Input Trace Display
3. Toggle all of the previous processes inactive. 4. In Disk Data Input, read in the le that was written in the previous exercise that has the copied picks in the header. 5. In Trace Display, plot the traces Plot 80 ensembles. 6. Execute the ow. 7. Overlay the picks from the headers and/or the database on the traces.
23-6
Landmark
Use the Picking pulldown menu to select the rst breaks from the trace headers, or the database. All three traces per FFID should have the same pick time. Check the values by using the header dump facility.
Landmark
23-7
23-8
Landmark
Chapter 24
Landmark
24-1
Disk Data Input <GET ALL> Bandpass Filter Hodogram Analysis Disk Data Output
2. In Disk Data Input, read the traces with constant rst arrival times for each trace at each receiver level in the headers. 3. Apply a bandpass lter. Default values are ok. In general you would not want to apply any trace by trace amplitude corrections for this process. 4. Select Hodogram Analysis parameters.
24-2
Landmark
Plot the rst arrival times, and use the arrival times as a basis for the analysis window. Do not output the analysis window to a time gate le. Write the orientation values to the trace headers.
5. Write the output data to disk. 6. Execute the Flow. You should see a display similar to the following after zooming in around the rst arrivals.
Landmark
24-3
24-4
Landmark
1. Click on the Hodogram editing icon. This enables us to alter the orientation angle that the program computed automatically if desired. Normally we will only want to watch the polarity of the oriented traces and we may need to rotate the trace by 180 degrees to get the proper polarity. 2. Look at the second and third trace of the middle trace display. The second trace should be maximized at the same polarity as the rst trace and the third trace should be minimized. 3. Press MB2 in the top hodogram window to rotate the oriented traces by 180 degrees to change its polarity. Change it back again with another MB2 Click. This trace has the correct polarity. 4. Press MB2 in the bottom hodogram window to rotate the oriented vertical trace by 180 degrees. After rotation this trace now has the proper orientation. 5. Fine tune the orientation to minimize the third trace on the second set of traces and the second trace on the third set of traces by using MB1 and rotating the orientation axes. In general you will nd that the ne tuning is not required. 6. Press the Next Screen icon to go to the next set of three traces for the next depth level. Repeat the orientation procedures where the goal is to: 1) maximize the second trace on the second panel of traces at the same polarity as the original vertical trace 2) maximize the rst trace on the third panel of traces at the same polarity as the original vertical trace.
Landmark
24-5
8. Expand the ow to display the output data. Editing Flow: 3 comp hodogram analysis Add Delete Execute View Exit
>Disk Data Input< >Bandpass Filter< >Hodogram Analysis< >Disk Data Output< Disk Data Input <hodo_typ,ffid>
Select primary trace header entry------------------hodo_typ Select secondary trace header entry---------------------FFID
Trace Display
9. Read the le that was just created. You will want to sort the input with a primary ensemble sort order of hodo_typ and sort the traces within these ensembles to increase by FFID. 10. You may want to experiment with different display options. A best rst guess would be to use Trace Display and plot 5 ensembles. You may also want to try 1 ensemble per screen and 5 horizontal panels.
24-6
Landmark
Landmark
24-7
24-8
Landmark
Chapter 25
Landmark
25-1
Exercise
1. Build a ow to look at the trace headers. Editing Flow: prepare input data Add Delete Execute View Exit
3. In Trace Display, plot the data and view the trace headers to identify the header values that may need to be altered prior to the start of processing. Since we have no idea how this data is organized, use all defaults for Trace Display except specify to plot 100 ensembles. This will help you identify what an ensemble is and then how to deal with the data. 4. Derive an equation to use to assign the FFIDs from 1 to 80. Also note that the Geophone (x,y,z) header word does not exist and must be set equal to the channel number.
25-2
Landmark
Exercise
1. Expand the previous ow to rebuild the trace headers and write the le to your own line directory. Editing Flow: Add Delete Execute View Exit
Disk Data Input <Get All> Trace Header Math Trace Header Math Trace Header Math Trace Header Math Trace Header Math Trace Header Math Trace Header Math In-line Sort >Disk Data Output< Trace Display
2. In the rst Trace Header Math, compute FFID=(12100-CDP)/50+1. 3. In the second Trace Header Math, compute GEO_COMP=chan. 4. In the remaining Trace Header Math processes, set SOU_X=0.0, REC_ELEV=0.0, CDP=0, TR_FOLD=0.0, and LINE_NO=0 one at a time. Note: Some are integer others are oating point. 5. Sort the data back to FFID/CHAN. There are 3 traces per FFID ensemble and a total of 240 traces in the dataset. 6. Check the output headers using Trace display. 7. In Disk Data Output, write the data to disk when satised that the data is OK.
Landmark
25-3
25-4
Landmark
Chapter 26
Archival Methods
Archiving your data protects your work from system failure and may allow you to bring data into other software packages. The archiving methods can be run from both inside and outside the ProMAX User Interface. In this chapter, we will discuss options for archiving your data.
Landmark
26-1
SEG-Y Output
ProMAX offers a variety of industry standard and individual company output formats. Of these, SEG-Y is the most common. This process can write out industry standard SEG-Y tapes as well as frequently requested non-standard variations of SEG-Y and IEEE format. SEG-Y Output is a good choice for archiving a dataset that will later be loaded to a third party software package. This process will successfully archive data spanning over multiple disks. One downfall to this archival method is that it will not automatically map all the ProMAX trace headers. However, SEG-Y Output provides you the capability of mapping these non-standard trace headers.
Exercise
In this exercise, you will write a SEG-Y formatted tape, mapping some non-standard SEG-Y headers. We will check to make sure the headers were mapped correctly by using SEG-Y Input and Screen Display. Depending on the availability of a tape drive on the system, this exercise may be modified to write a SEG-Y disk image. 1. Build the following ow: Editing Flow: SEG-Y Output Add Delete Execute View Exit
26-2
Landmark
Enter the tape drive device name. Select Yes to Remap SEG-Y headers. Map the defaulted header values, sou_sloc, rec_sloc, and cdp_sloc. The SEG-Y format reserves bytes 181-240 for optional use. The *_sloc trace headers are important to ProMAX so we typically write them to the extended headers. These header values must be present in order to automatically rebuild the database les with the Extract Database Files process. 4. Put tape in tape drive. 5. Execute the ow. 6. Once the job is completed, build the following ow to QC the headers. Editing Flow: SEG-Y Out Add Delete Execute View Exit
Trace Display
7. Select SEG-Y Input parameters. Make sure the formats are consistent with those specied in SEG-Y output. 8. Select Yes to Remap SEGY headers. This loads the extended headers that you mapped with SEG-Y output. 9. Execute the ow. 10. Click on the Header icon in Trace Display to QC the headers. The extended header values should be preserved (rec_sloc and sou_sloc).
Landmark
26-3
Exercise
In this exercise, you will view trace headers in the dataset, write a ProMAX formatted tape and read the tape back in to make sure the headers are preserved. 1. Exit out of ProMAX by selecting the Exit at the bottom of the User Interface. 2. Set the environment variable BYPASS_CATALOG = t in your ProMAX start-up script or your .cshrc le, by including the line setenv BYPASS_CATALOG t (for the c shell). This will deactivate the tape cataloging system. Information about this system is located in the helple index under seismic datasets and tape datasets. 3. If you set the environment variable in your .cshrc le, type source .cshrc. This will reinitialize your .cshrc le. 4. Type promax. 5. Build the following ow: Editing Flow: Tape Data Output Add Delete Execute View Exit
Disk Data Input >Tape Data Output< >Tape Data Input< Trace Display
26-4
Landmark
6. Select Disk Data Input parameters. Select two shots from your Raw Shots with Geometry dataset. Limit the dataset size for efciency. 7. Execute the ow. 8. Click on the Header icon in Trace Display to view the trace headers. 9. Exit out of Trace Display 10. Toggle off Trace Display and toggle on Tape Data Output using MB3. Editing Flow: Tape Data Output Add Delete Execute View Exit
Disk Data Input Tape Data Output >Tape Data Input< >Trace Display<
11. Select Tape Data Output parameters. Enter an output le name and tape drive device path name. The Pre-geometry Database Initialization option is the same one found in Disk Data Output. This initializes the database, creating the TRC, SIN, and CHN ordered database les. Since we already applied our geometry, leave the question defaulted to No. 12. Put tape in tape drive. 13. Execute the ow. Choose to continue when the popup menu appears. 14. Enter your datasets menu and click MB2 on your tape dataset. You can view your tape dataset lename under the same menu as your disk dataset. Click MB2 to see information about your dataset. Your new tape dataset will have a Media type of Tape.
Landmark
26-5
15. Build the following ow to QC the headers: Editing Flow: Tape Data Output Add Delete Execute View Exit
>Disk Data Input< >Tape Data Output< Tape Data Input Trace Display
16. Select Tape Data Input parameters. Select your tape dataset created in Tape Data Output, and specify the tape device path name. 17. Execute the ow. Choose to continue when the popup menu appears. 18. Click on the Header icon in Trace Display to QC the headers. You should see that all of your header values are preserved.
26-6
Landmark
UNIX tar
The UNIX tar command is handy for archiving files, such as datasets, flows, and OPFs residing on one disk such as your primary disk data storage.
Exercise
1. Put a tape in the tape drive. 2. In an X-window, change directories to your line directory using the cd command. 3. Type ls. This lists all the les in your line directory 4. Select the ow that you want to archive. 5. Type tar -cvf /dev/(tape drive device name;rmt0) ./(owname). This command copies your ow directory and the les contained underneath that directory to tape. 6. When les are copied, type tar -tvf /dev/ (tape drive device name) at the prompt. This command types the les contained on your tape to screen. This step should always be done when you are using tar to archive les to make sure the archive worked. You can also redirect the output to a le by typing:
tar -tvf /dev/(tape drive device name) > (file name with tape list)
If you wanted to place archived les back to disk, you would type the following command:
tar -xvf /dev/(tape drive device name) ./(flowname).
Landmark
26-7
Archive to Tape
The UNIX tar command was discussed in the previous section. Although this works fine in many situations, ProMAX also includes an inline archive program, Archive to Tape (sometimes referred to as ctar), designed specifically for seismic datasets. The program ctar has some advantages over the UNIX tar commands such as the ability to span tape volumes on all platforms, flexible use of ProMAXs secondary storage for seismic trace datasets and checking for available disk space before writing files during restore operations. Also, you may use this functionality in conjunction with the Advance Tape Catalog. The related process, List/Restore from Tape reads ProMAX archive tapes and restores the data to disk.
Exercise
In this exercise, you will archive your ProMAX Area to tape, list the tape contents and restore your Area back to disk. 1. Add an Area/Line called archive/archive with permissions of 775 or 777. You may not need to do this in the classroom or, for that matter, at your workplace if this Area/Line has already been created. The purpose of creating this new Area/Line is to prevent you from archiving a line by executing a ow from within the line to be archived. 2. Build the following ow: Editing Flow: ARCHIVE Add Delete Execute View Exit
26-8
Landmark
5. Click on Invalid to select a tape drive device path. 6. Execute the ow. Choose to continue when the popup menu appears. 7. Build the following ow: Editing Flow: ARCHIVE Add Delete Execute View Exit
Landmark
26-9
26-10
Landmark
Chapter 27
Landmark
27-1
The Emacs Editor is a general-purpose, full-function editor. It can be operated outside of ProMAX or in the processes: Config File Edit and the Emacs Editor process. To start the Emacs editor outside of ProMAX, exit ProMAX and type emacs filename at the UNIX prompt. The Emacs Editor Widget is a subset of the full-function editor and is used within ProMAX when a single line editor is insufficient but a fullfunction editor is unnecessary. It supports cursor movement commands and a small set of editing commands. The Emacs View Widget is similar to Emacs Widget in cursor movement, but does not allow any modification of text. The Emacs View Widget only displays text. It is used by ProMAX to view help files and the flow execution output listings (view job.output). Since all the editors listed above are variations on the Emacs Editor, they operate similarly. Of course, the View Widget, which does not actually modify text, has no need for editing commands. Since the Editor Widget is a subset of the full Emacs Editor, it does not have all the commands in the Emacs Editor (Search and Replace, for example). Note: The implementation of the editors is slightly different for each of the ProMAX supported hardware platforms. One reason for the differences is the fact that the keyboards are not the same on each platform. The main difference is the designation of the Meta key. This is the diamond key on either side of the space bar on the keyboard of SUN SPARCstations, Compose Character key on DECstations and the Alt key on IBM RS/6000 workstations. In the following instructions, replace the Meta key with the equivalent key stroke depending on your platform.
27-2
Landmark
Cursor movement:
Use the 4 cursor arrow keys Point the mouse cursor and click button 1 Ctrl-A Move the cursor to the beginning of the current line Ctrl-E Move the cursor to the end of the current line Ctrl-V Scroll the screen forward (down) one screen Meta-V Scroll the screen backward (up) one screen Meta-Shift-<Jump to the beginning of the file Meta-Shift-> Jump to the end of the file Ctrl-S Search forward for a string; (start entering string) Ctrl-R Search backward for a string; (start entering string)
Editing:
All keyboard entry is in insert mode Delete key Delete one character to the left of the cursor (Backspace for DEC) Ctrl-D Delete one character to the right of the cursor Ctrl-K Kill to the end of the line (from the cursor) Ctrl-Y Yank back the contents of the kill buffer (created by Ctrl-K or Ctrl-W); cut and paste; (can move the cursor first) Meta-X, then type repl s Search and replace; (follow prompts) Ctrl-X, Ctrl-W Write new Emacs file; (enter path & filename) Ctrl-X, Ctrl-S Save current Emacs file Ctrl-X, Ctrl-F Find another Emacs file Ctrl-X, I Insert a file at current cursor location Ctrl-X, Ctrl-C Exit Emacs
Landmark
27-3
Exiting Emacs:
Ctrl-X, Ctrl-C; (then respond Y or N to saving)
27-4
Landmark
UNIX Commands
Alphabetical summary of general purpose UNIX commands used in conjunction with ProMAX.
cat
Concatenate and Display Files
UNIX$ cat [options] [files]
cd
Change Directory
$ cd [directory]
chmod
Change Access Modes
$ chmod [options] mode names
Option: -r recursively change directory tree Mode can be numeric or symbolic The symbolic case is of the form [agou][+-=][rstwx] where: a group, other and user, access permissions g group access permissions o other access permissions u user access permissions + add the permission to current status of les - remove the permission from status of les
Landmark ProMAX VSP User Traiining Manual 27-5
= set the permission of les to specied value r read permission s set owner -ID or group -ID on execution (usable only with g or u) t save text mode w write permission x execute permission
cp
copy files
$ cp [options] file 1 file 2
make copies of specified files in directory Options: -i prompt user before overwriting le -p copies have same modication times and modes as source les -r recursive copy of directory (with subdirectories)
df
Report Free Block Count
$ df [options][filesys][file]
Options: -i print number of modes free and in use les df reports on le system containing les lesys is a list of device names or mounted directory names to report (default = all mounted)
27-6
Landmark
du
Summarize Disk Usage
$ du [options][names]
Options: -a generate entry for each le -s only display a grand total summary (default is entry for each directory) names directory names or lenames
grep
Search File for Pattern
$ grep [options]expr [files]
stdin read if no files specified Options: -b precede line with block number -c print count of matching lines only -e expr useful if the expr start with a -i ignore case of letters in search -l print only names of les with matching lines -n print line numbers -s print error messages only -v print non-matching lines -w search for expression as a word expr expression or pattern
Landmark
27-7
kill
Terminate Process
$ kill -l
Options: signal send signal instead of terminate 0 for process-id implies all processes resulting from current login
ln
Make Links to File
$ ln [option] file1 file2
make link with same name in current directory Option: -s make symbolic link (hard link default)
login
Sign On to System
$ login [option][user]
ls
List Contents of Directories
$ ls [options][names]
27-8
Landmark
names can be files of directories current working directory used if no name specified Options: -1 print listing of one entry per line -a list all entries (including ones starting with.) -d list only name (not contents) of directory -l long list (mode, links, owner, size, mod. time) -r reverse sort order -R recursively print subdirectories -s print le size in kilobytes
man
Print Manual Entries
$ man -k keywords
print manual sections for each cmd specified Options: - pipe output through more (default on terminals) -M path to search for entries (/usr/man/default) -t troff output to raster device path list of directories to search, separated by colons section Arabic section number, followed by optional letter signifying type of command
Landmark
27-9
dir
Create Specified Directories
$ mkdir directories
more
View file by Screenful or by Line
$ more [options][files]
Options: -c redraw page one line at a time -d prompt after each screenful -f count by newlines instead of screen lines -l treat formfeed (L) as ordinary character -n window size (default set with stty) +n start viewing le at line n -s reduce multiple blank lines to one -u suppress terminal underlining or enhancing +/pat start two lines before line containing pat Enter h when more pauses for interactive options
mv
Move Files (See CP)
$ mv [options] file1 file2
27-10
Landmark
Options: - following arguments are lenames -f force overwriting of existing les -i interactive mode
ps
Report Process Status
$ ps [keys][-t list][process-id]
Keys: a print all processes involving terminals c print internally stored command name e print both environment and arguments g print all processes k use/vmcore in place of /dev/kmem and /dev/mem for debugging l long listing n process number (must be last key) s add size of kernel stack of process to output tn list processes associated with terminals; n is terminal number (must be last key) u include elds of interest to user U update namelist database (for speed) v print virtual memory statistics w 132 column output format ww arbitrarily wide output x include processes with no terminal
Landmark
27-11
pwd
Print Working Directory Name
$ pwd
rcp
Copy Files Between Machines
$ rcp [option] file1 file2
copy files to specified directory Options: -p copies have same modication times and modes as source les -r recursive copy of directories
rlogin
Login on Remote Terminal
$ [rlogin] remote [options]
Options: -8 allow 8 bit data path -ec specify new escape character c -l user user is login name on remote system -L run remote session in litout mode remote remote host system rlogin is optional if /usr/hosts in search path
rm
Remove Files
$ rm [options] files
27-12
Landmark
Options: - treat all following arguments as lenames -i ask for conrmation before each delete -r recursively delete directories
rmdir
Remove Empty Directories (See RM)
$ rmdir directories
su
Become Another User (Set User)
$ su [options][user]
user defaults to root Options: - act like full login -f if csh, dont execute .cshrc
tar
Tape file Archiver
$ tar [key][option][files]
stdin read if no files specified Keys: format: letter [modifiers] Function Letters: c create new tape and record les t tell when les found, all entries if no les x extract les, entire tape if no les
Landmark
27-13
Function Modifiers: 0...9 specify which tape drive to use (0 default) b next arg is blocking factor (20 default, 20 max) B force I/O blocking at 20 blocks per record f arch arch is the le to be used for input/output to archives (if-then stdin read) h follow symbolic links l complain if all le links not found m update le modication times v verbose mode w wait for conrmation after reporting lename (y causes action to be performed) Option: -C dir change directory to dir
who
Who is on the System
$ who [file][am i]
Arguments: le read instead of /etc/utmp for login information am i output who you are logged in as
whoami
Print Effective User-Id
$ whoami
27-14
Landmark
alias promax /advance/sys/bin/promax& The alias command is used to substitute a short, convenient command in place of a longer command. In this case, promax is the new (alias) command. From this point on, typing promax will be equivalent to typing the full /advance/sys/bin/promax&. Note: This alias will only be effective until you log out. If you want it to be available each time you log in, place this line in your .cshrc file. This is a C shell command.
cp -r /advance/data/offshore . cp is the copy command. The -r tells the system that you want to copy recursively (useful for copying directories trees). The directory from which you are copying in this case is /advance/data/offshore. Note the final ., which denotes the target directory. The single . means the current directory. Be careful about how you specify the target directory. If you told the system to copy the files to a directory offshore and this directory already exists, then the files will end up in offshore/offshore.
df df shows the amount of free space on all the currently mounted file systems, including remotely mounted file systems. The listing will show you which of the file systems are remotely mounted. It is possible to specify one file system and see the amount of free space in only that file system. If you do not specify a file system, then df will default to showing all the mounted file systems. There are many other options for df which you may find useful.
du -s offshore The du command summarizes disk usage. It can show disk usage file by file. When the -s option is given, only a grand total summary of disk
Landmark ProMAX VSP User Traiining Manual 27-15
usage is produced. Specifying offshore requests a disk usage report for that directory.
grep -i STAT elev_stat_math | grep -i CDP grep is the search command. This command will search for the lines within the file elev_stat_math which contain the string STAT. The -i causes the search to ignore upper or lower case differences. Without this option, it would look for STAT exactly, in upper case. The | or pipe redirects the output from the search into another grep command. This again performs a case-insensitive search for CDP. Because the output from the first search contains only lines with the string STAT, the result of the piped search will contain only lines with both STAT and CDP.
grep STAT header.list static_hdrs This grep will search the file header.list for lines containing the string source in upper case letters only, and then will direct the output of the search to a file called static_hdrs.
kill -9 2367 The kill command will stop a current process by sending a signal. The process number in this case is number 2367, which was found by using the ps command. There are many modifiers for this command, but one which you should know is the -9. This makes it impossible for the process to ignore the signal. You might use this when a process is locked up and there is no other way to stop it.
ln -s /advance/data2/oswork offshore The ln command means link. The -s denotes a symbolic link. This can be used to link files on different file systems. A normal link, sometimes known as a hard link, specified as ln without the -s, cannot link between file systems. This symbolic link will cause the directory /advance/data2/oswork to appear in the current directory under the name offshore. It is not a new directory, or a copy of the oswork directory in /advance/data2. When you access a file in your directory called offshore, you are actually accessing the original file in the directory /advance/data2/oswork.
27-16
Landmark
Therefore any changes made in offshore will be made to in /advance/ data2/os work. You should be aware that certain commands act differently when applied to a linked file. For example, if you delete the linked file using rm applied to the linked file in your directory, only the link is removed. The original file is intact. But if you copy the linked file with cp applied to the file in your directory, the system will make a copy of the original file.
ps -ax The ps (process status) command shows all of the processes currently running on the system. The -a tells the system to display all processes except process group leaders and processes not started from terminals. The x shows processes without control terminals. If you do not specify the x, then you may not see the process for which you are looking. The -ax on Berkeley UNIX changes to -elf on System V UNIX. The l provides a long form of the listing, -f provides a full listing of the processes, and -e asks for every process on the system.
rcp -r neptune:/usr/disk2/offshore . rcp is the remote copy command. The -r, as with the cp command, is the recursive form of copy. It will copy the /usr/disk2/offshore directory and its subdirectories from the named server. The destination directory is ., the current working directory.
rmdir offshore The rmdir command removes directories. In this example the rmdir command will remove the directory offshore. rmdir will only remove an empty directory. If you still have entries in the directory, this command will fail. You can check the contents of the directory, to see if it contains files you meant to keep. Or you can use the rm -r command, at your own risk.
tar c /advance/data/offshore The tar command (tape archive) is used for moving files to or from tape. The c means create, so a new tape will be created. The directory to be copied to tape is /advance/data/offshore. To copy more directories to tape, just list them after the first directory, separated by spaces. x in place of the c will extract files from the tape and copy them to the disk.
Landmark
27-17
tar x with no files listed will read everything off the tape. tar x followed by a file name, directory or path will only read the data if it exists on the tape. This is a safe way to get back a specific dataset from the tape. The v option is verbose, so that you can see what the process is doing. Otherwise, like most UNIX processes, it is silent. You may wish to investigate cpio as a more versatile alternative to tar.
tar c ./offshore This tar command copies to tape the directory offshore and the files which belong to the directory offshore. The ./ preceding offshore indicates that offshore is a subdirectory of the current working directory. It is generally best to use relative path names (rather than full path names) when you are using tar.
27-18
Landmark