You are on page 1of 6

Specifying Your Own Parallel Stages

Data stage has a lot of stages which are available for carrying on the job development process. But if we want to have stages which we think that it can be reused in our project or useful in longer version of project then Data stage allows building those stages in an efficient way, which then can be subsequently used in our parallel jobs.

Data stage Manager provides an interface that allows the development of new Data Stage parallel job stage.
There are three different types of stages that can be defined in Data stage. The explanation of these stages will be followed by examples and their discussion. The first one in this line is CUSTOM stage type.

1. CUSTOM STAGE TYPE This stage type allows us to define an Orchestrate Operator as a DataStage stage in developing a parallel job. The developed Custom stage will then be available to all jobs in the project in which the stage is defined. This stage can be available to other projects using DataStage Manager Export/Import facility, which we will discuss later after developing a Custom Stage Type. For defining Custom Stage type from Data Stage Manager, choose File->New Parallel Stage- >Custom

Or alternatively we can build a Custom stage from Stage Types folder in the Manager.

After clicking the Custom button, stage related enquiry page will appear where all the related information regarding building of Custom stage will Custom Stages can be created for Server and Parallel jobs but not for Mainframe jobs. ******* Removing header trailer info from the flat file.

Some useful Macros available are: Build Stage Macros :

These macros form an important part in writing Logic for Pre-Loop, Post-Loop and Per-Record. These are categorized as under: Informational Flow Control Transfer Input and Output

Informational Macros: These macros are used to find the number of inputs, outputs and transfers: Inputs (): Returns the number of inputs to the stage. Outputs (): Returns the number of outputs from the stage.

Transfers (): Returns the number of transfers in the stage.


Flow Control: These macros are used to override the default behavior of PerRecord loop in our stage. endLoop (): Causes the operator to stop looping, followed by completion of current loop and its subsequent output.

nextLoop (): Immediately causes the operator to go the next loop without writing the output. failedStep (): Causes the operator to return fail status and abort the job. Input and Output Macros: These macros control the read, write and transfer operation of records. These macros take certain arguments, which are described as under:

input is the index of the input (0 to n). If an input port name has
been defined, then index can be replaced with portname.portid.

output is the index of the output (o to n). If an output port name


has been defined, then index can be replaced with portname.portid.

index is the index of the transfer(0 to n).


The following Macros are available which takes one of the above arguments: readRecord (input): Reads the immediate input record if present. If no input record is present then the next call to inputDone() will return true. writeOutput (output): Immediately writes record to the output.

inputDone (input): Returns true if the last call to readRecord()


fails to fetch any record.

holdRecord (input): Causes auto input to be suspended for the current record. If auto is not set for input record, then holdRecord () has no effect.

discardRecord (output): Causes auto output to be suspended


for the current record, so that operator do not output the record at the end of the current loop. If auto is not set for the output the discardRecord() has no effect. discardTransfer (index) : Causes auto transfer to be suspended, so that the operator do not perform transfer at the end of current loop. If auto is not set for transfer then discardTransfer() has no effect. Transfer Macros: Following macros are available. doTransfer(index): Transfers the records specified by the index. doTransferFrom(input): Performs all transfer from input. doTransferTo(output): Performs all transfer to output. transferAndWriteRecord(output): Performs all transfer and write record to the specified output. It is equivalent to doTransferTo() and writeRecord() macros.

The UNIX command to be applied in Wrapper stage should be pipesafe (i.e. the input should read command sequentially from beginning to end).

You might also like