You are on page 1of 6

Logistic Cockpit (LC) is a new technique to extract logistics transaction

data from R/3.


All the DataSources belonging to logistics can be found in the LO Cockpit
(Transaction LBWE) grouped by their respective application areas.

The DataSources for logistics are delivered by SAP as a part of its standard
business content in the SAP ECC 6.0 system and has the following naming
convention. A logistics transaction DataSource is named as follows:
2LIS_<Application>_<Event><Suffix>
where,

Every LO DataSpurce starts with 2LIS.

Application is specified by a two


the application relating to a set
e.g. application 11 refers to SD sales.

Event specifies the transaction that provides the data for


the application specified, and is optional in the naming convention. e.g.
event VA refers to creating, changing or deleting sales orders. (Verkauf
Auftrag stands for sales order in German).

Suffix specifies the details of information that is extracted. For e.g. ITM
refers to item data, HDR refers to header data, and SCL refers
to schedule lines.

digit number that


of events in a

specifies
process.

Up on activation of the business content DataSources, all components like


the extract structure, extractor program etc. also gets activated in the
system.

The extract structure can be customized to meet specific reporting


requirements at a later point of time and necessary user exits can also be
made use of for achieving the same.
An extract structure generated will have the naming convention, MC
<Application> <Event>0 <Suffix>. Where, suffix is optional. Thus e.g.
2LIS_11_VAITM, sales order item, will have the extract structure
MC11VA0ITM.
Delta Initialization:

LO DataSources use the concept of setup tables to carry out the initial
data extraction process.

The presence of restructuring/setup tables prevents the BI extractors


directly access the frequently updated large logistics application tables
and are only used for initialization of data to BI.

For loading data first time into the BI system, the setup tables have to
be filled.

Delta Extraction:

Once the initialization of the logistics transaction data DataSource is


successfully carried out, all subsequent new and changed records are
extracted to the BI system using the delta mechanism supported by
the DataSource.

The LO DataSources support ABR delta mechanism which is both DSO


and InfoCube compatible. The ABR delta creates delta with after,
before and reverse images that are updated directly to the delta
queue, which gets automatically generated after successful delta
initialization.

The after image provides status after change, a before image gives
status before the change with a minus sign and a reverse image sends
the record with a minus sign for the deleted records.

The type of delta provided by the LO DataSources is a push delta, i.e.


the delta data records from the respective application are pushed to
the delta queue before they are extracted to BI as part of the delta
update. The fact whether a delta is generated for a document change
is determined by the LO application. It is a very important aspect for
the logistic DataSources as the very programthat updates
the application tables for a transaction triggers/pushes the data for

information systems, by means of an update type, which can be a V1


or a V2 update.

The delta queue for an LO DataSource is automatically generated after


successful initialization and can be viewed in transaction RSA7, or in
transaction SMQ1 under name MCEX<Application>.

Update Method
The following three update methods are available
1.

Synchronous update (V1 update)

2.

Asynchronous update (V2 update)

3.

Collective update (V3 update)

Synchronous update (V1 update)

Statistics updates is carried out oat the same time as the document
update in the applicationtable, means whenever we create a
transaction in R/3, then the entries get into the R/3 table and this takes
place in v1 update.

Asynchronous update (V2 update)

Document update and the statistics update take place in different


tasks. V2 update starts a few seconds after V1 update and this update
the values get into statistical tables from where we do the extraction
into BW.

V1 and V2 updates do not require any scheduling activity.


Collective update (V3 update)

V3 update uses delta queue technology is similar to the V2 update. The


main differences is that V2 updates are always triggered
by applications while V3 update may be scheduled independently.

Update modes
1.

Direct Delta

2.

Queued Delta

3.

Unserialized V3 Update

Direct Delta

With this update mode, extraction data is transferred directly to the BW


delta queues with each document posting.

Each document posted with delta extraction is converted to exactly


one LUW in the related BW delta queues.

In this update mode no need to schedule a job at regular intervals


(through LBWE Job control) in order to transfer the data to the BW
delta queues. Thus additional monitoring of update data or extraction
queue is not require.

This update method is recommended only for customers with a low


occurrence of documents (a maximum of 10000 document changes creating, changing or deleting - between two delta extractions) for the
relevant application.

Queued Delta

With queued delta update mode, the extraction data is written in an


extraction queue and then that data can be transferred to the BW delta
queues by extraction collective run.

If we use this method, it will be necessary to schedule a job to


regularly transfer the data to the BW delta queues i.e extraction
collective run.

SAP recommends to schedule this job hourly during normal operation


after successful delta initialization, but there is no fixed rule:
it depends from peculiarity of every specific situation (business
volume, reporting needs and so on).

Unserialized V3 Update

With this Unserialized V3 Update, the extraction data is written in an


update table and then that data can be transferred to the BW delta
queues by V3 collective run.

Setup Table

Setup table is a cluster table that is used to extract data from R/3
tables of same application.

The use of setup table is to store your data in them before updating to
the target system. Once you fill up the setup tables with the data, you
need not go to the application tables again and again which in turn will
increase your system performance.

LO extractor takes data from Setup Table while initialization and full
upload.

As Setup Tables are required only for full and init load we can delete
the data after loading in order to avoid duplicate data.

We have to fill the setup tables in LO by using OLI*BW or also by going


to SBIW Settings for Application Specific Data Sources Logistics
Managing Extract Structures Initialization Filling in Setup tables
Application specific setup table of statistical data.

We can delete the setup tables also by using LBWG code. You can also
delete setup tables application wise by going to SBIW Settings
for Application Specific Data Sources Logistics Managing Extract
Structures Initialization Delete the Contents of the Setup Table.

Technical Name of Setup table is ExtractStructure-setup, for example


suppose Data source name 2LIS_11_VAHDR. And extract structure
name
is
MC11VA0HDR
then setup table
name
will
be
MC11VA0HDRSETUP.

LUW

LUW stands of Logical Unit of Work. When we create a new document it


forms New image N and whenever there is a change in the existing
document it forms before image X and after Image and these after
and before images together constitute one LUW.

Delta Queue (RSA7)

Delta queue stores records that have been generated after last delta
upload and not yet to be sent to BW.

Depending on the method selected, generated records will either come


directly to this delta queue or through extraction queue.

Delta Queue (RSA7) Maintains 2 images one Delta image and the other
Repeat Delta. When we run the delta load in BW system it sends the
Delta image and whenever delta loads and we the repeat delta it sends
the repeat delta records.

Statistical Setup

Statistical
Setup
is
a program,
which
is
specific
to Application Component. Whenever we run this program it extracts
all the data from database table and put into the Setup Table.

You might also like