Professional Documents
Culture Documents
Navigating in SAP
Toolbar
Screen Icons
SAP Log on
o
o
o
OLTP: Online Transaction Process (SAP SD, MM, FICO, ABAP, HR)
Basics:
The key figures are divided into two types. They are:
a.
Cumulative key figures
b.
Non-cumulative key figures
a.
Material
E621
E622
E623
Total:
600
Cumulative key figures are used when the data in the key figure field need to be added.
Amount
100
200
300
Plant
Material
4002
Pencil
4002
Pencil
Records in the 'Stock Value' field are not added.
Stock Value
500
600
Go to RSA1
Go to 'Info Object' selection
Right click on the context menu > Select 'Create Info Area'
Give the technical name (Always unique)
Give description
Click on Continue
Part 2:
1.
2.
3.
4.
5.
Right click on Info Area > Select create 'Info Object Catalog'
Give technical name
Give description
Select Info object type 'Character'
Click on Activate button
Part 3:
1.
2.
3.
4.
Right click on Info area > Select create 'Info Object Catalog'
Give technical name and description
Select info object type 'Key Figure'
Click on Activate button
Part 4:
1.
2.
3.
4.
b. Noncumulative key
figures are used in
MM and HR related
reports
Date
28/04/2012
29/04/2012
5.
6.
7.
Click on Continue
Give mandatory options in the 'General' tab page (like Data type, length .. )
Click on Activate button
Part 5:
Right click on the Info Object Catalog for key figures
2.
Select create Info Object
3.
Give technical name (length between 3 to 8)
4.
Give description
5.
Click on Continue
6.
For key figure of type 'Amount' and 'Quantity' we have to give 'Unit Characteristic'
(0Currency/ 0Unit)
7.
Click on Activate button)
1.
Note: Master Data is always assigned to a characteristic. A characteristic is called master data
characteristic if it has attributes, text and hierarchies.
i.Attributes: These are info objects which explain a characteristic in detail. These are divided into
two types:
a. Navigational attributes
b. Display attributes
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
1.
2.
3.
4.
Part 1:
Go to Info object of type characteristic
Go to 'Display/Change'
In the 'Master data text' tab page, check the 'With Master Data' checkbox
Go to the Attribute tab page
Give technical name of attribute
Click Enter
Give description
Give data type, length
Click on continue
Activate the info object
Part 2:
If the info object is already in the system, copy the technical name of the info object
Go to attribute tab page of char
Paste the technical name of the info object
Click on Activate button
Note: Key Figure can be an attribute to a characteristic and it can only be a display attribute
Steps to enable Texts:
1.
Right click on the info object, select change, go to Master Data Text tab page, select the
check box Text
Company Code
India
USA
Amount
2000
2500
Company Code
India
Sales Org
Hyderabad
Bangalore
New York
Washington D.C
USA
Company Code
India
Sales Org
Hyderabad
Bangalore
USA
New York
Washington D.C
Division
Ameerpet
Begumpet
Electronic City
Silk Board
7th Street
9th lane
8th street
10th street
Amount
2000
2000
2500
2500
Amount
1000
1000
1000
1000
1250
1250
1250
1250
Go to 'Attribute' tab page, in column 'Navigation On/Off', select the pencil like structure.
When changing display to navigation, give a description, click on activate button.
o
o
o
o
In classical star schema, the characteristic record is directly stored in DIM tables.
For every Dimension table, a DIM ID is generated and it is stored in the fact table.
Create an InfoCube
Create an InfoCube
Creating an InfoCube In BW,Customer ID,Material Number, Sales Representative ID,Unit of Measure,
and Transaction Date are called characteristics. Customer Name and Customer Address are attributes of
Customer ID,although they are characteristics as well. Per Unit Sales Price, Quantity Sold, and Sales
Revenue are referred to as key figures. Characteristics and key figures are collectively termed Info
Objects.
A key figure can be an attribute of a characteristic. For instance, Per Unit Sales Price can be an attribute
of Material Number. In our examples, Per Unit Sales Price is a fact table key figure. In the real world, such
decisions are made during the data warehouse design phase. InfoCube Design provides some guidelines
for making such decisions.
InfoObjects are analogous to bricks. We use these objects to build InfoCubes. An InfoCube comprises the
fact table and its associated dimension tables in a star schema.
In this chapter,we will demonstrate how to create an InfoCube that implements the star schema from
figure. We start from creating an InfoArea. An InfoArea is analogous to a construction site,on which we
build InfoCubes.
Creating an InfoArea
In BW,InfoAreas are the branches and nodes of a tree structure. InfoCubes are listed under the branches
and nodes. The relationship of InfoAreas to InfoCubes in BW resembles the relationship of directories to
files in an operating system. Let's create an InfoArea first,before constructing the InfoCube.
Work Instructions
Step 1. After logging on to the BW system,run transaction RSA1,or double-click Administrator
Workbench.
Step 2. In the new window,click Data targets under Modelling in the left panel. In the right panel,right-click
InfoObjects and select Create InfoArea.
Note
In BW,InfoCubes and ODS Objects are collectively called data targets.
Step 3. Enter a name and a description for the InfoArea,and then
mark to continue.
Result
The InfoArea has been created.
Creating InfoObject Catalogs
Before we can create an InfoCube,we must have InfoObjects. Before we can create
InfoObjects,however,we must have InfoObject Catalogs. Because characteristics and key figures are
different types of objects,we organize them within their own separate folders,which are called InfoObject
Catalogs. Like InfoCubes,InfoObject Catalogs are listed under InfoAreas.
Having created an InfoArea,let's now create InfoObject Catalogs to hold characteristics and key figures.
Work Instructions
Step 1. Click InfoObjects under Modelling in the left panel. In the right panel,right-click InfoArea
demo,and select Create InfoObject catalog.
Step 2. Enter a name and a description for the InfoObject Catalog,select the option Char.,and then click
to create the InfoObject Catalog.
Step 3. In the new window,click
to check the Info Object Catalog. If it is valid,click
to activate
the InfoObject Catalog. Once the activation process is finished,the status message InfoObject catalog
IOC_DEMO_CH activated appears at the bottom of the screen.
Result
Click
to return to the previous screen. The newly created InfoObject Catalog will be displayed,as
shown in Screen
Following the same procedure,we create an InfoObject Catalog to hold key figures. This time,make sure
that the option Key figure is selected Screen.
Creating InfoObjects-Characteristics
Now we are ready to create characteristics.
Work Instructions
Step 1. Right-click InfoObject Catalogdemo: characteristics,and then select Create InfoObject.
Step 2. Enter a name and a description,and then click
to continue.
Step 3. Select CHAR as the DataType,enter 15 for the field Length,and then click the tab Attributes.
Step 4. Enter an attribute name IO_MATNM,and then click
Note: Notice that IO_MATNM is underlined. In BW,the underline works like a hyperlink. After IO_MATNM
is created,when you click IO_MATNM,the hyperlink will lead you to IO_MATNM's detail definition window.
Step 5. Select the option Create attribute as characteristic,and then click
to continue.
Step 6. Select CHAR as the DataType,and then enter 30 for the field Length. Notice that the option
Exclusively attribute is selected by default. Click
to continue.
Note: If Exclusively attribute is selected,the attribute IO_MATNM can be used only as adisplay
attribute,not as a navigational attribute. "InfoCube Design Alternative I Time Dependent Navigational
Attributes," discusses an example of the navigation attributes.
Selecting Exclusively attribute allows you to select Lowercase letters. If the option Lowercase letters is
selected,the attribute can accept lowercase letters in data to be loaded.
If the option Lowercase letters is selected,no master data tables,text tables,or another level of attributes
underneath are allowed. "BW Star Schema," describes master data tables and text tables,and explains
how they relate to a characteristic.
Step 7. Click
Step 8. A window is displayed asking whether you want to activate dependent InfoObjects. In our
example,the dependent InfoObject is IO_MATNM.
Click
Result
You have now created the characteristic IO_MAT and its attribute IO_MATNM.
Note: Saving an InfoObject means saving its properties,or meta-data. You have not yet created its
The column "Assigned to" specifies the characteristic to which an attribute is assigned. For
example,IO_MATNM is an attribute of IO_MAT.
The Material Description in Table will be treated as IO_MAT's text,as shown in Table,"Creating
InfoPackages to Load Characteristic Data." We do not need to create a characteristic for it.
IO_SREG and IO_SOFF are created as independent characteristics,instead of IO_SREP's attributes.
Section 3.6,"Entering the Master Data,Text,and Hierarchy Manually," explains how to link IO_SOFF and
IO_SREG to IO_SREP via a sales organization hierarchy. "InfoCube Design Alternative ITimeDependent Navigational Attributes," discusses a new InfoCube design in which IO_SOFF and IO_SREG
are IO_SREP's attributes.
BW provides characteristics for units of measure and time. We do not need to create them.From
Administrator Workbench,we can verify that the characteristics in Table have been created by clicking
InfoAreademo,and then clicking InfoObject Catalogdemo: characteristics.
to activate the
Result
You have created the key figure IO_PRC. A status message All InfoObject(s) activated will appear at the
bottom of Screen.
Repeat the preceding steps to create other key figures listed.
KEY FIGURES
From Administrator Workbench,we can verify that the key figures in Table have been created (Screen) by
Note: An InfoCube can be a basic cube,a multi-cube,an SAP remote cube,or a general remote cube.A
basic cube has a fact table and associated dimension tables,and it contains data. We are building a basic
cube.
A multi-cube is a union of multiple basic cubes and/or remote cubes to allow cross-subject analysis. It
does not contain data. See,Aggregates and Multi-Cubes,for an example.
A remote cube does not contain data;instead,the data reside in the source system. A remote cube is
analogous to a channel,allowing users to access the data using BEx. As a consequence,querying the
data leads to poor performance.
If the source system is an SAP system,we need to select the option SAP RemoteCube. Otherwise,we
need to select the option Gen. Remote Cube. This book will not discuss remote cubes.
Step 3. Select IO_CUST,IO_MAT,and IO_SREP from the Template table,and move them to the Structure
table by clicking
Next,click the Dimensions button to create dimensions and assign these characteristics to the
dimensions.
Step 4. Click,
Note: BW automatically assigns technical names to each dimension with the format <InfoCube
name><Number starting from 1>.
Fixed dimension <InfoCube name><P|T|U> is reserved for Data Packet,Time,and Unit. Section,"Data
Load Requests," discusses the Data Packet dimension.
A dimension uses a key column in the fact table. In most databases,a table can have a maximum of 16
key columns. Therefore,BW mandates that an InfoCube can have a maximum of 16 dimensions: three
are reserved for Data Packet,Time,and Unit; the remaining 13 are left for us to use.
Repeat the same procedure to create two other dimensions. Next,click the Assign tab to assign the
characteristics to the dimensions.
Step 5. Select a characteristic in the Characteristics and assigned dimension block,select a dimension to
which the characteristic will be assigned in the Dimensions block,and then click
characteristic to the dimension.
to assign the
Result:
You have created the InfoCube IC_DEMOBC. A status message InfoCube IC_DEMOBC activated will
appear at the bottom of Screen.
Summary
In this chapter,we created an InfoCube. To display its data model,you can right-click InfoCubedemo:
Basic Cube,then select Display data model.
The data model appears in the right panel of Screen .
Note:
IO_SREG and IO_SOFF are not listed under IO_SREP as attributes; rather,they have been created as
independent characteristics. "Entering the Master Data,Text,and Hierarchy Manually," describes how to
link IO_SOFF and IO_SREG to IO_SREP via a sales organization hierarchy. "InfoCube Design
Alternative I Time-Dependent Navigational Attributes,"discusses a new InfoCube design in which
IO_SOFF and IO_SREG are IO_SREP's attributes.
InfoCube
Info Cube is structured as Star Schema (extended) where a fact table is surrounded by different dim
table that are linked with DIM'ids. And the data wise, you will have aggregated data in the cubes.
Infocube contains maximum 16(3 are sap defines and 13 are customer defined) dimensions and
minimum 4(3 Sap defined and 1 customer defined) dimensions with maximum 233 key figures and 248
characteristic.
The following InfoCube types exist in BI:
. InfoCubes
. VirtualProviders
There are two subtypes of InfoCubes: Standard, and Real-Time. Although both have an extended star schema
design, Real-Time InfoCubes (previously called Transactional InfoCubes) are optimized for direct update, and do
not need to use the ETL process. Real-Time InfoCubes are almost exclusively used in the BI Integrated Planning
tool set. All BI InfoCubes consists of a quantity of relational tables arranged together in a star schema.
Star Schema
In Star Schema model, Fact table is surrounded by dimensional tables. Fact table is usually very large,
that means it contains millions to billions of records. On the other hand dimensional tables are very small.
Hence they contain a few thousands to few million records. In practice, Fact table holds transactional data
and dimensional table holds master data.
The dimensional tables are specific to a fact table. This means that dimensional tables are not shared to
across other fact tables. When other fact table such as a product needs the same product dimension data
another dimension table that is specific to a new fact table is needed.
This situation creates data management problems such as master data redundancy because the very
same product is duplicated in several dimensional tables instead of sharing from one single master data
table. This problem can be solved in extended star schema.
Info Area
InfoArea
In BW, InfoArea are the branches and nodes of a tree structure. InfoProviders are listed under the branches and
nodes. The relationship of InfoArea to InfoProviders in BW is similar to the relationship of directories to files in an
operation system.
Steps to create an InfoArea:
Step 1: After logging in to BW system, run transaction RSA1.
Step 2: In the new window, click InfoProvider tab under Modeling in the left panel. In the right panel, right click on
InfoProvider and select Create InfoArea.
Step 3: Enter a name and a description for the InfoArea, and then click to continue.
nfoObjects
InfoObjects are the smallest pieces in SAP BW puzzle. They are used to describe business information
and processes. Typical examples of InfoObjects are: Customer Name, Region, Currency, Revenue,
Fiscal year.
There are five types of SAP BW InfoObjects: Key figures, Characteristics, Unit characteristics, Time
Key figures
Key figures describe numeric information that are reported on in a query. The most popular types of key
figures are:
Number;
Integer.
Characteristics
Characteristics describe business objects in BW like products, customers, employee, and attributes like
color, material, company code. They enable us to set select criteria during which we display required
data.
Unit characteristics
Unit characteristics provide a meaning of key figures values, stores currencies or units of measure
(e.g., CURRENCY unit, value unit).
Time characteristics
Time characteristics describe time reference of business events. They build the time dimension obligatory part of InfoCube. The complete time characteristics (clearly assigned to a point in time)
provided by SAP: calendar day (0CALDAY), calendar week (0CALWEEK), calendar month
(0CALMONTH), calendar quarter (0CALQUARTER), calendar year (0CALYEAR), fiscal year
Technical characteristics
Technical characteristics have administrative purposes (e.g., stores request ID, change ID).
InfoObjects catalogs
SAP BW InfoObjects are stored in InfoObjects catalogs, separately Key figures and Characteristics (all
types). Usually there are two InfoObjects catalogs (for Key figures and Characteristics) defined for every
business context in SAP BW implementation.
Detailed information on particular InfoObject you can find in the Modeling area of the Data Warehousing
Workbench (TCode: RSA1 -> Modeling -> InfoObjects).
are updated from the change log if they are supplied with data from the DataStore object in the delta
method. The change log is a PSA table and can also be maintained in the PSA tree of the Data
Warehousing Workbench. The change log has a technical key consisting of a request, data package, and
data record number.
Activation Queue table
During the DTP, the records are first written to this table. This step is necessary due to the complex logic
that is then required by the activation process.
A MultiProvider is a special InfoProvider that combines data from several InfoProviders, providing it for
reporting. The MultiProvider itself (InfoSets and VirtualProviders) does not contain any data. Its data
comes exclusively from the InfoProviders on which it is based. A MultiProvider can be made up of various
combinations of the following InfoProviders:
. InfoCubes
. DataStore objects
. InfoObjects
. InfoSets
. Aggregation levels (slices of a InfoCube to support BI Integrated Planning)
Use
A BEx query can only be written against a single InfoProvider. A MultiProvider is a single InfoProvider to a
query but through it, multiple providers can be indirectly accessed.
MultiProvider
A MultiProvider is a special InfoProvider that combines data from several InfoProviders, providing it for
reporting. The MultiProvider itself (InfoSets and VirtualProviders) does not contain any data. Its data
comes exclusively from the InfoProviders on which it is based. A MultiProvider can be made up of various
combinations
of
the
following
InfoProviders:
.
InfoCubes
.
DataStore
objects
.
InfoObjects
.
InfoSets
.
Aggregation
levels
(slices
of
a
InfoCube
to
support
BI
Integrated
Planning)
Use
A BEx query can only be written against a single InfoProvider. A MultiProvider is a single InfoProvider to a
query but through it, multiple providers can be indirectly accessed.
DataStore object
Since a DataStore object is designed like a table, it contains key fields (document number and item, for
example) and data fields. Data fields can not only be key figures but also character fields (order status,
customer, or time, for example). You can use a delta update to update DataStore object data into
connected InfoCubes or into additional DataStore objects or master data tables (attributes or texts) in the
same system or in different systems. In contrast to multidimensional DataStores for InfoCubes, data in
DataStore objects is stored in flat, transparent database tables. Fact and dimension tables are not
created.
With DataStore objects, you can not only update key figures cumulatively, as with InfoCubes, but also
overwrite data fields. This is especially important for transaction-level documents that change in the
source system. Here, document changes not only involve numerical fields, such as order quantities, but
also non-numerical ones such as ship-to parties, delivery date, and status. Since the OLTP system
overwrites these records when changes occur, DataStore objects must often be moceled to overwrite the
corresponding fields and update to the current value in BI.
DS Oject Types
SAP BI distinguishes between three DataStore object types: Standard, Write Optimized, and Direct
Update. These three flavors of DataStore Objects are shown in the following figure.
1. The Standard DataStore Object consists of three tables (activation queue, active data table, and
change log). It is completely integrated in the staging process. In other words, data can be loaded into
and out of the DataStore Objects during the staging process. Using a change log means that all changes
are also written and are available as delta uploads for connected data targets.
Architecture and Functions of Standard DataStore Objects
Standard DataStore objects consist of three tables:
Active Data table
This is where the current status of the data is stored. This table contains a semantic (business-related)
key that can be defined by the modeler (order number, item, or schedule line, for example). It is very
important that the key be correctly defined by the modeler, as a match on the key initiates special delta
processing during the activation phase (discussed later). Also, reporting via the BEx uses this table.
Change Log table
During the activation run, changes are stored in the change log. Here, you can find the complete history
of the changes, since the content of the change log is not automatically deleted. The connected targets
are updated from the change log if they are supplied with data from the DataStore object in the delta
method. The change log is a PSA table and can also be maintained in the PSA tree of the Data
Warehousing Workbench. The change log has a technical key consisting of a request, data package, and
data record number.
Activation Queue table
During the DTP, the records are first written to this table. This step is necessary due to the complex logic
that is then required by the activation process.
Attributes
Attributes are InfoObjects that exist already, and that are assigned logically to the new characteristic
Navigational Attributes
A Navigational Attibute is any attribute of a Characteristic which is treated in very similar way as we treat
as Characteristic while Query Designer. Means one can perform drilldowns, filters etc on the same while
Query designing.
Imp Note:
While Creating the Info Object -- Attributes Tab Page -- Nav Attri butes to be switched on .
While Designign the Cube we need to Check mark for the Nav. Attributes to make use of them.
Features / Advantages
Nav. Attr. acts like a Char while Reporting. All navigation functions in the OLAP processor are
also possible
As the data is fetching from Master Data Tables and Not from Info Cube.
Disadvantages:
In the enhanced star schema of an InfoCube, navigation attributes lie one join further out than
characteristics. This means that a query with a navigation attribute has to run an additional join
If a navigation attribute is used in an aggregate, the aggregate has to be adjusted using a change
run as soon as new values are loaded for the navigation
attribute.http://help.sap.com/saphelp_nw04s/helpdata/EN/80/1a63e7e07211d2acb80000e829fbfe/frames
et.htm
Transitive Attributes
A Navigational attribute of a Navigational Attributes is called Transitive Attribute. Tricky right Let me
explain
If a Nav Attr Has the further more Nav. Attributes ( as its Attributes ) in it those are called
Transitive Attributes .
For Example Consider there exists a characteristic Material .It has Plant as its navigational
attribute. Plant further has a navigational attribute Material group. Thus Material group is the transitive
attribute. A drilldown is needed on both Plant and Material group.
And again we need to have both Material & Plant in your Info Cube to Drill down. (To fetch the
data through Nav. Attrs. we need Master Data tables hence, we need to check mark/select both of them
in the Cube
)http://help.sap.com/saphelp_nw04s/helpdata/EN/6f/c7553bb1c0b562e10000000a11402f/frameset.htm
If Cube contains both Material and Plant
Dimension table having both Material and Plant will have Dim ID, Sid of Material, and Sid of Plant.
Since both the Sids exists reference of each navigational attribute is made correctly.
It can be used in DSO, Infoset and Char as InfoProviders. In this Info Provider , the char is not
visible during read access (at run time)
This means, it is not available in the query. If the Info Provider is being used as source of
Transformation or DTP the characteristic is not visible.
It is just for Display at Query and cannot be used for Drill downs while reporting.
Exclusive attributes:
If you choose exclusively attribute, then the created key figure can only be used as an attribute for
another characteristic, but cannot be used as a dedicated key figure in the InfoCube.
While Creating A Key Figure -- The Tabpage:Additional Properties -- Check Box for Attributies
Only
http://help.sap.com/saphelp_nw04s/helpdata/en/a0/eddc370be9d977e10000009b38f8cf/frameset.htm
Outer Join:
With the use of outer join you can join the tables even there is no entry in all the tables used in the view.
Inner join between table 1 and table 2, where column D in both tables in the join condition is set the
same:
Table 1 Table 2
Inner Join
---- ---- ---- ---- ---- ---- ---- ---- ---ABCDDEFGH
---- ---- ---- ---- ---- ---- ---- ---- ---a1 b1 c1 1 1 e1 f1 g1 h1
a2 b2 c2 1 1 e1 f1 g1 h1
a4 b4 c4 3 3 e2 f2 g2 h2
---- ---- ---- ---- ---- ---- ---- ---- ---Left outer join between table 1 and table 2 where column D in both tables set the join
condition: Table 1 Table 2
ABCDDEFGH
a1 b1 c1 1 1 e1 f1 g1 h1
a2 b2 c2 1 3 e2 f2 g2 h2
a3 b3 c3 2 4 e3 f3 g3 h3
a4 b4 c4 3 --- ---- -----------Left Outer Join
ABCDDEFGH
a1 b1 c1 1 1 e1 f1 g1 h1
a2 b2 c2 1 3 e2 f2 g2 h2
a3 b3 c3 2 4 e3 f3 g3 h3
a4 b4 c4 3 --- ---- ------------
----------------------------------------------------ABCDDEFGH
-----------------------------------------------------a1 b1 c1 1 1 e1 f1 g1 h1
a2 b2 c2 1 1 e1 f1 g1 h1
a3 b3 C3 2 NULLNULLNULLNULLNULL
a4 b4 c4 3 3 e2 f2 g2 h2
What makes difference between Inner Join & Left Outer Join
Inner join returns only the matching records from both tables.Left outer join returns complete details of
left table which are matching with right table and non matching records also.
The data that can be selected with a view depends primarily on whether the view implements an inner
join or an outer join. With an inner join, you only get the records of the cross-product for which there is an
entry in all tables used in the view. With an outer join, records are also selected for which there is no entry
in some of the tables used in the view.
The set of hits determined by an inner join can therefore be a subset of the hits determined with an outer
join.
Database views implement an inner join. The database therefore only provides those records for which
there is an entry in all the tables used in the view. Help views and maintenance views, however,
Temporal Join
Join containing at least one time-dependent characteristic For example, a join contains the following timedependent Info Objects (in addition to other objects that are not time-dependent).
InfoObjects in the join
Valid from
Valid to
Cost center (0COSTCENTER) 01.01.2009 31.05.2009
Profit center (0PROFIT_CTR) 01.06.2009 31.09.2009
Where the two time-intervals overlap, means the validity area that the Info Objects have in common, is
known as the valid time-interval of the temporal join.
Temporal join
Valid time-interval
Valid from
01.03.2009
Valid to
31.05.2009
You define an Info Set via the characteristic PROFIT Center, which contains the responsible person
(RESP) as the time-dependent attribute and the characteristic COSTCENTER that also contains person
Profit Center
Responsible Person DATEFROM*
BI A 01.01.2009 30.06.2009
BI B 01.07.2009 31.12.9999
DATETO*
PROFITC
BI
BI
BI
BI
RESP
A
A
B
B
Cost Center
4711
4711
4711
4711
Profit Center
BI
BI
BI
BI
Responsible Person
X (01.01.2009-31.05.2009)
Y (01.06.2009-30.06.2009)
Y (01.07.2009-31.12.2009)
Z (01.01.2009-31.12.9999)
Equal Join
A join condition determines the combination of records from the individual objects that are included in the
resulting set. Before an Info Set can be activated, the join conditions have to be defined in such a way (as
equal join condition) that all the available objects are connected to one another either directly or
indirectly.
An Equal Join is possible only with Same Values Technical Requirement is such way that both the Values
has same Data Type & Length for Equal Join.
Application Component
Application Components are used to organize Data Sources. They are analogous to the Info Areas .
DataSource
A DataSource is not only a structure in which source system fields are logically grouped together, but also
an object that contains ETTL-related information.
Four types of DataSources exist:
DataSources for transaction data
DataSources for characteristic attributes
DataSources for characteristic texts
DataSources for characteristic hierarchies
If the source system is R/3, replicating DataSources from a source system will create identical DataSource
structures in the BI/BW system.
Info Package:
An InfoPackage specifies when and how to load data from a given source system. BW generates a 30-digit
code starting with ZPAK as an InfoPackage's technical name.
PSA
Persistent Staging Area is a data staging area in BW. It allows us to check data in an intermediate location,
before the data are sent to its destinations in BW.
The PSA stores data in its original source system format. In this way, it gives us a chance to examine /
analyse the data before we process them to further destinations. Most probably it is a temporary storage
area, based on the client data specifications and settings.
SID
SID (Surrogate-ID) translates a potentially long key for an InfoObject into a short four-byte integer, which
saves I/O and memory during OLAP.
Star schemaA star schema is a technique used in the data warehouse database design to help data
retrieval for online analytical processing(OLAP).
Business Content
Business Content is a complete set of BI/BW objects developed by SAP to support the
OLAP tasks. It contains roles, workbooks, queries, Info Cubes, key figures,
characteristics, Transformations and extractors for SAP R/3, and other mySAP
solutions.
Compound attribute
A compound attribute differentiates a characteristic to make the characteristic uniquely
identifiable. For example, if the same characteristic data from different source systems
mean different things, then we can add the compound attribute 0SOURSYSTEM (source
system ID) to the characteristic; 0SOURSYSTEM is provided with the Business Content.
Data Warehouse
Data Warehouse is a dedicated reporting and analysis environment based on the star
schema database design technique and requiring special attention to the data ETTL
process.
Delta update
The Delta update option in the InfoPackage definition requests BI/BW to load only the
data that have been accumulated since the last update. Before a delta update occurs, the
delta process must be initialized.
Equal Join
Table1
Table 2
Equal Join
------------------- X
SID is surrogate ID generated by the system. The SID tables are created when we create a master data IO.
In SAP BI Extended star schema, the distinction is made between two self contained areas: Info cube &
Master data tables and connecting SID tables.
The master data doesn't reside in the Extended Star schema but resides in separate tables which are
shared across all the star schemas in SAP BI.
An Unique Numeric ID , the SID is generated which connects the dimension tables of the infocube to that
of the master data tables.
The dimension tables contain the DIM IDs and SIDs of a particular Characteristic Info Object. Using this
SID Table the Master data ( attributes and texts of Info Object) is accessed.
List of Technical Tables
F - Fact Table - Uncompressed Fact Table - Contains Direct data for cube Request wise ( Based on B-Tree
Index )
E - Fact Table - Compress cube - Contains compressed data without Request IDs( Request ID would be
'zero') ( based on Bitmap Index )
M - View of Master Data Table - /BI0/MMATERIAL
P - Time Independent Master Data Table - /BI0/PMATERIAL
Q - Time Dependent Master Data Table - /BI0/QMATERIAL
H - Hierarchy table - /BI0/HMATERIAL
J - Hierarchy interval table - /BI0/JMATERIAL
K - Hierarchy SID table - /BI0/KMATERIAL
I - SID Hierarchy structure - /BI0/IMATERIAL
S - SID table - /BI0/SMATERIAL
X - Time Independent SID table for Attr - /BI0/XMATERIAL
Y - Time Dependent SID table fir Attr - /BI0/YMATERIAL
T - Text Table - /BI0/TMATERIAL
SID -- Master data Tables
Surrogate Keys are automatically generated uniform keys, which are uniquely identifying specific realworld key values.
SID are only the connectivity link between DIM IDs and Master Data Tables .
Let us take an Example of a Material Master Data Tables and understand the various connectivities with
SID table.
(Click on the image to enlarge for better view)
Compounding InfoObject
In Compounding a Field or another Object is attached to an Info Object. A
Compounding Characteristic is when object's definition is incomplete without the
definition of the another Characteristic Info Object.
For the better understanding the Info Object - Location (0PP_LOCAT) has to be
assigned with a Compound Info Object - Plant (0PLANT).
The Info Object 0Plant has to be Installed/Created/ Activated first, later followed by
Location(0PP_LOCAT)
While creating the Info Object itself we need to assign the Superior Object like below at
Compounding Tab Page of Info Object.
Compounding Info Object Acts as a compounding Primary Key at the Master Data
Table.
When a compounded Info object is included in an Info cube, all corresponding info
objects are added to the Info cube.
When a Compounded Info object is included in the DSO , all corresponding Objects are
added to the DSO Key fields/Data Fields.
The total length of the compounding info objects cannot exceed 60 characters.
An Info Object which is defined as an Attribute only setting can not be included in
Compounding object.
The Compounding Info Objects at BEx Report out put will be 0PLANT/0PP_LOCAT.
SAP BI Terminology
Info Area
Info Area is like Folder in Windows. InfoArea is used to organize InfoCubes, InfoObjects,
MultiProviders, and InfoSets in SAP BW.
InfoObject Catalog
Similar to InfoArea, InfoObject Catalog is used to organize the InfoObject based on their type. So
we will have InfoObjects Catalogs of type Characteristics & KeyFigures.
Info Objects
It is the bsic unit or object in SAP BI used to create any structures in SAP BI.
Transfer Structure indicates what fields and in what sequence are they being transferred from the
source system.
We use Source system connection to connect different OLTP applications to SAP BI.
All properties what we see in the InfoPackage depends on the properties of the DataSource.
BI/BW Tips
BI Tip # 1
Struggling as there is no sample data for your newly developed Infocube, why not try this?
Try using ABAP program CUBE_SAMPLE_CREATE. It allows you to enter required sample data directly
into your cube without using any flat files, source system configuration etc. Records are added to the
cube in one request using the APO interface without monitor log.
Try exploring even further with all the options available there.
Needless to say try it on Sandbox, development system before attempting on production environment.
BI Tip # 2
To check whether your query is migrated to SAP BI7 version already
Check the Table RSZCOMPDIR: Enter your Query Technical Name as input to the field COMPID and
execute.
If Field VERSION in the Table RSZCOMPDIR has the value less than 100 that means Query is in 3.x
version. If it is more than 100 means, it is already migrated.
BI Tip # 3
Couple of interesting tricks RSDG_MPRO_ACTIVATE is a program to activate the Multi providers in production system directly. If
there are any inactive MultiProviders due to some transport or any other reason, this will activate the
Multiprovider without affecting reporting.
Needless to say try it on Sandbox, development system before attempting on production environment.
BI Tip # 4
Worried about Data Loss while changing extract structure, why not try this?
Run the Report RMCSBWCC before doing changes to the LO extract structure or while importing any
changes done to the extract structure.
This report checks whether any of the clients in the system contains data in the V3 update (Extraction
queue) for that application (specific to extract structure provided as input). If there is data in V3 updates
then you need to start the update for that application. Without doing this you will not be able to change the
extract structure and if you are importing the changes then you may end up losing data.
BI Tip # 5
RSICCONT is a table used to delete the request of a data target including DSO or Cube. Facing any
problem in deleting the request from a DSO or a cube while loading the data. Try this.
Needless to say try it on Sandbox, development system before attempting on production environment.
BI Tip # 6
Most of you are aware of RSZDelete tcode, but lot of them face the issue of how to delete enmass
queries on one infoprovider or something in multiples. Well the answer is in same tcode.
For E.g.: You need to delete 10 queries out of 20 based on a infocube. Normally , developer tend to
delete one by one. But you can delete multiple queries also.
Infocube : ZSD_C03
Total Queries : 25
To Delete : 15
In RSZDelete,
Type = REP
Infocube = ZSD_C03
Execute.
You get list of all queries.. select the one that requires to be deleted.
PS: This is an extremely DANGEROUS Transaction Please use responsibly.
BI Tip #7
Replicate Single DataSource
- Use RSAOS_METADATA_UPLOAD function module to replicate single datasource. Logical system put in field I_LOGSYS OLTP datasource name - put in field I_SOURC. and execute.
Trust you shall check this in Sandbox / Development System first.
BI Tip #8
Stream oriented
Almost immediate availability for reporting(less than 1 minute)
RDA is used in tactical decision making.
Using a Webservice Push:
A web service push can write the data directly from the source to the
PSA. The data transfer is not controlled by BI. An infopackage(for full
upload) is required only to specify request-related settings for RDA;it
is never executed, as the data is pushesd into the BI PSA by a web service.
Using the BI Service API: If the source data is based on a source in an
SAP Source system, the BI Service API is used. Many of the steps are
the same as with normal delta extractions, such as the requirement for
an infopackage to initialize delta.
With RDA, it is these delta loads that are special. If the Datasourceallows for RDA ( a checkbox on
RSA2), we can choose to utilize it in
this way. This involves the creation of a specific RDA Data Transfer Process.
The RDA processes focuses on obtaining data very frequently from your
source system. Due to the limitations discussed above, many times you
only get to decide if the feed to your targets will be normal, periodically
schedulled infopackage, or if it be RDA.
Infoproviders exist for Plan and Actual data of cost center transaction.
This separate plan vs actual design suports BI Integrated Planning with
one dedicated cube, and to support the loading of actual data from
SAP Source system. Your users now have requirements for plan add actual
comparision reports. We want to investigate a Multiprovider to solve
this need
Aggregates
In an aggregate the dataset of an infocube is saved redundantly and persistantly in a consolidated form
into the database.
USE: The objective of using the aggregates is to improve the reporting performance in reporting.
Aggregates makes it possible to access infocube data quickly in reporting. Aggregates serve in a similar
way to database
indexes to improve performance.
The BW OLAP Processor selects an appropriate aggregate during a query run in the navigation
step. If no appropriate aggregate exists, the BW OLAP Processor retrives data from the infocube.
Aggregates are multidimensional data structures similar to infocube containing aggregated subset of
information, in a summarized from. An aggregate is also called as baby cube for an infocube. An
aggregates stores the data of an Infocube redundantly and in consolidated from in the aggregate table
RSDDAGGRDIR.
Aggregates are used mainly for one reason primarily to improve the reporting performance. When
queries run faster, they may take less processing time and resources in addition to the enduser getting
the information back i.e., response more quickly.
Life Cycle of Aggregates:
1. Aggregates are defined by the DBA against an existing Infocube.
2. Aggregates are updated when loading data into Infocubes from Infosource using the same updated
rules basic infocube.
3. During data loading, data is aggregated to the specific level of infocube dimension(characteristic)
4. During querying, the OLAP Processor dynamically determines if an aggregate exist to satisfy the
query.
Aggregates have 3 names:
1. A system defined 25 digit unique name.
2. A 6 DIGIT integer number
The fact table of aggregate maintains an adding keyfigure 0factcount(It is the couter for occurance of
request)
/BI*/Q<characteristic_name>
Fields DATETO & DATEFROM are included in time dependent attribute tbl.
stored with characteristic values
Dimension tables:
tables.
In SAP BW, there are two fact tables for including transaction data for Basis
InfoCubes: the F and the E fact tables.
The Fact tbl is the central tbl of the InfoCube. Here key figures (e.g. sales volume) &
pointers to the dimension tbls are stored (dim tbls, in turn, point to the SID tbls).
If you upload data into an InfoCube, it is always written into the F-fact table.
If you compress the data, the data is shifted from the F-fact table to the E-fact table.
The F-fact tables for aggregates are always empty, since aggregates are
compressed automatically
After a changerun, the F-fact table can have entries as well as when you use the
functionality 'do not compress requests for Aggregates.
/BI0/P<char_name>
/bic/M(object name) master data of object
Master data tables are independent of any InfoCube
Master data & master data details (attributes, texts & hierarchies) are stored.
Master data table stores all time independent attributes (display & navigational
attribues)
Navigational attributes tables:
The attribs are not stored as char values but as SIDs (master data IDs).
P table:
S - SID table
Y - Time Dependent SID table
T - Text Table
F - Fact Table - Direct data for cube ( B-Tree Index )
E - Fact Table - Compress cube ( Bitmap Index )
RSRV
Analysis and Repair of BW Objects:
This transaction contains a collection of Reports to check the consistency of the Metadata and the data
in the
system and offers repair payments for most inconsistencies.
These reports should be periodically run as a preventive maintenance measures to create any data
corruption etc.
RSRV Transaction is used as a Testing tool.
If a user creates characteristics type info object ZPRODUCT and activates it, information will be
stored in following:
Data element: /BIC/IOZPRODUCT
SID table: /BIC/SZPRODUCT
Master data table: /BIC/PZPRODUCT
Text table: /BIC/TZPRODUCT
View: /BIC/MZPRODUCT
When an info cube ZSALES is created and activated, information will be stored in following:
View Fact table: /BIC/VZSALESF
Transparent Fact table: /BIC/FZSALES
Dimension tables: /BIC/DZSALES1 to /BIC/DZSALESN where N being no. of dimensions
/BIC/DZSALESP, /BIC/DZSALEST, /BICDZALESU for Data Packet, Unit & Time (maximum
16 dimensions
Classes of Data
There are 3 classes of data in SAP-BW.
1. Master Data : It Describes Business
2. Transaction Data : It Describes Business Event.
3. Configuration Data :
Master Data is further classified into 3 types:
1. Attribute Data: It describes
2. Text Data:
3. Hierarchical Data:
Transaction Data is further divided into 2 types:
1. Document Data
1. Header Data
2. Item Data
3. Schedule line Data
2. Summary Level Data
Routine Lesson 1
SCENARIO: THE DATA SOURCE DOES NOT HAVE DIVISION AND WE NEED
TO DERIVE IT FROM MATERIAL WHICH EXISTS IN THE DATASOURCE.
POPULATE THE CUBE WITH THE DIVISION.
SOLUTION:
DIVISION NEEDS TO BE DERIVED FROM MATERIAL AS DIVISION IS NOT
RETRIEVED FROM THE DATASOURCE AND THE DIVISION NEEDS TO BE
DERIVED FROM MATERIAL USING THE /BI0/PMATERIAL TABLE.
WA_TH_MATERIAL IS AN INTERNAL TABLE DERIVED FROM A WORK
AREA WHICH IS WA_MATERIAL AND WA_MATERIAL IS A WORK AREA
DERIVED FROM THE STRUCTURE T_MATERIAL
SINCE T_MATERIAL HAS MATERIAL AND DIVISION AS THE 2 FIELDS AND
THIS IS READ INTO A WORK AREA WA_MATERIAL USING A KEY WHICH IS
THE -MATERIAL I.E. THE MATERIAL THAT IS LOADED INTO THE END
ROUTINE OF THE TRANSFORMATION.START ROUTINE: USE A SELECT
STATEMENT TO LOAD THE INTERNAL TABLE.
CODE SNIPPET:
IF WA_TH_MATERIAL[] IS INITIAL.
*LOAD DIVISION BY MATERIAL
SELECT MATERIAL DIVISION
INTO TABLE WA_TH_MATERIAL
FROM /BI0/PMATERIAL
WHERE OBJVERS = A.
END ROUTINE: USE A READ STATEMENT AND READ THE INTERNAL TABLE
POPULATED IN THE START ROUTINE INTO A WORK AREA USING A KEY. IF
DATA IS FOUND MAKE THE DATA FOUND EQUAL TO THE END ROUTINE
FIELD.
CODE SNIPPET:
READ TABLE WA_TH_MATERIAL
INTO WA_MATERIAL
WITH TABLE KEY MATERIAL = -MATERIAL.
IF SY-SUBRC = 0.
-DIVISION = WA_MATERIAL-DIVISION.
DATA DEFINITION:
DATA:
BEGIN OF T_MATERIAL,
MATERIAL TYPE /BI0/OIMATERIAL,
DIVISION TYPE /BI0/OIDIVISION,
END OF T_MATERIAL,
Routine Lesson 2
Scenario: cube needs a customer number and the datasource does not provide the customer
number. The datasource however contains the country code such as DE,FR etc. based on the
country code a particular customer number is assigned for eg: for DE it is DE01J45 and for
FR it is FR023J4. This customer number needs to be populated in the cube.
SOLUTION:
In this scenario the transformation from the DSO to the cube is worked on where the start
routine is coded to load a data element from a standard table with a field in the standard
table as a reference. This is then used in the END routine with a CASE statement and the
RESULT_FIELDS are loaded accordingly.
START ROUTINE:
Code snippet:
select single low from ZBW_CONSTANT_TAB
into g_de_billto
where vnam = JV_DE_BILLTO.
END ROUTINE:
Code snippet:
case -/bic/zjvsource.
when DE.
-Ship_to = g_de_billto.
-Sold_to = g_de_billto.
-billtoprty = g_de_billto.
-payer = g_de_billto
end case.
<RESULT_FIELDS>-/bic/zjvsource.
case <RESULT_FIELDS>-/bic/zjvsource.
when DE.
<RESULT_FIELDS>-Ship_to = g_de_billto.
<RESULT_FIELDS>-Sold_to = g_de_billto.
<RESULT_FIELDS>-billtoprty = g_de_billto.
<RESULT_FIELDS>-payer
= g_de_billto
end case.
Routine lesson 3
Scenario: An info object in the cube has to be updated with a constant value and this info
object does not come from the datasource. Update the info object in the cube with a
constant value.
Solution: go to the DSO and add the info object where the data is not being sourced from the
datasource and in the transformation right click on the info object and click on RULE
DETAILS which will provide the below screen shot. Now choose constant and enter the
value.
InfoProvider Identification
InfoProvider Name:
Standard/Custom:
Standard
Std.Business Content
Business Content
w/ Modifications
Module(s):
CO
FI
HR
MM
PM
Custom
PP
SD
PS
Other (specify):
Document History
Created By:
Rodrick Gary
Approved by:
Date Created:
04/03/2006
Date Approved:
Change History (To track changes to Request spec. after the specifications have been approved)
Date
Modified
Modified
by
Approved
by
Date
Approved
Description
DataSource
2LIS_03_BX
Material Stocks
2LIS_03_BX
2LIS_03_BF
2LIS_03_BF
2LIS_03_UM
2LIS_03_UM
The InfoSource 2LIS_03_BX allows you to transfer material stocks from an SAP R/3 system to SAP BW.
The InfoSource allows you to set up stocks for stock InfoCubes.
The InfoSource 2LIS_03_BF delivers the data for material movements from MM Inventory Management
(MM-IM).
The InfoSource 2LIS_03_UM delivers the data for revaluations from MM Inventory Management (MM-IM).
Stock Overview
Stock In Transit
Inventory Aging
RICE Number
Description
<insert report names here >
2. Data Flow
Classification:
Standard Business Content
3.1.2.
__
__
Note: This assignment has been activated per standard business content. No modifications have
been made.
Classification:
Standard Business Content
Standard Business Content w/ Modifications
Custom
3.2.2.
Modeling: Source Systems: Replicate the DataSources of the following Application Components:
Application Component(s): Inventory Management
3.2.3.
In each of the designated source systems, assign the following DataSource - InfoSource relationships:
DataSource
InfoSource
2LIS_03_BX
2LIS_03_BX
2LIS_03_BF
2LIS_03_BF
2LIS_03_UM
2LIS_03_UM
Source System:
3.2.4.
Classification:
Standard Business Content
3.3.2.
3.3.2.1.
__
__
__
3.3.2.2.
3.3.2.3.
__
__
Object: 0IC_C03
3.3.2.4.
Install
Note: This object has been activated per standard business content. No modifications have been
made.
4. Dimensional Model (Include InfoProvider, master data, related ODS, and related aggregated
cubes)
4.1. Material Stocks/Movements (as of 3.0B) ( 0IC_C03 )
Note: This InfoCube is set to be activated per standard business content.
Compound Key
Navigational Attribute
Navigational Attributes
NAV
InfoObject_NAV
Name
turned
for NAV
on
(yes=x)
0PLANT__0COUNTRY
0MATERIAL__0DIVISION
0MATERIAL__0MATL_CAT
0MATERIAL__0MATL_GROUP
0MATERIAL__0MATL_TYPE
Country
Of Plant
Division
Material
Category
Material
Group
Material
Type
Standard
Business
Custom
Reference Section
Content
Note: This DataSource assignment has been activated per standard business content. No
modifications have been made.
InfoObject
(InfoSource)
Description
Data Element
in R/3
Field in
Transfer
Structure
Transfer
Routine
Standard
Business
Content
0BASE_UOM
Base Unit
0BATCH
0BWAPPLNM
MEINS
BASME
N/A
CHARG_D
CHARG
N/A
RSAPPLNM
BWAPPLNM
N/A
MCBW_GEO
BWGEO
N/A
MCBW_MNG
BWMNG
N/A
MCBW_GVP
BWGVP
N/A
MCBW_GVO
BWGVO
N/A
KZBWS
KZBWS
N/A
HWAER
HWAER
N/A
Batch
Application
comp.
BW:
0CPPVLC
Purchase
Value
BW:
0CPQUABU
Amount in
BUnitM
0CPSTLC
0CPSVLC
0INDSPECSTK
Sales Val.
Loc Curr.
BW: Sales
Value LC
Valn of
Spec. Stock
Local
0LOC_CURRCY
currency
Custom
Reference
0MATERIAL
0PLANT
0PSTNG_DATE
0SOLD_TO
0STOCKCAT
0STOCKTYPE
0STOR_LOC
0VAL_TYPE
0VENDOR
Material
MATNR
MATNR
N/A
WERKS_D
WERKS
N/A
BUDAT
BUDAT
N/A
WEMPF
WEMPF
N/A
BSTTYP
BSTTYP
N/A
MCBW_BAUS
BSTAUS
N/A
LGORT_D
LGORT
N/A
BWTAR_D
BWTAR
N/A
LIFNR
ELIFN
N/A
Plant
Posting
date
Sold-to
party
Stock
Category
Stock type
Storage
location
Valuation
type
Vendor
Note: This DataSource assignment has been activated per standard business content. No
Standar
InfoObject
(InfoSource)
Description
Data Element
Field in Transfer
in R/3
Structure
Transfe
r
Routine
d
Busines
s
Content
0STORNO
Reversal indicator
STORNO
STORNO
N/A
0RT_PROMO
Promotion
WAKTION
AKTNR
N/A
Custo
m
Reference
0VAL_CLASS
Valuation class
BKLAS
BKLAS
N/A
0DOC_DATE
Document Date
BLDAT
BLDAT
N/A
0STOCKTYPE
Stock type
BSTAUS
BSTAUS
N/A
0STOCKCAT
Stock Category
BSTTYP
BSTTYP
N/A
0PSTNG_DATE
Posting date
BUDAT
BUDAT
N/A
0COMP_CODE
Company code
BUKRS
BUKRS
N/A
0BWAPPLNM
Application comp.
BWAPPLN
M
RSAPPLNM
N/A
0MOVETYPE
Movement Type
BWART
BWART
N/A
BW: Stock
BWBREL
MCBW_BREL
N/A
BWGEO
MCBW_GEO
N/A
BWGVO
MCBW_GVO
N/A
BWGVP
MCBW_GVP
N/A
BWMNG
MCBW_MNG
N/A
Valuation type
BWTAR
BWTAR_D
N/A
BW: Transaction
BWVORG
MCW_BWVOR
G
N/A
0STOCKRELEV
0CPPVLC
0CPSVLC
0CPSTLC
0CPQUABU
0VAL_TYPE
0PROCESSKEY
Relevance
BW: Purchase
Value
BW: Sales Value
LC
Sales Val. Loc
Curr.
BW: Amount in
BUnitM
Key
0BATCH
Batch
CHARG
CHARG_D
N/A
0MATMREA
GRUND
MB_GRUND
N/A
0BUS_AREA
Business area
GSBER
GSBER
N/A
0COSTCENTER
Cost Center
KOSTL
KOSTL
N/A
0SOLD_TO
Sold-to party
WEMPF
WEMPF
N/A
LGNUM
LGNUM
N/A
0WHSE_NUM
Warehouse
number
0STOR_LOC
Storage location
LGORT
LGORT_D
N/A
0STRGE_BIN
Storage bin
LGPLA
LGPLA
N/A
0STRGE_TYPE
Storage type
LGTYP
LGTYP
N/A
0VENDOR
Vendor
LIFNR
ELIFN
N/A
0MATERIAL
Material
MATNR
MATNR
N/A
KDAUF
KDAUF
N/A
MEINS
MEINS
N/A
MJAHR
MJAHR
N/A
0DOC_NUM
0BASE_UOM
0DOC_YEAR
BW: Document
Number
Base Unit
BW: Document
Year
0PROFIT_CTR
Profit Center
PRCTR
PRCTR
N/A
0DCINDIC
Debit/Credit
SHKZG
SHKZG
N/A
0LOC_CURRCY
Local currency
WAERS
HWAER
N/A
0PLANT
Plant
WERKS
WERKS_D
N/A
0FISCVARNT
NOPOS
MC_NOPOS
N/A
0CPNOITEMS
BW: Number
PERIV
PERIV
N/A
0CO_AREA
Controlling area
KOKRS
KOKRS
N/A
N/A
BW: Document
0DOC_ITEM
Line No
ZEILE
MBLPO
0VALUE_LC
DMBTR
MC_DMBTR
N/A
0COORDER
Order
AUFNR
AUFNR
N/A
0QUANT_B
MENGE
MC_MENG
N/A
0MOVE_PLANT
Receiving Plant
UMWRK
UMWRK
N/A
Update Mode
ROCANCEL
ROCANCEL
N/A
N/A
N/A
0RECORDMOD
E
RMA
0RT_RMAPIDA
Phys.Invent.Date
0BWCOUNTER
Counter
BWCOUNTER
MCBW_COUNTER
Valn of Spec.
0INDSPECSTK
Stock
KZBWS
KZBWS
N/A
Note: This DataSource assignment has been activated per standard business content. No
modifications have been made.
InfoObject
(InfoSource)
0STORNO
0RT_PROMO
0VAL_CLASS
0DOC_DATE
0COMP_CODE
0BWAPPLNM
0MOVETYPE
Data Element in
Field in Transfer
Transfer
R/3
Structure
Routine
STORNO
STORNO
Promotion
WAKTION
AKTNR
Valuation
BKLAS
BKLAS
BLDAT
BLDAT
code
BUKRS
BUKRS
Application
BWAPPLNM
RSAPPLNM
BWART
BWART
Value
BWGEO
MCBW_GEO
BW:
BWMNG
MCBW_MNG
Description
Reversal
indicator
class
Document
Date
Company
comp.
Movement
Type
Standard
Business
Content
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
BW:
0CPPVLC
0CPQUABU
Purchase
Amount in
BUnitM
BW:
0PROCESSKEY
Transaction
BWVORG
MCW_BWVORG
Key
0FISCYEAR
Fiscal year
Custom
Reference
0BUS_AREA
Business
N/A
N/A
N/A
N/A
N/A
N/A
MEINS
N/A
KOSTL
KOSTL
N/A
KOKRS
KOKRS
N/A
N/A
Number
KDAUF
KDAUF
Material
MATNR
MATNR
N/A
Posting
BUDAT
BUDAT
N/A
GSBER
GSBER
party
WEMPF
WEMPF
Debit/Credit
SHKZG
SHKZG
NOPOS
MC_NOPOS
PERIV
PERIV
currency
WAERS
HWAER
0BASE_UOM
Base Unit
MEINS
0COSTCENTER
Cost Center
0SOLD_TO
0DCINDIC
0FISCVARNT
0CPNOITEMS
0LOC_CURRCY
0CO_AREA
area
Sold-to
Fiscal Year
Variant
BW:
Number
Local
Controlling
area
BW:
0DOC_NUM
0MATERIAL
0PSTNG_DATE
Document
date
0VENDOR
Vendor
LIFNR
ELIFN
N/A
0PLANT
Plant
WERKS
WERKS_D
N/A
MENGE
MC_MENG
N/A
DMBTR
MC_DMBTR
N/A
ROCANCEL
ROCANCEL
N/A
ROCANCEL
ROCANCEL
N/A
0QUANT_B
0VALUE_LC
0RECORDMODE
0STOCKCAT
Qty in Base
UoM
Amt. in local
curr.
Update
Mode
Stock
Category
0STOCKTYPE
0INDSPECSTK
Stock type
BSTAUS
BSTAUS
KZBWS
KZBWS
Valn of
Spec. Stock
N/A
N/A
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
Note: Note: This Data Target mapping has been activated per standard business content. No
modifications have been made.
A. Appendix
The following sections will explain in detail any customization that needs to be performed in the
respective areas
I. Infoobjects
This section contains configuration settings required for custom / customized info objects.
II. DataSources
This section contains configuration settings required for custom / customized data sources.
III. InfoSources
This section contains configuration settings required for custom / customized info objects.
a. Transfer Rules
This section contains configuration settings required for custom / customized transfer rules.
b. Communication Structures
This section contains configuration settings required for custom / customized communication structures.
IV. ODS
This section contains configuration settings required for custom / customized ODS.
Options
BEx Reporting
ODS Object Type Standard
Unique Data Records
Check table for InfoObject
Set quality status to 'OK' automatically
Activate ODS object data automatically
Update data targets from ODS object automatically
InfoObject
(Data Target)
Description
InfoObject
Update
(InfoSource) Routine
Standard
Business
Content
Active?
Key
Field
Custom
Reference
Section
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
V. InfoCube
This section contains configuration settings required for custom / customized Info Cube.
a. Into ODS
This section contains configuration settings required for custom / customized update rules into ODS.
b. Into InfoCube
INFOSET
Defining InfoSet:
An InfoSet is a semantic layer over the data sources and is not
itself a data target and describes the data sources that are
usually defined as JOINS for ODS Objects or INFOCUBESCharacteristics with Master data.
What is Join?
A time dependent join or temporal join is a join that contains
an InfoObject that is a time dependent characteristic.
InfoSets are 2 dimensional query that we build upon
ODS/InfoCube.
Use of InfoSets:
InfoSets allows you to report on several InfoProviders by using
combinations of master data bearing characteristics and ODS
objects.
InfoSets are good for simple reporting needs with low data
volumes and conservative performance expectations.
InfoSets are best suited for snap shot-type reporting.
FEW QUESTIONS:
What is Inner join & Left outer join in InfoSet?
What are classic InfoSets?
What are InfoSets?
Differences between Classic InfoSet and InfoSet?
FEW QUESTIONS:
What are aggregates?
What are MultiCubes?
Differences between Aggregates & MultiCubes
Can we create queries on aggregates or MultiCubes, how?
What are aggregation levels?
Aggregates have 3 names. What are they?
In what situations you use or recommend aggregates?
Why SAP recommends MultiCube?
5. INFOCUBE-UTILITIES
5.1. PARTITIONINGPartitioning is the method of dividing a table into multiple, smaller, independent or
related segments(either column wise or row wise) based on the fields available which would enable a
quick reference for the intended values of fields in the table.
For Partitioning a data set, at least among 2 partitioning criteria 0CALMONTH & 0FISCPER must be
there.
5.2. ADVANTAGES OF PARTITIONING: Partitioning allows you to perform parallel data reads of
multiple partitions speeding up the query execution process.
By partitioning an InfoCube, the reporting performance is enhanced because it is easier to search in
smaller tables, so maintenance becomes much easier.
Old data can be quickly removed by dropping a partition.
you can setup partitioning in InfoCube maintenance extras>partitioning.
5.3. CLASSIFICATION OR TYPES OF PARTITIONING
5.3.1. PHYSICAL PARTITIONING/TABLE/LOW LEVELPhysical Partitioning also called table/low level
partitioning is restricted to Time Characteristics and is done at Data Base Level, only if the underlying
database allows it.
Ex: Oracle, Informix, IBM, DB2/390
Here is a common way of partitioning is to create ranges. InfoCube can be partitioned on a time slice like
Time Characteristics as below.
FISCALYEAR( 0FISCYEAR)
FISCAL YEAR VARIANT( 0FISCVARNT)
FISCALYEAR/PERIOD(0FISCPERIOD)
POSTING PERIOD(OFISCPER3)
By this physical partitioning old data can be quickly removed by dropping a partition.
note: No partitioning in B.I 7.0, except DB2 (as it supports)
5.3.2. LOGICAL PARTITIONING/HIGH LEVEL PARTITIONINGLogical partitioning is done at
MultiCubes(several InfoCubes joined into a MultiCube) or MultiProvider level i.e. DataTarget level . in this
case related data are separated & joined into a MultiCube.
Here Time Characteristics only is not a restriction, also you can make position on Plan & Actual data,
Regions, Business Area etc.
Advantages:
As per the concept, MultiCube uses parallel sub-queries, achieving query performance ultimately.
Logical partitioning do not consume any additional data base space.
When a sub-query hits a constituent InfoProvider, a reduced set of data is loaded into smaller InfoCube
from large InfoCube target, even in absence of MultiProvider.
5.3.3. EXAMPLES ON PARTITIONING USING 0CALMONTH & 0FISCYEARTHERE ARE TWO
PARTITIONING CRITERIA:
calendar month (0CALMONTH)
fiscal year/period (0FISCPER)
At an instance we can partition a dataset using only one type among the above two criteria:
In order to make partition, at least one of the two InfoObjects must be contained in the InfoCube.
If you want to partition an InfoCube using the fiscal year/period (0FISCPER) characteristic, you have to
set the fiscal year variant characteristic to constant.
After activating InfoCube, fact table is created on the database with one of the number of partitions
(TRANSACTION SM50), the system log ( TRANSACTION SM21) and the job overview (TRANSACTION
SM37) either.
The application thinks that the data in the InfoCube is correct, the data of the affected requests or
partitions is not displayed in the reporting because they do not have a corresponding entry in the package
dimension.
Solution2: use the report SA P_DROP_FPARTITIONS</Z1) to remove the orphaned or empty partitions
from the affected f fact tables, as described in note 1306747, to ensure that the database limit of 255
partitions per database table is not reached unnecessarily.
5.3.6. REPARTITIONING:Repartitioning is a method of partitioning, used for a cube which is already
partitioned that has loaded data. Actual & Plan data versions come here. As we know, the InfoCube has
actual data which is already loaded as per plan data after partition. If we do repartition, the data in the
cube will be not available/little data due to data archiving over a period of time.
You can access repartitioning in the Data Warehousing Work Bench using Administrator>Context Menu
of your InfoCube.
5.3.6.1. REPARTITIONING - 3 TYPES: A) Complete repartitioning,
B) Adding partitions to an e fact table that is already partitioned and
C) Merging empty or almost empty partitions of an e fact table that is already partitioned
5.3.7. REPARTITIONING - LIMITATIONS- ERRORS:SQL 2005 partitioning limit issue: error in SM21
every minute as we reached the limit for number of partitions per SQL 2005(i.e. 1000)
5.4. COMPRESSION OR COLLAPSE:Compression reduces the number of records by combining
records with the same key that has been loaded in separate requests.
Compression is critical, as the compressed data can no longer deleted from the InfoCube using its
request ID's. You must be certain that the data loaded into the InfoCube is correct.
The user defined partition is only affecting the compressed E-Fact Table.
By default F-Fact Table contains data.
By default SAP allocates a Request ID for each posting made.
By using Request ID, we can delete/select the data.
As we know that E-Fact Table is compressed & F-Fact Table is uncompressed.
When compressed, data from F-Fact Table transferred to E-Fact Table and all the request ID's are lost /
deleted / set to null.
After compression, comparably the space used by E-Fact Table is lesser than F-Fact Table.
F-Fact Table is compressed uses BITMAP Indexes
E-Fact Table is uncompressed uses B-TREE Indexes
5.5. INDEX/INDICES
PRIMARY INDEXThe primary Index is created automatically when the table is created in the database.
SECONDARY INDEX(Both Bitmap & B-Tree are secondary indices)
Bitmap indexes are created by default on each dimension column of a fact table
& B-Tree indices on ABAP tables.
5.6. RECONSTRUCTION:Reconstruction is the process by which you load data into the same cube/ODS
or different cube/ODS from PSA. The main purpose is that after deleting the requests by
Compression/Collapse by any one, so if we want the requests that are deleted (old/new) we don't need to
go to source system or flat files for collecting requests, we get them from PSA.
Reconstruction of a cube is a more common requirement and is required when:1) A change to the
structure of a cube: deletion of characteristics/key figures, new characteristics/key figures that can be
previous reconstruction runs (for example, tests). If this data is loaded into BW, you will usually see
multiple values in the queries (exception: Key figures in an ODS object whose update is at overwrite).
ERROR 2: Incorrect data in BW, for individual documents for a period of reconstruction run. Why?
Solution: Documents were posted during the reconstruction.
Documents created during the reconstruction run then exist in the reconstruction tables as well as in the
update queues. This results in the creation of duplicate data in BW.
Example: Document 4711, quantity 15
Data in the PSA:
ROCANCEL DOCUMENT QUANTITY
4711 15 delta, new record
4711 15 reconstruction
Query result:
4711 30
Documents that are changed during the reconstruction run display incorrect values in BW because the
logic of the before and after images no longer match.
Example: Document 4712, quantity 10, is changed to 12.
Data in the PSA:
ROCANCEL DOCUMENT QUANTITY
X 4712 10- delta, before image
4712 12 delta, image anus
4712 12 reconstruction
Query result:
4712 14
ERROR 3: After you perform the reconstruction and restart the update, you find duplicate documents in
BW.
Solution: The reconstruction ignores the data in the update queues. A newly-created document is in the
update queue awaiting transmission into the delta queue. However, the reconstruction also processes
this document because its data is already in the document tables. Therefore, you can use the delta
initialization or full upload to load the same document from the reconstruction and with the first delta after
the reconstruction into BW.
After you perform the reconstruction and restart the update, you find duplicate documents in BW.
Solution: The same as point 2; there, the document is in the update queue, here, it is in the delta queue.
The reconstruction also ignores data in the delta queues. An updated document is in the delta queue
awaiting transmission into BW. However, the reconstruction processes this document because its data is
already contained in the document tables. Therefore, you can use the delta initialization or full upload to
load the same document from the reconstruction and with the first delta after the reconstruction into BW.
ERROR 4:Document data from time of the delta initialization request is missing from BW.
Solution: The RMBWV3nn update report was not deactivated. As a result, data from the update queue
LBWQ or SM13 can be read while the data of the initialization request is being uploaded. However, since
no delta queue (yet) exists in RSA7, there is no target for this data and it is lost.
5.7. ROLLUPRollup creates aggregates in an InfoCube whenever new data is loaded.
5.8. LINE ITEM DIMENSION/DEGENERATE DIMENSIONlf the size of a dimension of a cube is more
than 20% of the normal fact table, then we define that dimension as a Line Item Dimension.
Ex: Sales Document Number in one dimension is Sales Cube.
Sales Cube has sales document number and usually the dimension size and the fact table size will be the
same. But when you add the overhead of lookups for DIMID/SID's the performance will be very slow.
By flagging is as a Line Item Dimension, the system puts SID in the Fact Table instead of DMID for sales
document Number.
This avoids one lookup into dimension table. Thus dimension table is not created in this case. The
advantage is that you not only save space because the dimension table is not created but a join is made
between the two tables Fact & SID table(diagram 3) instead of 3 Tables Fact, Dimension & SID tables
(diagram 2)
Below image is for illustration purpose only( ESS Extended Star Schema)
1. Characteristic values
a. Selecting Single Value Variable
b. Selecting Single Value Variable as Variable Value Range Limit
c. (Combination of several a. or b.) Selecting Variables with Several Single Values or Value
Ranges
2. Text
- Format: Techinical name enclosed by ampersands (&)
3. Hierarchies
4. Hierarchy nodes
a. Variable hierarchy node with a fixed hierarchy
b. Variable hierarchy node with a variable hierarchy
5. Formula Elements
Variable Processing Types
1. User Entry/Default value
2. Replacement Path
a. Text variables and formula variables with the replacement path processing type replaced
by characteristic value.
b. Characteristic value vriables with the replacement path processing type replaced by
query result.
3. Customer Exit
Determine values for variables using function module exit. (EXIT_SAPLRRS0_001)
Create a project in tcode CMOD by selecting SAP enhancement RSR00001 and assigning
this to enhancement project. Activate project.
Tcode SMOD. Enter name of enhancement (RSR00001), choose Documentation then Edit
Display/Change.
4. SAP Exit
Delivered within SAP BW Business Content
5. Authorization
Data selection carried according to authorization
What we get is basically set of different information in one place related to: BW backend server version and its
highest level of Support Package, HTTP(s) prefixes, either SAP Portal is connected, either information broadcasting
is available, server code page, BEx web runtime, RFC destinations (Portal), URL prefixes (web reporting, JAVA
based BW-IP Modeler) and ports, workbooks, system category, etc. In addition there is information on BW's frontend
requirements from table Parameters of table RSFRONTENDINIT - BW's frontend requirements:
What we get is basically set of different information in one place related to: BW backend server version and its
highest level of Support Package, HTTP(s) prefixes, either SAP Portal is connected, either information broadcasting
is available, server code page, BEx web runtime, RFC destinations (Portal), URL prefixes (web reporting, JAVA
based BW-IP Modeler) and ports, workbooks, system category, etc. In addition there is information on BW's frontend
requirements from table Parameters of table RSFRONTENDINIT - BW's frontend requirements :-)
RSDCUBE
RSDCUBET
RSDCUBEIOBJ
RSDDIME
RSDDIMET
RSDDIMEIOBJ
RSDCUBEMULTI
InfoCube
Directory of InfoCubes
Texts on InfoCubes
Objects per InfoCube (where-used list)
Directory of Dimensions
Texts on Dimensions
InfoObjects for each Dimension (Where-Used List)
InfoCubes involved in a MultiCube
RSDICMULTIIOBJ
RSDICHAPRO
RSDIKYFPRO
RSDICVALIOBJ
RSDDAGGRDIR
RSDDAGGRCOMP
RSDDAGGRT
RSDDAGGLT
Aggregates
Directory of Aggregates
Description of Aggregates
Text on Aggregates
Directory of the aggregates, texts
RSDODSO
RSDODSOT
RSDODSOIOBJ
RSDODSOATRNAV
RSDODSOTABL
ODS Object
Directory of all ODS Objects
Texts of all ODS Objects
InfoObjects of ODS Objects
Navigation Attributes for ODS Object
Directory of all ODS Object Tables
----=
after this you will be getting 1rowcount(number of record) column in your list out put, select that column
and go for summation , at the endyou will get total no of records ,that will also help you finding the
duplicate records in your MP.Please follow the screen shot.