Professional Documents
Culture Documents
- Support:
----------
1. Doing the load Monitoring of process chains using ST13.
2. Incident Handling.
3. Handled CR (Change Request, already i have system running the want to do some
changes to existing query and exixting cube iam enable to handle thos CR's also
)
MODELLING
----------
what is the diff b/n reference and template?
when ur creating with reference u can't change the data type and lengh and it do
es not have its own table
when u create with template u can modfy the lenth and type and infoobject and it
does it own table
what is the diff beween sematic group and filter in dtp?
data with same keys are grouped into a single data package. it's also the key fi
elds for error stack. If there's no duplicates or error records, no need to defi
ne semantic group.
WITH the help of filters you can restrioct the data on the basis of characterist
ics.You can restrict a characteristic for a single value or for a range of value
s or exclude some values from the result.
For example if you want data to be loaded for a certain period you can put restr
iction on 0CALDAY and give the range from 20th jan to 10 feb and like this.
what is the diff b/n ods and cube?
ans: ods is overwite ,cube is additiv,ods is two dim,cube is multi dim'nal,perfo
rmence of cube much faster when comparing to ods,
generally when we want detaild level of reporting the we prefer for ods and if i
t is aggrigated report we do it on cube .
how to decide whether u want to load it to cube or dso?
ans:based on the delta process (roosource table but detail level of explanation
we can go with rodeltum )of the extractr,if it is AIE it always gives me after
imagefor changed record the i need to over write the data by loading into dso
but i f the delta proces is ABR(after before reverse image) whenever the doc cha
nges gives me the after and before image in that case i can addup those records
to existing records
so i can load it to Cube or ods to cube also.
can i set my ods as additive?yes u can
what is the diff between multiprovider and info set?
ans:multiprovider is union and infoset is join.
can i build a multiprovider on single cube? ans:yes
howmany cubes can i use in infoset ? ans:2
had u worked with info set?
ans: no i have .due to performance,i had a requirement but we did it by leveragi
ng the content into the master data from master data to getting joind transactio
n data we leveraged
when ur joind masterdata to transation does it support temporal join ?
no by default it is inner join,if u want work with temporal join u can go with i
nfoset .
i you have 5 cubes in multiproveder but i want report only from one what tod?
restrict the infoproviedr.
did u work with ip? ans :yes
if some body deleted ip.the how can u handle?
ans: basically i can restrict my data in selections if there is no possible of s
elections then i will do full load load it to ods make the data overwire and tak
e only changes
to dso to cube.
did u work with dtp? ans :yes
do u work with transfermations?
yes.for validation cheks
do u know how to transport objects?
collect bundled in rsa1,rquest and release in stms
specifc sequence u have to follow?
yes.info object then targets ,then ds ,then transfermations,ip ,dtp, multiprovid
ers
do u know ABAP or Do u comfort with writing routines?yes
what is start routine? ans:start routine is excuted packet by packet and it is e
xcuted before the field level rules.we will have table called source_package an
d we loop through the table and do all manuplations .so we can impliment all the
logics of data preperations as to get exceuted before filed level rulesand sou
rce package sture will have similar to source of structure
what is end routine?end routine is excuted packet by packet and it is ecuted aft
er the filed level rules.any logic u want excecute after the transfermations and
before taget we executed with end routine
and end routine will have a table called result_package and the structure of the
result_package is similar to ur target structure
did u work with expert routine? yes.basically where extracting demand to source
,i wanto split one record to multiple records,and i implineted expert routine wh
ere i was able to read from the table source_package
and manuplate the recors and update the records back to result _package ,recors
were updated result_package and is getting update to target
do u know about rule groups?yes ,in a single transfermation i have set multiple
sets of rule groups and when we loading data get processed once for every set of
rule groups
did u work with types of dso's?yes
what are they? standered,direct update,write-optimized.
in write-optimized dso.do we have activation process?no
does it have change log? no
when i load from write-optimazed dso to next level target does it support delta:
yes,how? ans:based on request number.because ur active table of activation proc
ess will have request number from there it takes delta to the nex
level tagert
what about direct-updated dso.does it support delta in datamart scenario?ans:no,
only full because it doesn't have change log and no rquest no.
does direct-updated dso support etl process?No
when do u use direct-updated dso?apd
what is APD?
apd is a data maining process where u can extract data from cubes and load into
ur direct-updated dso by performing all the transfermations in the middle
when you used apd what is the requirement ?
ans: i had a inventory cube which gives me stocks and had a demad cubes which gi
ve me the demands so i used bex query to get the stock and used cube directly as
source and joined them
i ihave written abap code to calculate demad aggregation ratio and i put that d
irect-update dso ,from di-updated dso im loading into cube on top of this cube
i have got reporting done which gives me stock,demand and demand aggregation
ADVANCE MODELLING
-----------------
did u work with compression?
can i delete my request once my compression is complete?
no.there is no requestid
cube is compressed,can i load delta to next datamart?
no.oce the compression is done no rquests are there ,delta is done base on req n
ubmer ,hence there is no request delta is not posible,we can do compression afte
r loading to data mart
did u workd on performence tuning?
compression,rollup,olap chache,aggrigates,line item dimentions.
i have acube .i have loaded 5 years of data into cube and compressed and data al
so rolled up to my aggregates now i see some wrong data i my cube .how to correc
t?
ans:deactivate aggrigates,now selective deletion and reload the corrective data
active the aggregates.
if u dont deactivate the aggergate cube getting still wrong data