You are on page 1of 10

i have 3.8 years of expierience in bibw.out of 3.

8 years i have worked with 3 d


ifferent client engagements ,two client engagemts were implimentation projects
and two were support
as part of implementaion engagemnet
1.i was involved in requirerement gathering from user,designing of funtional spe
cs,designing of technical specs
doing the build ,doing the developmet ,doing the integration test,performing u
nit test,transporting object dev to quality
performimg the uat test ,fixing defects as part of uat test,then i was involve
d in doing cutover activities where going live to the production by initialing
all
extractiors and scheduling the process chains next day onwards and also as part
of hypercare support after the implementation
2.i was involved in giving KT to the support team
3.i was involved in end user training to how to use those reports
4. when comes to the functional part i was involved in working with Apllication
Areas : SD (Sales Orders, Contracts, Nominations),
MM (Inventory Management, Purchasing), FI (FI-GL, AR,AP,AA),co ,indirect procur
ement,direct procurement.
5.involved in creating Bex Reports - with Variables (Replacement Path, Customer
Exit), .....
6. As part of the build involved in installing Business Content objects and cust
omized the same by creating a copy objects.
7. As part of unit testing involved in preparing test case, test script, Test Do
cuments.
7. Implemented LSA by building dataflows with DSO's (Standard & Write Optimized)
and Info Cubes, Multiprovider, Complex Transformations using (start Routine, En
d Routine, Expert Routine)
8. Built process chains for Batch scheduling of data loads.
9. Extracted data from SAP & legacy systems using SAP Connections, DB Connect, F
lat file Interfaces, External System BAPI
10. Configured Information broadcasting Setting to distribute the BEx Queries &
Workbooks to Email & portal.
11. Involved in handling query performance strategies using Compression, Aggrega
tes, DB Statistics, Indices.
12. As part of ECC extraction involved with LO Cockpit & Generic Extractors base
d on Table, View, Function Module and also cutomized the standard extractors usi
ng enhancement RSAP0001.

- Support:
----------
1. Doing the load Monitoring of process chains using ST13.
2. Incident Handling.
3. Handled CR (Change Request, already i have system running the want to do some
changes to existing query and exixting cube iam enable to handle thos CR's also
)
MODELLING
----------
what is the diff b/n reference and template?
when ur creating with reference u can't change the data type and lengh and it do
es not have its own table
when u create with template u can modfy the lenth and type and infoobject and it
does it own table
what is the diff beween sematic group and filter in dtp?
data with same keys are grouped into a single data package. it's also the key fi
elds for error stack. If there's no duplicates or error records, no need to defi
ne semantic group.
WITH the help of filters you can restrioct the data on the basis of characterist
ics.You can restrict a characteristic for a single value or for a range of value
s or exclude some values from the result.
For example if you want data to be loaded for a certain period you can put restr
iction on 0CALDAY and give the range from 20th jan to 10 feb and like this.
what is the diff b/n ods and cube?
ans: ods is overwite ,cube is additiv,ods is two dim,cube is multi dim'nal,perfo
rmence of cube much faster when comparing to ods,
generally when we want detaild level of reporting the we prefer for ods and if i
t is aggrigated report we do it on cube .
how to decide whether u want to load it to cube or dso?
ans:based on the delta process (roosource table but detail level of explanation
we can go with rodeltum )of the extractr,if it is AIE it always gives me after
imagefor changed record the i need to over write the data by loading into dso
but i f the delta proces is ABR(after before reverse image) whenever the doc cha
nges gives me the after and before image in that case i can addup those records
to existing records
so i can load it to Cube or ods to cube also.
can i set my ods as additive?yes u can
what is the diff between multiprovider and info set?
ans:multiprovider is union and infoset is join.
can i build a multiprovider on single cube? ans:yes
howmany cubes can i use in infoset ? ans:2
had u worked with info set?
ans: no i have .due to performance,i had a requirement but we did it by leveragi
ng the content into the master data from master data to getting joind transactio
n data we leveraged
when ur joind masterdata to transation does it support temporal join ?
no by default it is inner join,if u want work with temporal join u can go with i
nfoset .
i you have 5 cubes in multiproveder but i want report only from one what tod?
restrict the infoproviedr.
did u work with ip? ans :yes
if some body deleted ip.the how can u handle?
ans: basically i can restrict my data in selections if there is no possible of s
elections then i will do full load load it to ods make the data overwire and tak
e only changes
to dso to cube.
did u work with dtp? ans :yes
do u work with transfermations?
yes.for validation cheks
do u know how to transport objects?
collect bundled in rsa1,rquest and release in stms
specifc sequence u have to follow?
yes.info object then targets ,then ds ,then transfermations,ip ,dtp, multiprovid
ers
do u know ABAP or Do u comfort with writing routines?yes
what is start routine? ans:start routine is excuted packet by packet and it is e
xcuted before the field level rules.we will have table called source_package an
d we loop through the table and do all manuplations .so we can impliment all the
logics of data preperations as to get exceuted before filed level rulesand sou
rce package sture will have similar to source of structure
what is end routine?end routine is excuted packet by packet and it is ecuted aft
er the filed level rules.any logic u want excecute after the transfermations and
before taget we executed with end routine
and end routine will have a table called result_package and the structure of the
result_package is similar to ur target structure
did u work with expert routine? yes.basically where extracting demand to source
,i wanto split one record to multiple records,and i implineted expert routine wh
ere i was able to read from the table source_package
and manuplate the recors and update the records back to result _package ,recors
were updated result_package and is getting update to target
do u know about rule groups?yes ,in a single transfermation i have set multiple
sets of rule groups and when we loading data get processed once for every set of
rule groups
did u work with types of dso's?yes
what are they? standered,direct update,write-optimized.
in write-optimized dso.do we have activation process?no
does it have change log? no
when i load from write-optimazed dso to next level target does it support delta:
yes,how? ans:based on request number.because ur active table of activation proc
ess will have request number from there it takes delta to the nex
level tagert
what about direct-updated dso.does it support delta in datamart scenario?ans:no,
only full because it doesn't have change log and no rquest no.
does direct-updated dso support etl process?No
when do u use direct-updated dso?apd
what is APD?
apd is a data maining process where u can extract data from cubes and load into
ur direct-updated dso by performing all the transfermations in the middle
when you used apd what is the requirement ?
ans: i had a inventory cube which gives me stocks and had a demad cubes which gi
ve me the demands so i used bex query to get the stock and used cube directly as
source and joined them
i ihave written abap code to calculate demad aggregation ratio and i put that d
irect-update dso ,from di-updated dso im loading into cube on top of this cube
i have got reporting done which gives me stock,demand and demand aggregation

what is the impartance of changelogue table in dso?


ans:for delta and selective deletion also all the logs are captured in this
can i delete the data in change log table
yes u can to keep space .after data loading to nextlevel target we can keep delt
ing
generally we delete based on which are older than 7 days
what is the diff b/n display and nav attibute?
display attibute is completely depend on the char'stic ind the reporting and nav
attirube will act as the char in the reporting
why did u make it as act why dont u make it as char'stic in the cube ?
it gives the fact but nav attbute gives u the present truth and acts as char in
report.
what is the use of Attribute Chagerun?
when ever u load i gan use nav arrtibutes in to agrrigates , when i load master
data if it changes then nav attibute value i must reflect the changes to all agr
rigate wherever its been used
then we run attribute chane run which reflect all the aggrigates where ever nav
attibutes has been used
howmay max no tables does we create for master data with all options? ans:10 ,/s
(sid) /p(p table) ,/q(time depnd),/x(nav),/y(dime nav),/t(txt),h,i,k(hier)
do u know to load the hierachys from flat file?
ans:yes.creat a flat file and insert my infoobject as my infoprovider,connect th
em with transfermations when we formulate transfermations it does create multipl
e segments ,for each segment it will have diff set of mappings
multiple rulegroups and activate it,run the ip and run dtp. And if it R/3 if it
is business content hierachy datasource we install it,replicate connect it obje
ct and start loading ip and run the dtp.
how to put ip to process chain?
if only ip supports background process the we can put to pc
Did u work with process chains what is the t-code for process chains ?
ans:rspc,rspcm,st13 ,rspc1
if process chain runnig and failed,how do u fix it?
ans:if pc like index deletion or index creation fails ,if i am able to solve it
by repeating -right clik and repeat it ,this process is repeats in and automati
cally triggers its subsequent process also.
but if fixing process like DTP orIp i fix it manually it does not trigger the pr
ocess below it ,so i will give the program in SE38 as RSPC_process_finish by pro
cessing the parameters like
Log id ,process chain name,varient name,instance name and status "G" and saying
successful and excute it triggers the process below from there.thats all to hand
led it.
when u doing load monitoring what kind of errors did u see ?
we could see some invalid chars error ,while loading master we have some duplic
ate data in master data , and we have also have some problems of locks(dead loc
ks)
when i have multiple process running at a time they might face some locks so wai
t for some time and repeat the process
ex: attibute change run in some chain and rollup in other chain ,but every thur
sday the are coinsided and occur dead locks because both acr and rollups are con
filct process
in both cases we are trying to write data to aggrigates so wait for some time a
nd repeat its gothrough
and index get failed log not set ,short dump errors and ,run time errors (then w
e need to raise to Basis team ,if he also fails to solve it ,then we can rais to
SAP through OsS message through services.sap.com providing our licence
and our problem)
EXTRACTION
-----------
Do u know LO Extraction?steps?
ans:yes i had worked with LO. install DS in RSA5,then go to LBWE and inactivate
the DS and maintain the extract stucture ,generate the data source and specify t
he update mode as direct or qued or unserialised v3 delta
activate and replicate it and start connecting to target
what are the update modes and their diff'ce?
ans:direct delta,qued delta,unserialised v3 delta
direct delta:when the records are posted its directly goes to direct delta when
we run delta load it comes in to BW
qued delta: when we doing the posting its collect all LBW's to Extractor que,whe
n we schedule the v3 job it clollects the all LBW's from extractor que to delta
que ,when u run delta load from delta que it gets into BW
unserialised v3 update: when documents are posting it collects unserially to upd
ate que ,when we schedule the v3 job its colects to deltaque from update que ,wh
en we we run the delta load its get collected to BW
in delta que there is no v3 job ,in qued delta the postings are done serially,in
unserialisationv3 postings are done unserially
when we use what ?
when no fo posting are less and less frequency we use direct delta.
when u have number postings are huge and u want serailizatoion and requirement o
f v3 job and loading to ODS then we go for qued delta
when u have number postings are huge and u want unserailizatoion then we go for
unserializedd v3 update.
why setup tables for LO-extraction when we see in fi there is no setup tables ?
when we use lo-extraction each diffrent levels of data source (header,item and s
chedule line) but dont use setup table u extract header first , item fir and sc.
line first,the doc.nums are not sync.ed across all the levels
but if i used setup tables where i fill all the levels of information setup tabl
es frm there i have single data source to extract each level so all the doc's ar
e in synced
and in lo extraction i wanted have one ds to support each level of ds thats wher
e i am used setup tables to implimenting the concept of clustered tables,in data
base it is single table
where u fill the entire applicatiion but in applicationaly see the seperately ea
ch level of ds ,so will have one ds to extract each level of ds ,to support for
each level seperately
syn.se initialisesation across all level of cirtain application we use setup ta
bles.
do u know CO-Pa?(generally dont go to Co-pa) but yes.
Ans:Go to KEP0-provide ds name,select dataprocess create ,provide operating conc
ern name,generate,select the fileds what we wnat ,generate catalogue ,generate d
s ,start replicate and loading
Why sap not given ready made DS for CO-pa?
copa ds is base on operating concern ,the structure of operating concern is base
d on business requirement thats why it is not fixed ds. so they just provided th
e environment
where we can generate copa ds based on existing operating consern what its been
configured by fi-co consultnt.
tables:ce1 ,ce2,ce3 and ce4
what is diff costing base and accounting based?
costing base for each value we have a keyfigure ,accounting base single keyfigur
e diffntiate bsed on gl account
what is the delta mechanishm of co-pa?
time stamp.seeing the time stamp t-code(kep2)
GENERIC
-------
did u work with generic exteractors?
ans:yes .table ,view ,infoset ,function module.
did u work with function module?do u know how to write fm extactor?
what table u used?
ans: z_x_ph_om (what ever u want except vbaka,vbuk like that)
what is the purpose of the table?
ans: nominations didnt have any table,for contracts sap have standrd table and
orders there is standrd table .nominations are own client requirement ,so we had
a custome table
built in for nominations so we had extract data from 'z' table ( nominations) we
had used genric extracors on the table
ans:yes.i know to write fm
how to control packet size?
we use open cursor,fetch cursor and close cursor concepts and because when it ca
lled for packetid '0' i am opening cursorfor data initialization,for extraction
call fetch cursor
and keep ftching every time with respect to max size then close cursor thats al
l we controling the packet size.
how do u workout with data selections?
when u worked with fm ,we will have internal table called i_t_select which will
have all the selections ,and i will collect all selections to S_iF_select-t_sel
ect
i will loop through this table and fill all those into range variables ,while op
ening cursor i use this variables into the where clause.there the selections are
enabled
(OR)
what was the scenario for to go fm?
ans : if i have a custome table 'z' table having 2 col. created date and changed
date but i want delta mechanisms enabled for both colums ,normal delta mechanis
m eabled on single
colum ,if i go to normal standard functionality of a table , i have enable delta
on single column but here i want 2 cloums where current date = created date or
changed date
so i passing this current date to enable created date and changed date data se
lections in infopackage i hav e wiritten code for passing current date to it i
n the where clause of the
open cursor i am just reading of this selections into that i am saying where cre
at date =current date OR changed date=current date (default selections in ip pac
kage is AND but here were getting OR not AND),
so i want to work with OR so i used fm extractorto enable delta multiple colum
s created and chaged data
and also requirement delivey open item ,deliverys are available in lips and in b
illing bprt
what is the importance of i_t_Fields ?
ans:it is nothing but list of fileds what we need to extract from extraction th
ats why when we opening cursor for so ans so select instead of lisitg the all fi
leds we content of i_t_files.
do u extract with db connect?
ans:yes .right click specify the db-conection name and specify the table or view
name start extraction
did u worked with ds enchancement .what ws the scenario how u worked ?
ans:to implimet this contract mangement my standered vbap table was enhnced wit
h almost 20 fields like lessthan full truck load,lmc ,nomintion min value,nom ma
x value,nom date---cust filed
of business requiremnet ws enchanced into my vbap table which are in ready made
table .basiclly we are appeded to communication structure (mcvbap) in the mainta
nace we have moved this
to extract structure and u used enhancemnet RSAP0001 ,within that exit_saplr_s
ap_001 we brougt it ,we have table called c_t_date ,when case i_ds when my data
source
looping to the table to c_t_data ,performing look up select on the sales order n
o vble and posn item number to vbap and and reading all those 20 enhanced file
d filling
enhance fiedls to my extract structure modifying the record by 'syindx' or we us
e field symbol to start writing we have use assign the to the field symbol that
modifes the table
have u worked with inventory.ds?
yes.the ds are 2lis_03_bx,bf and um to cube 0ic_c03
Bx for initial stock,bf-is for mat .movements and um-for revaluation
how did u load?
we load bx-only once to generate initail stock and we compress with marker to up
date the marker value.
when i am loading my historical movements bf ,and bringing init loads to bo brin
g historical movements without marker update because i dont want to be update t
o marker
(if i do with marker then the loads filled with duplicates) ,then all regular d
elta loads to fill compression with marker update to update to marker value.(to
get current stack value directly)
what is the impartance of valid table?
validty table tell u that my stocks will valid to which date to which date but t
he valid table my olap engine will only refer the record where my request numb
er is -1
that date range of the record of where request no. is -1 referred in the reporti
ng
what is revaluation excatly?
suppose when u do purchsing your purchase order thousand .ur bf ds give the stoc
k and valuated of the stock also but when purchasing invoice is billed out when
you paid
let say u get some discount of 10 rs.but that is not recorded earlier .tose chan
ges what is revaluated in the finance department as to be loaded again back to m
y cube
those is what comes from revaluation
REPORTING
---------
what is the diff b/n condition and restrictions(filters) ?
restrictions are in the char'stic value and conditions are in the keyfigure valu
e.
what is the diff b/n rkf and new selection?
rkf is global and newselection is local
what is the importance of the condition?
we restrict based on keyfigures value
can u have multiple conditions?
ans:diffent condition we can play with AND, same condition play in with OR
what is the diff b/n ckf and new formulae?
ckf is global and nf is local . ckf can used all the keyfigures in the provider
but new formula can used only which are acted as structure elements only
what is the max no.of structures we can use?
we can create no.of structure but we can leverage only 2 structures.
when u use 2 structures?
to enable cell definition .by using cell definition we can perform the calculat
ion at each cell.
what is formula collision?
when i have a cell where collied two formulas and we can specified which cell to
which formula can be given preferenced.
did u work with variables ?types?
char,text,formula,hierachy nad hierarchy node variables.
processing types:defaul/manula entry,customer exit,replacemet path,sap exit,auth
orisation
did u work with formula with raplacement path?
yes i have actula goods issue date and planned good issue date both are chars
,suppose to calculate no.of days i cannot used chars in caldays directly
i used formula variable with replce actula goods issue date and another formula
variable replace with planned goods issu date ,using those formulas into calcula
tion
bring out no.of days
did u still used this scenario?
yes we have char'stic called 0material ,it had a keyfigre called attribute cost
statndrd price.i wanted this manuplate this standerd price to quantity from th
ecube
and calculate the sales amount .so for this i cannot used the keyfigure which i
s attribute directly in to calculation ,i was using the formula variable with
rlacemet path
replacing the 0materia attibute of standdard price into calculation then use for
mula with replacemnt path in this case also
did u used text variable with replcemet path?
yes i used text variable with replacement path whenever designing the trending
reports i wanted heading of the colums to be changed base on the year on month s
o i used this
(or)
when in trending rport where u want current month,next mnoth,previous month we w
ant text with replacement with caledar year month we are this
what are variable offset ?
ans: char offset variable we use +1 ,-1, instead of user entering variable we c
alculate current year,next year(text with offset variable is what part of the sr
ting do u want )
when u go with char wit replcaement path?
ans : let say u r running a report u want like to have a variable which takes
in the input as out put of the another query.so it has run the query one takes
the output
of results of the query as input to my variable then we go with char replacemt w
ith
i have negetive values in my report i want show those negetive values in paranth
esis.where do u controlit ?
ans: in query properties display vlues as negetive (value-)
eceptional aggrigation?
ans: i have billed amount ,i would like to know no.of orders ,eceptional agregat
ion,(count) with reference to order number.
did u work with cutmor-exit variable?
yes .i worked with customer exits like i-step1 and i-step 2 also.i wanted my var
iable on calender quater where i supposed to fill current
quater i created it and i also got step-2 where my user is filling the year ,ba
sed on the year i suppose to fill name of the cube on the multiprovider and info
provider variable
add year variable on calender variable ,i had another variable with on infoprovi
der where we suppose to fill cube name which was created as mandotory variable a
nd no ready for input is deselcted
so user enters the year at the runtime of the query i read the value from i_t_va
r_range from there i refered to year value from user enter variable based on i d
etermine provider name ,and
i put the proveider name to l_t_rane_low =my provider name ,sign, opt ,high and
low ,append that e_t_data thats get to my variable thats all with wotked i_ste
p2
i want report on turn around time to purchase order to goods issued ?
ans:goods issued date-puschaing order date give no of turn around days
how did you calculate no.days?
ans:formula with replacement path ,replace the dates into this , use formula var
iable replacement the actual goods issued date into and formula with replacent p
ath in th p.o
date when i do minus from this to that the its give no.of days
what type of reports u did?
ans: 1.ontime delevory report (ontime is sale wich are delivered ontime,not deli
vered ontime) ,i have planned goods issue date, actula goods issued date if my p
lanned are >= actual the
i say delivered on time otherwise no.
2.we have delevered report like position of sale(complete positiion of sale,what
is my open order value,closed order value ,what is my open delevry value,net sa
les.)
3.we delevered a balance sheet report ,profit and loss reports,contact complaint
s reports(howmany contracts,howmany complaits,non complaints,nomination valume c
ompliants ,date level complaints)
how many reports did u work?
25
how may querys?
7

ADVANCE MODELLING
-----------------
did u work with compression?
can i delete my request once my compression is complete?
no.there is no requestid
cube is compressed,can i load delta to next datamart?
no.oce the compression is done no rquests are there ,delta is done base on req n
ubmer ,hence there is no request delta is not posible,we can do compression afte
r loading to data mart
did u workd on performence tuning?
compression,rollup,olap chache,aggrigates,line item dimentions.
i have acube .i have loaded 5 years of data into cube and compressed and data al
so rolled up to my aggregates now i see some wrong data i my cube .how to correc
t?
ans:deactivate aggrigates,now selective deletion and reload the corrective data
active the aggregates.
if u dont deactivate the aggergate cube getting still wrong data

what is the diffrence b/n APD and OHD?


which one is better for query performance in RKf or New selection?
in which scenario we use w-optimized DSo?
When u are use info object as info provider?
what is technical key?
in which tab we have 'exceptioinal aggrigation' and what are the other options w
e have on that tab?
in which tab we have 'external acces 'having in bex?
what is diff b/n LO and FI?

You might also like