You are on page 1of 102

SQL Intro SQL Syntax SQL Select SQL Distinct Column1 SQL Where SQL And & Or SQL

Order By SQL Insert SQL Update SQL Delete SQL Top SQL Like SQL Wildcards SQL In SQL Between SQL Alias SQL Joins SQL Inner Join SQL Left Join SQL Right Join SQL Full Join SQL Union SQL Select Into SQL Create DB SQL Create Table SQL Constraints SQL Not Null SQL Unique SQL Primary Key SQL Foreign Key SQL Check SQL Default SQL Create Index SQL Drop SQL Alter SQL Increment SQL Views SQL Dates SQL Nulls SQL isnull() SQL Data Types

SQL Functions SQL avg() SQL count() SQL first() SQL last() SQL max() SQL min() SQL sum() SQL Group By SQL Having SQL ucase() SQL lcase() SQL mid() SQL len() SQL round() SQL now() SQL format() Describe the difference between a procedure, function and anonymous pl/sql block.

What is a mutating table error and how can you get around it? Describe the use of %ROWTYPE and %TYPE in PL/SQL What packages (if any) has Oracle provided for use by developers? Describe the use of PL/SQL tables How can you generate debugging output from PL/SQL? What are the types of triggers?
What are the components of physical database structure of Oracle database? What are the components of logical database structure of Oracle database What is a tablespace? What is SYSTEM tablespace and when is it created? Explain the relationship among database, tablespace and data file ? What is schema? What are Schema Objects? Can objects of the same schema reside in different tablespaces? What is Oracle table? What is an Oracle view? What is Full Backup ? What is the use of Control File ? What are Clusters ? What is an Integrity Constrains ? What is an Index ? What is Table ? What are the advantages of views? What is an Oracle sequence? What is a synonym? What is Rollback Segment ? How to define Data Block size What does ROLLBACK do ?

What does COMMIT do ? Define Transaction ? What is a Schema ? What is Parallel Server ? What are roles? How can we implement roles ? What is the use of transactional triggers? What is forms_DDL? Differentiate between TRUNCATE and DELETE?

Oracle cursor : Implicit & Explicit cursors


Other way to replace query result null value with a text What are the more common pseudo-columns? What is the output of SIGN function? What is the maximum number of triggers, can apply to a single table? . What are PL/SQL Cursor Exceptions? Any three PL/SQL Exceptions? Which date function returns number value? . Display Odd/ Even number of records . Display the number value in Words To view installed Oracle version information Find out nth highest salary from emp table Implicit Cursor attributes . Explicit Cursor attributes Oracle cursor : Implicit & Explicit cursors Display the records between two range How do I display row number with records? How do I eliminate the duplicate rows ? Switch to DOS prompt Change SQL prompt name To see current user name Basic

50. What is 5NF? 49. What is 4NF? What is BCNF (Boyce-Codd Normal Form)? What is a Database? What is DBMS? What is a Database system? Advantages of DBMS? . Disadvantage in File Processing System? . Describe the three levels of data abstraction? Define the "Integrity Rules" What is extension and intension? What is System R? What are its two major subsystems? How is the data structure of System R different from the relational structure? What is Data Independence?

What is a view? How it is related to data independence? What is Data Model? What is E-R model? What is Object Oriented model? What is an Entity? What is an Entity type? What is an Entity set? What is an attribute What is a Relation Schema and a Relation? What is degree of a Relation? What is Relationship? What is DDL (Data Definition Language)? What is VDL (View Definition Language)? What is SDL (Storage Definition Language)? What is Data Storage - Definition Language? What is DML (Data Manipulation Language)? What is DML Compiler? What is Query evaluation engine? What is DDL Interpreter? What is Record-at-a-time? What is Set-at-a-time or Set-oriented? What is Relational Algebra? What is Relational Calculus? . What is normalization? What is Functional Dependency? What is Multivalued dependency? What is 1 NF (Normal Form)? What is 2NF? What is 3NF? What is BCNF (Boyce-Codd Normal Form)? What is 4NF? What is 5NF? codd's rules
normalizer

code 1orcl 1orcl 1orcl 1orcl 1orcl

1orcl 1orcl

page D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl D:\E-BOOK\exercise\2_ORACLE\1orcl

2orcl

D:\E-BOOK\DBMS\RDBMS\WORD

2orcl

D:\E-BOOK\DBMS\RDBMS\WORD

2orcl 3orcl

D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD\3orcl D:\E-BOOK\DBMS\RDBMS\WORD\3orcl D:\E-BOOK\DBMS\RDBMS\WORD\3orcl

3orcl

D:\E-BOOK\DBMS\RDBMS\WORD\3orcl

D:\E-BOOK\DBMS\RDBMS\WORD\3orcl

3orcl 3orcl 3orcl 3orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl 4orcl

D:\E-BOOK\DBMS\RDBMS\WORD\3orcl D:\E-BOOK\DBMS\RDBMS\WORD\3orcl D:\E-BOOK\DBMS\RDBMS\WORD

D:\E-BOOK\DBMS\RDBMS\WORD

5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl

D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD

5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl 5orcl

D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD D:\E-BOOK\DBMS\RDBMS\WORD

codds norm

D:\E-BOOK\codd's D:\E-BOOK\codd's

What is Functional Dependency? What is hybrid slowly changing dimension? What is ER Diagram Can a dimension table contains numeric values? How to Create Surrogate Key using Ab Initio? Differences between star and snowflake schemas What is degenerate dimension table? What is a CUBE in datawarehousing concept? What is VLDB What are the vaious ETL tools in the Market What is the main differnce between schema in RDBMS and schemas in DataWarehouse....? What is meant by metadata in context of a Datawarehouse and how it is important? What is Dimensional Modelling What is a linked cube? What is surrogate key ? where we use it expalin with examples Wht r the data types present in bo?n wht happens if we implement view in the designer n report What is data validation strategies for data mart validation after loading process What is Data warehosuing Hierarchy? What is BUS Schema? What is conformed fact? What is Difference between E-R Modeling and Dimentional Why fact table is in normal form? What is the definition of normalized and denormalized view and what are the differences between them what is junk dimension? what is the difference between junk dimension and degenerated dimension? What is the main difference between Inmon and Kimball philosophies of data warehousing? What is the difference between view and materialized view What are the advantages data mining over traditional approaches? What are the different architecture of datawarehouse What are the steps to build the datawarehouse What is a source qualifier What is the difference between ODS and OLTP What is the difference between datawarehouse and BI? What is the difference between OLAP and datawarehosue what is aggregate table and aggregate fact table ... any examples of both Kindly numberWhat are the steps to be taken to schedule the explain in detail about type 1, type 2(SCD), type 3 ? What is active data warehousing? What is snapshot Why do we override the execute method is struts? Plz give me the details? What is the difference between Datawarehousing and BusinessIntelligence? What is the difference between datawarehouse and BI? What is a bo repository

What are the process involved in the datawarehousing project cycle

What is Requirement gathering? ) What is JAD? The approach that we use to decide which business intelligence tool to use are ) To decide which approach among buy v/s build has to be choosen for selecting the business intelligence tool depen Explain the factors that are necessary for the database/hardware selection?
RDBMS/Hardware Combination: Because the RDBMS physically sits on the hardware What are the popular relational databases? Which are the most commonly used business intelligence tools Which business tool provide flexibility in terms of each user to create, shedule & run their own reports?

) Which tool is used by the users to look at the data from the multiple dimensions?
What is modeling technique used in datawarehousing? What are the different levels of abstraction for a data model? What is OLAP and also discuss its types?

) Dimensional data model is used for What are the name of the tools in which people put the data modeling information? What are the criteria for evaluating OLAP vendors
Modeling Types Modeling Methods Business Process Business Process Tools Business Process Management(BPM) Advantages of BPM Business Process Re-engineering Business Process Modeling Business Process Modeling Tools Business Process Modeling Example Process Flow Modeling Data Flow Modeling Workflows Activity Monitoring Data Modeling Tools Tools: What to Learn? DM Tools - Erwin DM Tools - Xcase Development Cycle DM Standards Create a Data Model Data Modeler Role Modeling Reports Conceptual DM Enterprise DM Logical DM Physical DM Logical vs Physical Relational(OLTP) DM Dimensional DM Relational vs Dimensional Dimensions

Slowly Changing Dimensions DW Concepts DW & Data Mart Star Schema Snowflake Schema Fact Table ETL Tools ETL Concepts Informatica Informatica - Transformations Metadata Tools Business Metadata Technical Metadata Metadata & ETL Metadata Reports BI Overview BI Tools OLAP & Hybrids OLAP Analysis OLAP Database - Multidimensional Key Performance Indicators BI Dashboards BI Scorecards What is Data Mining?

CODE 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 1DWH 2DWH

PAGE D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH

2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 2DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH

D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH

3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH 3 DWH

D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH D:\E-BOOK\exercise\3_DWH

SERVER JOBS UniVerse Stages UniData 6 Stages UniData stages Sequential File stages Transformer stages Aggregator stages Folder stages Inter process link collector link partioner DataStage debugger complex flat file comand stage merge pivot sort

DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job DS Server job

Parallel complex flat file dataset file set lookup file set External source External target processing Aggregator transform change apply change capture ccompare compress copy Decode Difference Encode Expand External filter Filter Funnel Generic Join Lookup Merge Modify Pivot Remove Duplicate surrogate key generator

DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs

switch parallel data parellel Restructure column export column import combine Record Sub record make record promote subrecord split subrecord split vector Mainframe Delimited flatfile Extenal source Fixed with flat file Multi-format flat file agggregator Business rule External routine ftp join lookup sort Development debug column generator head peek Row generator sample Tail write range map

DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs DS parll jobs

Sequence Jobs Endloop activity Exception handler Excute command job activity notification activity routine activity sequencer start loop activity terminator activity use variable activity wait for file activity ????? routine variable

datastagedocs datastagedocs

parameter container repository designer director admin manager what are other performance tunings tell me one situation ur last project?had u solv the problem eliminate duplicate rows implement complex jobs pll for better performance diff btwn Informatica & datastages hasing algorithm/backup datastages debuging create SCD 1 2 3 how can call procedure how to you remove duplicates without using remove duplicate stage job control/job sequence how can we test jobs in datastage unit/integrate testing hoe can find out process id explain specific scenerio, whe wod we use range partioning How to take backup of project?

datastagedocs datastagedocs datastagedocs datastagedocs datastagedocs datastagedocs datastagedocs 51 61 80 126/259 127 142 154 183 214 215/219 243 248 262

1. What is the flow of loading data into fact & dimensional tables? 2. Orchestrate Vs Datastage Parallel Extender? 3. Differentiate Primary Key and Partition Key? 4. How do you execute datastage job from command line prompt? . What are Stage Variables, Derivations and Constants? 5. What is the default cache size? How do you change the cache size if needed? 6. Containers: Usage and Types? 7. Compare and Contrast ODBC and Plug-In stages? 8. How to run a Shell Script within the scope of a Data stage job? 9. Types of Parallel Processing? 10. 11. 12. What does a Config File in parallel extender consist of? Functionality of Link Partitioner and Link Collector? What is Modulus and Splitting in Dynamic Hashed File?

Types of views in Datastage Director? . Differentiate Database data and Data warehouse data? What are the difficulties faced in using DataStage ? or what are the constraints in using DataStage 18. What r XML files and how do you read data from XML files and what stage to be used? 19. How do you catch bad rows from OCI stage? Suppose if there are million records did you use OCI? if not then what stage do you prefer?

22. How do you pass the parameter to the job sequence if the job is running at night? What happens if the job fails at night? 26. How do you track performance statistics and enhance it? 27. What is the order of execution done internally in the transformer with the stage editor having input links on the lft h 28. What are the often used Stages or stages you worked with in your last project? 29. How many jobs have you created in your last project? 30. Tell me the environment in your last projects 31. Did you Parameterize the job or hard-coded the values in the jobs? 32. 33. 34.

Have you ever involved in updating the DS versions like DS 5.X, if so tell us some the steps you have taken in doin What is Hash file stage and what is it used for? What are Static Hash files and Dynamic Hash files?

- What versions of DS you worked with? 39. 40. 41. 42. 39. 40. 41. 45. 46. 47. 48.
Ans :

Did you work in UNIX environment? How good are you with your PL/SQL? Explain the differences between Oracle8i/9i? Do you know about INTEGRITY/QUALITY stage? Do u know about METASTAGE? Do you know about INTEGRITY/QUALITY stage? Explain the differences between Oracle8i/9i? How did u connect to DB2 in your last project? What are OConv () and Iconv () functions and where are they used? What are Routines and where/how are they written and have you written any routines before? &nb If worked with DS6.0 and latest versions what are Link-Partitioner and Link-Collector used for? How did you handle reject data?

49.

What are other Performance tunings you have done in your last project to increase the performance of slowly running jobs?

50.
Ans :

How did you handle an 'Aborted' sequencer? What are Sequencers? How did u connect with DB2 in your last project? Read the String functions in DS What will you in a situation where somebody wants to send you a file and use that file as an input How would call an external Java function which are not supported by DataStage? What is the utility you use to schedule the jobs on a UNIX server other than using Ascential Direct What are the command line functions that import and export the DS jobs? How will you determine the sequence of jobs to load into data warehouse? Tell me one situation from your last project, where you had faced problem and How did u solve it? Does the selection of 'Clear the table and Insert rows' in the ODBC stage send a Truncate How do you rename all of the jobs to support your new File-naming conventions?

51. 52. 53. 54. 55. 56. 57. 58. 61. 61. 62.

63. 64. 65. 66. 67. 68. 69. 70.

Difference between Hashfile and Sequential File? How can we join one Oracle source and Sequential file?. How can we implement Slowly Changing Dimensions in DataStage?. How can we implement Lookup in DataStage Server jobs? What are all the third party tools used in DataStage? what is the difference between routine and transform and function? what are the Job parameters?

Ans Performance and tuning of DS jobs available. 71. 72. 73. 74. 75. How can we create Containers? When should we use ODS? Whats difference betweeen operational data stage (ODS) & data warehouse? How to handle Date convertions in Datastage? Convert a mm/dd/yyyy format to yyyy-dd-mm? How do you pass filename as the parameter for a job?

D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage

D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage

D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage D:\E-BOOK\exercise\1_DATASTAGE\new datastage

C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc

C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc C:\Documents and Settings\Administrator\Desktop\LOTUSCRORE\ds_doc

having input links on the lft hand side and output links?

steps you have taken in doing so?

of slowly running jobs?

Performance tuning in Informatica? What are the out put files that the informatica server creates during the session running? What is the target load order? What are the new features of the server manager in the informatica 5.0? Can you start a batches with in a batch? What are the diffrence between joiner transformation and source qualifier transformation? What are the new features in Informatica 5.0? In a sequential batch can you run the session if previous session fails? Can you use the maping parameters or variables created in one maping into any other reusable transformation? Can you generate reports in Informatcia? Can you copy the session to a different folder or repository? Can you copy the session to a different folder or repository? Describe two levels in which update strategy transformation sets? Why you use repository connectivity? What is source qualifier transformation? What is incremantal aggregation? Explain about perform recovery? How the informatica server sorts the string values in Ranktransformation? In which circumstances that informatica server creates Reject files? What are the joiner caches? Performance tuning in Informatica? What is the target load order? What are the new features of the server manager in the informatica 5.0? Can you start a batches with in a batch? What are the diffrence between joiner transformation and source qualifier transformation? What are the new features in Informatica 5.0? Can you use the maping parameters or variables created in one maping into any other reusable transformation? What are the different options used to configure the sequential batches? What are the methods for creating reusable transforamtions? What is tracing level and what r the types of tracing level? How can u work with remote database in informatica?did u work directly by using remote connections? Why use the lookup transformation? What are the types of data that passes between informatica server and stored procedure? What is update strategy transformation? What r the circumstances that infromatica server results an unreciverable session? What is power center repository? What is Load Manager? What is Datadriven? What is meant by lookup caches? Define Informatica Repository?

What are the types of maping wizards that are to be provided in Informatica? What is the Rankindex in Ranktransformation? What are two modes of data movement in Informatica Server? What is aggregate cache in aggregator transforamtion? What are Target Types on the Server? What are Target Options on the Servers? What is the status code? Difference between static cache and dynamic cache? While importing the relational source defintion from database, what are the meta data of source you import? What are Dimensions and various types of Dimensions? What are the session parameters? What is Session and Batches? What is Router transformation? What are the reusable transforamtions? How can you access the remote source into your session? Where should you place the flat file to import the flat file defintion to the designer? Why we use partitioning the session in informatica? What is parameter file? Which transformation should we use to normalize the COBOL and relational sources? Which transformation should you need while using the cobol sources as source defintions? What are the join types in joiner transformation? After draging the ports of three sources(sql server,oracle,informix) to a single source qualifier, can u map these three ports directly to target? If i done any modifications for my table in back end does it reflect in informatca warehouse or maping desginer or source analyzer? Can you copy the batches? Can you use the maping parameters or variables created in one maping into another maping?

1.What is target load order? How do you configure mapping in informatica? When conventional datawarehousing is able to handle and answer complex queries than wat is the need of OLTP and OLAP? How can we join 3 database like Flat File, Oracle, Db2 in Informatrica.?
How do you transfert the data from data warehouse to flatfile?

How can u work with remote database in informatica?did u work directly by using remote connections? What is the Rankindex in Ranktransformation? What is aggregate cache in aggregator transforamtion? There are no target options for ERP target type Difference between summary filter and details filter? What is the diff b/w STOP & ABORT in INFORMATICA sess level ? What is difference between partioning of relatonal target and partitioning of file targets? What are partition points? What is parameter file? Which transformation should we use to normalize the COBOL and relational sources? Which transformation should u need while using the cobol What are the join types in joiner transformation? What is the logic will you implement to laod the data in to one

Can you copy the batches? What are the real time problems generally come up while What is Micro Strategy? Why is it used for? Can any one explain in detail about it? Waht are main advantages and purpose of using Normalizer Transformation in Informatica? Can you use the maping parameters or variables created in one maping into another maping? Can you copy the batches? Can any one explain real time complain mappings or complex transformations in Informatica.

what are the difference What are the different threads in DTM process? What is difference between maplet and reusable transformation? Define maping and sessions? What are the mapping paramaters and maping variables? When you use the maping parameter ,U declare and use the What are the tasks that Loadmanger process will do? When you sart a session loadmanger fetches the session information from the repository to perform the validations and verific What is tracing level? Can we run a group of sessions without using workflow manager How do you create single lookup transformation using multiple tables? How can we join the tables if the tables have no primary and forien key relation and no matchig port to join? Describe two levels in which update strategy transformation sets? How do you handle decimal places while importing a flatfile into informatica? How do you handle decimal places while importing a flatfile into informatica What is parameter file? What is aggregate cache in aggregator transforamtion Why you use repository connectivity? When you create a version of a folder referenced by shortcuts, all What is source qualifier transformation? What is a source qualifier What is source qualifier transformation What is incremantal aggregation? what are the steps you follow What is exact use of 'Online' and 'Offline' server connect Options while defining Work flow in Work flow monitor? . How the informatica server sorts the string values in Ranktransformation In which circumstances that informatica server creates Reject files How the informatica server sorts the string values in Ranktransformation What are the joiner caches? What are variable ports and list two situations when they can How to load time dimension? We can use a MQSeries SQ when we have a MQ messaging system as source(queue In a sequential Batch how can we stop single session?

Can you start a session inside a batch idividually What is a view? How it is related to data independence?And what are the different types of views,and what is Materialize view What are various types of Aggregation? What is mystery dimention How do you create a mapping using multiple lookup transformation How can we eliminate duplicate rows from flat file? What is the look up transformation? What is the look up transformation? How to get two targets T1 containing distinct values and T2 why did u use update stategy in your application? What r the options in the target session of update strategy transsformatioin? How do we do unit testing in informatica? How can u complete unrcoverable sessions? What is the difference between connected and unconnected stored procedures. Can u start a batches with in a batch How can U create or import flat file definition in to the warehouse designer? What r the different types of Type2 dimension maping? What are the mapings that we use for slowly changing dimension table How many ways you create ports? How many ways you can update a relational source defintion and what are they? What are the basic needs to join two sources in a source qualifier? What are the different options used to configure the sequential batches? What are the methods for creating reusable transforamtions? What r the active and passive transforamtions? What is tracing level and what r the types of tracing level? How can u work with remote database in informatica?did u work directly by using remote connections? Why use the lookup transformation ]How can I detect aggregate tranformation causing low performance How do you load the time dimension What are the types of data that passes between informatica server and stored procedure? What is update strategy transformation Why sorter transformation is an active transformation? How to append the records in flat file(Informatica) ? Where as in Datastage we have the options In which condtions we can not use joiner transformation(Limitaions of joiner transformatio If you have four lookup tables in the workflow. How do you troubleshoot to improve performance How many types of dimensions are available in informatica? Where should U place the flat file to import the flat file defintion to the designer What is difference b/w Informatica 7.1 and Abinitio How many types of facts and what are they?

1.What is the filename which you need to configure in UNIX while installing infromatica?

What happens if you increase commit intervals and also When you decrease your commit interval to ~10000 rows your session will be slow when compared to previous but if the session fails at 24000th record you will lose only 4000 records

What is hash partition? What is the approximate size of data warehouse? view is nothing but an sql query stored. it will not store data in tables But in the case of M.views the changes that made in table cannot be seen in M.view.M.views have separate memery allocation if we delete the table the M.view will be there. M.view is read only view,D.D.L oerations are not possible How is Data Models Used in Practice? What is active and passive transformation? Sorter is an active transformation!! IT never changes the total row count, it changes only the order!! which can change the no.of output records. So it is Why do we use DSS database for OLAP tools? What is up date strategy and what are the options for update What is data merging, data cleansing and sampling What is staging area 1.What is query panel? How can you define a transformation? What are different What is hash partition? Which kind of index is preferred in DWH? What is power play plug in? What is difference macros and prompts? What is IQD file? What is the Difference between PowerPlay transformer and power play reports? What is the capacity of power cube? What is confirmed dimension? What is factless fact schema? What is meta data and system catalog? What are the Advantages of de normalized data?

If i done any modifications for my table in back end does it reflect in informatca warehouse or maping desginer or source analyze What are the circumstances that infromatica server results an unreciverable session? How can you recover the session in sequential batches? How to recover the standalone session? If a session fails after loading of 10,000 records in to the target. How can you load the records from 10001 th record when u run the session next time? Explain about Recovering sessions? What is difference between stored procedure transformation and external procedure transformation? What are the scheduling options to run a sesion? what is incremantal aggregation? What are the new features in Informatica 5.0?

How can u work with remote database in informatica?did you work directly by using remote connections? What is power center repository? What is Performance tuning in Informatica? what are the transformations that restricts the partitioning of sessions? What is parameter file? How can you access the remote source into your session? What is difference between partioning of relatonal target and partitioning of file targets? What are the session parameters? How can u stop a batch? Can you start a session inside a batch individually? Can you start a batches with in a batch? In a sequential batch can u run the session if previous session fails? What is a command that used to run a batch? When the informatica server marks that a batch is failed? How many number of sessions that u can create in a batch? Can you copy the batches? What is batch and describe about types of batches? What is polling? In which circumstances that informatica server creates Reject files? What aer the out put files that the informatica server creates during the session running? What are the different threads in DTM process? What is DTM process? Why you use repository connectivity? How the informatica server increases the session performance through partitioning the source? To achieve the session partition what r the necessary tasks Why we use partitioning the session in informatica? Which tool yoU use to create and manage sessions and batches and to monitor and stop the informaticaserver? Define maping and sessions? What is metadata reporter? What are two types of processes that informatica runs the session? How can you recognise whether or not the newly added rows in the source r gets insert in the target ? What are the mapings that we use for slowly changing dimension table? What are the types of maping in Getting Started Wizard? What are the types of maping wizards that r to be provided in Informatica? What are the types of maping wizards that r to be provided in Informatica? What are the options in the target session of update strategy transsformatioin? What is Datadriven? What is the default source option for update stratgey transformation? what is update strategy transformation ? What are the basic needs to join two sources in a source qualifier? What is the default join that source qualifier provides? What is the target load order? What are the tasks that source qualifier performs? What is source qualifier transformation? What is the status code?

Why we use stored procedure transformation? What are the types of groups in Router transformation? What is the Router transformation? What is the Rankindex in Ranktransformation? What are the rank caches? How the informatica server sorts the string values in Rank transformation? Which transformation should we use to normalize the COBOL What are the types of lookup caches? what is the size of persistance cache? how we can identifiey the previous persistance cache? can we use the persistance cache on another mapping? What are the Differences between connected and unconnected lookup? What are the joiner caches What are the join types in joiner transformation? what are the settiings that u use to cofigure the joiner transformation? What are the diffrences between joiner transformation and source qualifier transformation? How can yoU improve session performance in aggregator transformation? What are the unsupported repository objects for a mapplet? What are the methods for creating reusable transforamtions? What are the reusable transforamtions? How many ways you create ports? What are the connected or unconnected transforamations? What are the active and passive transforamtions What are the designer tools for creating tranformations? what is a transforamation How can yoU create or import flat file definition in to the warehouse designer To provide support for Mainframes source data,which files rused as a COBOL file Howmany ways yoU can update a relational source defintion and what are they? While importing the relational source defintion from database,what are the meta data of source U import?

What are parallel querys and query hints? Reference cursors have 2 types. What is aggregate awareness? What is difference between Mapplet and reusable transformation? How many repositories can we create in Informatica? Explain grouped cross tab What is the Difference between DSS & OLTP? What is source qualifier? What are mapping parametres and variables in informatica? what are presession,postsession success and postsession failure commands ? How to identify bottlenecks in How to enter same record twice in the target table,explain? What are the different threads in DTM process? What are the different types of schemas If session fails after loading 10000 records in the target,how can we load 10001 th record when we run the session in the nexttime? What is facttable granularity What are reusable transformations in how many ways we can create them?

What is confirmed dimension and fact? What are the two modes of datamovement in informatica sever? What is status code? What is the difference between OLTP and ODS? What are slowly changing dimensions? AT the max how many transformations and mapplets can we use in a mapping ? After dragging the ports of 3 sources we map these ports directly to target and how? How can we eliminate duplicate rows from flatfile,explain? If we have lookup table in workflow how do you trouble shhot to increase performance? can we generate reports in informatica ? How? What are limitations of joiner transformation? How can we join the tables if they don't have primary and foreign key relationship and no matching port?How abt using a full outer join as the type of join in the Name 4 output files that informatica server creates during session running? What is the functionality of update strategy? What are teh different tasks that can be created in workflow manager? What are the new features of informatica 7.1? Explain the flow of data in Iinformatica? Explain one complecated mapping? what are the real time problems generally come up while doing or running mapping or any transformation? What is exact use of 'Online' and 'Offline' server connect Options while defining Work flow in Work flow ? what is the logic will you implement to laod the data in to one factv from 'n' number of dimension? what is the difference between Informatica 7.1 and Abinitio? What is Micro Strategy? Why is it used for? what is the difference between stop and abort Two relational tables are connected to SQ Trans,what are the possible errors it will be thrown? what are cost based and rule based approaches and what is the difference? Explain about the concept of mapping parameters and variables ? What are the different types of Type2 dimension maping? What are the types of maping in Getting Started Wizard? What are the out put files that the informatica server creates during the session running? What are the data movement modes in informatcia? What is difference between maplet and reusable How to recover the standalone session? How to recover the standalone session? If you done any modifications for a table in back end does it reflect in informatca warehouse or maping? How to recover sessions in concurrent batches If a session fails after loading of 10,000 records in to the target.How can you load the records from 10001? Why we use lookup transformations? What is Code Page Compatibility? What is tracing level and what are the types of tracing What are Target Options on the Servers? What are the types of metadata that stores in repository? Define informatica repository? In a sequential batch can you run the session if previous session fails? When the informatica server marks that a batch is failed? Can you copy the session to a different folder or repository? In which circumstances that informatica server creates Reject files? What is aggregate cache in aggregator transforamtion? What are the reusable transforamtions?

what is a time dimension? give an example? Discuss the advantages & Disadvantages of star & snowflake shema What is the difference between Normal load and Bulk load? what is a junk dimension ? What is the procedure to load the fact table.Give in detail? What is the use of incremental aggregation? why dimenstion tables are denormalized in nature ? What is the difference between Power Centre and Power Mart? what are the enhancements made to Informatica 7.1.1 version when compared to 6.2.2 version? what is the exact meaning of domain? How do you handle decimal places while importing a flatfile into informatica? What is IQD file? What is data merging,data cleansing,sampling? How to import oracle sequence into Informatica? what is worklet and what use of worklet and in which situation we can use it? what happens if you try to create a shortcut to a non-shared folder? If you want to create indexes after the load process which transformation you choose? Where is the cache stored in informatica what is surrogatekey ? In ur project in which situation u has used ? explain with example ? what is Partitioning ? where we can use Partition? what are the different types of transformation available in informatica and what are the mostly used ? How to recover sessions in concurrent batches? what is the gap analysis? How to view and Generate Metadata Reports? How to call stored Procedure from Workflow monitor in Informatica 7.1 version? What is critical mapping? How to improve the performance of Aggregate transformation? How do you create single lookup transformation using multiple tables? In update strategy target table or flat file which gives more performance ? why? what is the flow? what is the architecture of any Data warehousing project? what is lookup ?

What is sql override ? Can we have multiple conditions in a Lookup? Can we have multiple conditions in a Filter? How the flags are called in Update strategy? Is it possible to run the session other than Server manager? If so how? What is diff. Things u can do using PMCMD? What is the use of power plug? What kind of Test plan? What kind of validation you do?

What is the usage of unconnected/connected look up?

What is the difference between Connected and Unconnected Lookups ?

If u have data coming from diff. sources what transformation will u use in your designer? What are different ports in Informatica? What is a Variable port? Why it is used?

Diff between Active and passive transormation ?

What are Mapplet?

What is Aggregate transformation

What is Router Transformation? How is it different from Filter transformation?

What are connected and unconnected transformations?

What is Normalizer transformation?

How to use a sequence created in Oracle in Informatica? What are source qualifier transformations? What are cache and their types in Informatica?

What is an incremental aggregation?

What is Reject loading?

WHAT IS SESSION and BATCHES?

Significance of Source Qualifier Transformation

What are 2 modes of data movement in Informatica Server?

Why we use lookup transformations?

What are confirmed dimensions

What is Data warehousing

What is a reusable transf.. What is a mapplet . Explain diff. Bet them

What happens when u use the delete or update or reject or insert statement in your update strategy?

Where do u design ur mappings Where do u define users and privileges in Informatica How do u debug the data in Informatica tool when u run the session does debugger loads the data to target ? Can u use flat file and table (relational) as source together ? suppose I need to separate the data for delete and insert to target depending on the codition, which transformation u use ? What is the difference between lookup Data cache and Index cache.

What is an indicator file and how it can be used.

Different Tools in Designer

Components of Informatica

Different Tools in Workflow Manager

What is overview window ? Why it is used ? While using Debugger, how will you fined out which transformation is currently running ? How do u load the data using Informatica? What is an Filter Transformation? or what options u have in Filter Transformation?

What happens to the discarded rows in Filter Transformation. What are the two programs that communicate with the Informatica Server?

What u can do with Designer ?

What are different types of Tracing Levels u hv in Transformations?

What is Mapplet and how do u create Mapplet?

If data source is in the form of Excel Spread sheet then how do use?

When do u use connected lookup n when do u use unconnected lookup?

How many values it (informatica server) returns when it passes thru Connected Lookup n Unconncted Lookup?

What kind of modifications u can do/perform with each Transformation?

Expressions in Transformations, Explain briefly how do u use?

In case of Flat files (which comes thru FTP as source) has not arrived then what happens What does a load manager do ?

What is a cache

What is an Expression transformation?

I have two sources S1 having 100 records and S2 having 10000 records, I want to join them, using joiner transformation. Which of these two sources (S1,S2) should be master to improve my performance? Why? I have a source and I want to generate sequence numbers using mappings in informatica. But I dont want to use sequence generator transformation. Is there any other way to do it? What is a bad file? What is the first column of the bad file?

What are the contents of the cache directory in the server Is lookup a Active transformation or Passive transformation ? What is a Mapping? What are the types of transformations If a sequence generator (with increment of 1) is connected to (say) 3 targets and each target uses the NEXTVAL port, what value will each target get? Have you used the Abort, Decode functions?

What do you know about the Informatica server architecture? Load Manager, DTM, Reader, Writer, Transformer

What are the default values for variables? How many ways you can filter the records?

How do you identify the bottlenecks in Mappings?

How to improve the Session performance?

What is Business components ? Where it exists ?

What are Short cuts ? Where it is used ?

While importing the relational source definition from database, what are the meta data of source U import?

. How many ways U can update a relational source definition and what r they?

How many ways u create ports?

.What r the unsupported repository objects for a mapplet?

What r the mapping parameters and mapping variables?

Can U use the mapping parameters or variables created in one mapping into another mapping?

Can u use the mapping parameters or variables created in one mapping into any other reusable transformation? How can U improve session performance in aggregator transformation?

.What r the difference between joiner transformation and source qualifier transformation?

In which conditions we can not use joiner transformation(Limitations of joiner transformation)?

What r the settings that u use to configure the joiner transformation?

What r the join types in joiner transformation?

How the informatica server sorts the string values in Rank transformation?

What is the Rank index in Rank transformation?

What is the Router transformation?

What r the types of groups in Router transformation?

Why we use stored procedure transformation?

What r the types of data that passes between informatica server and stored procedure?

What is the status code?

What r the tasks that source qualifier performs?

What is the default join that source qualifier provides?

. What r the basic needs to join two sources in a source qualifier?

what is update strategy transformation ?

Describe two levels in which update strategy transformation sets?

What is the default source option for update strategy transformation? What is Data driven?

What r the options in the target session of update strategy transformation?

What r the types of mapping wizards that r to be provided in Informatica?

What r the types of mapping in Getting Started Wizard?

What r the mappings that we use for slowly changing dimension table?

What r the different types of Type2 dimension mapping?

How can u recognize whether or not the newly added rows in the source r gets insert in the target ?

What r two types of processes that informatica runs the session?

What r the features of the server manager in the informatica 5.0?

Can u generate reports in Informatica?

What is metadata reporter?

Define mapping and sessions?

Which tool U use to create and manage sessions and batches and to monitor and stop the informatica server? Why we use partitioning the session in informatica?

To achieve the session partition what r the necessary tasks u have to do?

How the informatica server increases the session performance through partitioning the source?

. Why u use repository connectivity?

What r the tasks that Load manger process will do?

What is DTM process?

What r the different threads in DTM process?

What r the data movement modes in informatica?

What r the out put files that the informatica server creates during the session running?

In which circumstances that informatica server creates Reject files?

What is polling?

Can u copy the session to a different folder or repository?

. What is batch and describe about types of batches?

Can u copy the batches? .How many number of sessions that u can create in a batch?

When the informatica server marks that a batch is failed? . What is a command that used to run a batch What r the different options used to configure the sequential batches?

In a sequential batch can u run the session if previous session fails?

Can u start batches with in a batch?

Can u start a session inside a batch individually?

How can u stop a batch?

What r the session parameters?

What is parameter file?

How can u access the remote source into Ur session?

What is difference between portioning of relational target and partitioning of file targets?

what r the transformations that restricts the partitioning of sessions?

. Performance tuning in Informatica?

Define informatica repository?

What r the types of metadata that stores in repository?

What is power center repository?

How can u work with remote database in informatica? did u work directly by using remote

what is incremental aggregation?

. What r the scheduling options to run a session?

What is tracing level and what r the types of tracing level?

What is difference between stored procedure transformation and external procedure transformation?

Explain about Recovering sessions?

If a session fails after loading of 10,000 records in to the target. How can u load the records from 10001st record when u run the session next time?

Explain about perform recovery?

How to recover the standalone session?

How can u recover the session in sequential batches?

How to recover sessions in concurrent batches?

How can u complete unrecoverable sessions?

What r the circumstances that informatica server results an unrecoverable session?

If i've done any modifications for my table in back end does it reflect in informatica warehouse or mapping designer or source analyzer?

After dragging the ports of three sources(sql server,oracle,informix) to a single source qualifier, can u map these three ports directly to target? Server Variables

Folders

Multiple Servers

..\..\..\informatica q\3188398-informatica.pdf INFORMATICA pdf INFORMATICA pdf

..\..\..\informatica q

INFORMATICA pdf D:\informatica q

INFORMATICA pdf

D:\informatica q

INFORMATICA pdf

D:\informatica q INFORMATICA pdf

INFORMATICA pdf

D:\informatica q

INFORMATICA pdf

INFORMATICA pdf INFORMATICA pdf D:\informatica q

..\..\..\informatica q\info inview qustion more\info inview.doc D:\informatica q\info inview qustion more

t is the need of OLTP and OLAP?

D:\informatica q\info inview qustion more

D:\informatica q\info inview qustion more

perform the validations and verifications prior to starting DTM process.

chig port to join?

D:\informatica q\info inview qustion more

w in Work flow monitor? .

views,and what is Materialize view

D:\informatica q\info inview qustion more

D:\informatica q\info inview qustion more

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

g desginer or source analyze

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

..\..\..\informatica q\IPC\all_int2.doc

Overriding a SQL in source qualifier or lookup for additional logic. Yes Yes 0 - DD_INSERT , 1- DD_UPDATE , 2- DD_DELETE , 3DD_REJECT YES USING PMCMD Start, Stop and abort the session For 3rd party connectors to sap, mainframe, Peoplesoft In Informatica we create some test SQL to compare the number or records and validate scripts if the data in the warehouse is loaded for the logic incorporated.

We use a lookup for connecting to a table in the source or a target. There are 2 ways in which a lookup can be configured i.e. connected or unconnected Connected Lookup Receives input values directly from the pipeline. Unconnected Lookup Receives input values from the result of a :LKP expression in another transformation. Connected Lookup You can use a dynamic or static cache. Unconnected Lookup You can use a static cache. Connected Lookup Cache includes all lookup columns used in the mapping (that is, lookup table columns included in the lookup condition and lookup table columns linked as output ports to other transformations). Unconnected Lookup Cache includes all lookup/output ports in the lookup condition and the lookup/return port.Connected Lookup Can return multiple columns from the same row or insert into the dynamic lookup cache. Unconnected Lookup The dynamic lookup cache, Designate one return port (R). Returns one column from each row. Connected Lookup If there is no match for the lookup condition, the Informatica Server returns the default value for all output ports. If you configure dynamic caching, the Informatica Server inserts rows into the cache. Unconnected Lookup If there is no match for the lookup condition, the Informatica Server returns NULL. Connected Lookup Pass multiple output values to another transformation. Link lookup/output ports to another transformation. Unconnected Lookup Pass one output value to another transformation. The lookup/output/return port passes the value to the transformation calling: LKP expression.Connected Lookup Supports user-defined default values. Unconnected Lookup Does not support user-defined default values Joiner Transformation Input, Output, Variable, Return/Rank, Lookup and Master. Variable port is used to store intermediate results. Variable ports can reference input ports and variable ports, but not output ports. Transf can be active or passive , activ tranf can change the no of rec passed th it, a passive tranf can never change the rec cnt, Active trsnf that might change the rec cnt are advan ext proc, aggrega, filter,joiner,normalizer, rank , updte strategy, source qualifier, if u use powerconnect to access erp sources, erp source quail is also an active tranfor Passive tranf :- lookup, expression, external procedure, seq generator, stored procedure U can connect only 1 active tranf to the same tranf or target can connect any no of pass tranf A mapplet is a reusable object that represents a set of transformations. It allows you to reuse transformation logic and can contain as many transformation as you need.

An aggregator transformation allows you to perform aggregate calculations, such as average and sums. The Aggregator transformation is unlike the Expression transformation, in that you can use the Aggregator transformation to perform calculations on groups. A Router transformation is similar to a Filter transformation because both transformations allow you to use a condition to test data. A Filter transformation tests data for one condition and drops the rows of data that do not meet the condition. However, a router transformation tests data for one or more conditions and gives you the option to route rows of data that do not meet any of the conditions to default output group. Connected transformations are the transformation, which are in the data flow, whereas unconnected transformation will not be in the data flow. These are dealt in Lookup and Stored procedure transformations. Normalizer transformation normalizes records from COBOL and relational sources allowing you to organize the data according to your needs. A normalizer transformation can appear anywhere in a data flow when you normalize a relational source. By using Stored procedure transformation. The source qualifier represents the records that the Informatica Server reads when it runs a session. The Informatica server creates index and data cache for aggregator, Rank, joiner and Lookup transformations in a mapping. The Informatica server stores key values in the index cache and output values in the data cache. In Incremental aggregation, you apply captured changes in the source to aggregate calculations in a session. If the source changes only incrementally and you can capture changes, you can configure the session to process only those changes. This allows the Informatica server to update your target incrementally, rather than forcing it to process the entire source and recalculate the same calculation each time you run the session. During a session, the Informatica server creates a reject file for each target instance in the mapping. If the writer or the target rejects data, the Informatica server writes the rejected row into reject file. The reject file and session log contain information that helps you determine the cause of the reject. You can correct reject files and load them to relational targets using the Informatica reject load utility. The reject loader also creates another reject file for the data that the writer or target reject during the reject loading.

SESSION - A Session Is A set of instructions that tells the Informatica Server How And When To Move Data From Sources To Targets. After creating the session, we can use either the server manager or the command line program pmcmd to start or stop the session. BATCHES - It Provides A Way to Group Sessions For Either Serial Or Parallel Execution By The Informatica Server. There Are Two Types Of Batches : 1. SEQUENTIAL - Run Session One after the Other. 2. CONCURRENT - Run Session At The Same Time. When you add a relational or a flat file source definition to a mapping, you need to connect it to a Source Qualifier transformation. The Source Qualifier represents the records that the Informatica Server reads when it runs a session. To join data originating from the same DB. Filter records in the Source itself. To specify an outer join instead of a default inner join. To specify sorter ports. To select distinct values from the source. To create a custom query to issue a special select statement for the Informatica server to read source data. For example, we might use a custom query to perform aggregate calculations or execute a stored procedure. The data movement mode depends on whether Informatica Server should process single byte or multi-byte character data. This mode selection can affect the enforcement of code page relationships and code page validation in the Informatica Client and Server. a) Unicode IS allows 2 bytes for each character and uses additional byte for each non-ascii character (such as Japanese characters) b) ASCII IS holds all data in a single byte Lookup Transformations can access data from relational tables that are not sources in mapping. With Lookup transformation, we can accomplish the following tasks: a) Get a related value - Get the Employee Name from the Employee table based on the Employee ID b) Perform Calculation Update slowly changing dimension tables - We can use unconnected lookup transformation to determine whether the records already exist in the target or not. Confirmed dimensions are linked to multiple fact tables

A DW is a DB used for query,analysis and reporting . By definition DW is a subject oriented, intergrated, non volatile and time variant Subject Oriented:- Represents a subject Aread like sales, Mktg Integrated :- Data Colleted from multiple source systems integrated into a user readable unique format Ex:- male, female ,0,1, M,F, T, F Non Volatile :- Dw stores historical data Time Variant :- Stores data timewise like weekly,monthly,quarterly, yearly

Reusable tranformation:- if u want to create tranfor that perform common tasks such as avg sal in a dept Mapplet:- Is a reusuable object that represents a set of transformations Inserts:- treats all records as inserts , while inserting if the record violates primary, foreign key or foreign key in the database it rejects the record designer Repository manager Use debugger in designer If you select the option discard target data then it will not load to target yes Router or filter Index cache:Contains columns used in condition Data cache: :Contains other output columns than the condition columns. Indicator file is used for Event Based Scheduling when you dont know when the Source Data is availaible., A shell command ,script or a batch file creates and send this indicator file to the directory local to the Informatica Server.Server waits for the indicator file to appear before running the session. Source Analyzer Warehouse designer Transformation Developer Maplet designer Mapping designer Designer Workflow Manager Workflow Monitor Task Developer Worklet designer Workflow Designer

Its a window in which you can see all the transformations that are used for a mapping. The left hand corner of the transformation that has an arrow looks like moving. Using workflow manager The Filter transformation provides the means for filtering records in a mapping. You pass all the rows from a source transformation through the Filter transformation, then enter a filter condition for the transformation. All ports in a Filter transformation are input/output, and only records that meet the condition pass through the Filter transformation. Discarded rows do not appear in the session log or reject files Informatica provides Server Manager and pmcmd programs to communicate with the Informatica Server: Server Manager. A client application used to create and manage sessions and batches, and to monitor and stop the Informatica Server. You can use information provided through the Server Manager to troubleshoot sessions and improve session performance. pmcmd. A command-line program that allows you to start and stop sessions and batches, stop the Informatica Server, and verify if the Informatica Server is running. The Designer client application provides five tools to help you create mappings: Source Analyzer. Use to import or create source definitions for flat file, Cobol, ERP, and relational sources. Warehouse Designer. Use to import or create target definitions. Transformation Developer. Use to create reusable transformations. Mapplet Designer. Use to create mapplets. Mapping Designer. Use to create mappings.

Tracing Levels in Transformations :Level Description Terse Indicates when the Informatica Server initializes the session and its components. Summarizes session results, but not at the level of individual records. Normal Includes initialization information as well as error messages and notification of rejected data. Verbose initialization Includes all information provided with the Normal setting plus more extensive information about initializing transformations in the session. Verbose data Includes all information provided with the Verbose initialization setting. Note: By default, the tracing level for every transformation is Normal. To add a slight performance boost, you can also set the tracing level to Terse, writing the minimum of detail to the session log when running a session containing the transformation. A mapplet is a reusable object that represents a set of transformations. It allows you to reuse transformation logic and can contain as many transformations as you need. Create a mapplet when you want to use a standardized set of transformation logic in several mappings. For example, if you have a several fact tables that require a series of dimension keys, you can create a mapplet containing a series of Lookup transformations to find each dimension key. You can then use the mapplet in each fact table mapping, rather than recreate the same lookup logic in each mapping.

PowerMart and PowerCenter treat a Microsoft Excel source as a relational database, not a flat file. Like relational sources, the Designer uses ODBC to import a Microsoft Excel source. You do not need database permissions to import Microsoft Excel sources. To import an Excel source definition, you need to complete the following tasks: Install the Microsoft Excel ODBC driver on your system. Create a Microsoft Excel ODBC data source for each source file in the ODBC 32-bit Administrator. Prepare Microsoft Excel spreadsheets by defining ranges and formatting columns of numeric data. Import the source definitions in the Designer. Once you define ranges and format cells, you can import the ranges in the Designer. Ranges display as source definitions when you import the source.

A connected Lookup transformation is part of the mapping data flow. With connected lookups, you can have multiple return values. That is, you can pass multiple values from the same row in the lookup table out of the Lookup transformation. Common uses for connected lookups include: => Finding a name based on a number ex. Finding a Dname based on deptno => Finding a value based on a range of dates => Finding a value based on multiple conditions Unconnected Lookups : An unconnected Lookup transformation exists separate from the data flow in the mapping. You write an expression using the :LKP reference qualifier to call the lookup within another transformation. Some common uses for unconnected lookups include: => Testing the results of a lookup in an expression => Filtering records based on the lookup results => Marking records for update based on the result of a lookup (for example, updating slowly changing dimension tables) => Calling the same lookup multiple times in one mapping

Connected Lookup can return multiple values where as Unconnected Lookup will return only one values that is Return Value.

Using transformations, you can modify data in the following ways: ----------------- -----------------------Task Transformation ----------------- -----------------------Calculate a value Expression Perform an aggregate calculations Aggregator Modify text Expression Filter records Filter, Source Qualifier Order records queried by the Informatica Server Source Qualifier Call a stored procedure Stored Procedure Call a procedure in a shared library or in the External Procedure COM layer of Windows NT Generate primary keys Sequence Generator Limit records to a top or bottom range Rank Normalize records, including those read Normalizer from COBOL sources Look up values Lookup Determine whether to insert, delete, update, Update Strategy or reject records Join records from different databases Joiner or flat file systems

Expressions in Transformations To transform data passing through a transformation, you can write an expression. The most obvious examples of these are the Expression and Aggregator transformations, which perform calculations on either single values or an entire range of values within a port. Transformations that use expressions include the following: --------------------- -----------------------------------------Transformation How It Uses Expressions --------------------- -----------------------------------------Expression Calculates the result of an expression for each row passing through the transformation, using values from one or more ports. Aggregator Calculates the result of an aggregate expression, such as a sum or average, based on all data passing through a port or on groups within that data. Filter Filters records based on a condition you enter using an expression. Rank Filters the top or bottom range of records, based on a condition you enter using an expression. Update Strategy Assigns a numeric code to each record based on an expression, indicating whether the Informatica Server should use the information in the record to insert, delete, or update the target. In each transformation, you use the Expression Editor to enter the expression. The Expression Editor supports the transformation language for building expressions. The transformation language uses SQL-like functions, operators, and other components to build the expression. For example, as in SQL, the transformation cause server to fail/stop the session. U get a fatal error which language includes the functions The Load Manager is the primary PowerCenter Server process. It accepts requests from the PowerCenter Client and from pmcmd. The Load Manager runs and monitors the workflow. It performs the following tasks: Starts the session, creates DTM process and sends pre & post session emails. 1.Manages the session and batch scheduling 2.locks the session and reads the session properties. 3. expands the session and server variables and parameters 4. validates the source and target code pages 5. verifies the permissions and privileges 6. creates session log file 7. creates DTM process which executes the session

Temporary memory area used to store intermediate results

Expression transformation is used to calculate expressions on a row by row basis. Total_sal = Com * sal

S1 should be the master as it contains few records so that the usage of cache can be reduced , S2 should be detail. YES, Write a stored procedure and use Stored Procedure Transformation Bad file is the file which contains the data rejected by the writer or target. Record / Row indicator 0,1,2,3 0 insert -- Rejected by writer/target 1- update -- Rejected by writer/target 2- delete -- Rejected by writer/target 3-reject -- Rejected by writer --- coz update st. has marked it for reject.

Index cache files and Data caches Passive. Mapping Represent the data flow between source and target Passive and active Each target will get the value in multiple of 3 Abort can be used to Abort / stop the session on an error condition. If the primary key column contains NULL, and you need to stop the session from continuing then you may use ABORT function in the default value for the port. It can be used with IIF and DECODE function to Abort the session. o Load Manager is the first process started when the session runs. It checks for validity of mappings, locks sessions and other objects. o DTM process is started once the Load Manager has completed its job. It starts a thread for each pipeline. o Reader scans data from the specified sources. o Writer manages the target/output data. o Transformer performs the task specified in the mapping.

String = Null, Number = 0, Date = 1/1/1753 1. Source Qualifier 2. Filter transformation 3. Router transformation 4. Ranker 5. Update strategy

Bottlenecks can occur in 1. Targets The most common performance bottleneck occurs when the informatica server writes to a target database. You can identify target bottleneck by configuring the session to write to a flat file target. If the session performance increases significantly when you write to a flat file, you have a target bottleneck. Solution : Drop or Disable index or constraints Perform bulk load (Ignores Database log) Increase commit interval (Recovery is compromised) Tune the database for RBS, Dynamic Extension etc., 2. Sources Set a filter transformation after each SQ and see the records are not through. If the time taken is same then there is a problem. You can also identify the Source problem by Read Test Session where we copy the mapping with sources, SQ and remove all transformations and connect to file target. If the performance is same then there is a Source bottleneck. Using database query Copy the read query directly from the log. Execute the query against the source database with a query tool. If the time it takes to execute the query and the time to fetch the first row are significantly different, then the query can be modified using optimizer hints. Solutions: Optimize Queries using hints. Use indexes wherever possible. 3. Mapping

1 Run concurrent sessions 2 Partition session (Power center) 3. Tune Parameter DTM buffer pool, Buffer block size, Index cache size, data cache size, Commit Interval, Tracing level (Normal, Terse, Verbose Init, Verbose Data) The session has memory to hold 83 sources and targets. If it is more, then DTM can be increased. The informatica server uses the index and data caches for Aggregate, Rank, Lookup and Joiner transformation. The server stores the transformed data from the above transformation in the data cache before returning it to the data flow. It stores group information for those transformations in index cache. If the allocated data or index cache is not large enough to store the date, the server stores the data in a temporary disk file as it processes the session data. Each time the server pages to the disk the performance slows. This can be seen from the counters . Since generally data cache is larger than the index cache, it has to be more than the index. 4. Remove Staging area 5. Tune off Session recovery 6. Reduce error tracing

It is available in navigator inside the folder.

Shortcuts allow you to use metadata across folders without making copies, ensuring uniform metadata. A shortcut inherits all properties of the object to which it points. Once you create a shortcut, you can configure the shortcut name and description. When the object the shortcut references changes, the shortcut inherits those changes. By using a shortcut instead of a copy, you ensure each use of the shortcut matches the original object. For example, if you have a shortcut to a target definition, and you add a column to the definition, the shortcut inherits the additional column. Scenario1 Here is a table with Single Row, in a target table the same row should be populated 10 times. Using Normalizer, we can do it. Hint : Normalizer / Occurs make it 10 and Have 10 inputs and a output. You will get 10 rows.

Source name Database location Column names Data types Key constraints

Two ways 1. Edit the definition 2. Re-import the definition

Two ways 1. Drag the port from another transformation 2. Click the add button on the ports tab.

COBOL source definition Joiner transformations Normalizer transformations Non reusable sequence generator transformations. Pre or post session stored procedures Target definitions Power mart 3.5 style Look Up functions XML source definitions IBM MQ source definitions Mapping parameter represents a constant value that U can define before running a session. A mapping parameter retains the same value throughout the entire session. When u use the mapping parameter ,U declare and use the parameter in a mapping or maplet. Then define the value of parameter in a parameter file for the session. Unlike a mapping parameter, a mapping variable represents a value that can change throughout the session. The informatica server saves the value of mapping variable to the repository at the end of session run and uses that value next time U run the session.

NO. We can use mapping parameters or variables in any transformation of the same mapping or mapplet in which U have created mapping parameters or variables.

Yes. Because reusable transformation is not contained with any maplet or mapping. Use sorted input.

U can join heterogeneous data sources in joiner transformation which we can not achieve in source qualifier transformation. U need matching keys to join two relational sources in source qualifier transformation. Where as u doesn't need matching keys to join two sources. Two relational sources should come from same data source in sourcequalifier. U can join relational sources which r coming from different sources also.

Both pipelines begin with the same original data source. Both input pipelines originate from the same Source Qualifier transformation. Both input pipelines originate from the same Normalizer transformation. Both input pipelines originate from the same Joiner transformation. Either input pipelines contains an Update Strategy transformation. Either input pipelines contains a connected or unconnected Sequence Generator transformation. Master and detail source Type of join Condition of the join

Normal (Default) Master outer Detail outer Full outer When the informatica server runs in the ASCII data movement mode it sorts session data using Binary sort order. If U configure the session to use a binary sort order, the informatica server calculates the binary value of each string and returns the specified number of rows with the highest binary values for the string.

The Designer automatically creates a RANKINDEX port for each Rank transformation. The Informatica Server uses the Rank Index port to store the ranking position for each record in a group. For example, if you create a Rank transformation that ranks the top 5 salespersons for each quarter, the rank index numbers the salespeople from 1 to 5:

Input group Output group A Router transformation is similar to a Filter transformation because both transformations allow you to use a condition to test data. However, a Filter transformation tests data for one condition and drops the rows of data that do not meet the condition. A Router transformation tests data for one or more conditions and gives you the option to route rows of data that do not meet any of the conditions to a default output group. If you need to test the same input data based on multiple conditions, use a Router Transformation in a mapping instead of creating multiple Filter transformations to perform the same task

The designer copies property information from the input ports of the input group to create a set of output ports for each output group. Two types of output groups User defined groups Default group U can not modify or delete default groups. For populating and maintaining data bases.

3 types of data Input/Out put parameters Return Values Status code.

Status code provides error handling for the informatica server during the session. The stored procedure issues a status code that notifies whether or not stored procedure completed sucessfully. This value can not seen by the user. It only used by the informatica server to determine whether to continue running the session or stop.

Join data originating from same source data base. Filter records when the informatica server reads source data. Specify an outer join rather than the default inner join specify sorted records. Select only distinct values from the source. Creating custom query to issue a special SELECT statement for the informatica server to read source data.

Inner equi join.

Two sources should have primary and Foreign key relation ships. Two sources should have matching data types.

This transformation is used to maintain the history data or just most recent changes in to target table. Within a session. When you configure a session, you can instruct the Informatica Server to either treat all records in the same way (for example, treat all records as inserts), or use instructions coded into the session mapping to flag records for different database operations. Within a mapping. Within a mapping, you use the Update Strategy transformation to flag records for insert, delete, update, or reject. Data driven. The informatica server follows instructions coded into update strategy transformations with in the session mapping determine how to flag records for insert, update, delete or reject. If u do not choose data driven option setting, the informatica server ignores all update strategy transformations in the mapping.

Insert Delete Update Update as update Update as insert Update else insert Truncate table The Designer provides two mapping wizards to help you create mappings quickly and easily. Both wizards are designed to create mappings for loading and maintaining star schemas, a series of dimensions related to a central fact table. Getting Started Wizard. Creates mappings to load static fact and dimension tables, as well as slowly growing dimension tables. Slowly Changing Dimensions Wizard. Creates mappings to load slowly changing dimension tables based on the amount of historical dimension data you want to keep and the method you choose to handle historical dimension data.

Simple Pass through mapping : Loads a static fact or dimension table by inserting all rows. Use this mapping when you want to drop all existing data from your table before loading new data. Slowly Growing target : Loads a slowly growing fact or dimension table by inserting new rows. Use this mapping to load new data when existing data does not require updates.

Type1: Rows containing changes to existing dimensions are updated in the target by overwriting the existing dimension. In the Type 1 Dimension mapping, all rows contain current dimension data. Use the Type 1 Dimension mapping to update a slowly changing dimension table when you do not need to keep any previous versions of dimensions in the table. Type 2: The Type 2 Dimension Data mapping inserts both new and changed dimensions into the target. Changes are tracked in the target table by versioning the primary key and creating a version number for each dimension in the table. Use the Type 2 Dimension/Version Data mapping to update a slowly changing dimension table when you want to keep a full history of dimension data in the table. Version numbers and versioned primary keys track the order of changes to each dimension. Type 3: The Type 3 Dimension mapping filters source rows based on user-defined comparisons and inserts only those found to be new dimensions to the target. Rows containing changes to existing dimensions are updated in the target. When updating an existing dimension, the Informatica Server saves existing data in different columns of the same row and replaces the existing data with the updates

Type2 Dimension/Version Data Mapping: In this mapping the updated dimension in the source will gets inserted in target along with a new version number. And newly added dimension in source will insert into target with a primary key. Type2 Dimension/Flag current Mapping: This mapping is also used for slowly changing dimensions. In addition it creates a flag value for changed or new dimension. Flag indicates the dimension is new or newly updated. Recent dimensions will gets saved with current flag value 1. And updated dimensions r saved with the value 0. Type2 Dimension/Effective Date Range Mapping: This is also one flavor of Type2 mapping used for slowly changing dimensions. This mapping also inserts both new and changed dimensions in to the target. And changes r tracked by the effective date range for each version of each dimension.

In the Type2 mapping we have three options to recognize the newly added rows Version number Flag value Effective date Range

Load manager Process: Starts the session, creates the DTM process, and sends post-session email when the session completes. The DTM process. Creates threads to initialize the session, read, write, and transform data, and handle pre- and postsession operations. U can use command line arguments for a session or batch. This allows U to change the values of session parameters, and mapping parameters and mapping variables. Parallel data processing: This feature is available for power center only. If we use the informatica server on a SMP system, U can use multiple CPU's to process a session concurrently. Process session data using threads: Informatica server runs the session in two processes. Explained in previous question.

Yes. By using Metadata reporter we can generate reports in informatica.

It is a web based application that enables you to run reports against repository metadata. With a meta data reporter, u can access information about Ur repository with out having knowledge of sql, transformation language or underlying tables in the repository.

Mapping: It is a set of source and target definitions linked by transformation objects that define the rules for transformation. Session : It is a set of instructions that describe how and when to move data from source to targets.

Informatica server manager. Partitioning achieves the session performance by reducing the time period of reading the source and loading the data into target.

Configure the session to partition source data. Install the informatica server on a machine with multiple CPU's.

For relational sources informatica server creates multiple connections for each partition of a single source and extracts separate range of data for each connection. Informatica server reads multiple partitions of a single source concurently. Similarly for loading also informatica server creates multiple connections to the target and loads partitions of data concurrently. For XML and file sources, informatica server reads multiple files concurrently. For loading the data informatica server creates a separate file for each partition(of a source file).U can choose to merge the targets.

When u edit, schedule the session each time, informatica server directly communicates the repository to check whether or not the session and users r valid. All the metadata of sessions and mappings will be stored in repository.

Manages the session and batch scheduling: When u start the informatica server the load manager launches and queries the repository for a list of sessions configured to run on the informatica server. When u configure the session the load manager maintains list of list of sessions and session start times. When u start a session load manger fetches the session information from the repository to perform the validations and verifications prior to starting DTM processLocking and reading the session: When the informatica server starts a session load manager locks the session from the repository. Locking prevents U starting the session again and again. Reading the parameter file: If the session uses a parameter files, load manager reads the parameter file and verifies that the session level parameters are declared in the file Verifies permission and privileges: When the session starts load manger checks whether or not the user have privileges to run the session.

After the load manger performs validations for session, it creates the DTM process. DTM is to create and manage the threads that carry out the session tasks. I creates the master thread. Master thread creates and manages all the other threads.

Master thread: Creates and manages all other threads Mapping thread: One mapping thread will be creates for each session. Fetches session and mapping information. Pre and post session threads: This will be created to perform pre and post session operations. Reader thread: One thread will be created for each partition of a source. It reads data from source. Writer thread: It will be created to load data to the target. Transformation thread: It will be created to transform data.

Data movement modes determines how informatica server handles the character data. U choose the data movement in the informatica server configuration settings. Two types of data movement modes available in informatica. ASCII mode Uni code mode.

Informatica server log: Informatica server(on UNIX) creates a log for all status and error messages(default name: pm.server.log).It also creates an error log for error messages. These files will be created in informatica home directory. Session log file: Informatica server creates session log file for each session. It writes information about session into log files such as initialization process, creation of sql commands for reader and writer threads, errors encountered and load summary. The amount of detail in session log file depends on the tracing level that u set. Session detail file: This file contains load statistics for each target in mapping. Session detail include information such as table name, number of rows written or rejected. U can view this file by double clicking on the session in monitor window Performance detail file: This file contains information known as session performance details which helps U where performance can be improved. To generate this file select the performance detail option in the session property sheet. Reject file: This file contains the rows of data that the writer does not write to targets. Control file: Informatica server creates control file and a target file when U run a session that uses the external loader. The control file contains the information about the target flat file such as data format and loading instructions for the external loader. Post session email: Post session email allows U to automatically communicate information about a session run to designated recipients. U can create DD_Reject inmessages. One if the When it encounters the two different update strategy transformation. Violates database constraint Filed in the rows was truncated or overflowed. It displays the updated information about the session in the monitor window. The monitor window displays the status of each session when U poll the informatica server

Yes. By using copy session wizard u can copy a session in a different folder or repository. But that target folder or repository should consists of mapping of that session. If target folder or repository is not having the mapping of copying session , u should have to copy that mapping first before u copy the session. Grouping of session is known as batch. Batches r two types Sequential: Runs sessions one after the other Concurrent: Runs session at same time. If u have sessions with source-target dependencies u have to go for sequential batch to start the sessions one after another. If u have several independent sessions u can use concurrent batches. Which runs all the sessions at the same time.

NO Any number of sessions.

If one of session is configured to "run if previous completes" and that previous session fails. pmcmd is used to start a batch. Two optionsRun the session only if previous session completes successfully. Always runs the session.

Yes. By setting the option always runs the session.

U can not. If u want to start batch that resides in a batch, create a new independent batch and copy the necessary sessions into the new batch. We can start our required session only in case of sequential batch. in case of concurrent batch we cant do like this. By using server manager or pmcmd.

Session parameters r like mapping parameters, represent values U might want to change between sessions such as database connections or source files. Server manager also allows U to create user defined session parameters. Following r user defined session parameters. Database connections Source file names: use this parameter when u want to change the name or location of session source file between session runs Target file name :Use this parameter when u want to change the name or location of session target file between session runs. Reject file name :Use this parameter when u want to change the name or location of session reject files between session runs.

Parameter file is to define the values for parameters and variables used in a session. A parameter file is a file created by text editor such as word pad or notepad. U can define the following values in parameter file Mapping parameters Mapping variables session parameters

Relational source: To access relational source which is situated in a remote place ,u need to configure database connection to the data source. File Source : To access the remote source file U must configure the FTP connection to the host machine before u create the session. Heterogeneous : When Ur mapping contains more than one source type, the server manager creates a heterogeneous session that displays source options for all types.

If u partition a session with a relational target informatica server creates multiple connections to the target database to write target data concurrently. If u partition a session with a file target the informatica server creates one target file for each partition. U can configure session properties to merge these target files.

Advanced External procedure transformation and External procedure transformation: This transformation contains a check box on the properties tab to allow partitioning. Aggregator Transformation: If u use sorted ports u can not partition the associated source Joiner Transformation : U can not partition the master source for a joiner transformation Normalizer Transformation XML targets.

The goal of performance tuning is optimize session performance so sessions run during the available load window for the Informatica Server. Increase the session performance by following. The performance of the Informatica Server is related to network connections. Data generally moves across a network at less than 1 MB per second, whereas a local disk moves data five to twenty times faster. Thus network connections often affect on session performance. So avoid network connections. Flat files: If ur flat files stored on a machine other than the informatica server, move those files to the machine that consists of informatica server. Relational data sources: Minimize the connections to sources ,targets and informatica server to improve session performance. Moving target database into server system may improve session performance. Staging areas: If u use staging areas u force informatica server to perform multiple data passes. Removing of staging areas may improve session performance. U can run the multiple informatica servers against the same repository. Distributing the session load to multiple informatica servers may improve session performance. Run the informatica server in ASCII data movement mode improves the session performance. Because ASCII data movement mode stores a character value in one byte. Unicode mode takes 2 bytes to store a character.

The Informatica repository is a relational database that stores information, or metadata, used by the Informatica Server and Client tools. Metadata can include information such as mappings describing how to transform source data, sessions indicating when you want the Informatica Server to perform the transformations, and connect strings for sources and targets. The repository also stores administrative information such as usernames and passwords, permissions and privileges, and product version. Use repository manager to create the repository. The Repository Manager connects to the repository database and runs the code needed to create the repository tables.These tables stores metadata in specific format the informatica server, client tools use.

Following r the types of metadata that stores in the repository Database connections Global objects Mappings Mapplets Multidimensional metadata Reusable transformations Sessions and batches Short cuts Source definitions Target definitions Transformations

The PowerCenter repository allows you to share metadata across repositories to create a data mart domain. In a data mart domain, you can create a single global repository to store metadata used across an enterprise, and a number of local repositories to share the global metadata as needed. To work with remote data source u need to connect it with remote connections. But it is not preferable to work with that remote source directly by using remote connections .Instead u bring that source into U r local machine where informatica server resides. If u work directly with remote source the session performance will decreases by passing fewer amounts of data across the network in a particular time.

When using incremental aggregation, you apply captured changes in the source to aggregate calculations in a session. If the source changes only incrementally and you can capture changes, you can configure the session to process only those changes. This allows the Informatica Server to update your target incrementally, rather than forcing it to process the entire source and recalculate the same calculations each time you run the session. U can schedule a session to run at a given time or interval, or u can manually run the session. Different options of scheduling Run only on demand: Informatica server runs the session only when user starts session explicitly Run once: Informatica server runs the session only once at a specified date and time. Run every: Informatica server runs the session at regular intervals as u configured. Customized repeat: Informatica server runs the session at the dates and times specified in the repeat dialog box.

Tracing level represents the amount of information that informatica server writes in a log file. Types of tracing level Normal Verbose Verbose init Verbose data In case of stored procedure transformation procedure will be compiled and executed in a relational data source. U need data base connection to import the stored procedure in to ur maping. Where as in external procedure transformation procedure or function will be executed out side of data source. ie u need to make it as a DLL to access in u r mapping. No need to have data base connection in case of external procedure transformation.

If you stop a session or if an error causes a session to stop, refer to the session and error logs to determine the cause of failure. Correct the errors, and then complete the session. The method you use to complete the session depends on the properties of the mapping, session, and Informatica Server configuration. Use one of the following methods to complete the session: Run the session again if the Informatica Server has not issued a commit. Truncate the target tables and run the session again if the session is not recoverable. Consider performing recovery if the Informatica Server has issued at least one commit. As explained above informatica server has 3 methods to recovering the sessions. Use performing recovery to load the records from where the session fails.

When the Informatica Server starts a recovery session, it reads the OPB_SRVR_RECOVERY table and notes the row ID of the last row committed to the target database. The Informatica Server then reads all sources again and starts processing from the next row ID. For example, if the Informatica Server commits 10,000 rows before the session fails, when you run recovery, the Informatica Server bypasses the rows up to 10,000 and starts loading with row 10,001. By default, Perform Recovery is disabled in the Informatica Server setup. You must enable Recovery in the Informatica Server setup before you run a session so the Informatica Server can create and/or write entries in the OPB_SRVR_RECOVERY table.

A standalone session is a session that is not nested in a batch. If a standalone session fails, you can run recovery using a menu command or pmcmd. These options are not available for batched sessions. To recover sessions using the menu: 1. In the Server Manager, highlight the session you want to recover. 2. Select Server Requests-Stop from the menu. 3. With the failed session highlighted, select Server RequestsStart Session in Recovery Mode from the menu. To recover sessions using pmcmd: 1.From the command line, stop the session. 2. From the command line, start recovery.

If you configure a session in a sequential batch to stop on failure, you can run recovery starting with the failed session. The Informatica Server completes the session and then runs the rest of the batch. Use the Perform Recovery session property To recover sessions in sequential batches configured to stop on failure: 1.In the Server Manager, open the session property sheet. 2.On the Log Files tab, select Perform Recovery, and click OK. 3.Run the session. 4.After the batch completes, open the session property sheet. 5.Clear Perform Recovery, and click OK. If you do not clear Perform Recovery, the next time you run the session, the Informatica Server attempts to recover the previous session. If you do not configure a session in a sequential batch to stop on failure, and the remaining sessions in the batch complete, recover the failed session as a standalone session.

If multiple sessions in a concurrent batch fail, you might want to truncate all targets and run the batch again. However, if a session in a concurrent batch fails and the rest of the sessions complete successfully, you can recover the session as a standalone session. To recover a session in a concurrent batch: 1.Copy the failed session using Operations-Copy Session. 2.Drag the copied session outside the batch to be a standalone session. 3.Follow the steps to recover a standalone session. 4.Delete the standalone copy.

Under certain circumstances, when a session does not complete, you need to truncate the target tables and run the session from the beginning. Run the session from the beginning when the Informatica Server cannot run recovery or when running recovery might result in inconsistent data.

The source qualifier transformation does not use sorted ports. If u change the partition information after the initial session fails. Perform recovery is disabled in the informatica server configuration. If the sources or targets changes after initial session fails. If the mapping consists of sequence generator or normalizer transformation. If a concurrent batch contains multiple failed sessions.

NO. Informatica is not at all concern with back end data base. It displays u all the information that is to be stored in repository. If want to reflect back end changes to informatica screens, again u have to import from back end to informatica by valid connection. And u have to replace the existing files with imported files.

NO. Unless and until u join those three ports in source qualifier u cannot map them directly. 1. $PMRootDir 2. $PMSessionLogDir 3. $PMBadFileDir 4. $PMCacheDir 5. $PMTargetFileDir 6. $PMSourceFileDir 7. $PMExtProcDir 8. $PMTempDir 9. $PMSuccessEmailUser 10. $PMFailureEmailUser 11. $PMSessionLogCount 12. $PMSessionErrorThreshold 13. $PMWorkflowLogDir 14. $PMWorkflowLogCount 15. $PMLookupFileDir

Folders provide a way to organize and store all metadata in the repository, including mappings, schemas, and sessions. Folders are designed to be flexible, to help you logically organize the repository. Each folder has a set of configurable properties that help you define how users access the folder. For example, you can create a folder that allows all repository users to see objects within the folder, but not to edit them. Or, you can create a folder that allows users to share objects within the folder. You can create shared and non-shared folders. Note: You must create a folder in a repository before you can connect to the repository using the Designer or Workflow Manager. You create and maintain folders in the Repository Manager. In the Repository Manager, you can view the users accessing folders and the objects within each folder. You can also copy and replace folders within the repository or across repositories.

Register multiple PowerCenter Servers. When you register multiple PowerCenter Servers, choose the PowerCenter Server for a workflow in the workflow properties.

Top Ten Most Useful Unix command commands Unix_Training unix

You might also like