You are on page 1of 7

Skill Set Inventory Data Quality 9.

x: Developer, Specialist Certification

About the Informatica Certified Professional (ICP) Program
Informatica certification is a two-stage structure that correlates professional certification with the what, why and how of Informatica implementations: Specialist To achieve Specialist recognition, a candidate must pass a written exam, proctored in person or via webcam. The Specialist exam validates that the individual understands the product and can apply the skills and capabilities required to contribute to a project as a full team member. Expert To reach the Expert level an Informatica Certified Specialist must pass Informatica Velocity Best Practices and Implementation Methodologies Certification. This certification validates that you are able to lead a project implementation team following our best practices. Our new exams have been developed to define product competency by job role, where demonstrated performance and outcomes validate Informatica implementation skills.

About the ICS Data Quality Developer Exam and the Skill Set Inventory
This exam measures your competency as a member of a project implementation team. This involves having an in-depth knowledge of each of the Data Quality processes from Profiling to Standardization, Matching and Consolidation. You need to be able to select and configure the appropriate Data Quality transformation for your requirements and build, debug and execute Data Quality mappings as well as integrate mappings into Power Center. The skill set inventory is used to guide your preparation before taking the exam. It is an outline of the technical topics and subject areas that are covered in each exam. The skill set inventory includes test domain weighting, test objectives and topical content. The topics and concepts are included to clarify the test objectives.

Informatica University, Copyright Informatica Corporation, February 12, 2013

Test takers will be tested on: Be able to navigate through the Developer Tool Use Project Collaboration techniques; shared Tags, Comments, Profiles, reference tables to share information with Analysts Use a variety of profiling methods to profile data Use Analyst built profiles to develop mappings Build mappings and mapplets to cleanse and standardize data Perform Address Validation Identify duplicate records in a dataset Automatically and manually consolidate duplicate records to create a master record Execute mappings/mapplets in PowerCenter Use Data Quality Mappings in an Excel spreadsheet (connect to Web Services)

Training Prerequisites
The skills and knowledge areas measured by this exam are focused on product core functionality inside the realm of a standard project implementation. Training materials, supporting documentation and practical experience may become sources of question development. The suggested training prerequisites for this certification level are the completion of the following Informatica course(s):

Informatica Data Quality 9.1 Developer & Analyst training.

Exam Test Domains

The test domains and the extent to which they are represented as an estimated percentage of the exam follows:

Informatica Overview Analyst Collaboration Profiling Standardization/Mapplets Address Validation Matching Consolidation and the DQA Integration with PowerCenter Object Import and Export DQ for Excel Parameters Content

% of Exam
10% 10% 15% 10% 5% 10% 10% 5% 5% 5% 5% 10%

Informatica University, Copyright Informatica Corporation, February 12, 2013

Question Format
You may select from one or more response offerings to answer a question. You will score the question correctly if your response accurately completes the statement or answers the question. Incorrect distractors are given as possible correct answers so that those without the required skills and experience may wrongly select that choice. A passing grade of 70% is needed to achieve recognition as an Informatica Certified Specialist (ICS) DI PowerCenter Developer v9.x You are given 90 minutes to complete the exam. Test formats used in this exam are: Multiple Choice: Select one option that best answers the question or completes the statement Multiple Response: Select all that apply to best answer the question or complete the statement True/False: After reading the statement or questions select the best answer

Exam Policy
You are eligible for one attempt per exam registration If you do not pass on your first attempt, you must wait 2 weeks after the exam to take the exam again You may take the exam up to three times in one year from the date of your first exam attempt You must pay the full exam fee each time you take the exam

Test Topics
The exam will contain 70 questions comprised of topics that span across the sections listed below. In order to ensure that you are prepared for the test, review the subtopics associated with each section.

Informatica Overview
Be able to describe the Informatica 9 architecture and set up including the repositories required for installation Be able to provide information on the Data Quality process and dimensions of data quality

Analyst Collaboration
Be able to use and describe the functionality of the Analyst Tool including Scorecarding, Reference Table Management, Tags, Filters, Profiles and Comments Be able to describe how Analysts and Developers can collaborate on projects Describe the benefits of project collaboration to a team

Informatica University, Copyright Informatica Corporation, February 12, 2013

Be able to perform and interpret column, rule, comparative and mid-stream profiling

Be aware of where data standardization fits in the DQ process Apply, configure and troubleshoot data standardization transformations Be able to differentiate between the various parsing techniques available Recognize how reference tables are used in the standardization process

Address Validation
Verify the importance of Address Validation Configure the Address Validation transformation including explaining each mode that is available and what custom templates can be used for Interpret the AV outputs generated

Know how to develop and build match plans to identify duplicate or related data Be able to differentiate between the matching algorithms available and explain how matching scores are calculated Know how to configure Identity Matching option for the match transformation Explain populations, the identity match strategies that are available and how they are used in Identity Matching

Consolidation and the DQA

Explain Automatic and Manual consolidation techniques and configure the Transformations used for automatic consolidation Use the Exception Transformation to generate and populate the Bad Records and Duplicate Records tables Troubleshoot any problems that may occur Be able to generate a survivor record in the DQA

Informatica University, Copyright Informatica Corporation, February 12, 2013

Remove duplicates using DQA

Integration with Power Center

Explain how to integrate IDQ mappings and mapplets into PowerCenter Workflows including how content is handled

Object Import and Export

Describe the difference between the Basic and Advanced Import options available Explain Dependency conflict resolution and how it is handled Describe how to Export a Project

DQ for Excel
Be able to describe the requirements for integration for Excel Explain the techniques for creating mappings for DQ for Excel Explain the Web service capabilities required for DQ for Excel Provide an explanation on how to use the Informatica ribbon in Excel

Explain what parameters are and why they are used including what parameter functions are supported Identify which Transformations can use parameters Describe the process of exporting mappings with parameters built in

Explain what Content is, what is contained in the Core Accelerator and why it is used

Informatica University, Copyright Informatica Corporation, February 12, 2013

Sample Test Questions

Scorecarding is performed on which of the following?
Option Text A. B. C. D. Per record basis Per column/attribute basis Per table basis Per database basis

Select all correct statements for IDQ grouping and matching:

Option Text A. B. C. D. IDQ field level matching does not utilize grouping When field level matching is performed, the records within each group will be compared against each other When field level matching is performed, matching will be performed across multiple groups in a single match transformation When field level matching is performed, matching will not be performed across groups, therefore it is imperative grouping is performed on a complete and accurate field(s)

What types of profiling can be performed in the Developer Tool in Data Quality?
Option Text A. B. C. D. Column Profiling, Primary Key Inference, Dependency Inference Column and Join Profiling only Column, Join Analysis, Mid Stream, Comparative Profiling Column, Join Analysis, Mid Stream, Comparative, Primary and Foreign Key and Overlap Profiling

Data Consolidation manages the following processes (select all that apply):
Option Text A. B. C. D. Identifies duplicates within a data set Merging or linking duplicate or related records Removing duplicates from a dataset Replacing inaccurate data

Informatica University, Copyright Informatica Corporation, February 12, 2013

Select the architecture description that best describes IDQ 9.1

Option Text A. B. C. D. The architecture is completely server based; there is no client. Both Developer and Analyst users access IDQ through a web based interface The architecture is both client and server based. The Data Integration, Analyst, and Model Repository services all run on the server. Developer is the client application IDQ has both a client and server repository. Production ready data quality mappings are transferred from the client repository to the server repository IDQ does not have a repository. All rule and mapping work is stored as XML files in the server directory structure

Informatica University, Copyright Informatica Corporation, February 12, 2013