You are on page 1of 166

Informatica (Version 9.5.

0)

Release Guide

Informatica Release Guide Version 9.5.0 June 2012 Copyright (c) 1998-2012 Informatica. All rights reserved. This software and documentation contain proprietary information of Informatica Corporation and are provided under a license agreement containing restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. This Software may be protected by U.S. and/or international Patents and other Patents Pending. Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III), as applicable. The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us in writing. Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and Informatica Master Data Management are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions throughout the world. All other company and product names may be trade names or trademarks of their respective owners. Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights reserved.Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization 1986. All rights reserved. Copyright ej-technologies GmbH. All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright is International Business Machines Corporation. All rights reserved. Copyright yWorks GmbH. All rights reserved. Copyright Lucent Technologies 1997. All rights reserved. Copyright (c) 1986 by University of Toronto. All rights reserved. Copyright 1998-2003 Daniel Veillard. All rights reserved. Copyright 2001-2004 Unicode, Inc. Copyright 1994-1999 IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and other software which is licensed under the Apache License, Version 2.0 (the "License"). You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software copyright 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under the GNU Lesser General Public License Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose. The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright () 1993-2006, all rights reserved. This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html. This product includes Curl software which is Copyright 1996-2007, Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. The product includes software copyright 2001-2005 () MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://www.dom4j.org/ license.html. The product includes software copyright 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://dojotoolkit.org/license. This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html. This product includes software copyright 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at http:// www.gnu.org/software/ kawa/Software-License.html. This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php. This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at http://www.boost.org/LICENSE_1_0.txt. This product includes software copyright 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at http:// www.pcre.org/license.txt. This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http:// www.eclipse.org/org/documents/epl-v10.php. This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://www.stlport.org/ doc/ license.html, http://www.asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/license-agreements/fuse-message-broker-v-5-3- licenseagreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html; http://www.jgraph.com/jgraphdownload.html; http:// www.jcraft.com/jsch/LICENSE.txt. http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/2002/copyright-software-20021231; http:// developer.apple.com/library/mac/#samplecode/HelpHook/Listings/HelpHook_java.html; http://www.jcraft.com/jsch/LICENSE.txt; http://nanoxml.sourceforge.net/orig/ copyright.html; http://www.json.org/license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://

www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/ license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; and http://srp.stanford.edu/license.txt; and http://www.schneier.com/blowfish.html; http:// www.jmock.org/license.html; http://xsom.java.net/. This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php) the MIT License (http://www.opensource.org/licenses/mitlicense.php) and the Artistic License (http://www.opensource.org/licenses/artistic-license-1.0). This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab. For further information please visit http://www.extreme.indiana.edu/. This Software is protected by U.S. Patent Numbers 5,794,246; 6,014,670; 6,016,501; 6,029,178; 6,032,158; 6,035,307; 6,044,374; 6,092,086; 6,208,990; 6,339,775; 6,640,226; 6,789,096; 6,820,077; 6,823,373; 6,850,947; 6,895,471; 7,117,215; 7,162,643; 7,243,110; 7,254,590; 7,281,001; 7,421,458; 7,496,588; 7,523,121; 7,584,422; 7,676,516; 7,720,842; 7,721,270; and 7,774,791, international Patents and other Patents Pending. DISCLAIMER: Informatica Corporation provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica Corporation does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice. NOTICES This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions: 1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. 2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS. Part Number: IN-REL-95000-0001

Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Customer Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi Informatica Multimedia Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi

Part I: Version 9.5.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Chapter 1: New Features and Enhancements (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . 2


Version 9.5.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Informatica Installer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Informatica Data Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 PowerExchange Adapters. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

Chapter 2: Informatica Data Explorer (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20


Edit Profile Action Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Foreign Key Discovery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Scorecards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Chapter 3: Informatica Data Quality (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22


Address Validator Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Edit Profile Action Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Export to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 File Directory Fields for Mapping Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Exception Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Table of Contents

Mapping and Mapplet Editors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Match Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Reference Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Scorecards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

Chapter 4: Informatica Data Services (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26


Edit Profile Action Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 Export to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 File Directory Fields for Mapping Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Flat File Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Mapping and Mapplet Editors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Scorecards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Web Service Consumer Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Fault Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Fault Terminology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Manual Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 SOAP 1.2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

Chapter 5: Informatica Data Transformation (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . 31


Data Transformation Platform. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Deprecated Script Components. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 IntelliScript Editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Model Repository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Obsolete Script Components and Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Script Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 XML Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Chapter 6: Informatica Domain (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36


Connection Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Content Management Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Data Integration Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Pass-through Security. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Chapter 7: PowerCenter (9.5) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38


Pushdown Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Exporting Metadata to Excel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

ii

Table of Contents

Chapter 8: Metadata Manager (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39


Data Modeling and Business Intelligence Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Incremental Metadata Load. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 mmcmd Command Line Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Resource Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Metadata Manager Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Convert Metadata Manager Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Reload Metadata Manager Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Update the Metadata Manager Properties File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

Chapter 9: Adapters for PowerCenter (9.5.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42


PowerCenter Dual Load Option for Teradata. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 PowerExchange for HP Neoview Transporter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 PowerExchange for JD Edwards EnterpriseOne (JD Edwards OneWorld). . . . . . . . . . . . . . . . . . . . . 42 PowerExchange for Microsoft Dynamics CRM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 PowerExchange for Salesforce. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 PowerExchange for Teradata Parallel Transporter API. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 PowerExchange for Ultra Messaging. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

Part II: Version 9.1.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Chapter 10: New Features and Enhancements (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . 46
Version 9.1.0 HotFix 4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Version 9.1.0 HotFix 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Version 9.1.0 HotFix 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Informatica Data Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Version 9.1.0 HotFix 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

Table of Contents

iii

Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Metadata Manager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Version 9.1.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Informatica Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 Data Analyzer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

Chapter 11: Informatica Data Explorer (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72


Oracle Database Requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Profiling Warehouse. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

Chapter 12: Informatica Data Quality (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73


Address Validator Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Association Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Data Quality Content Installer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Data Quality for Siebel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Decision Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Exception Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Export to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Match Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Reference Data Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 Web Service Operations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

Chapter 13: Informatica Data Services (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77


Application Redeployment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Informatica Data Integration Analyst Action Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 Deployment Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Export to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Logical Data Object Model Import. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Generate Requests in the Data Viewer View. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Deleted WSDL Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Create a Web Service Wizard. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Ports Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

iv

Table of Contents

Create Web Service from a WSDL Data Object Wizard. . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Deployment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Ports Tab Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Cache Property in the Lookup Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Web Service Consumer Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Deleted WSDL Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Ports Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Ports Tab Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

Chapter 14: Informatica Domain (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82


Analyst Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 Data Integration Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 Address Validation Reference Data Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 HTTP Proxy Server Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 Result Set Cache Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 Domain Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 infacmd Control Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Export Control Files for Domain Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Import Control Files for Domain Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Model Repository Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Backup File Directory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 Search Index Backup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Blocking Other Operations During a Backup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Permissions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Privileges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86

Chapter 15: Metadata Manager (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87


Incremental Metadata Load. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 backupCmdLine Command Line Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 Business Glossary Custom Models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Class Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Hide or Display Empty and Read-only Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Link Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Search Results Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Metadata Manager Agent Validation Level. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 Embarcadero ERStudio . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 SAP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Chapter 16: PowerCenter (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91


Session Recovery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 Informatica Data Integration Analyst Action Menus. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91

Table of Contents

Chapter 17: Informatica Development Platform (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . 92


Relational Data Adapter API. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

Chapter 18: Adapters for PowerCenter (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93


PowerExchange for HP Neoview Transporter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 PowerExchange for Hadoop. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

Part III: Version 9.0.1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Chapter 19: New Features and Enhancements (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . 95
Version 9.0.1 HotFix 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Informatica Data Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 95 Version 9.0.1 HotFix 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 96 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Version 9.0.1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 98 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 PowerExchange. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 Adapters for Data Quality and Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

Chapter 20: Informatica Data Quality and Informatica Data Explorer Advanced Edition (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108

Chapter 21: Informatica Data Services (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109


Relational Physical Data Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 Relational Physical Data Object Source and Target Transformations. . . . . . . . . . . . . . . . . . . . . . . 110 Relational Physical Data Object Keys. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 Flat File Physical Data Object File Path. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

Chapter 22: Informatica Domain (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111


8.6.1 Features in 9.0.1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 IBM DB2 Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

vi

Table of Contents

Connection Permissions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 infacmd Changed Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 infacmd New Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 New Environment Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Data Integration Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Data Integration Service Privilege. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Maximum # of Concurrent Connections Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Maximum Execution Pool Size Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 LDAP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 User Import. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 View Log Events from the Previous Informatica Version. . . . . . . . . . . . . . . . . . . . . . . . . 117 Maximum Heap Size. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Node Diagnostics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 User Activity Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 License Management Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118

Chapter 23: PowerCenter (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119


Mapping Analyst for Excel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Excel Add-in. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Export Option. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Standard Mapping Specification Template. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 Web Services Hub. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 Sources and Targets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

Chapter 24: Metadata Manager (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121


backupCmdLine Command Line Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 Business Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 Data Lineage for Custom Objects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 Data Lineage for SQL Inline Views. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122 Email. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 Impact Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 mmcmd Command Line Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 Searching for PowerCenter Metadata Extensions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

Part IV: Version 9.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 Chapter 25: New Features and Enhancements (9.0). . . . . . . . . . . . . . . . . . . . . . . . . . 126
Informatica Data Quality and Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128

Table of Contents

vii

Informatica Analyst. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Command Line Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 PowerExchange. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 PowerExchange for HP Neoview Transporter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 PowerExchange for Netezza. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 PowerExchange for Oracle E-Business Suite. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for SAP NetWeaver. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for Teradata Parallel Transporter API. . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for webMethods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138

Chapter 26: Informatica Domain (9.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139


PowerCenter Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 PowerCenter Domain Name Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Administration Console Name Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Informatica Administrator URL Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Domain Ports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 Object Name Length. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 Shared Object Names. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 Domain Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 New Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 New Environment Variable. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 Metadata Manager Privileges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

Chapter 27: PowerCenter (9.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146


Reference Table Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146

Chapter 28: Metadata Manager (9.0). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147


Business Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 mmcmd Command Line Program. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 Data Lineage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148 Logging. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148 Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 Removed Resource Types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 Deprecated Metadata Source Versions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 Linking Objects in Connected Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

viii

Table of Contents

Connection Assignments for Purged Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 Automatic Connection Assignment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 PowerCenter Source Increment Extract Window. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 PowerCenter Parameter Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151

Table of Contents

ix

Preface
The Informatica Release Guide is written for administrators who are responsible for installing and configuring the Informatica platform and developers and software engineers who implement Informatica. This guide assumes that you have knowledge of the features for which you are responsible. The Informatica Release Guide lists new features and enhancements, behavior changes between versions, and tasks you might need to perform after you upgrade from a previous version.

Informatica Resources
Informatica Customer Portal
As an Informatica customer, you can access the Informatica Customer Portal site at http://mysupport.informatica.com. The site contains product information, user group information, newsletters, access to the Informatica customer support case management system (ATLAS), the Informatica How-To Library, the Informatica Knowledge Base, the Informatica Multimedia Knowledge Base, Informatica Product Documentation, and access to the Informatica user community.

Informatica Documentation
The Informatica Documentation team takes every effort to create accurate, usable documentation. If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through email at infa_documentation@informatica.com. We will use your feedback to improve our documentation. Let us know if we can contact you regarding your comments. The Documentation team updates documentation as needed. To get the latest documentation for your product, navigate to Product Documentation from http://mysupport.informatica.com.

Informatica Web Site


You can access the Informatica corporate web site at http://www.informatica.com. The site contains information about Informatica, its background, upcoming events, and sales offices. You will also find product and partner information. The services area of the site includes important information about technical support, training and education, and implementation services.

Informatica How-To Library


As an Informatica customer, you can access the Informatica How-To Library at http://mysupport.informatica.com. The How-To Library is a collection of resources to help you learn more about Informatica products and features. It

includes articles and interactive demonstrations that provide solutions to common problems, compare features and behaviors, and guide you through performing specific real-world tasks.

Informatica Knowledge Base


As an Informatica customer, you can access the Informatica Knowledge Base at http://mysupport.informatica.com. Use the Knowledge Base to search for documented solutions to known technical issues about Informatica products. You can also find answers to frequently asked questions, technical white papers, and technical tips. If you have questions, comments, or ideas about the Knowledge Base, contact the Informatica Knowledge Base team through email at KB_Feedback@informatica.com.

Informatica Multimedia Knowledge Base


As an Informatica customer, you can access the Informatica Multimedia Knowledge Base at http://mysupport.informatica.com. The Multimedia Knowledge Base is a collection of instructional multimedia files that help you learn about common concepts and guide you through performing specific tasks. If you have questions, comments, or ideas about the Multimedia Knowledge Base, contact the Informatica Knowledge Base team through email at KB_Feedback@informatica.com.

Informatica Global Customer Support


You can contact a Customer Support Center by telephone or through the Online Support. Online Support requires a user name and password. You can request a user name and password at http://mysupport.informatica.com. Use the following telephone numbers to contact Informatica Global Customer Support:
North America / South America Toll Free Brazil: 0800 891 0202 Mexico: 001 888 209 8853 North America: +1 877 463 2435 Europe / Middle East / Africa Toll Free France: 0805 804632 Germany: 0800 5891281 Italy: 800 915 985 Netherlands: 0800 2300001 Portugal: 800 208 360 Spain: 900 813 166 Switzerland: 0800 463 200 United Kingdom: 0800 023 4632 Standard Rate India: +91 80 4112 5738 Asia / Australia Toll Free Australia: 1 800 151 830 New Zealand: 09 9 128 901

Standard Rate Belgium: +31 30 6022 797 France: +33 1 4138 9226 Germany: +49 1805 702 702 Netherlands: +31 306 022 797 United Kingdom: +44 1628 511445

Preface

xi

xii

Part I: Version 9.5.0


This part contains the following chapters:
New Features and Enhancements (9.5.0), 2 Informatica Data Explorer (9.5.0), 20 Informatica Data Quality (9.5.0), 22 Informatica Data Services (9.5.0), 26 Informatica Data Transformation (9.5.0), 31 Informatica Domain (9.5.0), 36 PowerCenter (9.5) , 38 Metadata Manager (9.5.0), 39 Adapters for PowerCenter (9.5.0), 42

CHAPTER 1

New Features and Enhancements (9.5.0)


This chapter includes the following topic:
Version 9.5.0, 2

Version 9.5.0
This section describes new features and enhancements in version 9.5.0.

Informatica Installer
This section describes new features and enhancements to the Informatica platform installer.

Install Application Client Components


You can specify the Informatica application client components that you want to install. For example, you can install all of the application clients or a subset of application clients.

Pre-Installation (i9Pi) System Check Tool


Before you install or upgrade the Informatica services, you can run the Pre-installation (i9Pi) System Check Tool to verify that the machine meets the minimum system and database requirements for the installation.

Uninstall Application Client Components


You can specify the Informatica application client components that you want to uninstall.

Informatica Data Explorer


This section describes new features and enhancements to Informatica Data Explorer.

Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in

the Analyst tool. You can edit database connections in the Analyst tool.

Data Domain Discovery


You can identify critical data characteristics within the enterprise so that you can apply further data management policies, such as data masking or data quality, to the data. Run a profile to identify all the data domains for a column based either on its values or name. A data domain is the logical datatype of a column or a set of allowed values it may have. The name of the data domain helps you find the functional meaning of the column data. You can perform data domain discovery in both Analyst tool and Developer tool.

Enterprise Discovery
You can run multiple data discovery tasks on a large number of data sources across multiple connections and generate a consolidated results summary of the profile results. This data discovery method includes running a column profile, data domain discovery, and discovering primary key and foreign key relationships. You can view the results in both graphical and tabular formats. You can run enterprise discovery from a profile model in the Developer tool.

Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling

warehouse stores the scorecard metrics and configuration information.


You can attach a read-only view of the scorecard metrics to a web application or portal. Copy the scorecard

URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.

Informatica Data Quality


This section describes new features in Informatica Data Quality.

Address Validator Transformation


The Address Validator transformation can perform consumer marketing and segmentation analysis on address data. Select the CAMEO options in the transformation to perform consumer marketing and segmentation analysis. The Address Validator transformation can add Enhanced Line of Travel (eLOT) data to a United States address. Mail carriers use eLOT data to sort mail items in the order in which they are likely to be delivered on a mail route. The Address Validator transformation runs in Certified mode when it creates eLOT output.

Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in

the Analyst tool. You can edit database connections in the Analyst tool.

Version 9.5.0

Content Management Service


The Content Management Service has the following features:
The Content Management Service identifies the location of files that store probabilistic model data. You set the

path to the probabilistic model files in the on each Content Management Service.
You can configure a master Content Management Service for an Informatica domain or grid. You specify a

master Content Management Service when you want to run a mapping that reads probabilistic model data on multiple nodes. When you use a master Content Management Service, any probabilistic model file that you create or update on the master service host machine is copied from the master service machine to the locations specified by the other Content Management Services on the domain or grid.
The Content Management Service enables dynamic configuration updates for the Address Validator

transformation and the Match transformation. The Content Management Service updates the input port list in the Address Validator transformation each time you open the transformation. You can install an address validation engine update from Informatica without performing a product reinstall. The Content Management Service updates the list of identity population files in the Match transformation each time you open the transformation.

Data Masking Transformation


The Data Masking transformation contains the following data masking techniques:
Expression masking. Applies an expression to a source column to create or mask data. Substitution masking. Replaces source data with repeatable values. The Data Masking transformation

produces deterministic results for the same source data, masking rules, and seed value.
Dependent masking. Replaces the values of one source column based on the values of another source column.

Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Import from PowerCenter


You can import objects from a PowerCenter repository to a Model repository. You can connect to a PowerCenter repository from the Developer tool and select objects to import into a target location in the Model repository. The import process validates and converts the PowerCenter objects to Model repository objects based on compatibility. You can check feasibility of the import before the final import. The Developer tool creates a final summary report with the results of the import.

Decision Transformation
The Decision transformation handles integer values in IF/ELSE statements in addition to boolean values. The transformation processes a 0 value as False and other integer values as True.

Informatica Data Director for Data Quality


Informatica Data Director for Data Quality is a web-based application that you use to review the bad record and duplicate record output from an Exception transformation. You can edit bad records, and you can consolidate duplicate records into a single master record. You use Informatica Data Director for Data Quality to complete a Human task in a workflow. When you log on to the application, Informatica Data Director for Data Quality connects to the database tables specified in the workflow and displays the tasks to perform.

Chapter 1: New Features and Enhancements (9.5.0)

Mapping and Mapplet Editors


The Developer tool contains the following options for mapping and mapplet editors:
Option Shift + resize an object Align All to Grid Restore All Description After you resize the object, the Developer tool arranges all objects so that no objects are overlapping. The Developer tool aligns all objects in the editor based on data flow. When an editor contains iconized objects, the Developer tool restores the objects to their original sizes without overlapping them.

Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Probabilistic Models
A probabilistic model is a content set that you can use to identify data values on input ports that contain one or more values in a delimited string. A probabilistic model uses probabilistic matching logic to identify data values by the types of information the values contain. You can use a probabilistic model in Labeler and Parser transformations. You create a probabilistic model in the Developer tool. You select the model from a project folder in the Model repository. The Developer tool writes probabilistic model data to a file you specify in the Content Management Service.

Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling

warehouse stores the scorecard metrics and configuration information.


You can attach a read-only view of the scorecard metrics to a web application or portal. Copy the scorecard

URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.

System Mapping Parameters


System mapping parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of the system parameters on a Data Integration Service process in the Administrator tool. By default, the system parameters are assigned to flat file directory, cache file directory, and temporary file directory fields.

Workflows
A workflow is a graphical representation of a set of events, tasks, and decisions that define a business process. You use the Developer tool to add objects to a workflow and to connect the objects with sequence flows. The Workflow Service Module is the component in the Data Integration Service that uses the instructions configured in the workflow to run the objects. A workflow can contain the following objects:
Start event that represents the beginning of the workflow. End event that represents the end of the workflow. Mapping task that runs a mapping. Command task that runs a single shell command.

Version 9.5.0

Human task that involves user interaction with an application. For example, you view bad or duplicate records

in Informatica Data Director for Data Quality in a Human task.


Notification task that sends an email notification to specified recipients. Before you configure a Notification task

to send emails, you must use the Administrator tool to configure the email server properties for the Data Integration Service.
Assignment task that assigns a value to a user-defined workflow variable. Exclusive gateway that makes a decision to split and merge paths in the workflow.

A sequence flow connects workflow objects to specify the order that the Data Integration Service runs the objects. You can create a conditional sequence flow to determine whether the Data Integration Service runs the next object. You can define and use workflow variables and parameters to make workflows more flexible. A workflow variable represents a value that records run-time information and that can change during a workflow run. A workflow parameter represents a constant value that you define in a parameter file before running a workflow. After you validate a workflow to identify errors, you add the workflow to an application and deploy the application to a Data Integration Service. You run an instance of the workflow from the deployed application using the infacmd wfs command line program. You monitor the workflow instance run in the Monitoring tool.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Business Intelligence Tools


You can query published data services with the OBIEE 11.1.1.5 or 11.1.13, Toad for Data Analysts, and MS Sql Server Reporting Service business intelligence tools.

Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in

the Analyst tool. You can edit database connections in the Analyst tool.

Data Processor Transformation


You can configure a Data Transformation service in the Developer tool by configuring it in a Data Processor transformation. Create a script in the IntelliScript editor or configure an XMap to map input XML to output XML in the transformation. Add a Data Processor transformation to a mapping or export the transformation as a service to a Data Transformation repository.

Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.

Import from PowerCenter


You can import objects from a PowerCenter repository to a Model repository. You can connect to a PowerCenter repository from the Developer tool and select objects to import into a target location in the Model repository. The import process validates and converts the PowerCenter objects to Model repository objects based on compatibility. You can check feasibility of the import before the final import. The Developer tool creates a final summary report with the results of the import.

Chapter 1: New Features and Enhancements (9.5.0)

Mapping and Mapplet Editors


The Developer tool contains the following options for mapping and mapplet editors:
Option Shift + resize an object Align All to Grid Restore All Description After you resize the object, the Developer tool arranges all objects so that no objects are overlapping. The Developer tool aligns all objects in the editor based on data flow. When an editor contains iconized objects, the Developer tool restores the objects to their original sizes without overlapping them.

Mapping Specifications
Version 9.5.0 includes the following enhancements for mapping specifications in the Analyst tool:
You can select multiple source columns and drag these to insert between target columns in a mapping

specification.
When you edit a mapping specification, all objects appears as a single tabbed dialog. Analysts can select

sources, joins, lookups, reusable rules, expressions, filters, aggregators, and a target from a tab and edit these objects.
You can run a profile on a source, source columns, or target columns in a mapping specification to better

understand the data in a mapping specification.


You can add SQL queries to source and target columns in a mapping specification and run the query to view

the query results as a data preview in a mapping specification.


The data preview for a mapping specification appears on tabs for mapping specification objects such as

lookups, filters, joins, and aggregators.


You can create join types or use a join object as a source object in a mapping specification. You can also

create a join between two join objects.


When you create a join between two sources, the Analyst tool recommends join conditions for a join between

two sources.
The Analyst tool updates a mapping specification when you open a mapping specification again after deleting a

column or after modifying a mapping specification.


In the Analyst tool, you can map source and target columns based on naming conventions and column

positions.

Performance
Version 9.5.0 includes the following performance enhancements:
You can configure early selection and push-into optimization with the Java transformation, Web Service

Consumer transformation, and the SQL transformation.


You can add hints to a source SQL query to pass instructions to a database optimizer. The optimizer uses the

hints to choose a query run plan to access the source. The source database must be Oracle, Sybase, IBM DB2, or Microsoft SQL Server.

Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.

Version 9.5.0

Row Level Security


Administrators can assign security predicates on virtual tables to restrict access to rows of data when users query the tables.

Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling

warehouse stores the scorecard metrics and configuration information.


You can attach a read-only view of the scorecard metrics to a web application or portal. Copy the scorecard

URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.

SQL Data Services


Version 9.5.0 includes the following enhancements for SQL data services:
You can issue a correlated subquery in a query against an SQL data service if the correlated subquery meets a

specific set of criteria. You can submit correlated subqueries from an ODBC, JDBC client, or from the query plan window in the Developer tool.
You can connect to an SQL data service through a default ODBC or JDBC connection specified in the SQL

data service and then create and drop local temporary tables in a relational database.

System Mapping Parameters


System mapping parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of the system parameters on a Data Integration Service process in the Administrator tool. By default, the system parameters are assigned to flat file directory, cache file directory, and temporary file directory fields.

Web Services
Web Service Consumer Transformation Version 9.5.0 include the following enhancements for the Web Service Consumer transformation:
You can enable the Web Service Consumer transformation to create multiple concurrent connections to a

web service so that it can send multiple web service requests in parallel. When you enable the Web Service Consumer transformation to create multiple concurrent connections to the web service, you can set the memory consumption limit and the number of concurrent connection limits.
The Web Service Consumer transformation can process SOAP 1.2 messages with document/literal

encoding. You can create a Web Service Consumer transformation with a SOAP 1.2 binding. The fault output ports for SOAP 1.2 are code, reason, node, role, and detail. Generic Fault You can define a generic fault to return an error message to a web service client when an error is not defined by a fault element in the WSDL. Create a Fault transformation to return a generic error message. Schema Objects Version 9.5.0 includes the following enhancements for schema objects:
You can add multiple root .xsd files to a schema object. You can also remove .xsd files from a schema

object.
You can update a schema object when elements, attributes, types, or other schema components change.

When you update a schema object, the Developer tool updates objects that use the schema.

Chapter 1: New Features and Enhancements (9.5.0)

The following table describes the methods that you can use to update a schema object:
Method Synchronize the schema. Edit a schema file. Description Synchronize a schema object when you update the schema files outside the Developer tool. The Developer tool re-imports all of the schema XSD files that contain changes. Edit a schema file when you want to update a file from within the Developer tool. The Developer tool opens the file in your XSD file editor or in an editor that you select.

Hierarchy Level of Elements You can change the hierarchy of the elements in an operation mapping. Operations You can create and configure operations in the web service Overview view. After you manually create a web service, you can create an operation from a reusable object. SOAP 1.2 The Data Integration Service can process SOAP 1.2 messages with document/literal encoding. Each web service can have an operation that uses a SOAP 1.2 binding. When you create a fault using SOAP 1.2, the wizard creates the code, reason, node, and role elements. WSDL Synchronization You can synchronize a WSDL data object when the WSDL files change. When you synchronize a WSDL data object, the Developer tool re-imports the object metadata from the WSDL files. The Developer tool also updates objects that reference the WSDL or marks them as changed when you open them.

Informatica Data Transformation


Effective in version 9.5.0, Data Transformation moved to the Informatica platform. You can now create and test a Data Transformation service in the Developer tool. Create a Data Processor transformation that include script objects or XMAP objects to transform data. Create a script in the Data Processor transformation Script editor. A script can contain Parsers, Serializers, Mappers, Transformers, and Streamer components. Define an XMap in the transformation XMap editor. Define an XMap to map input XML to output XML. You can add a Data Processor transformation to a mapping or export the transformation as a service to a Data Transformation repository. You can import a Data Transformation project into a Data Processor transformation to upgrade a script from Data Transformation versions 8.6.1 through 9.1.0. You can also deploy a Data Transformation project as a service, and then import the service to a Data Processor transformation.

Informatica Domain
This section describes new features and enhancements to the Informatica domain.

Connection Management
You can rename connections.

Data Director Service


The Informatica Data Director Service is an application service that runs Informatica Data Director for Data Quality in the Informatica domain. Create and enable an Informatica Data Director Service on the Domain tab of Informatica Administrator.

Version 9.5.0

When you enable the Informatica Data Director Service, the Service Manager starts Informatica Data Director for Data Quality. You can open Informatica Data Director for Data Quality in a web browser.

Data Integration Service


Directories for Data Integration Service Files You can configure the following Data Integration Service process properties that define where the service stores files:
Property Home Directory Description Root directory accessible by the node. This is the root directory for other service process variables. Default is
<Informatica Services Installation Directory>/ tomcat/bin.

Log Directory

Directory for log files. Default is <home directory>/ disLogs. Directory for index and data cache files for transformations. Default is <home directory>/Cache. Directory for source flat files used in a mapping. Default is <home directory>/source. Default directory for target flat files used in a mapping. Default is <home directory>/target. Directory for reject files. Reject files contain rows that were rejected when running a mapping. Default is <home directory>/reject.

Cache Directory

Source Directory

Target Directory

Rejected Files Directory

Out of Process Execution You can run each Data Integration Service job as a separate operating system process. Each job can run separately without affecting other jobs running on the Data Integration Service. For optimal performance, run batch jobs and long jobs out of process, such as preview, profile, scorecard, and mapping jobs. Email Server Properties You can configure email server properties for the Data Integration Service. The email server properties configure the SMTP server that the Data Integration Service uses to send email notifications from a workflow. Grid You can run the Data Integration Service on a grid. When you run an object on a grid, you improve scalability and performance by distributing the work across multiple DTM processes running on nodes in the grid. Human Task Service Module The Human Task Service Module is the component in the Data Integration Service that manages requests to run a Human task in a workflow. Logical Data Object Properties If you want to manage the data object cache through the database, you can specify a cache table name for each logical data object. When you specify a cache table name, the database user or a third-party tool that you configure populates and refreshes the cache.

10

Chapter 1: New Features and Enhancements (9.5.0)

SQL Properties You can configure the following SQL properties for the Data Integration Service:
Property DTM Keep Alive Time Description Number of milliseconds that the DTM process stays open after it completes the last request. Identical SQL queries can reuse the open process. You can set this property globally or for each SQL data service that is deployed to the Data Integration Service. Table Storage Connection Skip Log Files Relational database connection that stores temporary tables for SQL data services.

Prevents the Data Integration Service from generating log files when the SQL data service request completes successfully and the tracing level is set to INFO or higher.

Virtual Table Properties If you want to manage the data object cache through the database, you can specify a cache table name for each virtual table. When you specify a cache table name, the database user or a third-party tool that you configure populates and refreshes the cache. Web Service Properties You can configure the following web service properties for the Data Integration Service:
Property DTM Keep Alive Time Description Number of milliseconds that the DTM process stays open after it completes the last request. Web service requests that are issued against the same operation can reuse the open process. You can set this property globally or for each web service that is deployed to the Data Integration Service. Logical URL Skip Log Files Prefix for the WSDL URL if you use an external HTTP load balancer. Prevents the Data Integration Service from generating log files when the web service request completes successfully and the tracing level is set to INFO or higher.

Workflow Service Module The Workflow Service Module is the component in the Data Integration Service that manages requests to run workflows.

Monitoring
You can monitor a workflow instance run in the Monitoring tab of the Administrator tool. You can view the status of running workflow and workflow object instances. You can abort or cancel a running workflow instance. You can also view workflow reports, workflow logs, and mapping logs for mappings run by Mapping tasks in the workflow.

PowerExchange Listener Service


You can configure PowerExchange so that Data Integration Service workflows connect to a PowerExchange Listener through a PowerExchange Listener Service. If the NODE statement in the DBMOVER configuration file on a Data Integration Service node includes the service_name parameter, the Data Integration Service ignores the host_name parameter on the NODE statement and uses the service_name and port parameters to connect to the Listener Service that manages the PowerExchange Listener process.

Version 9.5.0

11

The function of the NODE statement did not change for PowerCenter Integration Service workflows.

Profile Privilege
Assign the Manage Data Domains Model Repository Service privilege to enable a user to create, edit, and delete data domains in the data domain glossary.

Security
The Model Repository Service includes the Show Security Details privilege. When you disable this privilege,

error and warning message details do not display the names of projects for which users do not have read permission.
The Informatica domain locks out a user if they exceed the maximum number of failed logins. The administrator

can configure the maximum number of failed logins. The administrator can also unlock an account.

Command Line Programs


This section describes new commands and options for the Informatica command line programs.

infacmd cms Commands


The following table describes new infacmd cms commands:
Command CreateAuditTables DeleteAuditTables ResyncData Description Creates audit trail tables that record any change made to probabilistic model content sets. Deletes audit trail tables that record any change made to probabilistic model content sets. Synchronizes the probabilistic model content set files on a Content Management Service machine with the files on the master Content Management Service machine. Updates a Content Management Service to version 9.5.0. When you run infacmd cms upgrade, the command updates the following properties on the service: Master CMS, Model Repository Service, Reference Data Location.

Upgrade

The following table describes an updated infacmd cms command:


Command CreateService Description Contains the following new options: - -RepositoryService (-rs). Specifies a Model Repository Service to associate with the Content Management Service. - -ReferenceDataLocation (-rdl). Connection name of the database that stores data values for the reference tables defined in the Model repository. Note: -RepositoryService and -ReferenceDataLocation are required options. Update scripts that use the CreateService command before you run them in an Informatica 9.5.0 environment.

12

Chapter 1: New Features and Enhancements (9.5.0)

infacmd dis Commands


The following table describes updated commands:
Command CreateService Description Contains new option -GridName (-gn). This option specifies the grid on which the Data Integration Service runs. Deletes all cache for a logical data object, including the latest cache run if the latest cache run has exceeded the cache refresh period. Previously, this command deleted all cache for a logical data object except the latest cache run. Contains new option -PurgeAll (-pa). This option deletes all cache for a logical data object. UpdateDataObjectOptions Contains new data object option DataObjectOptions.RefreshDisabled. This option specifies the name of the table that the Data Integration Service uses to cache the logical data object. Contains the following new options: - -NodeName (-nn). The name of the node on which the Data Integration Service runs. - -GridName (-gn). The name of the grid on which the Data Integration Service runs. Contains the following changed option: - -Option (-o). This argument is optional. Previously, this argument was required. Contains the following new Data Integration Service option: - WSServiceOptions.DTMKeepAliveTime. Sets the keepalive time for all web services that are deployed to the Data Integration Service. Contains the following changed Data Integration Service options: - WSServiceOptions.<option name>. Specifies the web service options. Previously, the web service options were named "WebServiceOptions.<option name>." - WebServiceOptions.RequestResourceBufferSize. This option is removed.

PurgeDataObjectCache

UpdateServiceOptions

If you created scripts that use the changed Data Integration Service options, you must update the scripts.

infacmd ipc Commands


The following table describes a new command:
Command ImportFromPC Description Converts a PowerCenter repository object XML file to a Model repository object XML file.

The following table describes an updated command:


Command CreateConnection Description Contains new option -ConnectionId (-cid). This option specifies the string that the Data Integration Service uses to identify the connection.

Version 9.5.0

13

infacmd isp Commands


The following table describes new commands:
Command RenameConnection ValidateFeature Description Renames a connection. Validates that the feature in the specified plug-in file is registered in the domain.

The following table describes an updated command:


Command ImportDomainObjects Description The merge conflict resolution strategy for option -ConflictResolution (-cr) is removed. You can still specify the merge strategy for groups in the import control file. If you created scripts that use the merge conflict resolution strategy, you must update the scripts.

infacmd oie Commands


The following table describes updated commands:
Command Export Import Description Contain new option -OtherOptions (-oo). This option specifies the options you can set when you import or export data files. You can set an option for a probabilistic model file in the rtm group. The possible values are "full" or "trainedOnly." The following options select trained probabilistic model files:
rtm:disName=ds,codePage=UTF-8,refDataFile=/folder1/data.zip,pm=trainedOnly

infacmd ps Commands
The following table describes new commands:
Command cancelProfileExecution executeProfile getProfileExecutionStatus migrateScorecards Description Cancels the profile model run. Runs the profile model. Gets the run-time status of a profile model. Migrates scorecard results from Informatica 9.1.0 to 9.5.0.

14

Chapter 1: New Features and Enhancements (9.5.0)

infacmd rtm Commands


The following table describes updated commands:
Command DeployImport Description Contains the following changed options: - -ConflictResolution (-cr). This option is removed. - -DataIntegrationService (-ds). Identifies the Data Integration Service. Previously, you used the DsServiceName (-dsn) option. - -Folder (-f). Identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine. - -StagingDbName (-sdb). This option is removed. Contain changed option -Folder (-f). This option identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine. Contains the following changed options: - -Folder (-f). Identifies a folder on the machine that runs the command. Previously, this option identified a folder on a Data Integration Service machine. - -ImportType (-it). Specifies the type of content to import. The DataOnly argument is deprecated for this option. Use the MetadataAndData argument with the -ImportType option to import reference data into the Model repository and reference data database. Use the infacmd oie ImportObjects command to import data to the reference data database only.

Export

Import

If you created scripts that use the changed options, you must update the scripts.

infacmd sql Commands


The following table describes updated commands:
Command UpdateSQLDataServiceOptions Description Contains new SQL data service option SQLDataServiceOptions.DTMKeepAliveTime. This option sets the keepalive time for one SQL data service that is deployed to the Data Integration Service. Contains new data object option VirtualTableOptions.RefreshDisabled. This option specifies the name of the table that the Data Integration Service uses to cache the virtual table.

UpdateTableOptions

infacmd wfs Commands


The following table describes new infacmd wfs commands:
Command ListWorkflowParams Description Lists the parameters for a workflow and creates a parameter file that you can use when you run a workflow. Starts an instance of a workflow.

StartWorkflow

Version 9.5.0

15

infacmd ws Commands
The following table describes an updated command:
Command UpdateWebServiceOptions Description Contains new web service option WebServiceOptions.DTMKeepAliveTime. This option sets the keepalive time for one web service that is deployed to the Data Integration Service.

pmrep
The following table describes updated commands:
Command ExecuteQuery FindCheckout ListObjects ListObjectDependencies Validate Description Contain new option -y. This option displays the database type of sources and targets.

PowerCenter
This section describes new features and enhancements to PowerCenter.

Datatypes
PowerCenter supports the Microsoft SQL Server datetime2 datatype. Datetime2 datatype has a precision of 27 and scale of 7.

Transformation Language
Use the optional argument, match_from_start, with the REG_EXTRACT function to return the substring if a match is found from the start of the string. The REG_EXTRACT function uses the following syntax:
REG_EXTRACT( subject, 'pattern', subPatternNum, match_from_start )

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Resources
SAP BW Resource You can create and configure a SAP BW resource to extract metadata from SAP NetWeaver Business Warehouse. Custom Resource You can create and configure custom resources to extract metadata from custom files such as comma separated files. You can create load template files that contain all mapping rules and rule sets used to load the custom resources.

16

Chapter 1: New Features and Enhancements (9.5.0)

Rule-based Links
Use rule-based links to define rules that Metadata Manager uses to link matching elements between a custom resource type and another custom, packaged, or business glossary resource type. You can also configure rulebased links between a business glossary and a packaged resource type. Configure rule-based links so that you can run data lineage analysis across metadata sources.

Command Line Programs


The following table describes new Metadata Manager commands.
Command createloadtemplate generatedefaultload template getloadtemplate deleteloadtemplate listloadtemplate updateloadtemplate createlinkruleset updatelinkruleset Description Creates a load template file. Generates a default load template to load all top level classes for the specified model.

Exports a load template file. Deletes a load template file. Lists all the load template files for a custom resource. Updates a load template file. Creates a linking rule set based on a rule set XML file. Updates a linking rule set based on a modified rule set XML file. If the rule set does not exist, the command creates the rule set. Deletes a linking rule set. Exports all linking rule sets for a resource to XML files. You can import the rule sets into another Metadata Manager repository. Imports all linking rule sets from XML files in the specified path into the Metadata Manager repository.

deletelinkruleset exportlinkruleset

importlinkruleset

PowerExchange Adapters
This section describes new features and enhancements to PowerExchange adapters in version 9.5.

Adapters for PowerCenter


PowerExchange for Greenplum
You can parameterize the following PowerExchange for Greenplum session properties: - Greenplum Target Table - Match Columns - Update Columns - Update Condition - Delimiter - Escape Character

Version 9.5.0

17

- Null As - Quote - Error Table - Greenplum Pre SQL - Greenplum Post SQL

PowerExchange for Microsoft Dynamics CRM


You can use PowerExchange for Microsoft Dynamics CRM for online deployment with passport

authentication.
You can use Power Exchange for Microsoft Dynamics CRM for Internet-facing deployment with claims-

based authentication.
You can read and write PartyList datatype from Microsoft Dynamics CRM. Intersect entities are writable.

PowerExchange for Salesforce


You can use an encrypted HTTP proxy account password for PowerExchange for Salesforce. Use the

PowerCenter command line program, pmpasswd, to encrypt the password. PowerExchange for SAP NetWeaver
PowerExchange for SAP NetWeaver uses SAP RFC SDK 7.2 libraries.

PowerExchange for Teradata Parallel Transporter API


You can use the serialize mechanism for columns with the Stream system operator.

PowerExchange for Ultra Messaging


You can connect to Informatica Ultra Messaging Queuing sources and targets to process messages. You can read and write Ultra Messaging JMS messages. You can configure pass-through partitioning for an Ultra Messaging source and target. You can run PowerExchange for Ultra Messaging workflows concurrently. You can configure PowerExchange for Ultra Messaging for high availability. You can configure PowerExchange for Ultra Messaging to use any Ultra Messaging transport protocol.

Adapters for Informatica


PowerExchange for Facebook
You can extract social media data such as friends and posts from Facebook. You can use Open Authentication to connect to Facebook. You can use Informatica Developer to create a Facebook data object, specify resources, and create a data

object operation. You can use the data object operation as a source in the mappings.
You can use Facebook search operators in a query parameter to search for data.

PowerExchange for LinkedIn


You can extract social media data such as connections and profiles from LinkedIn. You can use Open Authentication to connect to LinkedIn. You can use Informatica Developer to create a LinkedIn data object, specify resources, and create a data

object operation. You can use the data object operation as a source in the mappings.
You can use LinkedIn search operators in a query parameter to search for data.

18

Chapter 1: New Features and Enhancements (9.5.0)

PowerExchange for SAP NetWeaver


PowerExchange for SAP NetWeaver uses SAP RFC SDK 7.2 libraries.

PowerExchange for Twitter


You can extract social media data such as tweets and profiles from Twitter. You can extract Twitter data

that is six to nine days old or extract data in real time.


You can use Open Authentication to connect to Twitter. You can use Informatica Developer to create a Twitter data object, specify resources, and create a data

object operation. You can use the data object operation as a source in the mappings.
You can use Twitter search operators in a query parameter to search for data.

Documentation
This section describes new features and enhancements to the documentation.

Documentation DVD
The Informatica Documentation DVD contains product manuals in PDF format. Effective in 9.5.0, the documentation DVD uses a browser-based user interface. Supported browsers are Internet Explorer 7.0 or later and Mozilla Firefox 9.0 or later. Ensure that Javascript support is enabled and the Adobe Acrobat Reader plugin is installed in your browser.

Data Quality User Guide


The Informatica Data Quality User Guide contains information about profiles, reference data, rules, and scorecards. It includes data quality information from the Informatica Data Explorer User Guide , Informatica Data Quality Analyst User Guide , Informatica Developer User Guide, and Informatica Developer Transformation Guide .

Data Processor Transformation Guide


The Informatica Data Processor Transformation Guide contains information that can help you design scripts and XMaps in the Data Processor transformation in the Developer tool and implement them in Data Integration Services. It consolidates information from the Data Transformation Studio Editing Guide , Data Transformation Studio User Guide, and Data Transformation Engine Developer Guide .

Data Services Performance Tuning Guide


The Informatica Data Services Performance Tuning Guide contains information that can help you identify and eliminate bottlenecks and tune the Administrator, Developer, and Analyst tools to improve data services performance.

Data Services User Guide


The Informatica Data Services User Guide contains information about data services, virtual data, queries, and data services configuration. It consolidates information from the Informatica JDBC/ODBC Connection Guide , Informatica SQL Reference , and the Informatica Developer User Guide.

Developer Workflow Guide


The Informatica Developer Workflow Guide describes how to create and configure workflows in the Developer tool.

Version 9.5.0

19

CHAPTER 2

Informatica Data Explorer (9.5.0)


This chapter includes the following topics:
Edit Profile Action Menus, 20 Foreign Key Discovery, 20 Projects, 21 Scorecards, 21

Edit Profile Action Menus


Effective in version 9.5.0, the actions menus to edit a profile in Informatica Analyst have changed. The following table describes tasks to edit a profile that have changed action menus:
Task Change the basic properties such as name, description, and profile type. Choose another matching data source for the profile. Select the columns you want to run the profile on and configure the sampling and drill down options. Create, edit, and delete filters. Create rules or change current ones. Changed to Actions > General Changed from Actions > Edit Properties

Actions > Data Source

Actions > Edit Properties

Actions > Column Profiling

Actions > Run Profile

Actions > Column Profiling Filter Actions > Column Profiling Rules

Actions > Manage Filters Actions > Add Rule

Foreign Key Discovery


Effective in version 9.5.0, the option to limit the total number of inferred foreign keys has changed. This option now determines the number of foreign keys identified between a child data object and a parent data object. The default value is 2.

20

The following table describes the change to the foreign key limiting option:
Task Limit the number of foreign keys identified between a child data object and a parent data object Changed to Max foreign keys between data objects Changed from Max foreign keys returned

Previously, the option determined the total number of foreign keys the Developer tool returned in the profile results and the default value was 500.

Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and

grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.

Previously, the Developer tool displayed all projects regardless of project permissions.

Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.

Projects

21

CHAPTER 3

Informatica Data Quality (9.5.0)


This chapter includes the following topics:
Address Validator Transformation, 22 Edit Profile Action Menus, 22 Export to PowerCenter, 23 File Directory Fields for Mapping Objects, 23 Exception Transformation, 24 Mapping and Mapplet Editors, 24 Match Transformation, 25 Projects, 25 Reference Tables, 25 Scorecards, 25

Address Validator Transformation


Effective in version 9.5.0, the Address Validator transformation uses version 5.2.9 of the Address Doctor software engine. Address Doctor 5.2.9 allows you to configure the Address Validator transformation to return the formal street name or the street alias in cases where a street has both a formal name and an alias. Previously, the transformation returned the formal name only. Effective in 9.5.0, the Address Validator transformation refreshes the list of available input and output ports when you open the transformation. Previously, Informatica defined the port list in the product code.

Edit Profile Action Menus


Effective in version 9.5.0, the actions menus to edit a profile in Informatica Analyst have changed.

22

The following table describes tasks to edit a profile that have changed action menus:
Task Change the basic properties such as name, description, and profile type. Choose another matching data source for the profile. Select the columns you want to run the profile on and configure the sampling and drill down options. Create, edit, and delete filters. Create rules or change current ones. Changed to Actions > General Changed from Actions > Edit Properties

Actions > Data Source

Actions > Edit Properties

Actions > Column Profiling

Actions > Run Profile

Actions > Column Profiling Filter Actions > Column Profiling Rules

Actions > Manage Filters Actions > Add Rule

Export to PowerCenter
Effective in version 9.5.0, the process to export Model repository objects to the PowerCenter repository writes log message files to the machine that performs the export operation. Previously, the export process did not write log files and displayed log messages for Developer tool export operations only. If you export to a PowerCenter repository from a Developer tool machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\clients\DeveloperClient\infacmd\exporttopc_cli_logs

If you export to a PowerCenter repository from an Informatica services machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\tomcat\logs\exporttopc_cli_logs

You must have write access to the log directory. If you do not have write access, Informatica 9.5.0 displays a warning message to state that no logs are stored for the export.

File Directory Fields for Mapping Objects


Effective in version 9.5.0, system mapping parameters are the default values for flat file directory, cache file directory, and temporary file directory fields in mapping objects. System parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of system parameters on a Data Integration Service process in the Administrator tool.

Export to PowerCenter

23

The following table lists the default values for these directories:
Object Flat file data object Flat file data object Flat file data object Aggregator transformation Joiner transformation Lookup transformation Rank transformation Sorter transformation Directory Field Source file directory Output file directory Reject file directory Cache directory Cache directory Cache directory Cache directory Work directory Default Value SourceDir TargetDir RejectDir CacheDir CacheDir CacheDir CacheDir TempDir

Previously, the default value for all of these directories was ".", which stood for the following directory:
<Informatica Services Installation Directory>\tomcat\bin

When you upgrade, the upgrade process does not change the value of these directory fields. If you used the previous default value of ".", the upgrade process retains that value.

Exception Transformation
Effective in version 9.5.0, you can connect the bad record and duplicate record output from an Exception transformation to a data object in a mapping. You use the transformation to create the bad record and duplicate record tables and the data object. Previously, you used the Exception transformation to write data to bad record or duplicate record tables that are not represented as repository objects. If you upgrade to Data Quality 9.5.0 and the Model repository contains an Exception transformation, complete the following steps to use the transformation in Data Quality 9.5.0: 1. 2. 3. Create a data object from the database table that contains the bad records or duplicate records. Add the data object to the mapping canvas. Connect the bad data or duplicate data ports output ports to the data object.

When you run a mapping with an Exception transformation in Data Quality 9.5.0, you can use Informatica Analyst or Informatica Data Director for Data Quality to review and edit the table records.

Mapping and Mapplet Editors


Effective in version 9.5.0, when you click Layout > Arrange All in any mapping or mapplet editor, the Developer tool aligns each object in the editor, but retains the object size. Previously, the Arrange All option resized each object to a standard size.
24 Chapter 3: Informatica Data Quality (9.5.0)

Match Transformation
Effective in version 9.5.0, the Match transformation refreshes the list of identity population files that are installed on the Informatica services machine each time you open a strategy in the transformation. Previously, the Match transformation read the list of identity population files when you started the Developer tool.

Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and

grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.

Previously, the Developer tool displayed all projects regardless of project permissions.

Reference Tables
Effective in 9.5.0, you can use the Developer tool and Analyst tool to create, edit, and delete reference tables in the Model repository. Previously, you used the Analyst tool to perform reference table operations.

Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.

Match Transformation

25

CHAPTER 4

Informatica Data Services (9.5.0)


This chapter includes the following topics:
Edit Profile Action Menus, 26 Export to PowerCenter, 27 File Directory Fields for Mapping Objects, 27 Flat File Data Object, 28 Mapping and Mapplet Editors, 28 Projects, 28 Scorecards, 28 Web Service Consumer Transformation, 28 Web Services, 29

Edit Profile Action Menus


Effective in version 9.5.0, the actions menus to edit a profile in Informatica Analyst have changed. The following table describes tasks to edit a profile that have changed action menus:
Task Change the basic properties such as name, description, and profile type. Choose another matching data source for the profile. Select the columns you want to run the profile on and configure the sampling and drill down options. Create, edit, and delete filters. Create rules or change current ones. Changed to Actions > General Changed from Actions > Edit Properties

Actions > Data Source

Actions > Edit Properties

Actions > Column Profiling

Actions > Run Profile

Actions > Column Profiling Filter Actions > Column Profiling Rules

Actions > Manage Filters Actions > Add Rule

26

Export to PowerCenter
Effective in version 9.5.0, the process to export Model repository objects to the PowerCenter repository writes log message files to the machine that performs the export operation. Previously, the export process did not write log files and displayed log messages for Developer tool export operations only. If you export to a PowerCenter repository from a Developer tool machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\clients\DeveloperClient\infacmd\exporttopc_cli_logs

If you export to a PowerCenter repository from an Informatica services machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\tomcat\logs\exporttopc_cli_logs

You must have write access to the log directory. If you do not have write access, Informatica 9.5.0 displays a warning message to state that no logs are stored for the export.

File Directory Fields for Mapping Objects


Effective in version 9.5.0, system mapping parameters are the default values for flat file directory, cache file directory, and temporary file directory fields in mapping objects. System parameters are constant values that define the directories where the Data Integration Service stores cache files, reject files, source files, target files, and temporary files. You define the values of system parameters on a Data Integration Service process in the Administrator tool. The following table lists the default values for these directories:
Object Flat file data object Flat file data object Flat file data object Aggregator transformation Joiner transformation Lookup transformation Rank transformation Sorter transformation Directory Field Source file directory Output file directory Reject file directory Cache directory Cache directory Cache directory Cache directory Work directory Default Value SourceDir TargetDir RejectDir CacheDir CacheDir CacheDir CacheDir TempDir

Previously, the default value for all of these directories was ".", which stood for the following directory:
<Informatica Services Installation Directory>\tomcat\bin

When you upgrade, the upgrade process does not change the value of these directory fields. If you used the previous default value of ".", the upgrade process retains that value.

Export to PowerCenter

27

Flat File Data Object


Effective in version 9.5, you can use the flat file data object to read a directory of files with the same file properties. Previously, you could not read a directory of files.

Mapping and Mapplet Editors


Effective in version 9.5.0, when you click Layout > Arrange All in any mapping or mapplet editor, the Developer tool aligns each object in the editor, but retains the object size. Previously, the Arrange All option resized each object to a standard size.

Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and

grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.

Previously, the Developer tool displayed all projects regardless of project permissions.

Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.

Web Service Consumer Transformation


Effective in version 9.5.0, you can create a Web Service Consumer transformation for a SOAP 1.2 binding from a single WSDL object. For a SOAP 1.2 binding, the Data Integration Service returns the fault message, code, reason, node, and role elements for the fault. Previously, you could only create a Web Service Consumer transformation for a SOAP 1.1 binding. The Data Integration Service returned the fault elements defined for a SOAP 1.1 binding.

28

Chapter 4: Informatica Data Services (9.5.0)

Web Services
This section describes changes to web services.

Fault Transformation
Effective in version 9.5.0, you can configure a Fault transformation to return a generic error message when the error is not defined by a fault element in the WSDL. When you create a Fault transformation for a generic fault in a web service, you must define the operation mapping logic that returns the error condition. Previously, you could create a Fault Transformation to return a predefined fault from a fault element. The web service used a fault element to define the fault. You could configure a Fault transformation to return a custom error message.

Fault Terminology
Effective in version 9.5.0, the fault handling terminology changed. Faults can be of the following types:
System defined User-defined - Predefined - Generic

The following table shows the terminology changes:


9.5 Term System defined fault. User-defined fault. Predefined fault. Generic fault. Previous Term Generic fault. User-defined fault. -

Manual Web Services


Effective in version 9.5.0, the Developer tool contains behavior changes for web services that you create manually. The Developer tool contains the following behavior changes:
When you change an element type from complex to simple, the Developer tool clears the location of the

associated port. Previously, the Developer tool deleted the associated port.
When you change an element type from simple to complex, the Developer tool marks the port as not valid.

Previously, the Developer tool cleared the location of the associated port.

SOAP 1.2
Effective in version 9.5.0, the following changes are implemented for SOAP 1.2:
Each web service can have one or more operations that use either a SOAP 1.1 binding or a SOAP 1.2 binding

but not both.

Web Services

29

The SOAP request can be of SOAP 1.1 or SOAP 1.2 format. The SOAP request is based on the type of binding

that is used by the binding operation associated with the operation mapping.
When you create a fault in an operation that has a SOAP 1.2 binding, the wizard creates the code, reason,

node, and role elements. Previously, you could only create an operation with a SOAP 1.1 binding and create a fault in an operation with a SOAP 1.1 binding.

30

Chapter 4: Informatica Data Services (9.5.0)

CHAPTER 5

Informatica Data Transformation (9. 5.0)


This chapter includes the following topics:
Data Transformation Platform, 31 Deprecated Script Components, 32 IntelliScript Editor, 32 Model Repository, 32 Obsolete Script Components and Options, 33 Script Objects, 33 Transformation, 34 Views, 34 XML Validation, 35

Data Transformation Platform


Effective in version 9.5.0, Data Transformation moved to the Informatica platform. You can create a Data Processor transformation that includes XMap objects and script objects. Previously, you could use Data Transformation Studio to create scripts. Data Transformation Studio is still available to edit Data Transformation library projects and validation rule files.

31

Deprecated Script Components


Effective in version 9.5.0, some Intelliscript components are deprecated. The IntelliScript editor displays deprecated components in scripts that were created in previous Data Transformation versions, but you can no longer add them to scripts. The following table describes the deprecated components and suggestions for replacements:
Component ExternalJavaPreProcessor document processor ExternalPreProcessor document processor ExternalTransformer transformer JavaTransformer transformer EDIFACTValidation transformer Replacement Develop a custom Java component. Develop a custom C++ component. Develop a custom C++ component. Develop a custom Java component. Use other validation components.

IntelliScript Editor
Effective in version 9.5.0, the IntelliScript editor opens in the Developer tool. The IntelliScript editor uses new icons and a new font. Script components now appear in black letters. Previously, the IntelliScript editor opened in Data Transformation Studio. Script components appeared in brown letters. The IntelliScript editor also had a separate panel to display the example source document for the main input.

Model Repository
Effective in version 9.5.0, you store schemas, example sources, and other project files in the Model repository. You can also import a Data Transformation project or service into the Model repository. Previously, you stored schemas, example sources, and project files in a workspace folder in the file system. You could import projects or services into the workspace folder, or you could copy the files manually to the workspace folder.

32

Chapter 5: Informatica Data Transformation (9.5.0)

Obsolete Script Components and Options


Effective in version 9.5.0, some script components that were deprecated in version 8.6.1 are obsolete. The IntelliScript editor does not load scripts that contain any of these components or options. The following table describes the obsolete components and options and their replacements:
Obsolete Component DownloadFile action EDIValidation validator ExcelToHtml document processor ExcelToTextML document processor ExcelToTxt document processor HtmlForm anchor IBANValidation validator JavaScriptFunction action MSMQOutput option of WriteValue action MSMQOutput option of WriteSegment action PowerpointToHtml document processor SubmitForm action SubmitFormGet action WordToRTF document processor WordToTxt document processor WordToHTML document processor WordToTextML document processor WordperfectToTextML document processor Replacement Use a custom component. Use other validation components. Use the ExcelToXml_03_07_10 document processor. Use the ExcelToXml_03_07_10 document processor. Use the ExcelToXml_03_07_10 document processor. Use a custom component. Use other validation components. Use other components that define complex behavior. Use a custom component. Use a custom component. Use a custom component. Use a custom component. Use a custom component. Use the WordToXml document processor. Use the WordToXml document processor. Use the WordToXml document processor. Use the WordToXml document processor. Use a custom component.

Script Objects
Effective in version 9.5.0, you create scripts in the Developer tool. Previously, you used Data Transformation Studio to create TGP file scripts.

Obsolete Script Components and Options

33

Transformation
Effective in version 9.5.0, transformations include the following changes:
Data Transformation moved to the Informatica platform. You can create a Data Processor transformation in the

Developer tool. Create scripts and XMap objects in the transformation instead of the DT Studio.
Set the startup component of a Data Processor transformation in the Overview tab. If the startup component is

a component of a script, you can set it in the IntelliScript editor. Previously, you could set the startup component of a TGP file in the IntelliScript editor.
Add a schema object to a project in the Model repository, and then reference a schema in the Data Processor

transformation. Previously, you added a schema to the XSD node in the Data Transformation Explorer view of Data Transformation Studio.
The Output panel of the Data Viewer view displays the main output of a Data Processor transformation or the

output of an additional output port. Previously, you could view the main output and output for additional output ports in a separate view in the editor area.
View an example source in the Input panel of the Data Viewer view. You can view example source data for the

main input and for additional input ports. Previously, you could view the example sources for the main input in the Input panel of the IntelliScript editor, and you could view the example source for additional input ports in a separate view.
Configure document encodings and other settings in the Data Processor transformation Settings view.

Previously, you configured document encodings and other settings in the Studio project properties.
You can no longer use the VarPostData, VarFormAction, and VarFormData system variables. The IntelliScript

editor continues to display them in scripts that were created in previous Data Transformation versions.

Views
Effective in 9.5.0, views in the Developer tool replaced views in Data Transformation Studio. The following table describes the changes in the views:
Developer Tool View Data Processor Events Studio View Events Description Shows information about events that occur when you run the transformation. Shows initialization, execution, and summary events. Displays context-sensitive help for components and properties selected in the IntelliScript editor. Displays the example source documents in hexadecimal format.

Data Processor Script Help

Help

Data Processor Hex

Binary Source

34

Chapter 5: Informatica Data Transformation (9.5.0)

Developer Tool View Data Viewer

Studio View IntelliScript editor example panel and other components No equivalent

Description View example input data, run the transformation, and view output results. Displays context-sensitive help for tabs selected in the Data Processor transformation. Add, modify, or delete script and XMap objects from the transformation. Configure ports and define the startup component. Add or remove schemas from the transformation. Configure transformation settings for encoding, output control, and XML generation. Displays details of syntax errors in the Data Transformation project or Data Processor transformation. Displayed all project files in a hierarchical tree. Displayed all schemas available in the project, together with system variables and user-defined variables. Displayed information about services in the ServiceDB folder on the local computer. Displayed all the components of a TGP file in a hierarchical tree. Displayed additional information about the value of the component or property selected in the IntelliScript editor.

Help

Objects

No equivalent

Overview

No equivalent

References

No equivalent

Settings

Project Properties dialog box

Validation Log

Problems

No equivalent

Data Transformation Explorer

No equivalent

Schema

No equivalent

Repository

No equivalent

Component

No equivalent

Intelliscript Assistant

XML Validation
Effective in version 9.5.0, the lexical space of the simple type gmonth is --MM, in accordance with W3C erratum E2-12. Previously, the lexical space of the simple type gmonth was --MM--, in accordance with the original W3C XML Schema recommendation.

XML Validation

35

CHAPTER 6

Informatica Domain (9.5.0)


This chapter includes the following topics:
Connection Management, 36 Content Management Service, 36 Data Integration Service, 36 Pass-through Security, 37 Web Services, 37

Connection Management
Effective in version 9.5.0, the Data Integration Service identifies each connection by the connection ID. Therefore, you can rename a connection. Previously, the Data Integration Service identified each connection by the connection name. You could not rename a connection. If you upgrade to version 9.5.0, the upgrade process sets the connection ID for each connection to the connection name.

Content Management Service


Effective in version 9.5.0, the Content Management Service specifies the database connection name for the database that stores reference table data values. Previously, the Analyst Service specified the database connection name.

Data Integration Service


Effective in version 9.5.0, when the Data Integration Service restarts, the state of each application associated with the Data Integration Service is restored. Previously, when the Data Integration Service restarted, each application associated with the Data Integration Service was restarted as well.

36

Pass-through Security
Effective in version 9.5.0, you configure pass-through security in the connection properties of a domain. Previously, you configured pass-through security in the Data Integration Service.

Web Services
Effective in version 9.5.0, you can set the keepalive interval for web services through the Administrator tool. You can also set the keepalive interval through the following infacmd command options:
infacmd dis UpdateServiceOptions command, WSServiceOptions.DTMKeepAliveTime option infacmd ws UpdateWebServiceOptions command, WebServiceOptions.DTMKeepAliveTime option

Previously, you set the keepalive interval for all web services through the infacmd dis UpdateServiceOptions command, WebServiceOptions.DTMKeepAlive option. If you created scripts that use this command option, you must update the scripts.

Pass-through Security

37

CHAPTER 7

PowerCenter (9.5)
This chapter includes the following topics:
Pushdown Optimization, 38 Exporting Metadata to Excel, 38

Pushdown Optimization
Effective in version 9.5.0, you can disable creation of temporary views for pushdown optimization to Teradata when the Source Qualifier transformation contains source filter, user defined joins, or SQL override. Previously, pushdown optimization on Teradata database would create and drop views when you have source filter, user defined joins, or SQL override at the Source Qualifier transformation.

Exporting Metadata to Excel


Effective in version 9.5.0, the PowerCenter Repository Service does not export the Domains, Enumerations, Joins, Lookups, Filter, and Rules worksheet when you export metadata to Excel. The export of metadata to Excel is a reporting activity that represents a summary of the data lineage. The PowerCenter Repository Service exports the Models, Packages, and Mapping worksheets when you export metadata to Excel. Previously, the PowerCenter Repository Service exported the worksheets by including the worksheets that do not have summary lineage, when you export metadata to Excel. The worksheets exported were Models, Packages, Domains, Enumerations, Mappings, Joins, Lookups, Filter, and Rules worksheet.

38

CHAPTER 8

Metadata Manager (9.5.0)


This chapter includes the following topics:
Data Modeling and Business Intelligence Resources, 39 Incremental Metadata Load, 39 mmcmd Command Line Program, 39 Resource Types, 40 Metadata Manager Service, 40

Data Modeling and Business Intelligence Resources


Effective in version 9.5.0, properties of the data modeling and business intelligence resources include all the properties available in the Metadata Manager Agent. Previously, these resources included custom properties that were not available in the Metadata Manager Agent.

Incremental Metadata Load


Effective in version 9.5.0, when you configure any business intelligence resource except a Microsoft Analysis and Reporting Services resource, you can choose to incrementally load recent changes to the metadata instead of loading complete metadata. After the first successful load, the Metadata Manager Agent incrementally loads metadata that changed after the last successful load. By default, the incremental extraction option is enabled for all these resources. If you do not want to incrementally load a resource, you can edit the resource and disable the option. Previously, you could incrementally load Business Object and Cognos resources.

mmcmd Command Line Program


Effective in version 9.5.0, specify two consecutive hyphens (--) before a long name option in the mmcmd commands. Previously, you specified a hyphen (-) before a long name option in the commands.

39

For example, if the long name option in a command is user, specify --user <user name> instead of -user <user name>.

Resource Types
Some of the resource types in Metadata Manager 9.1.0 are deprecated in Metadata Manager 9.5.0 The following resource types are deprecated in Metadata Manager 9.5.0:
Business Objects Cognos ReportNet Microsoft Analysis and Reporting Services MicroStrategy Oracle Business Intelligence Enterprise Edition Erwin ERStudio Oracle Designer Power Designer RationalER Generic JDBC Xconnect

When you upgrade to Metadata Manager 9.5.0, Metadata Manager appends (Deprecated_9.5.0) to the Metadata Manager 9.1.0 resource types. You can view resources of the deprecated resource type, but you cannot create or edit resources. You can also view the existing data lineage for the objects of the deprecated resource types. You must create and load resources with the corresponding new resource types in 9.5.0.

Metadata Manager Service


After you upgrade, complete the post-upgrade tasks for each Metadata Manager Service. Enable the Metadata Manager Service when you perform the post-upgrade tasks. You must upgrade to the Metadata Manager 9.1.0 before you upgrade to Metadata Manager 9.5.0.

Convert Metadata Manager Resources


The business intelligence, data modeling, and JDBC resources have new model and representation in Metadata Manager 9.5.0. During the upgrade, Metadata Manager marks the existing business intelligence, data modeling, and JDBC resources as deprecated. You cannot edit or reload the deprecated resources. Instead, for each deprecated resource, you must create, configure and load an equivalent new resource. You will lose all the personalization in deprecated resources and must redo all personalization in the new resources. After you have converted a deprecated resource, you can delete the deprecated resource.

40

Chapter 8: Metadata Manager (9.5.0)

Reload Metadata Manager Resources


Before you can use Metadata Manager after the upgrade, you must reload all the resources. To ensure that the connection assignments are proper, do resource conversion and load the resources in the following order: 1. 2. 3. 4. 5. 6. Convert each deprecated JDBC resource to an equivalent new JDBC resource. Reload all database management resources. Convert each deprecated business intelligence or data modeling resource to an equivalent new resource. Reload each of the business intelligence, data modeling, PowerCenter, ERP, and custom resources. Recreate any personalization. Delete the deprecated resources.

For example, suppose you have resources of the following types:


Database management - JDBC, Oracle Data integration - PowerCenter Data modeling - ER/win Business intelligence - Cognos

After you upgrade to Metadata Manager 9.5.0, the JDBC, ER/win, and Cognos resources are marked as deprecated. Perform resource conversion and load the resources in the following order: 1. 2. 3. 4. 5. 6. Convert the JDBC resource. Load JDBC and Oracle resources. Convert ER/win and Cognos resources. Load PowerCenter, ER/win, and Cognos resources. Recreate any personalization. Delete the deprecated resources.

Update the Metadata Manager Properties File


Compare the imm.properties file in the previous installation directory with the 9.5.0 version. Update the 9.5.0 version of the imm. properties file as required. The 9.5.0 version of the imm.properties file is in the following directory:
<9.5.0 InformaticaInstallationDir>\tomcat\shared\class

The changes take effect when you enable the Metadata Manager Service.

Metadata Manager Service

41

CHAPTER 9

Adapters for PowerCenter (9.5.0)


This chapter includes the following topics:
PowerCenter Dual Load Option for Teradata, 42 PowerExchange for HP Neoview Transporter, 42 PowerExchange for JD Edwards EnterpriseOne (JD Edwards OneWorld), 42 PowerExchange for Microsoft Dynamics CRM, 43 PowerExchange for Salesforce, 43 PowerExchange for Teradata Parallel Transporter API, 44 PowerExchange for Ultra Messaging, 44

PowerCenter Dual Load Option for Teradata


Effective in version 9.5.0, Informatica dropped support for PowerCenter Dual Load Option for Teradata. If you upgrade to version 9.5.0, the sessions will fail.

PowerExchange for HP Neoview Transporter


Effective in version 9.5.0, Informatica dropped support for PowerExchange for HP Neoview Transporter. Hewlett Packard has discontinued selling Neoview. Informatica continues to support PowerExchange for HP Neoview for previous releases. You can upgrade from version 9.1.0 to a 9.1.0 hotfix version. However, if you upgrade to version 9.5.0, the sessions will fail.

PowerExchange for JD Edwards EnterpriseOne (JD Edwards OneWorld)


PowerExchange for JD Edwards OneWorld is renamed PowerExchange for JD Edwards EnterpriseOne.

42

PowerExchange for Microsoft Dynamics CRM


Effective in version 9.5.0, PowerExchange for Microsoft Dynamics CRM includes the following changes:
You can use PowerExchange for Microsoft Dynamics CRM for on-premise deployment with active directory and

claims-based authentication. Previously, you could use PowerExchange for Microsoft Dynamics CRM for on-premise deployment with active directory authentication.
Intersect entities are readable and writable.

Previously, intersect entities were only readable.


Endorsed jars are placed in the following location: <PowerCenter Installation Directory>\server\bin\javalib\endorsed

Previously, endorsed jars were placed in the following location:


<<PowerCenter Installation Directory>\clients\java\jre\lib\Endorsed

PowerExchange for Salesforce


Effective in version 9.5.0, PowerExchange for Salesforce uses version 24.0 of the Salesforce API. Previously, application connections pointed to earlier versions of the Salesforce service. To connect to the new version of a Salesforce object, change the service URL in existing application connections from the previous version to the new version. The following table shows the service URLs for application connections created in the previous versions of PowerExchange for Salesforce:
PowerExchange for Salesforce Version 9.1.0 HotFix 3, 9.1.0 HotFix 4 9.1.0 HotFix 2 9.1.0, 9.1.0 HotFix 1 9.0.1 HotFix 1, 9.0.1 HotFix 2, and 8.6.1.0.4 9.0.1 9.0 8.6.1 and 8.6.1.0.3 8.6.0.1 8.5, 8.5.1, and 8.6 Salesforce Service URL https://www.salesforce.com/services/Soap/u/23.0 https://www.salesforce.com/services/Soap/u/21.0 https://www.salesforce.com/services/Soap/u/20.0 https://www.salesforce.com/services/Soap/u/19.0 https://www.salesforce.com/services/Soap/u/18.0 https://www.salesforce.com/services/Soap/u/17.0 https://www.salesforce.com/services/Soap/u/14.0 https://www.salesforce.com/services/Soap/u/12.0 https://www.salesforce.com/services/Soap/u/8.0

The 24.0 version of the Salesforce service URL is:


https://www.salesforce.com/services/Soap/u/24.0

If the new version of a Salesforce object has a different structure than the previous version of the object, re-import the Salesforce object. After you re-import the object, analyze the associated mapping to determine if you need to update transformations in the mapping. For example, if you re-import a source definition that is based on a

PowerExchange for Microsoft Dynamics CRM

43

Salesforce object that contains a new field, you can modify your mapping to extract the new field and write the data to the target.

PowerExchange for Teradata Parallel Transporter API


Effective in version 9.5.0, PowerExchange for Teradata Parallel Transporter API includes the following changes:
PowerExchange for Teradata Parallel Transporter API will be installed along with Informatica 9.5.0 server.

Previously, you needed to install PowerExchange for Teradata Parallel Transporter API separately.

PowerExchange for Ultra Messaging


Effective in version 9.5.0, PowerExchange for Ultra Messaging includes the following changes: Informatica Ultra Messaging Persistence sources and targets You use PowerExchange for Ultra Messaging to connect to Ultra Messaging Queuing sources and targets. Previously, you used PowerExchange for Ultra Messaging to connect to Ultra Messaging Persistence sources and targets. Configuration files You use one XML-based source and target configuration file to define the configuration options that the PowerCenter Integration Service must use to connect to Ultra Messaging sources and targets contained in one workflow. Previously, you created a source configuration file to connect to Ultra Messaging sources and a target configuration file to connect to Ultra Messaging targets. Ultra Messaging Connection You specify the destination name, configuration file path, session ID, maximum number of session IDs, application name, and context name to create an Ultra Messaging connection. Previously, you specified the destination name and configuration file path. Flush latency You configure the flush latency for an Ultra Messaging session in milliseconds. Previously, you configured the flush latency in seconds. If you upgrade from an earlier version, you must manually configure the flush latency in milliseconds.

44

Chapter 9: Adapters for PowerCenter (9.5.0)

Part II: Version 9.1.0


This part contains the following chapters:
New Features and Enhancements (9.1.0), 46 Informatica Data Explorer (9.1.0), 72 Informatica Data Quality (9.1.0), 73 Informatica Data Services (9.1.0), 77 Informatica Domain (9.1.0), 82 Metadata Manager (9.1.0), 87 PowerCenter (9.1.0), 91 Informatica Development Platform (9.1.0), 92 Adapters for PowerCenter (9.1.0), 93

45

CHAPTER 10

New Features and Enhancements (9.1.0)


This chapter includes the following topics:
Version 9.1.0 HotFix 4, 46 Version 9.1.0 HotFix 3, 47 Version 9.1.0 HotFix 2, 49 Version 9.1.0 HotFix 1, 53 Version 9.1.0, 58

Version 9.1.0 HotFix 4


This section describes new features and enhancements in version 9.1.0 HotFix 4.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Analyst Service Privileges


The License Access for Informatica Analyst privilege is renamed to Run Profiles and Scorecards. The following Analyst Service privileges are required to access mapping specifications and load the results of mapping specifications in the Analyst tool:
Access Mapping Specifications. Allows access to mapping specifications in the Analyst tool. Load Mapping Specification Results. Load the results of a mapping specification to a table or flat file.

After you upgrade to 9.1.0 HotFix 4, an administrator must grant the Access Mapping Specifications and Load Mapping Specification Results privileges from the Administrator tool. After you upgrade to 9.1.0 HotFix 4, users with the License Access for Informatica Analyst privilege will have the Run Profiles and Scorecards privilege. Administrators must grant the Run Profiles and Scorecards privilege to new users.

46

PowerCenter
This section describes new features and enhancements to PowerCenter.

Data Masking Transformation


The Data Masking Transformation has the following enhancements:
The substitution masking type can replace datetime values, integers, and floating point numbers. You can create repeatable output for any data masking type. When you mask credit card numbers, you can keep or replace the six-digit credit card issuer.

Version 9.1.0 HotFix 3


This section describes new features and enhancements in version 9.1.0 HotFix 3.

Informatica Data Explorer


This section describes new features and enhancements to Informatica Data Explorer.

Profile Export
When you export profile results from the Analyst tool, you can choose to export the complete profile results summary to a Microsoft Excel spreadsheet.

Data Source Import


You can import a fixed-width flat file data source using the Analyst tool.

Informatica Data Quality


This section describes new features and enhancements to Informatica Data Quality.

Address Validator Transformation


The Address Validator transformation includes the Delivery Point Validation Footnote Complete port. This port writes all Delivery Point Validation Footnote data for a United States address as a single string. You find this port in the US Specific port group of the Address Validator transformation.

Profile Export
When you export profile results from the Analyst tool, you can choose to export the complete profile results summary to a Microsoft Excel spreadsheet.

Data Source Import


You can import a fixed-width flat file data source using the Analyst tool.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Aggregators
You can add aggregators to a mapping specification in the Analyst tool to perform aggregate calculations on multiple rows of data.

Version 9.1.0 HotFix 3

47

When you add an aggregator, you can perform aggregate calculations on groups of columns or all columns. When you group by columns, you can apply the aggregate conditions and rules to multiple columns. You can include filters, rules, conditional clauses, and nested expressions in an aggregator. You can also add different aggregators to multiple columns.

Export to Excel
You can export the mapping specification logic from the Analyst tool to Microsoft Excel to document and share the mapping specification logic with other analysts and developers. You can then modify the mapping specification with the review feedback.

Data Masking
You can create a Data Masking transformation to transform sensitive production data to realistic test data for nonproduction environments. You can create masked data for software development, testing, training, and data mining. You can maintain data relationships in the masked data and maintain referential integrity between database tables.

Web Services
You can create a web service configuration to control the settings the Developer tool applies when you preview the output of an operation mapping or the output of a transformation in the operation mapping.

Informatica Domain
This section describes new features and enhancements to the Informatica domain.

Data Integration Service


You can limit the number of characters that the Data Integration Service processes in a web service request or response message.

Secure Connections
You can configure the Reporting and Dashboards Service to use secure communications.

PowerCenter and Metadata Manager Reports in JasperReports Server


You can associate a Reporting and Dashboards Service with the PowerCenter Repository Service to view the PowerCenter reports in JasperReports Server. You can associate a Reporting and Dashboards Service with the Metadata Manager Service and view the Metadata Manager reports in JasperReports Server. You can also launch the reports from the PowerCenter Client and Metadata Manager to view them in JasperReports Server.

Connection to the Jaspersoft Repository from Jaspersoft iReport Designer


You can connect to the Jaspersoft repository from Jaspersoft iReport Designer.

Adapters for PowerCenter


This section describes new features and enhancements to adapters for PowerCenter.

PowerExchange for Twitter


You can extract up to 100 Twitter user profiles at a time.

PowerExchange for Salesforce


PowerExchange for Salesforce 9.1.0 HotFix 3 uses version 23.0 of the Salesforce API. You can configure a Bulk API session to replace values in the target with null values from the source. You can use the existing Set Fields to NULL session property when configuring the Bulk API session.

48

Chapter 10: New Features and Enhancements (9.1.0)

PowerExchange for SAP NetWeaver


You can configure the workflow to complete processing when the SAP DataSource does not return any data during business content integration.

PowerExchange for Hadoop


You can set the Hive table name and configure the session to overwrite data in the Hive table. You can configure the session to load Hive table data to a location defined in the output file path.

Metadata Manager
This section describes new features and enhancements in version 9.1.0 HotFix 3.

Configure Server Connection Timeout


You can configure the duration after which a remote connection from mmcmd to the Metadata Manager Service times out.

PowerCenter
This section describes new features and enhancements to PowerCenter.

Aggregators
You can add aggregators to a mapping specification in the Analyst tool to perform aggregate calculations on multiple rows of data. When you add an aggregator, you can perform aggregate calculations on groups of columns or all columns. When you group by columns, you can apply the aggregate conditions and rules to multiple columns. You can include filters, rules, conditional clauses, and nested expressions in an aggregator. You can also add different aggregators to multiple columns.

Data Masking
You can configure dependent data masking for a source column. With dependent masking, the Data Masking transformation masks more than one column of source data from the same row of data in a dictionary. You can maintain a relationship between the columns of source data, such as the relationship between city and state. You can configure data masking for Social Insurance numbers (SIN). Select the SIN masking type when you configure masking for the source SIN.

Export to Excel
You can export the mapping specification logic from the Analyst tool to Microsoft Excel to document and share the mapping specification logic with other analysts and developers. You can then modify the mapping specification with the review feedback.

Version 9.1.0 HotFix 2


This section describes new features and enhancements in version 9.1.0 HotFix 2.

Version 9.1.0 HotFix 2

49

Informatica Data Explorer


This section describes new features and enhancements to Informatica Data Explorer.

Exporting Profile Results from Informatica Developer to Your Local Computer


When you export profile results from the Developer tool, you can choose to export it to a local directory.

Overlap Discovery
You can determine the percentage of overlapping data between two columns within one or more data sources. You run the overlap discovery function from a profile model in Informatica Developer. You can validate the results that appear in a Venn diagram.

Synchronization of Metadata Changes with Data Objects


Informatica Data Explorer synchronizes the metadata changes to imported data sources with the related profile and scorecard data objects. In Informatica Analyst, you can synchronize a data object with the metadata changes to its external data source.

Informatica Data Quality


This section describes new features and enhancements to Informatica Data Quality.

Address Validator Transformation


The Address Validator transformation has new ports. Use the ports to add the following information to an address record:
Additional demographic information about United States addresses. The information includes the time zone for

the address, the census areas the address belongs to, and whether the address is a home or business.
The preferred name of a locality in Canada or the United States, where a preferred locality name exists. For

example, the transformation recognizes "North York" as a locality name in Canada, but it can return "Toronto" as the preferred locality name.
The address type in a Canadian or United States address.

The transformation can return the short forms of United States street and locality names when the address reference data contains the short forms. Use the Element Abbreviation option to add the short forms to the address record.

Informatica Data Quality Integration for PowerCenter


Most Data Quality transformations in PowerCenter can run on a grid. The following transformations can run on a grid:
Address Validator Case Converter Decision Execution Point Labeler Merge Parser Comparison Standardizer Weighted Average

50

Chapter 10: New Features and Enhancements (9.1.0)

Regional Accelerators
The following regional accelerators include prebuilt identity matching mapplets:
Informatica Data Quality Accelerator for Australia and New Zealand Informatica Data Quality Accelerator for Brazil Informatica Data Quality Accelerator for France Informatica Data Quality Accelerator for Germany Informatica Data Quality Accelerator for Portugal Informatica Data Quality Accelerator for United Kingdom Informatica Data Quality Accelerator for US and Canada

Each regional accelerator includes mapplets that perform the following identity matching operations:
Company name and address matching Family name and address matching Individual name and address matching Person name and data matching

Templates
A template is a set of repository objects that you can use to create and run business intelligence reports from Informatica Data Quality applications. Use templates to quickly gain insights into the data quality of project data. A template contains mappings, mapplet rules, profile objects, and Informatica reference data. Templates use logical data objects to enable rapid implementation. Informatica users can download the following templates: CRM Template Use the Customer Relationship Management (CRM) Template to measure and report on the customer data processing issues associated with CRM implementations and data migrations. Dashboard and Reporting Template Use the Dashboard and Reporting Template to add data quality metrics to enterprise business intelligence reports. The template lets you apply data quality metrics across multiple dimensions for cross-organization and data entity reports.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Web Services
You can create an operation from a flat file data object or relational data object. When you create an operation

from a flat file data object or relational data object, the operation performs a look up on the data object.
You can update the maximum number of occurrences for elements in operations that you create from a

reusable object.

Adapters for PowerCenter


This section describes new features and enhancements to adapters for PowerCenter.

PowerExchange for Facebook


You can access Facebook through an HTTP proxy server.
Version 9.1.0 HotFix 2 51

PowerExchange for LinkedIn


You can access LinkedIn through an HTTP proxy server.

PowerExchange for Netezza


You can recover real-time Netezza sessions in normal mode.

PowerExchange for Salesforce


PowerExchange for Salesforce 9.1.0 HotFix 2 uses version 21.0 of the Salesforce API. You can generate success files and error files separately when you monitor a Bulk API session. When you monitor a Bulk API session, you can use the existing Use SFDC Success File session property to generate a success file and the Use SFDC Error File session property to generate an error file. You can configure a session to have Salesforce truncate overflow target data and write truncated data to a Salesforce target. By default, the PowerCenter Integration Service writes overflow data to the session error file.

PowerExchange for Twitter


You can access Twitter through an HTTP proxy server.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

IMM.Properties
The following table describes properties added to the imm.properties file.
Property shortcut.classes Description Lists the classes that are shortcuts. When you add a shortcut class to this list of classes, children of that shortcut appear in the Catalog tree. ImpactSummary.MaxObjects Sets the maximum number of objects to appear in the impact summary user interface. Sets the number of elements that Metadata Manager processes to calculate impact summary.

ElementFetch.ParamSize

Command Line Programs


The following table describes new Metadata Manager commands.
Command backupconfiguration Description Backs up the resource configurations, source files associated with resources, and custom models for a Metadata Manager instance. Restores the resource configurations, source files associated with resources, and custom models from a backup file.

restoreconfiguration

52

Chapter 10: New Features and Enhancements (9.1.0)

Command listmodels getresourcefiles

Description Lists all models in Metadata Manager. Retrieves all the source files associated with a resource.

The following table describes the new Metadata Manager options for the command line programs.
Commands createresource, updateresource, assignconnection, assignparameterfile Option -pdir Description The directory in which the source files associated with a resource are located.

Version 9.1.0 HotFix 1


This section describes new features and enhancements in version 9.1.0 HotFix 1.

Informatica Data Quality


This section describes new features and enhancements to Informatica Data Quality.

Address Validation
Data Quality address validation meets the certification standards of the Address Matching Approval System (AMAS) of Australia Post.

Transformations
Address Validator Transformation
The transformation includes output ports that enable address validation to the Australia Post AMAS

certification standard.
The transformation includes ports that provide additional information on United States address that are

validated to the Coding Accuracy Support System (CASS) certification standard of the United States Postal Service.
The transformation includes ports that provide additional information on Canadian addresses that are

validated to the Software Evaluation and Recognition Program (SERP) certification standard of Canada Post.
You can use the Address Validator transformation to generate a report for address data that meet the

Australia Post AMAS certification standard. Exception Transformation You can configure the Exception transformation to append records to the exception table. Labeler Transformation
The Labeler transformation includes a search feature in the token set and character set wizards. You can

search name, description, and tag metadata by entering text that contains all or part of the metadata string.
When you use the Labeler transformation to select objects from a content set, you can override the default

label.

Version 9.1.0 HotFix 1

53

Parser Transformation The Parser transformation includes a search feature in the token set and character set wizards. You can search name, description, and tag metadata by entering text that contains all or part of the metadata string.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Informatica Data Director Integration


You can integrate detailed operational data from an SQL data service with Informatica Data Director data to give data stewards access to a complete view of data. Data stewards can view virtual table data on a custom tab when they select a master record in Informatica Data Director.

Informatica Data Integration Analyst


You can load the results of a mapping specification to a flat file to download locally. You can export the mapping specification logic as a virtual table that analysts can use to run SQL queries against the data. You can also export the mapping specification logic to PowerCenter to share the logic in the mapping specification with a PowerCenter developer. You can add a lookup to a mapping specification to look up data from a data object. You can add multiple lookups and edit a lookup to configure the lookup conditions and outputs. You can remove redundant sources from a mapping specification and validate the mapping specification.

Import Objects from a URL


You can add untrusted certificates to the Developer tool when you want to import an object from a URL that requires an untrusted client certificate.

Data Object Lookup Optimization


When the logical data object is a lookup in a mapping, the Data Integration Service can attempt to push transformation logic in the logical data object read map to a relational source. Mapping performance improves when the Data Integration Service pushes transformation logic in the logical data object read map to a relational source.

Result Set Caching


You can cache the results of web service requests. The Data Integration Service caches the results for a specified expiration period. When web service client makes the same request before the cache expires, the Data Integration Service returns the cached results. If a cache does not exist or has expired, the Data Integration Service starts a DTM instance to process the request. When a web service uses WS-Security, the Data Integration Service caches the result by user and only returns cached results to the user that sent the web service request.

Schema Objects
You can view global attributes on the Schema view of a schema object.

Virtual Stored Procedures


You can preview a source or transformation in a virtual stored procedure.

Web Services
You can create an operation from a reusable mapplet, reusable transformation, or a reusable logical data

object. When you create an operation from a reusable logical data object, the operation performs a look up on the data in the logical data object.

54

Chapter 10: New Features and Enhancements (9.1.0)

The Developer tool extracts nodes in the first level of the operation hierarchy to ports when you choose to

extract the first level of the hierarchy. The Developer tool also creates the ports to perform the extraction.
When you extract nodes from the operation input, you can extract the complete SOAP request as XML instead

of returning groups of relational data in separate output ports. When you extract ports to the operation output and operation fault, you can extract XML data from one string or text input port to the entire SOAP response.
You can configure a SOAP message from a WSDL or schema that contains derived types, anyTypes, and

substitution groups. You must choose the types that can appear in the data.
You can create a SOAP message from denormalized input data. You can configure composite keys in a SOAP message by extracting multiple ports to the same key. You can

create keys that contain string, bigint, or integer values.


When you use the Data Viewer view to preview the output of a transformation in an operation mapping and the

preview fails, a system-defined fault displays in the Output area of the Data Viewer view.

Web Service Consumer Transformation


The Developer tool extracts nodes in the first level of the operation hierarchy to ports when you choose to

extract the first level of the hierarchy. The Developer tool also creates the ports to perform the extraction.
You can configure a SOAP message from a WSDL or schema that contains derived types, anyTypes, and

substitution groups. You must choose the types that can appear in the data.
You can create a SOAP message from denormalized input data. You can configure composite keys in a SOAP message by extracting multiple ports to the same key. You can

create keys that contain string, bigint, or integer values.

Informatica Domain
This section describes new features and enhancements to the Informatica domain.

Connection Permissions
You can view connection permission details for a user or group. When you view permission details, you can view the origin of effective permissions. Permission details display direct permissions assigned to the user or group and direct permissions assigned to groups that the user or group belongs to. In addition, permission details display whether the user or group is assigned the Administrator role which bypasses the permission check.

Custom Keystore File


If you use a custom keystore file, you can specify the location and password when you upgrade a gateway node to a different node configuration.

Monitoring Reports
You can view the following monitoring reports:
Report Name Longest Duration SQL Data Service Connections Description Shows SQL data service connections that were open the longest during the specified time period. Shows SQL data service requests that ran the longest during the specified time period. Shows web service requests that were open the longest during the specified time period.

Longest Duration SQL Data Service Requests

Longest Duration Web Service Requests

Version 9.1.0 HotFix 1

55

Report Name Minimum, Maximum, and Average Duration Report

Description Shows the total number of SQL data service and web service requests during the specified time period. Shows the total number of SQL data service requests from each IP address. Shows the most frequent errors for jobs, regardless of job type. Shows the most frequent errors for SQL data service requests. Shows the most frequent faults for web service requests.

Most Active IP for SQL Data Service Requests

Most Frequent Errors for Jobs

Most Frequent Errors for SQL Data Service Requests Most Frequent Faults for Web Service Requests

Reporting and Dashboards Service


You can create the Reporting and Dashboards Service from Informatica Administrator. You can use the service to create and run reports from the JasperReports application. JasperReports is an open source reporting library that can be embedded into any Java application.

Command Line Programs


This section describes new commands and options for the Informatica command line programs.

infacmd
With infacmd rds commands, you can create a Reporting and Dashboards Service and list service process options. The following table describes new infacmd rds commands:
Command CreateService ListServiceProcessOptions Description Creates a Reporting and Dashboards Service in a domain. Lists the Reporting and Dashboards Service process options.

The following table describes a new option for the infacmd command:
Command cms CreateService Option -ds Description Specifies the name of the Data Integration Service to assign to the Content Management Service.

Adapters for PowerCenter


This section describes new features and enhancements to adapters for PowerCenter.

PowerExchange for Hadoop


You can use PowerExchange for Hadoop to extract from and load data to Hadoop. This adapter opens the connectivity available between Informatica and Hadoop.

56

Chapter 10: New Features and Enhancements (9.1.0)

Use PowerExchange for Hadoop to complete the following tasks:


Extract data from Hadoop through a Hadoop source to load into a data warehouse and other targets. A Hadoop

source is a Hadoop Distributed File System (HDFS) database.


Load data from enterprise data sources into Hadoop through a Hadoop target. A Hadoop target is an HDFS

database.

PowerExchange for SAP NetWeaver


You can read LRAW datatypes from SAP in the file mode.

PowerExchange for LinkedIn


You can extract social media data from LinkedIn such as FirstName, Specialties, and Honors. You can create source definitions for LinkedIn source types such as Connections and People. You can use a query string to search for data.

PowerExchange for Facebook


You can extract social media data from Facebook such as LikesCount, FromId, and CommentsCount. You can create source definitions for Facebook source types such as Posts. You can use a query string to search for data.

PowerExchange for Twitter


You can extract social media data from Twitter such as FromUser, Entry, and followers_count. You can create source definitions for Twitter source types such as Entry and User. You can use a query string to search for data.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Export Service Log


You can export the service log file from Metadata Manager for a specified date and view it offline.

Right-click Menu Options


You can use right-click menu options to perform the following actions:
Create objects from custom objects, business categories, and business terms. Propose and publish business terms in the business glossary. Edit resources in the Load perspective.

Command Line Programs


The following table describes new Metadata Manager commands.
Command version testsourceconnection getlinkreport Description Displays the current version of Metadata Manager. Tests connection to the source system of the resource. Exports the link report. The link summary report contains the resource, connection, assigned database, and assigned schema details.

Version 9.1.0 HotFix 1

57

Command encrypt

Description Encrypts the text you specify. You can specify the encrypted text when you use the -ep option in a command.

getServiceLog

Exports the service log file from Metadata Manager for a specified date.

The following table describes the new Metadata Manager options for the command line programs.
Command assignConnection Option -s Description During refresh, skips retrieving the connection objects you specify in the resource configuration file. Specifies the encrypted password generated using the encrypt command. This option can be used in multiple commands.

-ep

Version 9.1.0
This section describes new features and enhancements in version 9.1.0.

Informatica Data Explorer


This section describes new features and enhancements to Informatica Data Explorer.

Data Explorer Repository Migration


You can migrate profile metadata and data source connection metadata from a Data Explorer Legacy repository to the Model repository that Data Explorer uses. The migration does not remove metadata from the Data Explorer Legacy repository.

Profiles
You can create and run a profile on bad record tables and duplicate record tables in the Analyst tool. You can create and run profiles to identify primary keys, foreign keys, and functional dependencies between

columns in data objects. You can also define relationships between columns, and profile the columns to verify the relationship.
You can create a data model of the data objects that you want to profile. You can create and run profiles on the

data objects in the profile model.


You can replace the data object in a profile and run the profile on the new data object without editing the profile

parameters. The old and new data object must be the same type and must have the same data structure.
You can use a profile to infer the Date type for source data values. You can apply one or more filters to a profile.

58

Chapter 10: New Features and Enhancements (9.1.0)

You can apply one or more filters to drilldown data in the Analyst tool. You can create profiles for multiple data objects in a single operation. The profiles return separate results for

each data object.


You can run a profile on two objects in a mapping and view a comparison of the profile results. Use this profile

to review the changes that the mapping can make to the source data.
You can run profiling reports that include column profiling statistics and their summary from Data Analyzer.

Reference Tables
You can use the Analyst tool to create reference tables that store data in databases that you specify.

Reference tables that store data in the staging database are called managed reference tables. Unmanaged reference tables store data in user-specified databases.
When using Informatica Developer to import and export reference tables, you specify file paths recognized by

the local file system. Previously when you used the Developer tool to import and export reference tables, you specified directories recognized by the machine that ran the Data Integration Service.
You can apply a text filter when you search the Model repository for a reference table from a data quality

transformation. The filter narrows your search to reference table names that meet your filter criteria.
When you use the infacmd oie exportObjects command to export Model repository objects, you can include the

reference tables associated with these objects. The command exports reference table data from the staging database into a .zip file. When you run the infacmd oie importObjects command to import the Model repository objects, the command writes reference table data from the .zip file into the staging database.

Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.

Informatica Data Quality


This section describes new features and enhancements to Informatica Data Quality.

Accelerators
Accelerators are content bundles that contain rules, reference tables, demonstration mappings, and demonstration data objects. Each accelerator provides solutions to common data quality issues in a country, region, or industry. The Data Quality Content installer includes the Informatica Data Quality Core Accelerator, which contains general data quality rules. You can purchase the following accelerators separately:
Informatica Data Quality Accelerator for Australia and New Zealand Informatica Data Quality Accelerator for Brazil Informatica Data Quality Accelerator for Financial Services Informatica Data Quality Accelerator for Portugal Informatica Data Quality Accelerator for United Kingdom Informatica Data Quality Accelerator for US and Canada

Content Sets
A content set is a Model repository object that you use to store reusable, user-defined expressions. These expressions include pattern sets, character sets, token sets, and regular expressions. When you configure a Labeler or Parser transformation, you can choose to include reusable expressions from a content set. Create content sets in the Developer tool.

Version 9.1.0

59

Exception Management
You can perform the following exception management tasks in the Analyst tool:
Apply status and priority filters to rows in a bad record table. Save changes to multiple rows at a time in a bad record or duplicate record table. View the previous version of a data value in an audit trail table.

Integration with Microsoft Excel


You can use Data Quality for Microsoft Excel to start a mapping that reads input data from a Microsoft Excel spreadsheet. The Data Integration Service runs the mapping and returns the results to the Microsoft Excel spreadsheet. Data Quality for Microsoft Excel calls the mapping through Informatica web services.

Object Deployment
The process to export repository objects to PowerCenter resolves conflicts and dependencies.

Profiles
You can create and run a profile on bad record tables and duplicate record tables in the Analyst tool. You can replace the data object in a profile and run the profile on the new data object without editing the profile

parameters. The old and new data object must be the same type and must have the same data structure.
You can use a profile to infer the Date type for source data values. You can apply one or more filters to a profile. You can apply one or more filters to drilldown data in the Analyst tool. You can create profiles for multiple data objects in a single operation. The profiles return separate results for

each data object.


You can run a profile on two objects in a mapping and view a comparison of the profile results. Use this profile

to review the changes that the mapping can make to the source data.
You can run profiling reports that include column profiling statistics and their summary from Data Analyzer.

Reference Tables
You can use the Analyst tool to create reference tables that store data in databases that you specify.

Reference tables that store data in the staging database are called managed reference tables. Unmanaged reference tables store data in user-specified databases.
When using Informatica Developer to import and export reference tables, you specify file paths recognized by

the local file system. Previously when you used the Developer tool to import and export reference tables, you specified directories recognized by the machine that ran the Data Integration Service.
You can apply a text filter when you search the Model repository for a reference table from a data quality

transformation. The filter narrows your search to reference table names that meet your filter criteria.
When you use the infacmd oie exportObjects command to export Model repository objects, you can include the

reference table data associated with these objects. The command exports reference table data from the staging database into a .zip file on the Developer client machine. When you run the infacmd oie importObjects command to import the Model repository objects, the command writes reference table data from the .zip file into the staging database.

Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.

60

Chapter 10: New Features and Enhancements (9.1.0)

Transformations
Address Validator Transformation
You can configure the Address Validator transformation in Suggestion List mode. Use this mode to find all

possible matches for an input address in the reference data. Suggestion List mode works for partial and complete addresses.
You can define parameters for default country, line separator, casing style, and mode.

Association Transformation
You can set the minimum amount of cache memory that the Association transformation uses. If you enter a cache memory value that is lower than 65536, the Association transformation reads the

value in megabytes.
The Association transformation generates log entries while mappings are running. The transformation

generates log entries after every 100,000 rows.


You can define parameters for the cache directory.

Comparison Transformation You can use custom-built identity population files when you perform identity operations. Consolidation Transformation
The Consolidation transformation can use row-based consolidation strategies. The transformation uses

these strategies to choose a single cluster row. The transformation populates the master row with the data values from the chosen row.
The Consolidation transformation can use custom consolidation strategies that you define. You define the

custom strategies using decision expressions.


You can define parameters for the cache directory.

Decision Transformation You can configure the Decision transformation to recognize input NULL values. Exception Transformation The Exception transformation creates database tables that you can review for data quality issues in the Analyst tool. Use the Exception transformation in a mapping to create tables that identify poor quality or duplicate records using conditions that you specify. Use the Analyst tool to correct bad records or consolidate duplicate records in the tables. Java Transformation You can implement the resetNotification method in a Java transformation. When the Data Integration Service machine runs in restart mode, this method resets variables that you use in the Java code after a mapping run. Match Transformation
You can analyze a Match transformation to preview the number of computations that the transformation

will perform.
You can analyze a Match transformation to preview the size and number of clusters the transformation will

create.
You can define partitioned identity match operations to improve identity match operation performance. Use

the Execution Instance property to configure the number of partitions.


You can use custom-built identity population files to perform identity operations. You can define parameters for threshold, weight, matched pairs cache directory, clustering cache

directory, and index file directory.

Version 9.1.0

61

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Business Terms
You can use a Metadata Manager business term in the Analyst tool to search for objects in the Metadata

Manager repository. You can select Metadata manager objects from the search results and import them as tables in the Analyst tool. You can use these tables as sources for profiles or mapping specifications.
You can access Metadata Manager and the Metadata Manager Business Glossary from the Analyst tool to

manage business terms. You can view or edit business terms in the Metadata Manager Business Glossary.

Logical Data Object Models


You can import logical data models from the export files of the following modeling tools:
CA ERwin 7.x Data Modeler IBM Cognos BI Reporting - Framework Manager SAP BusinessObjects Designer Sybase PowerDesigner CDM 7.5 to 15.x Sybase PowerDesigner OOM 9.x to 15.x Sybase PowerDesigner PDM 7.5 to 15.x

Mapping Specification
A mapping specification is an object in the Model repository that describes the movement and transformation of

data from a source to a target. Use a mapping specification to define business logic that populates a target table with data that you can leverage to report on the target table.
You can create a mapping specification in the Analyst tool to transform and move data from the source to the

target.
You can configure the sources, target, rules, filters, and joins to transform the data in a mapping specification. You can load the results of a mapping specification to a target.

Monitoring
You can access the Monitoring tool from the Developer and Analyst tools to monitor the status of applications and jobs, such as a profile job.

Object Export and Import


You can import individual objects from an XML file into the repository. You can edit the object names in an export XML file with an infacmd xrf command. You can use an infacmd control file when you run the infacmd oie ExportObjects or ImportObjects command.

Use an infacmd control file to complete the following tasks during an export or import process:
- Filter the objects that are exported or imported. - Configure conflict resolution strategies for specific object types or objects. - Map connections in the source repository to connections in the target repository.

Performance Tuning
Mapping Performance You can improve mapping performance with the cost-based optimization method. The Data Integration Service can evaluate a mapping, generate semantically equivalent mappings, and run the mapping with the best performance. Cost-based optimization is most effective for mappings that contain multiple Joiner

62

Chapter 10: New Features and Enhancements (9.1.0)

transformations. The Data Integration Service applies cost-based optimization when you select the full optimizer level. Pushdown Optimization
The Data Integration Service can push Expression and Joiner transformation logic to the source database. The Data Integration Service can push transformation logic to IBM DB2 for i5/OS, DB2 for LUW, and DB2

for z/OS sources when expressions contain supported functions with the following logic:
- TO_BIGINT includes more than one argument. - TO_CHAR converts a date to a character string without the format argument. - TO_DATE converts a character string to a date without the format argument. - TO_DECIMAL converts a string to a decimal value. - TO_INTEGER includes more than one argument.

Physical Data Objects


Nonrelational Data Objects After you import a nonrelational data object, you can use the Developer tool to view the following types of information:
Mapping of nonrelational records and fields to columns in a relational table Copybook definition Data map metadata

Schema Objects You can import a schema and store it as a schema object in the repository. When you create a web service, you can define input, output and fault signatures from the schema types. WSDL Data Objects Import a WSDL file to create a WSDL data object. You can use a WSDL data object to create a web service or a Web Service Consumer transformation.

SQL Data Services


Result Set Caches You can temporarily cache the results of SQL queries run against an SQL data service. The Data Integration Service caches the results by user for a specified expiration period. When the same user makes the same request before the cache expires, the Data Integration Service returns the cached results. If a cache does not exist or has expired, the Data Integration Service starts a DTM instance to process the request. SQL Data Service Datatypes SQL data services support the binary SQL 99 datatype. SQL Data Service Functions SQL data services supports the COALESCE function in SQL queries.

Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.

Version 9.1.0

63

Transformations
Java Transformation You can implement the resetNotification method in a Java transformation. When the Data Integration Service machine runs in restart mode, this method resets variables that you use in the Java code after a mapping run. Lookup Transformation The Lookup transformation can perform a lookup on a logical data object. The transformation can return one row or it can return multiple rows. You an configure the Lookup transformation to perform the lookup in a web service operation mapping. Web Service Consumer Transformation The Web Service Consumer transformation consumes web services in a mapping. The transformation can consume an Informatica web service or an external web service. The transformation returns related groups of output data from hierarchical SOAP response messages. Create a Web Service Consumer transformation from a WSDL data object.

Web Services
Informatica web services provides data integration functionality through a web service interface. Create an operation mapping to define how the Data Integration Service processes the web service request. The operation mapping can include logical data objects or transformations. You can create a web service from a WSDL, or you can create a web service without using a WSDL. You can configure message layer security and transport layer security for a web service. Message layer security includes user authentication and user permissions.

Informatica Documentation
This section describes new documentation and enhancements to Informatica documentation.

Informatica Analyst User Guide


The Informatica Analyst User Guide is divided into the following guides:
Informatica Data Explorer User Guide Informatica Data Integration Analyst User Guide Informatica Data Quality Analyst User Guide

Informatica Getting Started Guide


The Informatica Getting Started Guide is divided into the following guides:
Informatica Administrator Getting Started Guide Informatica Data Integration Analyst Getting Started Guide Informatica Data Explorer Getting Started Guide Informatica Data Quality Getting Started Guide Informatica Data Services Getting Started Guide

Informatica Domain
This section describes new features and enhancements to the Informatica domain.

Connections
If you have PowerExchange, you can create an Adabas connection.

64

Chapter 10: New Features and Enhancements (9.1.0)

Content Management Service


The Content Management Service is an application service that manages address validation reference data properties and identifies the identity population data files that are available to Developer tool users. It updates the Data Integration Service with configuration information for the address reference data. The Match transformation and Comparison transformation display the list of currently installed identity population files, including any custom population file that you add to the system.

Data Integration Service


You can limit the amount of memory that the Data Integration Service allocates for requests such as data previews, mappings, and SQL queries. You can configure the maximum amount of memory that the Data Integration Service allocates for running all concurrent requests. You can also limit the maximum amount of memory that the Data Integration Service allocates for any request.

Data Transformation
You can install Data Transformation Engine and Data Transformation Studio through the Informatica platform installer. When you run the server installation, you can install or upgrade Informatica or you can install only Data Transformation Engine. When you run the client installation, you can install Informatica Developer, the PowerCenter Client, and Data Transformation Studio and Engine.

Dependencies
In the Services and Nodes view on the Domain tab, you can now view dependencies among applications services, nodes, and grids.

Domain Configuration Database


You can create a domain configuration database on Sybase ASE.

Monitoring
You can configure the Service Manager to store historical run-time statistics about objects that run on a Data Integration Service. The Service Manager stores the statistics in the Model repository. You can view the statistics and reports in the Monitoring tab of the Administrator tool for different objects, such as applications, web services, logical data objects, and SQL data services. For example, you can view a report to determine the longest running jobs. You can also monitor objects from the Analyst tool and Developer tool.

Object Export and Import


You can use the infacmd command line program to migrate objects between different domains of the same version. You might migrate domain objects from a development environment to a test or production environment. Use the following infacmd commands:
infacmd isp ExportDomainObjects. Exports native users, native groups, roles, and connections to an XML file. infacmd xrf GenerateReadableViewXML. Generates a readable XML file from the export file. Review the

readable XML file to determine if you need to filter the objects that you import.
infacmd isp ImportDomainObjects. Imports native users, native groups, roles, and connections into an

Informatica domain. If you do not want to migrate all objects, use an infacmd control file to filter the objects during the export or import.

Version 9.1.0

65

Permissions
Origin of Effective Permissions You can view domain object, SQL data service, or web service permission details for a user or group. When you view permission details, you can view the origin of effective permissions. Permission details display direct permissions assigned to the user or group, direct permissions assigned to groups that the user or group belongs to, and permissions inherited from parent objects. In addition, permission details display whether the user or group is assigned the Administrator role which bypasses permission checking. Search Filters You can use search filters to search for a user or group when you assign permissions, view permission details, or edit permissions.

Privileges
You can assign the following new types of privileges: Connection Privileges Assign the Manage Connection privilege to enable a user or group to create, edit, and remove connections. Monitoring Privileges Assign the monitoring privileges to enable a user or group to configure and view historical run-time statistics and reports. Sample monitoring privileges include Configure Global Settings, Configure Statistics and Reports, and Access Monitoring. Profiling Privileges Assign the Drilldown and Export Results privilege to enable a user to drill down or export profiling results.

Secure Communication
To configure services to use the Transport Layer Security (TLS) protocol to transfer data securely within the domain, enable the TLS protocol for the domain. When you enable TLS for the domain, services use TLS connections to communicate with other Informatica application services and clients.

Views
The Domain tab now has new views:
Services and Nodes view. View and manage services and nodes in the domain. Connections view. View and manage connections in the domain.

Command Line Programs


This section describes new commands and options for the Informatica command line programs.

infacmd
With infacmd cms commands, you can create and remove a Content Management Service. The following table describes new infacmd cms commands:
Command CreateService RemoveService Description Creates the Content Management Service. Removes the Content Management Service.

66

Chapter 10: New Features and Enhancements (9.1.0)

The following table describes a new infacmd dis command:


Command PurgeResultSetCache Description Purges the result set caches for an application.

The following table describes new infacmd isp commands:


Command ExportDomainObjects Description Exports native users, native groups, roles, and connections from an Informatica domain to an XML file. Imports native users, native groups, roles, and connections from an XML file into an Informatica domain. Removes permission on an object from a user. Removes permission on an object from a group.

ImportDomainObjects

removeUserPermission removeGroupPermission

The following table describes a new infacmd pwx command:


Command UpgradeModels Description Upgrades PowerExchange 9.0.1 nonrelational data objects.

With infacmd ws commands, you can manage web services. The following table describes new infacmd ws commands:
Command ListOperationOptions ListOperationPermissions ListWebServiceOptions ListWebServicePermissions ListWebServices Description Lists operation options. Lists operation permissions. Lists web service options. Lists web service permissions. Lists the web services in an application. If the application name is not provided, all web services are listed. Renames a web service. Sets operation permissions. Sets web service permissions. Starts a web service so it can receive web service requests. Stops a web service.

RenameWebService SetOperationPermissions SetWebServicePermissions StartWebService StopWebService

Version 9.1.0

67

Command UpdateOperationOptions UpdateWebServiceOptions

Description Updates operation properties. Updates web service properties.

With infacmd xrf commands, you can generate a readable XML file from an export file. You can also edit the readable XML file and update the changes in the export file. The following table describes new infacmd xrf commands:
Command GenerateReadableViewXML UpdateExportXML Description Generates a readable XML file from the export file. Updates the export file with the changes made to the readable XML file.

The following table describes new options for infacmd commands:


Command isp ImportUsersAndGroups Option -exportedFromPowercenter Description Specifies that the export file containing users and groups was exported from a PowerCenter version 8.6.1 domain. Saves the search index to the backup file. Overwrites the export file if it exists. Specifies the path and file name of the export control file. Use an export control file to filter the objects that are exported from the Model repository. Specifies additional options to export reference table data. Skips the CRC check that detects whether the import file was modified. Configures how to handle conflicts during the import. Specifies the path and file name of the import control file. Use an import control file to filter the objects that are imported into the Model repository. Skips connection validation during the import. Specifies options to import reference table data. Specifies the number of days that the profiling warehouse stores profile or scorecard results before it purges the results. Identifies the Model repository project and folder where the profile or scorecard is stored.

mrs BackupContents oie ExportObjects

-BackupSearchIndices -OverwriteExportFile -ControlFilePath

-OtherOptions oie ImportObjects -skipCRC

-ConflictResolution -ControlFilePath

-SkipConnectionValidation -OtherOptions ps purge -rd

-pf

68

Chapter 10: New Features and Enhancements (9.1.0)

Command

Option -pt -r -pa

Description Specifies the name of the profile task. Purges results from folders recursively. Purges all results for a specified profile or scorecard from the Profiling Warehouse.

infasetup
The following infasetup commands are updated to enable Transport Layer Security (TLS).
DefineDomain DefineGatewayNode DefineWorkerNode UpdateGatewayNode UpdateWorkerNode

pmrep
The following table describes new pmrep commands:
Command GenerateAbapProgramToFile InstallAbapProgram UninstallAbapProgram Description Generates the ABAP program for a mapping and saves the program as a file. Generates and installs an ABAP program in the SAP system. Uninstalls the ABAP program. Uninstall an ABAP program when you no longer want to associate the program with a mapping.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Business Glossary
When you create or edit a business term, you can add hyperlinks to any other business term in the same or different business glossary. You can also provide links to external web pages as a reference to a business term. These internal and external links help you to browse through related business terms in the business glossary.

Class Properties
You can organize the way to display the class properties. When you edit the properties, you can drag them to change their order or to ensure that they appear in either the Basic or Advanced section of the class properties, in all Metadata Manager perspectives. For a class, Source Creation Date, Source Update Date, MM Creation Date, and MM Update Date properties are referred as the synthetic date properties. You can set the Show_Synthetic_Dates_In_Basic_Section property in the imm.properties file to specify if these properties should be located in the Basic or Advanced section.

Version 9.1.0

69

Search Results Configuration


You can use the elements in the searchpriorities.xml file to create groups of class_ids, class_types, or favorite_types and assign a priority value to the group. You can assign a priority value to a group, where one indicates a low priority and ten indicates a high priority. The search results appear based on the priority assigned to the group. By default, searchpriorities.xml is in the following directory:
<Informatica installation directory>\services\shared\jars\pc\classes

Hide or Display Empty and Read-only Properties


You can set the value of the Hide_Empty_Uneditable_Properties property in the imm.properties file to hide or display the empty and read-only properties in all Metadata Manager perspectives.

Filtering Impact Summary


You can filter the objects that appear in the impact summary based on the type of resource. The filter criteria is persisted across objects in the catalog.

JDBC Resource
You can create and configure a JDBC resource to extract metadata from any relational database management system that is accessible through JDBC. You can create a JDBC resource for any relational database that supports JDBC. Informatica has tested the JDBC resource for IBM DB2/iSeries. You cannot connect to relational databases through ODBC. Where available, you should use the existing database resource types specific to that relational database instead of the JDBC resource. The database-specific resource types perform better and extract more metadata aspects. For example, to load metadata from an Oracle database, create an Oracle resource instead of creating a JDBC resource.

Missing Links Report


You can set the value of the Missing_Links_Report_Limit property in the IMM.properties file to limit the maximum number of missing links you want to export.

backupCmdLine Command Line Program


backupCmdLine includes an optional nThreads argument and the task argument for the Backup and Restore command. The value of the the nThread argument specifies the number of threads you want to use for the backup or restore task. The value must be greater than zero. The task argument specifies the type of task, which can be backup or restore.

Gathering Statistics
You can gather statistics for DB2 resources when the GatherStatistics property in the imm.properties file is set to Yes.

PowerCenter
This section describes new features and enhancements to PowerCenter.

PowerCenter Repository
You can create a PowerCenter repository on Sybase ASE.

70

Chapter 10: New Features and Enhancements (9.1.0)

Data Analyzer
This section describes new features and enhancements to Data Analyzer.

Data Analyzer Repository


You can create a Data Analyzer repository on Sybase ASE.

Adapters for PowerCenter


This section describes new features and enhancements to adapters for PowerCenter.

PowerExchange for Salesforce


PowerExchange for Salesforce uses version 20.0 of the Salesforce API. You can configure the Enable Field Truncation Attribute session property to have Salesforce truncate overflow

target data and write truncated data to a Salesforce target.


You can use a forced authentication HTTP proxy server.

PowerExchange for SAP NetWeaver


You can use pmrep to generate and install ABAP programs for mappings with SAP source tables. You can load data to SAP target tables. When you generate the ABAP program, you can require that SAP perform an authorization check on ABAP

programs that are generated and installed from the Designer.

PowerExchange for SAS


You can join SAS sources in a mapping. You can configure the SIP Server timeout parameter in the SPI Server. The SIP Server timeout determines the

amount of time that the SIP Server waits for the PowerCenter Integration Service to connect to the SPI Server.

PowerExchange for Teradata Parallel Transporter API


PowerExchange for Teradata Parallel Transporter API supports Teradata Parallel Transporter version 13.10. You can extract data from a Teradata source or load data to a Teradata target when the PowerCenter

Integration Service runs on SUSE Linux Enterprise Server 11 and you use Teradata Parallel Transporter 13.10.
You can extract data from a Teradata source or load data to a Teradata target when the PowerCenter

Integration Service runs on SunOS x64 platform.

Version 9.1.0

71

CHAPTER 11

Informatica Data Explorer (9.1.0)


This chapter includes the following topics:
Oracle Database Requirements, 72 Profiling Warehouse, 72

Oracle Database Requirements


Effective in 9.1.0, you can configure NLS_CHARACTERSET and NLS_LENGTH_SEMANTICS environment variables. These variables make sure that the Profiling Service Module does not truncate Unicode characters when you profile a data source. Previously, the Profiling Service Module truncated Unicode characters when you profiled a data source and the profilling warehouse was an Oracle database.

Profiling Warehouse
Effective in 9.1.0, you purge result data for profiles or scorecards from the profiling warehouse. Run the infacmd ps purge command to purge the result data. Previously, the profile warehouse purged result data when the Data Integration Service was idle.

72

CHAPTER 12

Informatica Data Quality (9.1.0)


This chapter includes the following topics:
Address Validator Transformation , 73 Association Transformation, 74 Data Quality Content Installer, 74 Data Quality for Siebel, 74 Decision Transformation, 74 Exception Transformation, 75 Export to PowerCenter, 76 Match Transformation, 76 Reference Data Configuration, 76 Web Service Operations, 76

Address Validator Transformation


Effective in 9.1.0 HotFix 3, the Address Validator transformation uses version 5.2.8 of the Address Doctor software engine. Effective in 9.1.0 HotFix 3, when you run a mapping with an Address Validator transformation that is configured in Suggestion List mode, the transformation performs a runtime check to verify that all input ports belong to the Discrete port group. The mapping fails if the Address Validator transformation uses an input port from any other port group. Effective in 9.1.0 HotFix 2, the Address Validator transformation uses version 5.2.7 of the Address Doctor software engine. Effective in 9.1.0 HotFix 1, Address Validator transformation port names are composed of complete words separated by character spaces. Previously, all words and numbers were concatenated in Address Validator port names, and postal terms in transformation names were abbreviated. For example, the AddressComplete1 port is renamed to Address Complete 1, and the CBSAID port is renamed to Core-Based Statistical Area Identification. Note: The port name changes do not affect transformation or mapping performance or Informatica version compatibility.

73

Association Transformation
Effective in 9.1.0, Association transformation behavior changes in the following ways:
The Association transformation accepts string and numerical values on association ports. If you configure a

column of another data type as an association port, the transformation converts the port data values to strings. Previously, the Association transformation accepted string data only on association ports.
The association ID output is hard-coded as an integer. Previously, the association ID output was hard-coded as

a string. If you upgrade to version 9.1.0, the Association transformation preserves the string data type on the association ID port and writes the port output values as a string.

Data Quality Content Installer


Effective in 9.1.0, Data Quality Content Installer behavior changes in the following ways:
The Data Quality Content Installer has a single executable installer. If you are a Data Quality user, run the

installer on a machine that the Data Integration Service can access. If you are a PowerCenter user, run the installer on a machine that the PowerCenter Integration Service can access. Previously, the Data Quality Content Installer had an additional installer that you ran on the Informatica Developer machine or PowerCenter Designer machine.
The Data Quality Content Installer can install address validation reference data, identity population data, and

sample data sources. Previously, the Content Installer also installed reference table data and Informatica rules. In Informatica 9.1.0, you use the Developer tool to import reference table data and Informatica rules to the Model repository.

Data Quality for Siebel


Effective in 9.1.0 HotFix 2, Data Quality for Siebel contains the following changes:
Data Quality for Siebel uses the Informatica 9 platform. Users can create mappings in Informatica Data Quality

9 and export these mappings to PowerCenter for use with Data Quality for Siebel. Previously, users created data quality plans in Data Quality 8.6 before exporting to PowerCenter.
Data Quality for Siebel uses Address Doctor for all realtime and batch address validation. Previously, Data

Quality for Siebel used specific vendors for particular realtime or batch scenarios.

Decision Transformation
Effective in 9.1.0 HotFix 2, when you use a Decision transformation in a web service operation, the DTM resources remain in memory between data requests. Previously, a web service operation that included a Decision transformation stopped and started the DTM for each data request.

74

Chapter 12: Informatica Data Quality (9.1.0)

Effective in version 9.1.0, the Decision transformation implements the function INSTR(str1, str2) by looking for str2 in str1. Previously, the Decision transformation looked for str1 in str2. If you configured a Decision transformation with this function in Informatica 9.0.1, edit the transformation to reverse the order of the strings in the function.

Exception Transformation
Effective in version 9.1.0 HotFix 2, the Exception transformation contains the following change: Column names that the Exception transformation writes to a staging table do not use the prefix "Ex_". Previously, all columns names contained the "Ex_" prefix. Effective in version 9.1.0 HotFix 1, the Exception transformation contains the following changes:
The Consolidation Exception type is called the Duplicate Record Exception. You choose a transformation type

when you create an Exception transformation.


The names of port groups, views, and options changed. The following table lists the previous name and current

name:
Location Port group Configuration view Priority view (view name) Priority view Priority view Configuration view Configuration view, Data Routing Options Configuration view, Data Routing Options Configuration view, Data Routing Options Previous Name Labels From Priority Current Name Quality Issues Lower Threshold Issue Assignment

Label Field Priority To Definite Matches

Quality Issue Label Priority Upper Threshold Automatic Consolidation Records(Above upper threshold)

Potential Matches

Manual Consolidation Records (Between thresholds)

Unique Records

Unique Records (Below lower threshold)

Exception transformations that identity duplicate records now write clusters with only one record to the unique

record category. Previously, the transformation wrote these records to the definite matches category.
The Exception transformation no longer contains Match Score and IsMaster ports in the input and output

groups. The removal of these ports does not affect transformation functionality.
The Score input port is no longer mandatory for Exception transformations that identify bad records. The

Exception transformation can identify bad records by determining if quality issue ports contain data.

Exception Transformation

75

Export to PowerCenter
Effective in version 9.1.0, the earliest version of PowerCenter to which you can export mappings, mapplets, and logical data object read mappings is PowerCenter 8.6.1. Previously, you could export objects to PowerCenter 8.6.

Match Transformation
Effective in version 9.1.0 HotFix 2, identity match strategies contain primary required fields and secondary required fields. You must assign input ports to all primary required fields. You must assign input ports to at least one secondary required field. Previously, identity match strategies only contained primary required fields. Effective in version 9.1.0 HotFix 1, the Execution Instances property on the Advanced view of the Match transformations is only available for Match transformations that use identity matching. Previously, the Execution Instances property was available for Match transformations that used field matching.

Reference Data Configuration


Effective in 9.1.0, you must add a Content Management Service to the domain before you can use address reference data and identity populations. To use address reference data, you must also configure address validation process properties in the Content Management Service. Previously, identity matching did not require a Content Management Service, and you configured the address validation process properties in the Data Integration Service.

Web Service Operations


Effective in 9.1.0 HotFix 1, when you add one or more of the following transformations to a web service operation, the DTM resources remain in memory between data requests:
Address Validator Case Converter Comparison Labeler Merge Parser Standardizer Weight Based Analyzer

Previously, the web service operation stopped and started the DTM for each data request.

76

Chapter 12: Informatica Data Quality (9.1.0)

CHAPTER 13

Informatica Data Services (9.1.0)


This chapter includes the following topics:
Application Redeployment, 77 Informatica Data Integration Analyst Action Menus, 77 Deployment Menus, 78 Export to PowerCenter, 78 Logical Data Object Model Import, 78 Web Services, 79 Web Service Consumer Transformation, 80

Application Redeployment
Effective in version 9.1.0, when you change an application that contains a mapping, and you redeploy the mapping with the update option, the Data Integration Service preserves the Administrator tool mapping redeployment properties. Previously, the Data Integration Service replaced the Administrator tool mapping deployment properties with the Developer tool mapping deployment properties.

Informatica Data Integration Analyst Action Menus


Effective in version 9.1.0 HotFix 1, the actions menus to perform mapping specification tasks have changed. The following table describes mapping specification tasks that have changed action menus:
Task Remove object and target column relationships. Load mapping specification results to the target. Changed to Actions > Clear Transformation Changed from Actions > Unlink

Actions > Export

Actions > Run Mapping Specification > Load to Data Object

77

Deployment Menus
Effective in version 9.1.0 HotFix 2, based on the object type that you select, the Deploy menu may contain a secondary menu. When you right-click the following options in the Object Explorer view, the Deploy > Deploy as a web service option appears:
Flat file physical data object Relational physical data object Logical data objects Transformations except for the Web Service Consumer transformation Mapplets

When you right-click the following objects in the Object Explorer view, the Deploy > Deploy as SQL data service option appears:
Physical data objects Logical data objects

Previously, when you right-clicked one of the following objects in the Object Explorer view, the Deploy option appeared with no secondary menu option:
Physical data objects Logical data objects

Export to PowerCenter
Effective in version 9.1.0, the earliest version of PowerCenter to which you can export mappings, mapplets, and logical data object read mappings is PowerCenter 8.6.1. Previously, you could export objects to PowerCenter 8.6.

Logical Data Object Model Import


Effective in version 9.1.0, the option names and steps have changed for importing a logical data object model from an XSD file. You can also configure import properties for the model. The following table describes the changes to import a logical data object model from an XSD file:
Interface Component New dialog box 9.1 Select Logical Data Object Model from Data Model. Select W3C XML Schema 1.0 (XSD) as the model type. Optionally, configure import properties. 9.0.1 Select Logical Data Object Model.

New Logical Data Object Model wizard

Select Create Existing Model from File .

78

Chapter 13: Informatica Data Services (9.1.0)

Web Services
This section describes changes to web services.

Generate Requests in the Data Viewer View


Effective in version 9.1.0 HotFix 1, to generate a new web service request in the Data Viewer view, click the Reset button. Previously, you clicked Generate to generate a new web service request in the Data Viewer view.

Deleted WSDL Data Object


Effective in version 9.1.0 HotFix 1, if the associated WSDL data object is deleted from the repository, the Developer tool retains the location in the Location column for ports and nodes in the Input, Output, and Fault transformations. If you associate another WSDL with the web service, the Developer tool checks whether the location is valid. The Developer tool clears the location if the location is no longer valid. Previously, if you associated another WSDL with the web service, the Developer tool did not check whether location in the Location column was valid.

Create a Web Service Wizard


Effective in version 9.1.0 HF1, you can map ports to nodes for the Input, Output, and Fault transformation using the Create Web Service wizard. Previously, you could only map ports to nodes on the Transformation tab of the Input, Output, and Fault transformation.

Ports Tab
Effective in version 9.1.0 HotFix 1, you use the Ports tab to complete the following tasks:
Extract operation input nodes to output ports. Extract input ports to the operation output nodes. Extract input ports to operation fault nodes.

Previously, you extracted input ports to nodes in the operation hierarchy on the Transformation tab in the Properties view of the Output and Fault transformation. You extracted nodes to output ports on the Transformation tab in the Properties view of the Input Transformation.

Create Web Service from a WSDL Data Object Wizard


Effective in version 9.1.0 HF2, you can map ports to nodes for the Input, Output, and Fault transformation using the Create Web Service from a WSDL data object wizard. Previously, when you created a web service from a WSDL data object, you could only map ports to nodes on the Ports tab of the Input, Output, and Fault transformation.

Web Services

79

Deployment
Effective in version 9.1.0 HotFix 2, you can right-click a mapplet, reusable transformation, logical data object, flat file data object, or relational data object and deploy it as a web service. Previously, you created a web service operation for a reusable object from the Create a Web Service wizard and then deployed the web service in an application.

Ports Tab Options


Effective in version 9.1.0 HotFix 2, some option names on the Ports tab of the Input, Output, and Fault transformation have changed. The following table describes the changes to the option names:
9.1.0 HotFix 2 Map Map first level hierarchy Map as XML 9.1.0 HotFix 1 and earlier Extract Extract the first level of the hierarchy Extract As XML

Cache Property in the Lookup Transformation


Effective in version 9.1.0 HF3, when you create a web service from a customized data object, the Developer tool does not enable caching for the Lookup transformation in the operation mapping. Previously, when you created a web service from a customized data object, the Developer tool enabled cache for the Lookup transformation that it creates in the operation mapping.

Web Service Consumer Transformation


This section describes changes to the Web Service Consumer transformation.

Deleted WSDL Data Object


Effective in version 9.1.0 HotFix 1, if the associated WSDL data object is deleted from the repository, the Developer tool retains the location of the operation nodes in the output mapping. When you show the output mapping, the Ports area still displays the location of the operation nodes in the Location column for the output ports. If you associate another WSDL with the transformation, the Developer tool checks whether the location of each operation node is valid. The Developer tool clears the location of operation nodes in the Ports area of the output mapping if the location is no longer valid. Previously, if you associated another WSDL with the transformation, the Developer tool did not check whether the location of each operation node was valid.

80

Chapter 13: Informatica Data Services (9.1.0)

Ports Tab
Effective in version 9.1.0 HotFix 1, you complete the following tasks on the Ports tab:
Extract input ports to nodes in the input operation hierarchy. Extract nodes in the output operation hierarchy to output ports.

Previously, you extracted input ports to nodes in the operation hierarchy on the Transformation Input tab. You extracted nodes to output ports on the Transformation Output tab.

Ports Tab Options


Effective in version 9.1.0 HotFix 2, some option names on the Ports tab have changed. The following table describes the changes to the option names:
9.1.0 HotFix 2 Map Map first level hierarchy Map as XML 9.1.0 HotFix 1 and earlier Extract Extract the first level of the hierarchy Extract As XML

Web Service Consumer Transformation

81

CHAPTER 14

Informatica Domain (9.1.0)


This chapter includes the following topics:
Analyst Service, 82 Data Integration Service, 82 Domain Management, 84 infacmd Control Files, 84 Model Repository Service, 84 Permissions, 85 Reports, 85 Privileges, 86

Analyst Service
Effective in version 9.1.0, you can associate a Metadata Manager Service with the Analyst Service. You can also configure the Metadata Manager Service Options in the Analyst Service properties. Associate a Metadata Manager Service with the Analyst Service to connect to the Metadata Manager Business Glossary when searching for business terms in the Analyst tool. When you upgrade to 9.1.0, you can edit the Analyst Service to configure the Metadata Manager Service options. You can select a Metadata Manager Service when you create the Analyst Service.

Data Integration Service


This section describes changes to the Data Integration Service.

Address Validation Reference Data Properties


Effective in version 9.1.0, you configure address validation reference data properties in the Content Management Service instead of in the Data Integration Service. Use the Content Management Service to configure the following address validation reference data properties:
License Key. License key to activate address validation reference data.

82

Reference Data Location. Location of the address validation reference data. Full Pre-Load Countries. List of countries for which all available address reference data will be loaded into

memory before address validation begins.


Partial Pre-Load Countries. List of countries for which the address reference metadata and indexing structures

will be loaded into memory before address validation begins.


No Pre-Load Countries. List of countries for which no address reference data will be loaded into memory before

address validation begins.


Full Pre-Load Geocoding Countries. List of countries for which all geocoding reference data will be loaded into

memory before address validation begins.


Partial Pre-Load Geocoding Countries. List of countries for which geocoding metadata and indexing structures

will be loaded into memory before address validation begins.


No Pre-Load Geocoding Countries. List of countries for which no geocoding reference data will be loaded into

memory before address validation begins.


Memory Usage. Number of megabytes of memory that address validation can allocate. Max Address Object Count. Maximum number of address validation instances to run at the same time. Max Thread Count. Maximum number of threads that the address validation can use. Cache Size. Size of cache for databases that are not preloaded. Caching reserves memory to increase lookup

performance in reference data that has not been preloaded. When you upgrade to 9.1.0, create a Content Management Service to update the properties.

HTTP Proxy Server Authentication


Effective in version 9.1.0 HotFix 1, the Service Manager encrypts the HTTP proxy server password if you configure the Data Integration Service to use an HTTP proxy server with authentication. The Data Integration Service decrypts the HTTP proxy server password when it runs Web Service Consumer transformations. After you upgrade, you must reset the HTTP proxy server password if you configured the Data Integration Service to use an HTTP proxy server with authentication. If you do not reset the password, then the Data Integration Service cannot successfully process Web Service Consumer transformations. Previously, the Service Manager did not encrypt the HTTP proxy server password.

Result Set Cache Properties


Effective in version 9.1.0 HotFix 1, you configure values for the following properties in bytes:
Maximum Total Disk Size Maximum Per Cache Memory Size Maximum Total Memory Size

Previously, you configured the Maximun Total Disk Size in megabytes and you configured the Maximim Per Cache Memory Size and Maximum Total Memory Size in kilobytes. After you upgrade, the number that you configured for there properties is retained. You must update the number to match the number of bytes that you require.

Data Integration Service

83

Domain Management
Effective in version 9.1.0, domain management has changed in the following ways:
The details panel no longer appears on the Domain tab.

Previously, you could view details on the Domain tab.


You can manage connections from the Connections view on the Domain tab.

Previously, you managed connections from the Manage > Connections menu command.

infacmd Control Files


This section describes changes to infacmd control files.

Export Control Files for Domain Objects


Effective in version 9.1.0 HotFix 1, you can use an object name attribute and a time attribute in an objectList element. The infacmd isp ExportDomainObjects command exports objects that match both the specified object name and the time filter. For example, the following lines export user1 and user2 if they were created before the specified date:
<objectList type="user" createdBefore="2010-11-12 10:00:00 +0530" /> <object name="user1" /> <object name="user2" /> </objectList>

Previously, you could not use both an object name and a time attribute in an objectList element.

Import Control Files for Domain Objects


Effective in version 9.1.0 HotFix 1, if you do not define an object element for the objectList element, then the default value for the select attribute is all. The infacmd isp ImportDomainObjects command imports all objects of the specified type. For example, the following line imports all roles:
<objectList type="role" resolution="replace" />

Previously, you had to set the select attribute to all to import all objects of the specified type.

Model Repository Service


This section describes changes to the Model Repository Service.

Backup File Directory


Effective in version 9.1.0, the Model Repository Service writes repository backup files to the service backup directory. The service backup directory is a subdirectory of the node backup directory with the name of the Model

84

Chapter 14: Informatica Domain (9.1.0)

Repository Service. For example, a Model Repository Service named MRS writes repository backup files to the following location:
<node_backup_directory>\MRS

When you view backup files for a Model Repository Service, you can view the backup files for that service. Previously, the Model Repository Service wrote repository backup files to the node backup directory. When you viewed backup files for a Model Repository Service, you could view all backup files for all Model Repository Services running on the node.

Search Index Backup


Effective in version 9.1.0, when you back up a Model repository, the Model Repository Service saves the search index to the repository back up file to reduce the amount of time needed to restore the file. Previously, the Model Repository Service did not save the search index to the backup file. When you restored a backup file, the Model Repository Service re-indexed the search index. To use the previous behavior when you back up the repository, use the infacmd mrs BackupContents command and set the backup search index option to false.

Blocking Other Operations During a Backup


Effective in version 9.1.0, the Model repository backup operation blocks all other repository operations until the backup completes. Blocking other operations ensures that the Model Repository Service creates a consistent backup file. Previously, the Model Repository Service did not block other repository operations during a backup of the repository.

Permissions
Effective in 9.1.0, the Permissions tab for any domain object in the Administrator tool shows up to 1,000 users or groups. If there are more than 1,000 users or groups, a message appears that asks you to create a filter to limit the number of users or groups. Previously, the Permissions tab did not have a limit on the number of users or groups that it could display.

Reports
You must associate a Reporting and Dashboards Service with the PowerCenter Repository Service to view the PowerCenter reports in JasperReports Server. You must associate a Reporting and Dashboards Service with the Metadata Manager Service and view the Metadata Manager reports in JasperReports Server. Effective in version 9.1.0 HotFix 3, you run the PowerCenter reports from the PowerCenter Client and view them in JasperReports Server. You run the Metadata Manager reports from Metadata Manager to view them in JasperReports Server. Previously, you ran the reports using Data Analyzer.

Permissions

85

Privileges
Connection Privileges
Effective in version 9.1.0, a user must have the Manage Connection privilege to create, edit, and remove connections. By default, only users with the Administrator role have the Manage Connection privilege. Previously, a user did not need a privilege to create, edit, or remove a connection. If you upgrade to 9.1.0, assign users and groups the Manage Connection privilege to enable them to create, edit, and remove connections.

Profiling Privileges
Effective in version 9.1.0, a user must have the Drilldown and Export Results privilege to drill down and export profiling results. By default, users are not assigned this privilege after you upgrade to 9.1.0. Previously, a user did not need a privilege to drill down or export profiling results. If you upgrade to 9.1.0, assign users and groups the Drilldown and Export Results privilege to enable them to drill down and export profiling results.

86

Chapter 14: Informatica Domain (9.1.0)

CHAPTER 15

Metadata Manager (9.1.0)


This chapter includes the following topics:
Incremental Metadata Load, 87 backupCmdLine Command Line Program, 87 Business Glossary Custom Models, 88 Class Properties, 88 Hide or Display Empty and Read-only Properties, 88 Link Reports, 89 Search Results Configuration, 89 Metadata Manager Agent Validation Level, 89 Resources, 89

Incremental Metadata Load


Effective in version 9.1.0, when you configure Business Object and Cognos resources, you can choose to incrementally load recent changes to the metadata instead of loading complete metadata. After the first successful load, the Metadata Manager Agent incrementally loads metadata that changed after the last successful load. By default, the incremental extraction option is enabled for all Business Object and Cognos resources. If you do not want to incrementally load a resource, you can edit the resource and disable the option. Previously, all metadata was loaded when you load a resource.

backupCmdLine Command Line Program


Effective in version 9.1.0 HotFix 3, the default value of the number of threads used in the backup and restore commands is 5. Previously, the default value was 1. Effective in version 9.1.0, backupCmdLine includes the task argument for the Backup and Restore command. Use the task argument to specify if the task type is backup or restore. Previously, you could type the backup or restore command followed by the required arguments.

87

The name of the various arguments of the Backup and Restore command have also changed. The following table lists the previous and current names for the arguments:
Previous Argument Name MM_DBTYPE MM_DB_CONNECTION_URL MM_DB_USER MM_DB_PASSWORD MM_DB_NAME Current Argument Name dbType jdbcURL user pass file

Business Glossary Custom Models


Effective in version 9.1.0, you cannot create a custom model using an existing buisiness glossary model as template. Previously, you could create custom models using packaged business glossary models and custom business models as templates.

Class Properties
Effective in version 9.1.0, you can set the Show_Synthetic_Dates_In_Basic_Section property in the imm.properties file to specify if these properties should be located in the Basic or Advanced section. After you upgrade to Metadata Manager 9.1, any previous change to the order of the class properties is lost. You must edit the model to manually update the order in which class properties of metadata objects appear in all Metadata Manager perspectives. Previously, you could not set the location of the synthetic date properties.

Hide or Display Empty and Read-only Properties


Effective in version 9.1.0, you can set the value of the Hide_Empty_Uneditable_Properties property in the imm.properties file to hide or display the empty and read-only properties in all Metadata Manager perspectives. This property is enabled by default. After you install or upgrade to 9.1, to view the empty and read-only properties, update the property value in the imm.properties.file. Previously, all empty and read-only properties were displayed in all Metadata Manager perspectives.

88

Chapter 15: Metadata Manager (9.1.0)

Link Reports
Effective in 9.1.0, the link details are not part of the acquisition summary logs or the Load Details tab. You can view the link details in the Link Details tab. Link summary contains the resource, connection, assigned database, assigned schema, links, missing links, correctness (%) details. Previously, the link details were part of the Load Details tab.

Search Results Configuration


Effective in version 9.1.0, you can use the elements in the searchpriorities.xml file to create groups of class_ids, class_types, or favorite_types and assign a priority value to the group. You can assign a priority value to a group, where one indicates a low priority and ten indicates a high priority. The search results appear based on the priority assigned to the group. The searchpriorities.xml file is in the following directory:
<Informatica installation directory>\tomcat\shared\classes

After you upgrade to 9.1.0, you must reindex all resources to ensure that the default search priorities are applied to metadata objects in the Metadata Manager warehouse. If required, you can later configure the priorities of the various entities in the searchpriorities.xml file. Previously, you could not configure the priority of the search results.

Metadata Manager Agent Validation Level


Effective in 9.1.0 HotFix 3, the default validation level of the Metadata Manager Agent is NONE. Previously, the default validation level was basic.

Resources
This section describes changes to resources.

Embarcadero ERStudio
Effective in version 9.1.0, you can specify the physical models from which you want to extract metadata. You can also import owner schemas for tables, views and other database objects for physical models. Previously, Metadata Manager extracted the logical model and the physical model that was created from the logical model. Metadata Manager displayed all objects extracted from logical and physical models under the logical model.

Link Reports

89

SAP
Effective in version 9.1.0, you must install the SAP JCo 3 libraries before you can create an SAP R/3 resource or use an upgraded SAP R/3 resource. You can download the SAP JCo libraries that are specific to the operating system from SAP Service Marketplace. Previously, you had to install the SAP RFC libraries.

90

Chapter 15: Metadata Manager (9.1.0)

CHAPTER 16

PowerCenter (9.1.0)
This chapter includes the following topics:
Session Recovery, 91 Informatica Data Integration Analyst Action Menus, 91

Session Recovery
Effective in version 9.1.0 HotFix 3, the PowerCenter Integration Service resets mapping variables to the start value during session recovery.

Informatica Data Integration Analyst Action Menus


Effective in version 9.1.0 HotFix 1, the actions menus to perform mapping specification tasks have changed. The following table describes mapping specification tasks that have changed action menus:
Task Remove object and target column relationships. Load mapping specification results to the target. Changed to Actions > Clear Transformation Changed from Actions > Unlink

Actions > Export

Actions > Run Mapping Specification > Load to Data Object

91

CHAPTER 17

Informatica Development Platform (9.1.0)


This chapter includes the following topic:
Relational Data Adapter API, 92

Relational Data Adapter API


Effective in version 9.1.0 HotFix 1, the name of the adapter properties file uses the format ConnectionTypeBundle_lang.properties. ConnectionType is the value of the connectionType attribute of the adapter and lang is the language code for the locale as defined by ISO 639. If the properties file does not require a language code suffix, the properties file name can use the format ConnectionTypeBundle.properties. Previously, the name of the adapter properties file used the format ConnectionType_lang.properties or ConnectionType.properties. When you upgrade the Relational Data Adapter API to version 9.1.0 HotFix 1, change the names of existing properties files to follow the new naming format.

92

CHAPTER 18

Adapters for PowerCenter (9.1.0)


This chapter includes the following topics:
PowerExchange for HP Neoview Transporter, 93 PowerExchange for Hadoop, 93

PowerExchange for HP Neoview Transporter


Effective in 9.1.0, PowerExchange for HP Neoview Transporter is deprecated and Informatica will drop support in a future release. HP has discontinued selling Neoview. Informatica continues to support PowerExchange for HP Neoview for previous releases. Upgraded sessions continue to run for 9.1.0. You can upgrade from version 9.1.0 to a 9.1.0 hotfix version.

PowerExchange for Hadoop


Effective in 9.1.0 HotFix3 the Hadoop and Hive JAR files are installed in the following location: Informatica
\<version>\PowerCenter\Server\bin\javalib\hadoop\<DistributionName>

You do not need to configure JVMClassPath custom property for supported Hadoop distributions. You do not need to delete the Hive table before you run a session again.You can overwite the Hive table data.

93

Part III: Version 9.0.1


This part contains the following chapters:
New Features and Enhancements (9.0.1), 95 Informatica Data Quality and Informatica Data Explorer Advanced Edition (9.0.1), 108 Informatica Data Services (9.0.1), 109 Informatica Domain (9.0.1), 111 PowerCenter (9.0.1), 119 Metadata Manager (9.0.1), 121

94

CHAPTER 19

New Features and Enhancements (9.0.1)


This chapter includes the following topics:
Version 9.0.1 HotFix 2, 95 Version 9.0.1 HotFix 1, 96 Version 9.0.1, 97

Version 9.0.1 HotFix 2


This section describes new features and enhancements in version 9.0.1 HotFix 2.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Pushdown Optimization
A Data Integration Service that runs on a UNIX machine can push filter transformation logic to Microsoft SQL Server sources when the Data Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Microsoft SQL Server as the ODBC provider.

Informatica Data Quality and Informatica Data Explorer Advanced Edition


This section describes new features and enhancements to Informatica Data Quality and Informatica Data Explorer Advanced Edition.

Match Transformation
You can use multiple threads to run a mapping with a Match transformation configured for identity matching. Use a system variable on the Data Integration Service machine to set the number of threads. You can set this variable on the PowerCenter Integration Service machine if you export the mapping to PowerCenter.

Profiling
You can create a mapping from a profile in the Developer tool. The mapping uses the profile source as a data source and converts any rule defined on the profile into transformations or mapplets.

95

Pushdown Optimization
A Data Integration Service that runs on a UNIX machine can push filter transformation logic to Microsoft SQL Server sources when the Data Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Microsoft SQL Server as the ODBC provider.

Repository
The Data Quality 8.6.2-9.0.1 repository migration process migrates all reference data files from the 8.6.2 file system unless the files are present in the default Data Quality Content Installer file set. You do not need to install country pack or accelerator pack reference data to Data Quality 9.0.1 if the dictionary files used by the 8.6.2 plans are present on the 8.6.2 file system when you start the migration process.

Validation Rule Pushdown Optimization


A Data Integration Service can push validation rule logic to a relational database source when the rule statement is convertible to an SQL equivalent in the database.

Version 9.0.1 HotFix 1


This section describes new features and enhancements in version 9.0.1 HotFix 1.

Informatica Domain
This section describes new features and enhancements to the Informatica domain. Upgrade with Changes to Node Configuration Informatica provides an upgrade option to change the node configuration.

Informatica Data Quality and Informatica Data Explorer Advanced Edition


This section describes new features and enhancements to Informatica Data Quality and Informatica Data Explorer Advanced Edition.

Informatica Data Quality Transformations


Address Validation transformation The Address Validator transformation supports additional output port options from the Address Doctor 5.1.3 engine. Parser transformation When you define a pattern-based parsing operation, you can use wild cards to search for multiple consecutive instances of the same token. For example, you can define WORD+ as a pattern to search a string for any number of consecutive words. Match transformation The Match transformation can accept data on its GroupKey input port in identity matching mode. Use the GroupKey port to reduce identity match and field match operation times.

96

Chapter 19: New Features and Enhancements (9.0.1)

Profiling
Column Profile Results Profile results report three new types of information: the inferred datatype, the minimum value, and the maximum value in the column. Join Analysis Results When you perform join analysis on data sources in the Developer tool, you can export the common and unique records from the Data Viewer to a flat file.

Pushdown Optimization
The Data Integration Service can push filter transformation logic to SAP sources. The Data Integration Service can push filter transformation logic to Sybase ASE sources when the Data

Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Sybase as the ODBC provider.

Repository Migration
The process to migrate objects from the Informatica Data Quality 8.6.2 repository to the 9.0.1 Model repository provides greater support for address validation operations defined in Data Quality 8.6.2.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Pushdown Optimization
The Data Integration Service can push filter transformation logic to SAP sources. The Data Integration Service can push filter transformation logic to Sybase ASE sources when the Data

Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Sybase as the ODBC provider.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Microsoft SQL Server Authentication


You can authenticate the database user credentials using the Windows protocol and establish a trusted connection to a Microsoft SQL Server repository when you complete the following tasks:
Create the Metadata Manager Service and the Metadata Manager repository is on Microsoft SQL Server. Create a Microsoft SQL Server resource . Create a PowerCenter resource and the PowerCenter repository is on Microsoft SQL Server . Back up a Metadata Manager repository on Microsoft SQL Server . Restore a Metadata Manager repository on Microsoft SQL Server .

Version 9.0.1
This section describes new features and enhancements in version 9.0.1.

Version 9.0.1

97

Informatica Data Quality and Informatica Data Explorer Advanced Edition


This section describes new features and enhancements to Informatica Data Quality and Informatica Data Explorer Advanced Edition.

Analyst Viewer Permissions


You can create a user or group in the Administrator tool with read-only permissions on objects in the Analyst tool. Analyst Viewer users and groups can view a profile, scorecard, bad or duplicate record table, or reference data table. They cannot create, edit, or run an object or export profile results to reference tables.

Informatica Content
You can define an INFA_CONTENT environment variable on a PowerCenter Integration Service machine to set the path PowerCenter uses to read reference data. Use this environment variable if you cannot install the reference data to the expected location on the PowerCenter machine.

Informatica Data Quality Transformations


Address Validator transformation You can configure the Address Validator transformation to generate Coding Accuracy Support System (CASS) reports for United States addresses and Software Evaluation and Recognition Program (SERP) reports for Canadian addresses. CASS and SERP reports verify that the address validation operations performed on the source address data meet standards set by the USPS and Canada Post. Multi-strategy transformations You can configure multiple strategies within a single transformation object for the following transformations: Case, Decision, Key Generator, Labeler, Match, Merge, Parser, and Standardizer. You can apply each strategy to a set of data ports in a mapplet or mapping. Decision transformation Use the Decision transformation to write business rules using IF-THEN-ELSE logic and regular expressions and apply the rules to selected ports in a mapplet or mapping. Note: You must have Informatica Data Quality to use the Data Quality transformations.

Mapping Export and Import


You can use the Developer tool to import mappings, mapplets, transformations, and any reference table data used by these objects to the Model repository in a single step.

Mapping Performance
You can tune the performance of mappings by updating the mapping optimizer level through the mapping

configuration or mapping deployment properties. The optimizer level determines which optimization methods that the Data Integration Service applies to the mapping at run time. You can choose no, minimal, normal, or full optimization for mappings.

Parameters and Parameter Files


You can create parameters to define values that change between mapping runs. For example, create a

parameter to represent a connection so you can run one mapping with different relational source connections. Create a parameter file to define parameter values. The Data Integration Service applies the parameter values when you run a mapping from the command line and specify the parameter file.

98

Chapter 19: New Features and Enhancements (9.0.1)

Physical Data Objects


Customized physical data objects You can create customized physical data objects to perform the following tasks:
Join source data that originates from the same source database. Select distinct values from the source. Filter rows when the Data Integration Service reads source data. Specify sorted ports. Specify an outer join instead of the default inner join. Create a custom query to issue a special SELECT statement for the Data Integration Service to read

source data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.

You can add customized physical data objects to mappings and mapplets as read, write, or lookup objects. Create and import
You can create physical data objects from flat file, nonrelational database, relational database, and SAP

resources.
You can create physical data objects from resources that contain Developer tool illegal characters or

reserved words. For example, you can import a view named "CONCAT" or a table that contains a column with a period in the column name.
You can import tables, synonyms, and views from databases that use mixed case metadata. For example,

you can import tables "CUST" and "Cust" as separate physical data objects.

Profiles
Profile comments You can choose to delete profile comments on a profile. Profile results
You can profile multilingual data from different data objects and view profile results based on the locale

settings in the browser. The Analyst tool changes the Datetime, Numeric, and Decimal datatypes based on the browser locale.
You can sort on multilingual data. The Analyst tool displays the sort order based on the browser locale. After you run a profile, the Analyst tool purges the last profile run results from the profiling warehouse. The profiling warehouse stores 16,000 unique highest frequency values including NULL values for profile

results by default. If there is at least one NULL value in the profile results, the Analyst tool can display NULL values as patterns Running a profile
After you add a rule to a profile that has previously run, you can select the rule and associated columns

and run the profile again. The Analyst tool displays the previous profile results and the recent rule and columns results. You can modify the rule and run the profile again to view changes to profile results for the rule.
When you run a profile, you can choose to discard the profile results for previously profiled columns and

display results for the columns and rules selected for the latest profile run.

Version 9.0.1

99

Drilling down on columns


You can choose to drill down on profile data for a column. You can select the drill-down columns without profiling all the source columns again after you run the

profile.

Repository
You can migrate objects from the Informatica Data Quality 8.6.2 repository to the 9.0.1 Model repository. You

can also migrate reference data files used by the objects.

Rules
You can choose to drill down on live data for a rule. You can select the rules for drill down without profiling all the source columns again after running the profile.

Scorecards
You can group related scores within a scorecard to view a set of scores for a particular business concept.

When you add a profile column to a scorecard, you can choose to add it to a group. You can add a score to a group within a scorecard. You can move scores between groups within a scorecard and edit and remove groups from a scorecard.
You can select columns in the scorecard before running a scorecard again. You can choose to drill down on

live data for columns in a score.


You can view the total number of valid and not valid records and the refresh date for each score in the

scorecard.

Informatica Data Services


This section describes new features and enhancements to Informatica Data Services.

Applications
Objects in the Application view are sorted by default. Projects in the Application view have a new icon. They do not use the folder icon anymore. You can rename an application in the Administrator tool. You can refresh the Application view to update newly deployed, undeployed, and restored applications. You can update an application to resolve the conflict when you use the Administrator tool to deploy an

application with the same name as an existing application. Also, when you select the update or replace option during a conflict, you can select an option to stop the existing application if it is running.

Custom Data Transformation


You can configure a Custom Data transformation in a mapping. The Custom Data transformation processes

unstructured and semi-structured file formats, such as messaging formats, HTML pages, and PDF documents. The Custom Data transformation also processes structured formats such as ACORD, HL7, EDI-X-12, EDIFACT, and SWIFT. The Custom Data transformation calls a Data Transformation service to process the data.

Data Object Cache


You can purge data object and virtual table cache for an application. You can purge the cache from the Applications view in the Administrator tool. You can also purge cache with the infacmd PurgeDataObjectCache command line program.

100

Chapter 19: New Features and Enhancements (9.0.1)

Mapping Performance
You can tune the performance of mappings by updating the mapping optimizer level through the mapping

configuration or mapping deployment properties. The optimizer level determines which optimization methods that the Data Integration Service applies to the mapping at run time. You can choose no, minimal, normal, or full optimization for mappings.

Parameters and Parameter Files


You can create parameters to define values that change between mapping runs. For example, create a

parameter to represent a connection so you can run one mapping with different relational source connections. Create a parameter file to define parameter values. The Data Integration Service applies the parameter values when you run a mapping from the command line and specify the parameter file.

Physical Data Objects


Customized physical data objects You can create customized physical data objects to perform the following tasks:
Join source data that originates from the same source database. Select distinct values from the source. Filter rows when the Data Integration Service reads source data. Specify sorted ports. Specify an outer join instead of the default inner join. Create a custom query to issue a special SELECT statement for the Data Integration Service to read

source data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.

You can add customized physical data objects to mappings and mapplets as read, write, or lookup objects. Create and import
You can create physical data objects from flat file, nonrelational database, relational database, and SAP

resources.
You can create physical data objects from resources that contain Developer tool illegal characters or

reserved words. For example, you can import a view named "CONCAT" or a table that contains a column with a period in the column name.
You can import tables, synonyms, and views from databases that use mixed case metadata. For example,

you can import tables "CUST" and "Cust" as separate physical data objects.

Staging Database
The staging database properties include the database connection name and the properties for an IBM DB2

EEE database or a Microsoft SQL Server database.

SQL Data Services


You can view the JDBC string for an SQL data service in the general properties of the SQL Data Service view

in the Administrator tool.


You can rename an SQL data service in the Administrator tool.

Version 9.0.1

101

Virtual Data
Data preview When you preview virtual table data, you can view a graphical representation of the SQL query you enter. You can view the query plan for the original query and for the optimized query plan. Use the query plan to troubleshoot queries that end users run against a deployed SQL data service. You can also use the query plan to troubleshoot your own queries and to understand the log messages. Column level security You can set permissions at the column level to deny queries against a column in a virtual table. You can restrict user access to a column without denying the user access to the table. You can fail a query that selects the column or replace the column value with a default value in a query. Configure column-level security with infacmd.

Informatica Domain
This section describes new features and enhancements to the Informatica Domain.

Connections
Pass-through security The Data Integration Service uses the client user name and the password for connection objects in an SQL data service. The Data Integration Service connects to source objects with the client credentials instead of the default credentials from the connection object. Restrict users from the data in an SQL data service based on user permissions on the physical data object. Object names The Data Integration Service can generate SQL against Oracle, DB2, Microsoft SQL Server, or ODBC connections that have case-sensitive table and column names. You can use the Administrator tool or Developer tool to configure the connection. You can specify whether to include quotes around table and column names in the connection. Microsoft SQL Server
You can use the Administrator tool to specify the owner name and schema name for a Microsoft SQL

Server connection.
You can use the Administrator tool or Developer tool to configure a Microsoft SQL Server connection as a

trusted connection in the domain. IBM DB2 You can use the Administrator tool to specify the tablespace name for an IBM DB2 connection. Connection types If you have PowerExchange, you can create the following connection types:
DB2 for i5/OS IMS Sequential z/OS VSAM

If you have PowerExchange for SAP Netweaver, you can create the following connection type:
SAP

102

Chapter 19: New Features and Enhancements (9.0.1)

Connection permissions You can assign users the read, write, and execute permissions on the database connection. The execute permission grants other users the ability to preview data and run profiles and scorecards on data objects created with the connection.

Command Line Program Environment Variables


INFA_DEFAULT_PWX_OSEPASSWORD Stores the encrypted password for the operating system. You can use INFA_DEFAULT_PWX_OSEPASSWORD to set an encrypted password or use the corresponding command line -e option on infacmd pwx commands. You can set this password to access the operating sytem. INFA_DEFAULT_PWX_OSPASSWORD Stores the plain text password for the operating system. You can use INFA_DEFAULT_PWX_OSPASSWORD to set a plain text password or use the corresponding command line -p option on infacmd pwx commands. You can set this password to access the operating sytem. INFA_PASSWORD Stores the encrypted password for miscellaneous options when you set LDAP connectivity, define gateway nodes, define worker nodes, create new users, reset user passwords, and update SMTP options. You can use INFA_ PASSWORD to set a password or use the corresponding command line option. You can set this password to access your Informatica user account, data source, LDAP services, and outbound mail server. INFA_REPOSITORY_PASSWORD Stores the default password for the PowerCenter Repository. You can use INFA_ REPOSITORY_PASSWORD to set a password or use the corresponding command line option. You can set this password to create a PowerCenter Integration Service, an SAP BW Service, a Web Services Hub, and get log events for the most recent run of a session.

Domain Configuration
You can use infasetup to back up and restore the domain.

License Management Report


The License Management Report is enhanced to provide more statistics, such as user statistics and CPU

statistics for each host. The report contains information for all licenses assigned to the domain. An administrator can track the number of times a user logs in to the Analyst tool and how often the user runs profiles and scorecards.

Service Upgrade
Use the Service Upgrade Wizard in the Administrator tool to upgrade multiple services at one time. You can

save the upgrade report to file.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Browse Metadata
Impact summary When you view a business term or any metadata object except PowerCenter objects on the Browse tab, you can view impacted objects from the most relevant classes. The most relevant classes include business intelligence reports, relational database tables, PowerCenter mappings, and business terms.

Version 9.0.1

103

Business Glossary
Email notifications Metadata Manager sends an email notifying users about the following events:
A data steward proposes a draft business term for review. Metadata Manager displays the email options

so that the data steward can send an email to other users.


A user adds a comment to a business term in any phase. Metadata Manager sends an email to the data

steward assigned to the term.

Data Lineage
SQL inline views for PowerCenter and relational resources You can view data lineage on a database table, view, or synonym used in an SQL query with an inline view. The SQL query can exist in the following objects:
SQL override in a PowerCenter Source Qualifier or Lookup transformation. Database views, stored procedures, functions, and triggers.

Custom objects
When you run data lineage analysis on a custom object, the data lineage diagram includes the custom

object and all child custom objects.


You can view correct data lineage for relationships created between a custom metadata object or a

business term to the following PowerCenter objects:


- Reusable transformation or session.

When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object and all instances of the object. When you run data lineage analysis on any instance of the PowerCenter object, the data lineage diagram displays the associated custom metadata object or business term.
- Instance of a transformation or session.

When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object instance within the context where the instance is used. For example, the diagram shows a transformation instance within its corresponding mapping and session context.
- Shortcut.

When you run data lineage analysis on the custom object or business term, the diagram displays the instances of the original object and all instances of the shortcuts to the original object.

PowerCenter
This section describes new features and enhancements to PowerCenter.

Mapping Analyst for Excel


User interface Mapping Analyst for Excel includes the following user interface enhancements:
Simple or advanced view of a worksheet by displaying the minimum or maximum columns available. Icons for the most common tasks. Annotation of cells with descriptions from other worksheets.

104

Chapter 19: New Features and Enhancements (9.0.1)

Domains and enumerations You can configure domains and enumerations to define reference data within a mapping specification. A domain is a reference table. An enumeration includes the reference table values. Reusable rules You can define reusable rules to use as expressions on the Mappings worksheet. You can use rules in a mapping specification to perform simple data cleansing. Validation When you validate a mapping specification, Mapping Analyst for Excel provides more detailed error messages. Multiple mappings You can configure multiple mappings in a single mapping specification based on the Standard mapping specification template.

Mapping Objects
Data Transformation source and target You can configure a Data Transformation source or a Data Transformation target in a mapping. The Data Transformation source and the Data Transformation target process unstructured and semi-structured file formats, such as messages, HTML pages, and PDF documents. The source and target also transform structured formats such as HIPAA, HL7, EDI-X12, and EDIFACT. The Data Transformation source and Data Transformation target call a Data Transformation service. The Data Transformation service is the application that transforms the unstructured and semi-structured file formats. The Data Transformation service receives data from the PowerCenter Integration Service, transforms the data, and returns it to the PowerCenter Integration Service. Unstructured Data transformation The Unstructured Data transformation accepts hierarchical groups of input ports. You can pass data that represents relational tables. Groups are related by primary key-foreign key relationships. To increase performance you can flush sorted input data to the Unstructured Data transformation. Identity Resolution transformation The Identity Resolution transformation is an active transformation that you can use to search and match data in databases. The PowerCenter Integration Service uses the search definition that you specify in the Identity Resolution transformation to search and match data residing in the Informatica Identity Resolution (IIR) tables. The input and output views in the search definition determine the input and output ports of the transformation. Configure match tolerance and search width parameters in the Identity Resolution transformation to determine the matching scheme and search level. The Identity Resolution transformation returns the candidate records along with the search link port, respective scores, and the number of records found for the search.

PowerExchange
This section describes new features and enhancements to PowerExchange.

CDC Support for Oracle Materialized Views


If you use Oracle materialized views, PowerExchange can capture change data from the tables that underlie

those views. PowerExchange CDC supports any type of materialized view.

Version 9.0.1

105

CDC Support for Oracle Transparent Data Encrytion


If you use Oracle Transparent Data Encryption (TDE) to encrypt data, PowerExchange can capture change

data without your having to perform any additional CDC configuration task.

Data Maps Caching


If you configure a cache size, PowerExchange caches NRDB data maps on the system where they are used by

a single PowerExchange Listener, or by multiple PowerExchange Listener, netport, and batch jobs.

DB2 for i5/OS and DB2 for z/OS Stored Procedures as a Source
If you use the PowerExchange Client for PowerCenter (PWXPC) in PowerCenter, you can now execute DB2 for

i5/OS and DB2 for z/OS database stored procedures as override SQL for a data source.

GetCurrentFileName Function
For a data map record defined for a nonrelational data source, the GetCurrentFileName function gets the name

of the source data file. Use this function to determine from which data file the data for a record was read.

infacmd pwx Commands to Manage PowerExchange Application Services


infacmd pwx commands to manage a PowerExchange Listener Service. With the infacmd pwx program, you

can issue the CloseListener, CloseForceListener, ListTaskListener, and StopTaskListener commands from the command line to manage a PowerExchange Listener Service.
infacmd pwx commands to manage a PowerExchange Logger Service. With the infacmd pwx program, you can

issue the CondenseLogger, DisplayAllLogger, DisplayCheckpointsLogger, DisplayCPULogger, DisplayEventsLogger, DisplayMemoryLogger, DisplayRecordsLogger, DisplayStatusLogger, FileSwitchLogger, and ShutDownLogger commands from the command line to manage a PowerExchange Logger Service. To issue commands to a PowerExchange process that is not managed by a PowerExchange application service, you must use the pwxcmd program.

Writer Partitioning for Bulk Data Movement Sessions


You can use pass-through partitioning at writer partition points for bulk data movement sessions that have

VSAM or sequential file targets. The writer partitions process insert operations only. Because the partitions process inserts concurrently, this feature can help improve session performance. If you enable offload processing, offload processing also runs in the partitions concurrently.

Enhanced Timeout Processing


PowerExchange provides the following types of timeouts for unsuccessful connection attempts or failed

connections:
- Connection timeouts detect unsuccessful connection attempts. - Heartbeat timeouts detect a failure of the PowerExchange client or PowerExchange Listener to send or

receive heartbeat data.


- Network operation timeouts detect network operations that exceed the timeout period.

Adapters for Data Quality and Data Services


This section describes new features for adapters that you can use with Informatica Data Quality and Informatica Data Services.

PowerExchange for SAP Netweaver


You can integrate PowerExchange for SAP NetWeaver with any SAP industry solution or mySAP application to read data from mySAP applications using the ABAP program. You can use Informatica Developer (the Developer tool) to create an SAP data object, add tables to the data object, and create a read operation that reads data from the SAP tables. Generate and install the ABAP program on the SAP server that extracts the source data. The Data

106

Chapter 19: New Features and Enhancements (9.0.1)

Integration Service accesses staged files through FTP, SFTP, or standard file I/O, typically using network file sharing, such as NFS.

Adapters for PowerCenter


This section describes the new features and enhancements to PowerExchange for Teradata Parallel Transporter API.

PowerExchange for Teradata Parallel Transporter API


You can provide the query band expression that you want to pass to Teradata Parallel Transporter API as a

session property. A query band expression is a set of name-value pairs that identify a query's originating source.
PowerExchange for Teradata Parallel Transporter API supports Teradata Parallel Transporter version 13. You can load data in parallel through multiple instances into a Teradata PT API target from a source. You can

provide the number of instances in the session properties.


You can pause active loading to a Teradata PT API target in a session. You can acquire data from several

sources with subsequent runs of the session. Run the session without staged loading to load data from all sources at once to the Teradata PT API target.
You can extract data from a Teradata source or load data to a Teradata target when the PowerCenter

Integration Service runs on Windows 2008 R2.

Version 9.0.1

107

CHAPTER 20

Informatica Data Quality and Informatica Data Explorer Advanced Edition (9.0.1)
This chapter includes the following topic:
Transformations, 108

Transformations
Effective in version 9.0.1, the configurable options for some transformations have new names.

For example, most transformations use the term Strategy to specify the data operations defined in the transformation. Previously, transformations used different names for user-defined data operations.
Effective in version 9.0.1, the configurable options for some transformations are redesigned.

For example, the design of the Labeler transformation is simplified. Previously, you selected token labeler or character labeler mode when you created a Labeler transformation. This distinction is removed.

108

CHAPTER 21

Informatica Data Services (9.0.1)


This chapter includes the following topics:
Relational Physical Data Objects, 109 Relational Physical Data Object Source and Target Transformations, 110 Relational Physical Data Object Keys, 110 Flat File Physical Data Object File Path, 110

Relational Physical Data Objects


Effective in version 9.0.1, there are two types of relational physical data objects. Relational data objects are physical data objects that use relational resources as sources. Customized data objects also use relational resources as sources, but they allow you to perform tasks such as joining data from related resources and filtering rows. When you upgrade to version 9.0.1 of the Developer tool, the relational physical data objects you created in version 9.0 of the Developer tool are customized data objects. Therefore, you can configure the data objects to perform the following tasks:
Join source data that originates from the same source database. Select distinct values from the source. Filter rows when the Data Integration Service reads source data. Specify sorted ports. Specify an outer join instead of the default inner join. Create a custom query to issue a special SELECT statement for the Data Integration Service to read source

data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.

Previously, you could not perform these tasks within a relational physical data object.

109

Relational Physical Data Object Source and Target Transformations


When you view a customized data object, the Read and Write views contain source and target transformations that display the metadata of the native, relational resources. By default, these transformations have the same name as the relational resource. When you upgrade a relational physical data object from version 9.0, it becomes a customized data object. The source transformation in the Read view has the name "Read" instead of the name of the relational resource. Similarly, the target transformation in the Write view has the name "Write" instead of the name of the relational resource. The names of the source and target transformations have no effect on the query that the Developer tool executes against the relational resource. If you want to, you can rename the source and target transformations.

Relational Physical Data Object Keys


Effective in version 9.0.1, when you upgrade a relational physical data object, it becomes a customized data object. In customized data objects, a foreign key must be linked to a primary key in another resource or to a primary key in the same resource. If you delete a primary key, and the primary key is referenced by a foreign key, the Developer tool also removes the foreign key. Because foreign keys in a 9.0 relational physical data object do not reference any primary key, the upgrade process removes foreign keys from the relational physical data object. To restore key relationships, synchronize the objects that contain the primary key and foreign key at the same time.

Flat File Physical Data Object File Path


Effective in version 9.0.1, if you use the Developer tool to change a flat file data object that you uploaded in Analyst tool version 9.0, the Analyst tool displays the file path instead of "Uploaded." Previously, the Analyst tool displayed the file path as "Uploaded."

110

Chapter 21: Informatica Data Services (9.0.1)

CHAPTER 22

Informatica Domain (9.0.1)


This chapter includes the following topics:
8.6.1 Features in 9.0.1, 111 Connections, 112 Command Line Programs, 113 Data Integration Service, 116 LDAP, 117 Logs, 117 Maximum Heap Size, 117 Node Diagnostics, 118 Reports, 118

8.6.1 Features in 9.0.1


This section describes features that exist in versions 8.6.1 and 9.0.1. These features do not exist in version 9.0.

Alerts
You can subscribe to domain and service alerts to receive them through email. You can use infacmd isp to configure alerts.

Domain Configuration
You can use the command line interface to back up and restore the domain.

Permissions
Connection permissions. You can use the Administrator tool or command line interface to configure permissions on connections to limit users that perform particular actions on the connection. In 9.0, you could not assign permissions on connections. If you upgrade from 9.0 to 9.0.1, connections that were created in 9.0 remain without permissions. Assign permissions on the 9.0 connections to restrict access to the connections. Service and folder permissions. Unless a specific permission is assigned, services and folders inherit permissions from the parent folder. Domain administrator role permissions. Domain administrators automatically inherit permissions for all services and folders, including new services and folders that are created after the user is assigned the Administrator role.

111

Users
Case-sensitive distinguished name attributes. You can configure Informatica to support case-sensitive distinguished name attributes for LDAP security domains. If you upgrade from 9.0 to 9.0.1, distinguished name attributes that were created in 9.0 are still not case-sensitive. You can configure 9.0 distinguished name attributes to be case-sensitive. User preferences. You can configure user preferences to subscribe to domain and service alerts and to show custom properties. If you upgrade from 8.x to 9.0.1, the following user preferences are no longer available: Show Upgrade Option, Show Tooltips in the Overview Dashboards and Properties, and Overview Grid Refresh Time. As a result, the Administrator tool performs the following actions:
Shows the Upgrade tab if you have privileges to upgrade PowerCenter. Shows tooltips in the Overview and Properties tabs of the Administration tool. Refreshes the grid in the Overview tab every 30 seconds.

License Management Report


You can send the License Management Report in an email.

Connections
This section describes changes to connections.

IBM DB2 Connections


Effective in version 9.0.1, you can configure the tablespace name parameter for IBM DB2 in the advanced properties of the staging database connection for the Analyst Service. During upgrade, this parameter is removed from the staging database properties. You can no longer configure this from the Analyst Service. After you upgrade to version 9.0.1, an administrator needs to add the tablespace name to all upgraded IBM DB2 connections.

Connection Permissions
Effective in version 9.0.1, you can assign read, write, and execute permission on connections. Previously, all users had all permissions on every connection. All users are upgraded with read, write, and execute permissions. To restrict access to connections, a domain administrator must reset permissions.

112

Chapter 22: Informatica Domain (9.0.1)

Command Line Programs


This section describes changes to the command line programs.

infacmd Changed Commands


The following table lists the changed infacmd dis commands:
Commands UpdateServiceOptions Description Configures Data Integration service options. Comments You can now enter the following options - AllowCaching. Allows data object caching when pass-through security is enabled. - ConnectionNames. Enter the connection object name of each connection that allows pass-through security.

The following table lists the changed infacmd ms commands:


Commands RunMapping Description Runs a mapping from a deployed application. Comments You can now use a parameter file when you run a mapping. The following option is added: - ParameterFile -pf Enter the name of the parameter file.

The following table lists the changed infacmd sql commands:


Commands AddGroupPermission AddUserPermission ListSchemaPermissions Description Adds a permission to a group. Adds a permission for a user. Lists the permissions for a virtual schema. Sets group permissions or user permissions for a virtual schema. Comments Renamed to AssignGroupPermission. Renamed to AddUserPermission. Deprecated.

SetSchemaPermissions

Deprecated.

infacmd New Commands


The following table describes new infacmd isp commands:
Command AddConnectionPermissions AssignGroupPermission AssignUserPermission Description Assigns permissions for a user or group to a connection. Assigns a group permissions on an object. Assigns a user permission on an object.

Command Line Programs

113

Command CreateConnection ListConnections ListConnectionOptions ListConnectionPermissions ListConnectionPermissionsByGro up ListConnectionPermissionsByUse r. ListGroupPermissions ListUserPermissions RemoveConnection RemoveConnectionPermissions SyncSecurityDomains UpdateConnection

Description Creates a new database connection. Lists existing connections. Lists available options for connections that you can use when you create a connection. Lists permissions that a user or group has for a connection. Lists all groups that have permission for a connection and the type of permissions they have. Lists all users having permissions on the given connection, along with the type of permissions. Lists the domain objects that a group has permission on. Lists the domain objects that a user has permission on. Removes a database connection. Removes permissions for a specific user or group. Synchronizes LDAP security domain. Updates an existing connection.

The following table describes new infacmd mrs commands:


Command ListBackupFiles ListServiceOptions ListProjects ListServiceProcessOptions UpdateServiceOptions UpdateServiceProcessOptions UpgradeContents Description Lists files in the backup folder. Lists options for the Model Repository Service. Lists projects in the Model repository. Lists service process options for the Model Repository Service. Updates options for the Model Repository Service. Updates service process options for the Model Repository Service. Upgrades content for the Model repository.

The following table describes new infacmd ms commands:


Command ListMappingParams Description Lists the parameters for a mapping with the default values. You can use the output as a parameter file template.

114

Chapter 22: Informatica Domain (9.0.1)

The following table describes new infacmd pwx commands:


Command CloseForceListener CloseListener CondenseLogger DisplayAllLogger DisplayCheckpointsLogger DisplayCPULogger DisplayEventsLogger DisplayMemoryLogger DisplayRecordsLogger DisplayStatusLogger FileSwitchLogger ListTaskListener ShutDownLogger StopTaskListener Description Cancels subtasks and stops. Stops after subtasks complete. Starts another logging cycle. Displays verbose status of a PowerExchange Logger service . Reports information about the latest checkpoint file. Displays the CPU time spent. Displays events in wait status. Displays memory use. Displays changed record counts. Displays status of a PowerExchange Logger service. Switches to a new set of log files. Displays information about active tasks. Shuts down a PowerExchange Logger service. Stops a task.

The following table describes new infacmd sql commands:


Command ExecuteSQL ListColumnPermissions SetColumnPermissions UpdateColumnOptions Description Runs SQL queries against virtual tables in a SQL data service. Lists permissions for a column. Sets the permissions for a column. Sets column options in order to return a constant instead of failing the query when a user does not have permissions on the column.

Command Line Programs

115

New Environment Variables


The following table lists new environment variables:
Command INFA_CLI_CONNECTION_PASSWORD INFA_PASSWORD INFA_PC_REPOSITORY_PASSWORD Description Database password to create connections with infacmd isp CreateConnection. User password to create users or maintain LDAP credentials. PowerCenter repository password to create PowerCenter services or retrieve logs.

Data Integration Service


This section describes changes to the Data Integration Service.

Data Integration Service Privilege


Effective in version 9.0.1, users must have the following privileges and permission to manage applications in the Administrator tool:
Manage Applications privilege for the Data Integration Service Manage Services privilege for the domain Permission on the Data Integration Service

To manage applications in the infacmd command line program, users must have the Manage Applications privilege for the Data Integration Service. Previously, users needed the Manage Services privilege for the domain and permission on the Data Integration Service to manage applications. After you upgrade, you must assign users the Manage Applications privilege for the Data Integration Service.

Maximum # of Concurrent Connections Property


Effective in version 9.0.1, Maximum # of Concurrent Connections is on the Data Integration Service Processes view. In version 9.0, the Maximum # of Concurrent Connections was on the Data Integration Service Properties view. When you upgrade from version 9.0 to 9.0.1, the property value reverts to the default value. The default value is 100. If you configured a value for this property in version 9.0, you must reconfigure the property in version 9.0.1.

Maximum Execution Pool Size Property


Effective in version 9.0.1, Maximum Execution Pool Size is on the Data Integration Service Processes view. In version 9.0, the Max Execution Pool Size was on the Data Integration Service Properties view. When you upgrade from version 9.0 to 9.0.1, the property value reverts to the default value. The default value is 10. If you configured a value for this property in version 9.0, you must reconfigure the property in version 9.0.1.

116

Chapter 22: Informatica Domain (9.0.1)

LDAP
This section describes changes to LDAP.

User Import
Effective in version 9.0.1, the default maximum size for user import is set to 1000. Previously, the default value was set to 0, which indicated that there was no maximum value. When you upgrade, all users are imported into the domain. However, all users over 1000 will be dropped in reverse alphabetic order the next time the Service Manager synchronizes with the LDAP service directory. To avoid dropping users, reset the maximum size in the LDAP server configuration.

Logs
This section describes changes to logs.

View Log Events from the Previous Informatica Version


Effective in version 9.0.1, use infacmd isp ConvertLogFile command to view log files from PowerCenter 8.1.1, 8.5.x and 8.6.x. The infacmd isp ConvertLogFile command uses the following syntax:
ConvertLogFile <-InputFile|-in> input_file_name [<-Format|-fm> format_TEXT_XML] [<-OutputFile|-lo> output_file_name]

Maximum Heap Size


Effective in version 9.0.1 HotFix 1, specify the units for the Maximum Heap Size property value. You can configure the Maximum Heap Size property for an Analyst Service process, Data Integration Service process, Model Repository Service, or Web Services Hub Service. Append one of the following letters to the property value to specify the units:
b for bytes. k for kilobytes. m for megabytes. g for gigabytes.

Previously, you specified the value in megabytes. When you upgrade, the Administrator tool appends "m" to the value.

LDAP

117

Node Diagnostics
Effective in version 9.0.1, you do not need to import SSL certificates to generate node diagnostics on a secure node. The Configuration Support Manager web application runs on the same web application as the Administrator tool. Previously, the Configuration Support Manager ran on a separate web application. If you wanted to ensure security when you connected to the Configuration Support Manager, you had to configure the nodes for security.

Reports
This section describes changes to reports.

User Activity Report


Effective in version 9.0.1, the User Activity Report is obsolete. You can view user activity details in user activity log events, which appear on the Log tab.

License Management Report


Effective in version 9.0.1, the License Management Report contains information for all licenses assigned to the domain. Previously, there was one report for each license key.

118

Chapter 22: Informatica Domain (9.0.1)

CHAPTER 23

PowerCenter (9.0.1)
This chapter includes the following topics:
Mapping Analyst for Excel, 119 Web Services Hub, 120

Mapping Analyst for Excel


This section describes changes to Mapping Analyst for Excel.

Excel Add-in
Effective in version 9.0.1, Mapping Analyst for Excel includes an Excel add-in that adds a Metadata menu or ribbon to Microsoft Excel. Use the Metadata menu or ribbon to complete the following tasks:
Show and hide columns on a worksheet. Annotate cells with descriptions from other worksheets. Format a worksheet to resize the columns to fit the text. Validate the mapping specification. Insert another worksheet of a specific type.

You can install the add-in for Microsoft Excel 2003 or 2007. However, use Microsoft Excel 2007 to use the improved user interface. Previously, Mapping Analyst for Excel did not provide a Metadata menu or ribbon. You used the Validate button on each worksheet to validate data.

Export Option
Effective in version 9.0.1, you do not configure the Operation export option. Mapping Analyst for Excel determines the type of export operation to perform. To configure the Format export option, you select the Standard mapping specification template. Previously, you configured the Operation export option and typed the name of a template in the Format export option.

119

Standard Mapping Specification Template


Effective in version 9.0.1, Mapping Analyst for Excel includes a single mapping specification template named Standard-Blank.xlsx. A mapping specification based on the Standard template contains a single or multiple mappings configured on multiple Excel worksheets. Mapping specifications based on this template can contain source definitions, target definitions, rules, and filter, join, lookup, aggregate, and non-aggregate expressions. Previously, you created a mapping specification based on one of the following mapping specification templates:
Mapping template Source-Target-Matrix template Custom template

You can no longer import from or export to mapping specifications based on these templates. If you have existing mapping specifications, you must reconfigure the mapping metadata in a mapping specification based on the Standard template.

Web Services Hub


This section describes changes to the Web Services Hub.

Sources and Targets


Effective in version 9.0.1, you can edit sources and targets in web service mappings created in PowerCenter 9.0.1. You cannot edit web service sources and targets upgraded from previous versions of PowerCenter. To update a web service source or target from previous versions of PowerCenter, re-create the source or target in PowerCenter 9.0.1.

120

Chapter 23: PowerCenter (9.0.1)

CHAPTER 24

Metadata Manager (9.0.1)


This chapter includes the following topics:
backupCmdLine Command Line Program, 121 Business Glossary, 121 Data Lineage for Custom Objects, 122 Data Lineage for SQL Inline Views, 122 Email, 123 Impact Summary, 123 mmcmd Command Line Program, 123 Searching for PowerCenter Metadata Extensions, 124

backupCmdLine Command Line Program


Effective in version 9.0.1, backupCmdLine includes an optional commit interval argument for the Restore command. The commit interval specifies the number of rows to use as a basis for commits to the Metadata Manager repository. The Restore command uses a batch commit each time it writes this number of rows. Default is 10,000. Previously, you could not configure the commit interval. The Restore command committed all rows at the end of the restore operation.

Business Glossary
Effective in version 9.0.1, Metadata Manager sends an email notifying users about the following events:
A data steward proposes a draft business term for review. Metadata Manager displays the email options so that

the data steward can send an email to other users.


A user adds a comment to a business term in any phase. Metadata Manager sends an email to the data

steward assigned to the term. Previously, Metadata Manager did not send emails notifying users about these events.

121

Data Lineage for Custom Objects


You can create a relationship between a custom metadata object or a business term and the following PowerCenter objects:
Reusable transformation or session. Instance of a transformation or session. Shortcut.

Effective in version 9.0.1, a data lineage diagram displays these relationships in the following ways:
Custom metadata object or business term related to a reusable transformation or session.

When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object and all instances of the object. When you run data lineage analysis on any instance of the PowerCenter object, the data lineage diagram displays the associated custom metadata object or business term.
Custom metadata object or business term related to an instance of a transformation or session.

When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object instance within the context where the instance is used. For example, the diagram shows a transformation instance within its corresponding mapping and session context.
Custom metadata object or business term related to a shortcut.

When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the instances of the original object and all instances of the shortcuts to the original object. Previously, data lineage diagrams did not display all instances of PowerCenter reusable objects and did not display the context of a PowerCenter object instance for these relationships. After you upgrade, complete one of the following tasks to view these relationships in data lineage diagrams:
Reload the custom resource. Reload each PowerCenter resource that is related to the custom resource or to the business glossary.

Data Lineage for SQL Inline Views


Effective in version 9.0.1, you can view data lineage on a database table, view, or synonym used in an SQL query with an inline view. The SQL query can exist in the following objects:
SQL override in a PowerCenter Source Qualifier or Lookup transformation. Database views, stored procedures, functions, and triggers.

After you upgrade, reload any relational database or PowerCenter resource that uses SQL inline views. Previously, Metadata Manager did not correctly display data lineage for a database table, view, or synonym used in an SQL query with an inline view.

122

Chapter 24: Metadata Manager (9.0.1)

Email
Effective in version 9.0.1, you use the Administrator tool to configure the host name and port number of the outgoing mail server in the domain SMTP configuration settings. You must configure these properties before Metadata Manager can send email notifications. In version 9.0, you configured the host name and port number of the outgoing mail server in the imm.properties file. After you upgrade from version 9.0, use the Administrator tool to configure the email properties in the domain SMTP configuration settings. In version 8.6.x, you configured the host name and port number of the outgoing mail server in the domain SMTP configuration settings.

Impact Summary
Effective in version 9.0.1, you can view an impact summary when you view the details of a business term or a metadata object in a packaged resource. You cannot view an impact summary for a PowerCenter metadata object or a custom metadata object. The impact summary for a metadata object displays the following details:
Impact Summary Downstream. Lists all downstream objects. Changes to the selected metadata object impact

these objects.
Impact Summary Upstream. Lists all upstream objects. Changes to these objects impact the selected metadata

object. The impact summary for a business term displays the following details:
Impact Summary Downstream. Lists all objects that are downstream from the object related to the business

term. Changes to the related object impact these objects.


Impact Summary Upstream. Lists all objects that are upstream from the object related to the business term.

Changes to these objects impact the related object. After you upgrade, reload all PowerCenter resources to view the impact summary for business terms and metadata objects. Previously, Metadata Manager did not display an impact summary. You needed to run data lineage analysis to determine the impact of metadata changes.

mmcmd Command Line Program


Effective in version 9.0.1, the mmcmd importmodel command has changed. The -modelName option is optional. If you do not include a model name, mmcmd imports all models in the XML file. Previously, the -modelName option was required.

Email

123

Searching for PowerCenter Metadata Extensions


Effective in version 9.0.1, you cannot search for PowerCenter metadata extensions defined for PowerCenter metadata objects. When you upgrade, Metadata Manager modifies saved searches that search for a PowerCenter metadata extension property to search on Any Property. Previously, you could search for PowerCenter metadata extensions.

124

Chapter 24: Metadata Manager (9.0.1)

Part IV: Version 9.0


This part contains the following chapters:
New Features and Enhancements (9.0), 126 Informatica Domain (9.0), 139 PowerCenter (9.0), 146 Metadata Manager (9.0), 147

125

CHAPTER 25

New Features and Enhancements (9.0)


This chapter includes the following topics:
Informatica Data Quality and Informatica Data Services, 126 Informatica Analyst, 129 Informatica Domain, 130 PowerCenter, 131 Metadata Manager, 132 PowerExchange, 134 Adapters for PowerCenter, 136

Informatica Data Quality and Informatica Data Services


Use Informatica Data Quality and Informatica Data Explorer Advanced Edition for data quality solutions. Use Informatica Data Services for data services solutions. You can also use the Profiling option with Informatica Data Services to profile data. Components for Data Quality and Data Services include a repository, application clients, and application services.

Repository
Data Services and Data Quality both use a Model repository to store objects. If you have Data Services and Data Quality, you can use the same repository.

Application Clients
Data Services and Data Quality both use the following application clients to create objects and preview results:
Informatica Developer (Developer tool). Developers use this application to design and implement data

quality and data services solutions. The Developer tool includes an editor to edit the objects that you create.
Informatica Analyst (Analyst tool). Analysts use this web-based application client to analyze, cleanse,

standardize, profile, and score data in an enterprise.

126

Application Services
Data Services and Data Quality use the following application services to process the objects that you create in the client applications:
Data Integration Service. Performs data integration tasks for Informatica Analyst and Informatica Developer

and stores metadata in a Model repository.


Model Repository Service. Manages connections to the Model repository. Analyst Service. Runs Informatica Analyst.

Data Quality
Informatica Data Quality is enhanced in version 9.0 with new desktop and web-based client applications. Use the Developer tool to design and distribute data quality mappings and rules from your desktop. Use the Analyst tool to analyze data quality and run rules from any Internet browser. With Data Quality, you can perform the following tasks:
Profile data. Create and run a profile to analyze the structure and content of enterprise data and to identify

strengths and weaknesses in the data. After you run a profile, you can selectively drill down to see the underlying rows in the profile results. You can also add columns to scorecards and add column values to reference tables.
Score data. Create scorecards to score the valid values for any column or the output of rules. Scorecards

display the value frequency for columns in a profile as scores. Use scorecards to measure and visually represent data quality progress. You can also view trend charts to view the history of scores over time.
Standardize data values. Standardize data to remove errors and inconsistencies that you find when you run a

profile. You can standardize variations in punctuation, formatting, and spelling.


Parse records. Parse data records to improve record structure and derive additional information from your

data. You can split a single field of freeform data into fields that contain different information types. You can also add information to your records. For example, you can flag customer records as personal or business customers.
Validate postal addresses. Address validation evaluates and enhances the accuracy and deliverability of your

postal address data. Address validation corrects errors in addresses and completes partial addresses by comparing address records against reference data from national postal carriers. Address validation can also add postal information that speeds mail delivery and reduces mail costs.
Manage bad and duplicate records. Duplicate record analysis compares a set of records against each other

to find similar or matching values in selected data columns. You set the level of similarity that indicates a good match between field values. You can also set the relative weight given to each column in match calculations. For example, you can prioritize surname information over forename information.
Create and run data quality rules. Informatica provides pre-built rules that you can run or edit to meet your

project objectives. Create and apply rules within profiles. A rule is reusable business logic that defines conditions applied to data when you run a profile. Use rules to further validate the data in a profile and to measure data quality progress.
Collaborate with Informatica users. The rules and reference data tables you add to the Model repository are

available to users in the Developer tool and the Analyst tool. Users can collaborate on projects, and different users can take ownership of objects at different stages of a project.
Export mappings to PowerCenter. You can export mappings to PowerCenter to reuse the metadata for data

integration tasks or to create web services.

Informatica Data Quality and Informatica Data Services

127

Manage reference data. Create and update reference tables for use by analysts and developers to use in data

quality rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. The following table lists the data quality tasks that you can perform in the Developer tool and the Analyst tool:
Informatica Developer Create and run mappings. Create and run rules. Perform profiling. Score data. Export mappings to PowerCenter. Informatica Analyst Perform profiling. Score data. Manage reference tables. Create profiling rules. Run rules in profiles. Manage bad and duplicate records.

Note: Informatica Data Explorer Advanced Edition functionality is a subset of Informatica Data Quality functionality.

Data Services
Informatica Data Services provides a way to find, understand, integrate, and manage data across an enterprise. With Informatica Data Services, you can create data models that describe how to represent and access data in an enterprise. You can use the components of the data model for data integration and data quality projects. Reuse the components for multiple projects to eliminate redundant work. You can also create a virtual database that allows all applications to consume data regardless of how the integration logic is physically implemented. The virtual database also isolates applications and other data consumers from changes in underlying data sources. With Data Services, you can perform the following tasks:
Define logical views of data. A logical view of data describes the structure and use of data in an enterprise.

You can create a logical data object model that shows what types of data your enterprise uses and how that data is structured.
Map logical models to data sources or targets. Create a mapping that links objects in a logical model to

data sources or targets. You can link data from multiple, disparate sources to have a single view of the data. You can also load data that conforms to a model to multiple, disparate targets.
Create virtual views of data. You can deploy a logical model to a virtual federated database. End users can

run SQL queries against the virtual data without affecting the actual source data.
Export mappings to PowerCenter. You can export mappings to PowerCenter to reuse the metadata for

physical data integration or to create web services.


Create and deploy mappings that end users can query. You can create mappings and deploy them so that

end users can query the mapping results.


Profile data. Create and run profiles to reveal the structure and content of your data. Profiling is a key step in

any data project, as it can identify strengths and weaknesses in your data and help you define your project plan. This is available if you have the profiling option.
Create rules. Create rules with Data Services transformations. This is available if you have the profiling option. Manage reference data. Create and update reference tables for use by analysts and developers to use in data

quality standardization and validation rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. Developers use reference tables in standardization and lookup transformations in Informatica Developer.

128

Chapter 25: New Features and Enhancements (9.0)

The following table lists the data services tasks that you can perform in the Developer tool and the Analyst tool:
Informatica Developer Create logical data object models. Create and run mappings with Data Services transformations. Create SQL data services. Profile data. Create rules. Export objects to PowerCenter. Informatica Analyst - Manage reference data.

Note: If you have the profiling option, you can perform profiling and also create rules with Data Services transformations.

Informatica Analyst
Informatica Analyst is a new web-based application that analysts can use to analyze, cleanse, standardize, profile, and score data in an enterprise. Business analysts and developers use Informatica Analyst for data-driven collaboration. You can perform column and rule profiling, scorecarding, and bad record and duplicate record management. You can also manage and provide reference data to developers in a data quality solution. Use Informatica Analyst to accomplish the following tasks:
Profile data. Create and run a profile to analyze the structure and content of enterprise data and identify

strengths and weaknesses. After you run a profile, you can selectively drill down to see the underlying rows from the profile results. You can also add columns to scorecards and add column values to reference tables.
Create rules in profiles. Create and apply rules within profiles. A rule is reusable business logic that defines

conditions applied to data when you run a profile. Use rules to further validate the data in a profile and to measure data quality progress.
Score data. Create scorecards to score the valid values for any column or the output of rules. Scorecards

display the value frequency for columns in a profile as scores. Use scorecards to measure and visually represent data quality progress. You can also view trend charts to view the history of scores over time.
Manage reference data. Create and modify reference tables for use by analysts and developers to utilize in

data quality standardization and validation rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. Developers utilize reference tables in standardization and lookup transformations in Informatica Developer.
Manage bad records and duplicate records. Fix bad records and consolidate duplicate records.

Informatica Analyst

129

Informatica Domain
The PowerCenter domain is renamed to Informatica domain. It is expanded include objects and services for the Informatica platform.

Command Line Interface


infacmd is expanded to allow management of all Informatica application services. The Informatica Domain command line interface has the following infacmd programs:
infacmd as. Manage Analyst Services. infacmd dis. Manage Data Integration Services. infacmd isp. Administer the domain, security, and PowerCenter application services. Previously, this was called

infacmd.
infacmd mcf. Export mappings from the Model repository to the PowerCenter repository. infacmd mrs. Manage Model Repository Services. infacmd oie. Export objects from the Model repository to an export file. Import objects to the Model repository

from the file.


infacmd ps. Manage the profiling warehouse contents, profiles, and scorecards. infacmd rtm. Manage the staging database for the Analyst tool. infacmd sql. Manage SQL data services that you deploy to the Data Integration Service.

Services
The Informatica domain includes services for PowerExchange, Informatica Analyst, and Informatica Developer.

Analyst Service
Application service that runs Informatica Analyst in the Informatica domain. Create and enable an Analyst Service on the Domain tab of Informatica Administrator. When you enable the Analyst Service, the Service Manager starts Informatica Analyst. You can open Informatica Analyst from Informatica Administrator.

Data Integration Service


Application service that processes requests from Informatica Analyst and Informatica Developer to preview or run data profiles and mappings. It also generates data previews for SQL data services and runs SQL queries against the virtual views in an SQL data service. Create and enable a Data Integration Service on the Domain tab of Informatica Administrator.

Model Repository Service


Application service that manages the Model repository. The Model repository is a relational database that stores the metadata for projects created in Informatica Analyst and Informatica Designer. The Model repository also stores run-time and configuration information for applications deployed to a Data Integration Service. Create and enable a Model Repository Service on the Domain tab of Informatica Administrator.

PowerExchange Listener Service


Manages the PowerExchange Listener for bulk data movement and change data capture. The PowerCenter Integration Service connects to the PowerExchange Listener through the Listener Service.

130

Chapter 25: New Features and Enhancements (9.0)

PowerExchange Logger Service


Manages the PowerExchange Logger for Linux, UNIX, and Windows to capture change data and write it to the PowerExchange Logger Log files. Change data can originate from DB2 recovery logs, Oracle redo logs, a Microsoft SQL Server distribution database, or data sources on an i5/OS or z/OS system.

Management
This section describes new features and enhancements to domain management.

Informatica Administrator
The PowerCenter Administration Console is renamed to Informatica Administrator (Administrator tool). The Informatica Administrator has a new interface. Some of the properties and configuration tasks from the PowerCenter Administration Console have been moved to different locations in Informatica Administrator.

Connection Management
Database connections are centralized in the domain. You can create and view database connections in Informatica Administrator, Informatica Developer, or Informatica Analyst. Create, view, edit, and grant permissions on database connections in Informatica Administrator.

Deployment
You can configure, deploy, and enable applications in the Developer tool. Deploy applications to one or more Data Integration Services.

Licensing
The Informatica domain enforces the licensing restrictions on the number of CPUs and PowerCenter repositories.

Monitoring
You can monitor profile jobs, scorecard jobs, preview jobs, mapping jobs, and SQL data services for each Data Integration Service. View the status of each monitored object on the Monitoring tab of the Administrator tool.

PowerCenter
This section describes new features and enhancements to PowerCenter.

Real-time Sessions
Session log file rollover. You can limit the size of session logs for real-time sessions. You can limit the size

by time or by file size. You can also limit the number of log files for a session.

Lookup Transformation
Cache updates. You can update the lookup cache based on the results of an expression. When an expression

is true, you can add to or update the lookup cache. You can update the dynamic lookup cache with the results of an expression.
Database deadlock resilience. In previous releases, when the Integration Service encountered a database

deadlock during a lookup, the session failed. Effective in 9.0, the session will not fail. When a deadlock occurs, the Integration Service attempts to run the last statement in a lookup. You can configure the number of retry attempts and time period between attempts.

PowerCenter

131

Multiple rows return. You can configure the Lookup transformation to return all rows that match a lookup

condition. A Lookup transformation is an active transformation when it can return more than one row for any given input row.
SQL overrides for uncached lookups. In previous versions you could create a SQL override for cached

lookups only. You can create an SQL override for uncached lookup. You can include lookup ports in the SQL query.

SQL Transformation
Auto-commit for connections. You can enable auto-commit for each database connection. Each SQL

statement in a query defines a transaction. A commit occurs when the SQL statement completes or the next statement is executed, whichever comes first.
Exactly-once processing. The Integration Service provides exactly-once delivery of real-time source

messages to the SQL transformation. If there is an interruption in processing, the Integration Service can recover without requiring the message to be sent again. To perform exactly-once processing, the Integration Service stores a set of operations for a checkpoint in the PM_REC_STATE table.
Passive transformation. You can configure the SQL transformation to run in passive mode instead of active

mode. When the SQL transformation runs in passive mode, the SQL transformation returns one output row for each input row.

XML Transformation
XML Parser buffer validation. The XML Parser transformation can validate an XML document against a

schema. The XML Parser transformation routes invalid XML to an error port. When the XML is not valid, the XML Parser transformation routes the XML and the error messages to a separate output group that you can connect to a target.

Mapping Architect for Visio


Shortcuts. You can configure a transformation to use a shortcut. You can create a mapping template from a

mapping that contains shortcuts to reusable transformations.


Mapping template. You can include the following objects in a mapping template: - Pipeline Normalizer transformation - Custom transformation - PowerExchange source definition - PowerExchange target definition

Command Line Programs


pmrep ExecuteQuery. The pmrep ExecuteQuery command includes an -n option. When you use this option,

pmrep does not include the full parent path of non-reusable objects in the query result. This option can improve pmrep performance.

Metadata Manager
This section describes new features and enhancements to Metadata Manager.

Resources
Microsoft Analysis and Reporting Services resource. Metadata Manager added the Microsoft Analysis and

Reporting Services resource. Use this resource to extract reporting metadata from Microsoft Reporting Services and to extract an analysis schema from Microsoft Analysis Services.

132

Chapter 25: New Features and Enhancements (9.0)

Test a connection to validate the configuration of a resource. When you create or edit a resource, click

Test Connection to test the connection to the source system, validate the Metadata Manager Agent URL, or validate the source file configuration.
PowerCenter parameter file syntax. Metadata Manager supports additional forms of parameters in

PowerCenter parameter files for a PowerCenter resource.


Upload multiple PowerCenter parameter files. When you configure a PowerCenter resource, you can upload

multiple PowerCenter parameter files at the same time.


Abort loads. You can abort a load of any resource that requires the Metadata Manager Agent or any IBM DB2

z/OS resource.
Single log file. You can download a single log file in Microsoft Excel that includes load details, session

statistics, and summary details.


Edit resources. Use a single dialog box to edit any property for a resource, including connection properties,

connection assignments, resource parameters, and schedules.

Connection Assignments
Link objects in connected resources. If you change connection assignments for a resource, you do not need

to reload the resource to create the links between matching objects in connected resources. You can use the Resource Link Administration window to direct Metadata Manager to create the links between matching objects in the resources.
Export missing link details. The Load Details tab contains summary information for links created between

objects in connected resources. You can export the details of objects that Metadata Manager could not link to a Microsoft Excel file for further analysis.
Automatic connection assignment. Metadata Manager can automatically configure connection assignments

for a data integration, business intelligence, or data modeling resource. Metadata Manager configures the connection assignments during a resource load or link process. Use the Load Details tab to review the connection assignments and make corrections as needed. Or, you can manually configure the connection assignments.
Purging resources keeps connection assignments. Metadata Manager keeps connection assignments for a

purged resource. The connection assignment properties display the schema status as purged. If you reload the resource, Metadata Manager changes the status to active if the schema still exists in the source.

Business Glossary
Business Glossary approval workflow. You can use the business glossary to create and edit draft business

terms, propose the business terms for review by data stewards and then publish the terms. After publication, the business term is visible to all users.
Migrate a business glossary. You can migrate business glossaries to and from XML. The XML file includes all

categories, business terms, and custom objects, and object comments, links, and relationships.
Business glossary terms in data lineage diagram. When you launch data lineage, Metadata Manager

displays the business terms associated with each object in a lineage diagram. Metadata Manager does not display the both upstream and downstream connections between business terms in a data lineage diagram anymore.
Link business glossary terms to reference tables. You can associate a business term with a reference table

name and a URL to the reference table. You can specify a URL to a reference table in Informatica Analyst, or you can include any valid URL to a reference table. Previously you associated a business term with a reference table in Reference Table Manager. Reference Table Manager no longer exists.
Audit trail. View the history of changes to business glossary categories, business terms, and custom objects.

The audit trail includes the old and new values of a property and the user who made the edits. You can also search the audit trail.

Metadata Manager

133

Browse Metadata
Use the URL API to access Metadata Manager objects and features . You can use the URL API to access

Metadata Manager objects and features from external applications. For example, you can bookmark a link to a particular catalog object, or you can access data lineage on a specific object from a business intelligence tool.
Display one instance of an object in the lineage diagram. PowerCenter mappings and database tables,

views, and synonyms are no longer split across multiple instances in different nodes in a lineage diagram. You can also configure additional classes whose objects are not split.

mmcmd Commands
The mmcmd command line program includes commands in the following areas:
Resource management. Added commands to create, update, delete, and purge resources. Added commands

to configure connection assignments, configure PowerCenter parameter files, and create links between objects in connected resources. Added a command to cancel a resource load.
Metadata Manager Service. Added commands to create and delete Metadata Manager repository content and

to restore PowerCenter repository content.

Import/Export
Export and import in XML. You can export any custom resource, business glossary, or property added to a

packaged resource type to an XML file and import it into another Metadata Manager instance.

Audit Trail
Audit trail. View the history of changes to business glossary categories and terms, custom resources, and

properties added to packaged resource types. The audit trail includes the old and new values of a property and the user who made the edits. You can also search the audit trail.

Search
Searching metadata in Metadata manager includes the following enhancements:
Links. You can search for text in links for metadata objects, including link name, link description, link URL . Location property. By default, the location of an object appears in search results. You can click the link to

open the metadata object from the search results.


Include audit trail in search results. You can choose to search the history of changes made to custom

resource objects, business glossary categories and terms, and properties added to packaged resource types.

User Interface Enhancements


Updated user interface pages. The Browse, Model, Load, and Security pages contain updates to the look and

feel, Action menu, context-sensitive menus, and toolbar buttons. The Search menu is more conveniently placed for accessibility. The user interface is more consistent with other Informatica web-based tools.

PowerExchange
This section describes new features and enhancements to PowerExchange.
Asynchronous network communication . PowerExchange uses asynchronous network communication for

most send and receive operations between a PowerExchange client and a PowerExchange Listener. With asynchronous communication, PowerExchange uses separate threads for network processing and data processing, so that network processing overlaps with data processing. PowerExchange sends and receives heartbeat signals across the network that can be used for early detection of failure situations.

134

Chapter 25: New Features and Enhancements (9.0)

Partitioning for bulk data movement sessions . PowerExchange 9.0 provides the following partitioning

enhancements:
- You can use pass-through partitioning without SQL overrides for bulk data movement sessions that include

any of the following offloaded data sources: VSAM data sets, sequential data sets, and DB2 for z/OS unload data sets. PowerExchange opens a single connection to the data source and distributes the data across the partitions.
- For all other nonrelational bulk data sources, you can use pass-through partitioning with disjoint SQL

overrides. If you do not provide overrides with these data sources, data is read into the first partition only.
DB2 for z/OS Stored Procedure transformations . You can use Stored Procedure transformations for DB2 for

z/OS stored procedures in a PowerCenter mapping. Use Stored Procedure transformations for read or write bulk data movement and change data capture (CDC) operations.
DB2 for i5/OS multiple-row FETCH statements . To enhance the performance of DB2 for i5/OS bulk data

movement operations that use the DB2 access method, PowerExchange uses a DB2 multiple-row FETCH statement to retrieve multiple rows of data at a time from a source table. In PowerCenter, you can configure the number of rows to be retrieved by setting the Array Size attribute on a PWX DB2i5OS relational connection used by PWXPC.
PowerExchange Logger for Linux, UNIX, and Windows performance . The following enhancements improve

PowerExchange Logger for Linux, UNIX, and Windows processing and management:
- PowerExchange Logger configuration parameters . You can enter additional parameters in the

PowerExchange Logger for Linux, UNIX and Windows configuration file to control how expired CDCT records are deleted, specify the maximum number of days to hold retention array items in memory, and control whether PowerExchange displays a user confirmation prompt for a cold start or for a warm start from a previous position in the change stream.
- PowerExchange Logger cold starts . PowerExchange 9.0 provides a COLDSTART parameter that enables

you to control whether the PowerExchange Logger for Linux, UNIX, and Windows cold starts or warm starts.
- DISPLAY commands . Additional DISPLAY commands are available for the PowerExchange Logger for

Linux, UNIX, and Windows to help you monitor and manage PowerExchange Logger processing. You can issue these commands from the command line or by using the pwxcmd program.
- Checkpoint file. For more efficient checkpoint processing and smaller checkpoint files, PowerExchange 9.0

changes the format of the PowerExchange Logger checkpoint files from CISAM files to sequential files with the extension .ckp.
PowerExchange Logger file management . Use the PWUCDCT utility to back up, restore, and regenerate the

PowerExchange Logger for Linux, UNIX, and Windows CDCT file and to manage expired CDCT records and orphaned log files. Also use the utility to print reports on PowerExchange Logger files such as checkpoint files and log files.
pwxcmd command support for i5/OS and z/OS . You can issue use the pwxcmd program to issue commands

from a Linux, UNIX, or Windows system to a PowerExchange Listener or PowerExchange Condense process running on an i5/OS or z/OS system.
Extraction of relative record numbers during DB2 for i5/OS CDC processing . When you create a capture

registration, you have the option to capture the relative record number of the change record.
PowerExchange Listener Service . Use this application service to manage the local PowerExchange Listener.

You can create, start, or stop a PowerExchange Listener service or view status information in the Informatica Administrator.
PowerExchange Logger Service . Use this application service to manage the local PowerExchange Logger for

Linux, UNIX, and Windows. You can create, start, or stop a PowerExchange Logger service or view status information in the Informatica Administrator.

PowerExchange

135

Adapters for PowerCenter


This section describes new features and enhancements to the adapters for PowerCenter.

PowerExchange for HP Neoview Transporter


Support for the following platforms: - SUN SPARC 64-bit - Windows 2003 on EM64T You can import a Neoview source or target definition from Neoview. Previously, you connected to Neoview

through ODBC to create an Neoview source or target definition.


You can generate and execute SQL for a Neoview target. You can preview source and target data. Support for all Neoview datatypes except Interval. Precision to the nanosecond for Neoview date and time datatypes. Multibyte character support. You can configure the following properties in the Source Qualifier: - Number of sorted ports - Select distinct - Source filter You can override an SQL query in the Source Qualifier. You can use the following transformations in a Neoview mapping: - Lookup - SQL - Stored Procedure - Update Strategy Relational connection object. Use a relational connection object for the Integration Service to connect to

Neoview. Select a relational or bulk reader for the connection when you configure the session properties.
You can enter mapping variables and workflow variables in the connection object properties. You can configure parameters for a Neoview session. You can configure a Neoview session for recovery. You can enter pre-session and post-session SQL commands. You can override the source table name and source table name owner the in the session properties. You can configure the following session properties for a Neoview session: - Commit type - Enable high precision - Error log type - Error row handling - Pipeline partitioning - Session on grid If you use PowerExchange change data capture, you can load the changed data to Neoview as a target in the

same mapping.

136

Chapter 25: New Features and Enhancements (9.0)

PowerExchange for Netezza


Filtering metadata during source import. You can filter the metadata you want to appear when you create a

Netezza source definition.


You can import a Netezza source or target definition from Netezza. Previously, you connected to Netezza

through ODBC to create a Netezza source or target definition.


You can generate and execute SQL for a Netezza target. You can preview source and target data. Support for all Netezza datatypes except Interval. Precision to the nanosecond for Netezza date and time datatypes. Multibyte character support. You can configure the following properties in the Source Qualifier: - Number of sorted ports - Select distinct - Source filter You can override an SQL query in the Source Qualifier. You can use the following transformations in a Netezza mapping: - Lookup - SQL - Update Strategy Relational connection object. Use a relational connection object for the Integration Service to connect to

Netezza. Select a relational or bulk reader for the connection when you configure the session properties.
You can enter mapping variables and workflow variables in the connection object properties. You can configure parameters for a Netezza session. You can recover non real-time Netezza sessions in normal mode. You can enter pre-session and post-session SQL commands. You can override the source table name and source table name owner the in the session properties. You can configure the following session properties for a Netezza session: - Commit type - Enable high precision - Error log type - Error row handling - Pipeline partitioning - Pushdown optimization - Session on grid If you use PowerExchange change data capture, you can load the changed data to Netezza as a target in the

same mapping.

Adapters for PowerCenter

137

PowerExchange for Oracle E-Business Suite


This section describes new features in PowerExchange for Oracle E-Business Suite 9.0:
Parameterized properties. You can parameterize the following PowerExchange for Oracle E-Business Suite

properties in PowerCenter:
- SQL properties: Source Filter and SQL Query. - Target properties: User Name, Responsibility Name, Security Group Name, Server Name, and Schema Name. - Session properties: Source Filter, SQL Query, User Name, Responsibility Name, Security Group Name,

Server Name, Schema Name, and Cache Directory.


- Connection properties: Connection Objects. Commit Interval session property. Specify the number of rows that the PowerCenter Integration Service

commits to the interface table during each commit.

PowerExchange for SAP NetWeaver


Writing data to the IS-U Migration Workbench. You can import an IS-U transformation to generate files for an

SAP migration file. The Integration Service can write the data to the IS-U Migration Workbench.

PowerExchange for Teradata Parallel Transporter API


You can add partitions to a session that includes a Teradata PT API source.

PowerExchange for Web Services


Configuring a user name and password for authentication. When a web service is protected or secure, you can

include a user name and password in a web service target or Web Services Consumer transformation.

PowerExchange for webMethods


You can import documents with special characters as source definitions.

138

Chapter 25: New Features and Enhancements (9.0)

CHAPTER 26

Informatica Domain (9.0)


This chapter includes the following topics:
PowerCenter Domain, 139 Command Line Programs, 141 Metadata Manager Privileges, 145

PowerCenter Domain
This section describes changes to the PowerCenter domain.

PowerCenter Domain Name Change


Effective in version 9.0, the PowerCenter domain is renamed to Informatica domain.

Administration Console Name Change


Effective in version 9.0, the PowerCenter Administration Console is renamed to Informatica Administrator.

Informatica Administrator URL Change


Effective in version 9.0, the host and port in the Informatica Administrator URL represent the host name of the master gateway node and the Informatica Administrator port number. You configure the Informatica Administrator port when you define the domain. You can define the domain during installation or using the infasetup DefineDomain command line program. If the domain fails over to a different master gateway node, the host name in the Informatica Administrator URL will be equal to the host name of the elected master gateway node. Previously, the host and port in the Informatica Administrator URL represented the host name and port number of any gateway node.

Domain Ports
Effective in version 9.0, each worker node in the Informatica domain uses the following ports:
Domain port. Port number for the node. Service Manager port. Port number used by the Service Manager on the node.

139

Each gateway node uses the following ports:


Domain port. Port number for the node. Service Manager port. Port number used by the Service Manager on the node. Informatica Administrator port. Port number used by the Administrator tool.

The Service Manager logs show the Service Manager port number. Previously, each node used a single domain port. On a worker node, the node and Service Manager on the node used the same domain port number. On a gateway node, the node, Service Manager on the node, and the Administration Console used the same domain port number. The Service Manager logs showed the domain port number.

Object Name Length


Effective in version 9.0, domain object names, except folder names, can be up to 128 characters long. Previously, domain object names could be up to 80 characters.

Shared Object Names


Effective in version 9.0, some objects within a domain can have the same name. The following table lists whether domain objects can have the same names:
Node Node Folder No Yes Folder Yes Yes, if they are not at the same level. Yes Yes Yes Grid Yes Yes License Yes Yes Service Yes Yes

Grid License Service

Yes Yes Yes

No Yes Yes

Yes No Yes

Yes Yes No

Previously, domain objects could not have the same name.

Domain Configuration
Effective in version 9.0, the domain configuration metadata uses the same database structure as the Model repository database. You should back up the domain configuration database on a regular basis. You can restore the domain configuration from a backup. If you migrate the domain configuration to another database or change the database connection information, you must update the database connection on all gateway nodes. You create a license object when you install PowerCenter. You can also create license objects in Informatica Administrator. You can view the license and all licensed options in Informatica Administrator.

140

Chapter 26: Informatica Domain (9.0)

Command Line Programs


This section describes changes to the command line programs.

New Commands
The following table describes the new infacmd as commands:
Command CreateAuditTables CreateService DeleteAuditTables ListServiceOptions ListServiceProcessOptions UpdateServiceOptions UpdateServiceProcessOptions Description Creates audit tables. Creates an Analyst Service. Deletes audit tables. Lists Analyst service properties that you can update. Lists Analyst service process options that you can update. Updates Analyst service properties. Updates Analyst service process properties.

The following table describes the new infacmd dis commands:


Command BackupApplication CreateService DeployApplication ListApplicationOptions ListDataObjectOptions ListServiceOptions Description Backs up an application from a Data Integration Service. Creates the Data Integration Service. Deploys an application to a Data Integration Service. Lists tapplication properties that you can update. Lists the data object properties that you can update. Lists Data Integration Service properties that you can update. Lists Data Integration Service process properties that you can update. Restores an application to a Data Integration Service. Starts an application. Updates a deployed application from an application file and maintains existing properties. Updates application properties.

ListServiceProcessOptions

RestoreApplication StartApplication UpdateApplication

UpdateApplicationOptions

Command Line Programs

141

Command UpdateDataObjectOptions UpdateServiceOptions UPdateServiceProcessOptions CancelDataObjectCacheRefresh ListApplicationObjects ListApplications PurgeDataObjectCache RefreshDataObjectCache RenameApplication StopApplication UndeployApplication

Description Updates data object properties. Updates Data Integration Service properties. Updates Data Integration Service process properties. Stops a refresh of data object cache. Lists the paths of objects in an application. Lists deployed applications for a Data Integration Service. Purges data object cache. Refreshes data object cache. Renames a deployed application. Stops an application from running. Remove an application from a Data Integration Service.

The following table describes the new infacmd ipc commands:


Command ExportToPC Description Exports an object in PowerCenter format.

The following table describes the new infacmd mrs commands:


Command BackupContents CreateContents CreateService DeleteContents ListBackupFiles ListServiceOptions ListServiceProcessOptions RestoreContents UpdateServiceOptions Description Backs up Model repository contents. Creates Model repository contents. Creates a Model repository service. Deletes a Model repository service Lists backup files Lists service options Lists service process options Restores Model repository contents Updates Model repository service options

142

Chapter 26: Informatica Domain (9.0)

Command UpdateServiceProcessOptions UpgradeContents

Description Updates service process options Updates the Model repository

The following table describes the new infacmd ms commands:


Command ListMappingParams ListMappings RunMapping Description Create a parameter file for a parameterized mapping. List deployed mappings. Runs a deployed mapping.

The following table describes the new infacmd ps comands:


Command CreateWH DropWH Execute List Purge Description Creates Profiling Warehouse contents. Drops ProfilingWarehouse Contents. Executes a profile or scorecard. Lists profiles or scorecards. Purge profile or scorecard results from the warehouse.

The following table describes the new infacmd pwx command:


Description CloseForceListener CloseListener CreateListenerService ListTaskListener StopTaskListener UpdateListenerService CondenseLogger CreateLoggerService DisplayAllLogger Command Cancels subtasks and stops the Listener. Stops the Listener after subtasks complete. Creates a PowerExchange Listener service. Displays information about active tasks. Stops a task. Updates a PowerExchange Listener service. Starts another logging cycle. Creates a PowerExchange Logger service. Displays verbose status of a PowerExchange Logger service.

Command Line Programs

143

Description DisplayCPULogger DisplayCheckpointsLogger DisplayEventsLogger DisplayMemoryLogger DisplayRecordsLogger DisplayStatusLogger FileSwitchLogger ShutDownLogger UpdateLoggerService

Command Displays the CPU time spent. Reports information about the latest checkpoint file. Displays events being waited on. Displays memory use. Displays counts of change records. Displays status of a PowerExchange Logger service. Switches to a new set of log files. Shuts down a PowerExchange Logger service. Updates a PowerExchange Logger service.

The following table describes the new infacmd rtm commands:


Command Deployimport Export Import Description Imports Content to Staging database. Exports dictionary content. Imports dictionary content.

The following table describes the new infacmd sql commands:


ListSQLDataServiceOptions ListSQLDataServicePermissions ListStoredProcedurePermissions ListTableOptions ListTablePermissions SetSQLDataServicePermissions SetStoredProcedurePermissions SetTablePermissions StartSQLDataService UpdateSQLDataServiceOptions UpdateTableOptions Lists SQL Data Service options. Lists SQL Data Service permissions. Lists stored procedure permissions. Lists table options. Lists table permissions. Sets permissions on a SQL data service. Sets permissions on a stored procedure. Sets permissions on a table. Starts a SQL data service. Updates SQL data service properties. Updates table properties.

144

Chapter 26: Informatica Domain (9.0)

ListSQLDataServices

Lists all SQL data service names for a Data Integration Service. Purge virtual table cache. Refresh virtual table cache. Rename a SQL Data Service. Stop a SQL data service.

PurgeTableCache RefreshTableCache RenameSQLDataService StopSQLDataService

New Environment Variable


The following table describes the new environment variable:
Command INFA_DEFAULT_SECURITY_DOMAIN Description Default security domain name that a user belongs to.

Metadata Manager Privileges


Effective in version 9.0, Metadata Manager includes the new privilege Draft/Propose Business Terms in the Catalog privilege group. The Manage Glossary privilege includes the Draft/Propose Business Terms privilege. When you upgrade, users assigned the Manage Glossary privilege are also assigned the Draft/Propose Business Terms privilege.

Metadata Manager Privileges

145

CHAPTER 27

PowerCenter (9.0)
This chapter includes the following topic:
Reference Table Manager, 146

Reference Table Manager


Effective in version 9.0, the Analyst tool contains the Reference Table Manager functionality. You can create and import reference tables into the Analyst tool. Previously, PowerCenter had a web application named Reference Table Manager. After you upgrade to 9.01, you must create the reference tables in the Analyst tool again. The Analyst tool stores reference tables in the staging database. The Reference Table Manager external connections and privileges are not valid in the Analyst tool. When you create reference tables in the Analyst tool, a developer can view these tables in the the Developer tool. A developer can open a reference table to view the contents of the reference table and use them in Lookup and Standardizer transformations. A developer can launch the Analyst tool from the Developer tool to edit the reference table.

146

CHAPTER 28

Metadata Manager (9.0)


This chapter includes the following topics:
Business Glossary, 147 mmcmd Command Line Program, 147 Data Lineage, 148 Logging, 148 Resources, 149

Business Glossary
Effective in version 9.0, you create links between business terms and reference tables by associating a business term with a URL to a reference table. You can specify the following types of URLs:
Informatic a Analyst URL. You can include a URL to a reference table in Informatica Analyst. Other URL. You can include any valid URL to a reference table.

Previously, you created a link from a business term in Metadata Manager to a reference table in Reference Table Manager. Reference Table Manager does not exist in version 9.0.

mmcmd Command Line Program


Effective in version 9.0, mmcmd includes new commands. The following table describes the new mmcmd commands:
Command assignConnection Description Configures connection assignments for a resource using the properties in the specified resource configuration file. Assigns parameter files to PowerCenter workflows for a PowerCenter resource using the properties in the specified resource configuration file. Cancels a resource load.

assignParameterFile

cancel

147

Command createRepository

Description Creates the Metadata Manager warehouse tables and import models for metadata sources in the Metadata Manager repository. Creates a resource using the properties in the specified resource configuration file. Deletes Metadata Manager repository content including all metadata and repository database tables. Deletes the resource and all metadata for the resource from the Metadata Manager repository. Exports a custom resource or business glossary from the Metadata Manager repository to an XML file. Writes all properties for the specified resource to an XML resource configuration file. Imports a custom resource or business glossary from an XML file into the Metadata Manager repository. Creates the links between resources that share a connection assignment to run data lineage analysis across the metadata sources. Lists all resources in the Metadata Manager repository. Deletes metadata for a resource from the Metadata Manager repository. Restores a repository backup file packaged with PowerCenter to the PowerCenter repository database. Updates a resource using the properties in the specified resource configuration file.

createResource deleteRepository

deleteResource export

getResource import

link

listResources purgeMetadata restorePCRepository

updateResource

Data Lineage
Effective in version 9.0, a data lineage diagram displays the database schema name as the parent object of a table, view, or synonym. Previously, a data lineage diagram did not display the database schema name. Instead, Metadata Manager displayed the schema name in the location of the object. You could view the object location in the Details panel and when you moved the pointer over the object.

Logging
Effective in version 9.0, the Load Monitor is renamed the Load Details tab. The Load Details tab contains the following views:
Log view. Contains resource load events. Metadata Manager updates the Log tab as it loads a resource. Objects view. Contains summary information for metadata objects. Errors view. Contains summary information for errors.

148

Chapter 28: Metadata Manager (9.0)

Sessions view. Contains session statistics for each session in the PowerCenter workflows that Metadata

Manager uses to load metadata.


Links view. Contains summary information for links created between objects in connected resources.

You can export all of the contents in the Load Details tab to a single Microsoft Excel file. The file contains a worksheet for each view. You can export the contents of the Links view separately to analyze the missing link details. Previously, the Load Monitor listed the object, link, and error information in the Summary tab. You saved each tab in the Load Monitor window to a separate PDF file. After you upgrade, the Load Details tab for all resources displays "No items to show" until you reload the resources.

Resources
This section describes changes to resources.

Removed Resource Types


Effective in version 9.0, Metadata Manager removed the following resource types:
Data Analyzer Hyperion Essbase IBM DB2 CubeViews Microsoft Visio Database ERX

You cannot create or load these resource types.

Deprecated Metadata Source Versions


Effective in version 9.0, extracting metadata from the following metadata source versions is deprecated and will become obsolete:
Business Objects 5.x, 6.x, and 11.0 Cognos 8.0 and 8.1 Cognos Impromptu 7.0.x, 7.1.5, and 7.3.x Cognos ReportNet 1.x Embarcadero ERStudio 5.1 through 6.6 ERwin 3.x and 4.x IBM DB2 for LUW 8.x IBM DB2 for z/OS 8.x IBM Rational Rose 4.0, 98(I) to 2000 Microsoft SQL Server 2000 Microstrategy 7.0, 7.5.2, and 8.0 Oracle Designer 1.3.2, 2.1.2, 6.0, 6i, and 9i Sybase PowerDesigner 6.1.x, 7.5 to 12.0

Resources

149

Teradata v2R6 and v2R6.1

Use the supported versions to load metadata from these sources. You can still create, edit, and load resources from these deprecated versions. However, Informatica cannot help you resolve an issue encountered on a deprecated version.

Linking Objects in Connected Resources


Effective in version 9.0, if you change resource connection assignments, you do not need to reload the resource to create the links between matching objects for data lineage analysis. You can use the Resource Link Administration window to direct Metadata Manager to create the links between matching objects in the connected resources. After upgrading, you must reload the following resource types:
Relational database resources that share a connection assignment to any data integration, business

intelligence, or data modeling resource


Data integration resources Business intelligence resources Data modeling resources

Previously, you needed to reload resources after modifying the connection assignments.

Connection Assignments for Purged Resources


Effective in version 9.0, Metadata Manager keeps connection assignments for a purged resource. The Connection Assignment properties display the schema status as purged. If you reload the resource, Metadata Manager changes the status to active if the schema still exists in the source. Previously, Metadata Manager deleted connection assignments for a purged resource.

Automatic Connection Assignment


Effective in version 9.0, Metadata Manager can automatically configure connection assignments for a data integration, business intelligence, or data modeling resource. Metadata Manager configures the connection assignments during a resource load or link process. Use the Links view in the Load Details tab to review the automatic connection assignments and change the assignments as needed. Or, you can manually configure the connection assignments. Upgraded resources do not have Auto Connection Assignment selected by default. Previously, you needed to manually configure connection assignments after creating a resource.

PowerCenter Source Increment Extract Window


Effective in version 9.0, the maximum value of the Source Increment Extract Window parameter for a PowerCenter resource is 8,000 days. Previously, the maximum value was 4,000 days. PowerCenter resources that are upgraded retain the previous value of the parameter. Change the parameter value to extract PowerCenter metadata for more than the past 4,000 days.

150

Chapter 28: Metadata Manager (9.0)

PowerCenter Parameter Files


Effective in version 9.0, Metadata Manager can extract workflow parameters from PowerCenter parameter files when you load a PowerCenter resource. When you configure a PowerCenter resource, you can upload multiple PowerCenter parameter files at the same time. Previously, Metadata Manager ignored workflow parameters in PowerCenter parameter files.

Resources

151

You might also like