Professional Documents
Culture Documents
0)
Release Guide
Informatica Release Guide Version 9.5.0 June 2012 Copyright (c) 1998-2012 Informatica. All rights reserved. This software and documentation contain proprietary information of Informatica Corporation and are provided under a license agreement containing restrictions on use and disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica Corporation. This Software may be protected by U.S. and/or international Patents and other Patents Pending. Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III), as applicable. The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us in writing. Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange, PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and Informatica Master Data Management are trademarks or registered trademarks of Informatica Corporation in the United States and in jurisdictions throughout the world. All other company and product names may be trade names or trademarks of their respective owners. Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights reserved.Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization 1986. All rights reserved. Copyright ej-technologies GmbH. All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright is International Business Machines Corporation. All rights reserved. Copyright yWorks GmbH. All rights reserved. Copyright Lucent Technologies 1997. All rights reserved. Copyright (c) 1986 by University of Toronto. All rights reserved. Copyright 1998-2003 Daniel Veillard. All rights reserved. Copyright 2001-2004 Unicode, Inc. Copyright 1994-1999 IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and other software which is licensed under the Apache License, Version 2.0 (the "License"). You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software copyright 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under the GNU Lesser General Public License Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose. The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California, Irvine, and Vanderbilt University, Copyright () 1993-2006, all rights reserved. This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html. This product includes Curl software which is Copyright 1996-2007, Daniel Stenberg, <daniel@haxx.se>. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. The product includes software copyright 2001-2005 () MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://www.dom4j.org/ license.html. The product includes software copyright 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http://dojotoolkit.org/license. This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html. This product includes software copyright 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at http:// www.gnu.org/software/ kawa/Software-License.html. This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php. This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are subject to terms available at http://www.boost.org/LICENSE_1_0.txt. This product includes software copyright 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at http:// www.pcre.org/license.txt. This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms available at http:// www.eclipse.org/org/documents/epl-v10.php. This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://www.stlport.org/ doc/ license.html, http://www.asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/license-agreements/fuse-message-broker-v-5-3- licenseagreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html; http://www.jgraph.com/jgraphdownload.html; http:// www.jcraft.com/jsch/LICENSE.txt. http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/2002/copyright-software-20021231; http:// developer.apple.com/library/mac/#samplecode/HelpHook/Listings/HelpHook_java.html; http://www.jcraft.com/jsch/LICENSE.txt; http://nanoxml.sourceforge.net/orig/ copyright.html; http://www.json.org/license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://
www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/ license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; and http://srp.stanford.edu/license.txt; and http://www.schneier.com/blowfish.html; http:// www.jmock.org/license.html; http://xsom.java.net/. This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php) the MIT License (http://www.opensource.org/licenses/mitlicense.php) and the Artistic License (http://www.opensource.org/licenses/artistic-license-1.0). This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab. For further information please visit http://www.extreme.indiana.edu/. This Software is protected by U.S. Patent Numbers 5,794,246; 6,014,670; 6,016,501; 6,029,178; 6,032,158; 6,035,307; 6,044,374; 6,092,086; 6,208,990; 6,339,775; 6,640,226; 6,789,096; 6,820,077; 6,823,373; 6,850,947; 6,895,471; 7,117,215; 7,162,643; 7,243,110; 7,254,590; 7,281,001; 7,421,458; 7,496,588; 7,523,121; 7,584,422; 7,676,516; 7,720,842; 7,721,270; and 7,774,791, international Patents and other Patents Pending. DISCLAIMER: Informatica Corporation provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied warranties of noninfringement, merchantability, or use for a particular purpose. Informatica Corporation does not warrant that this software or documentation is error free. The information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is subject to change at any time without notice. NOTICES This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions: 1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT. 2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS. Part Number: IN-REL-95000-0001
Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Customer Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi Informatica Multimedia Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
Table of Contents
Mapping and Mapplet Editors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Match Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Reference Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Scorecards. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
ii
Table of Contents
Part II: Version 9.1.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Chapter 10: New Features and Enhancements (9.1.0). . . . . . . . . . . . . . . . . . . . . . . . . 46
Version 9.1.0 HotFix 4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Version 9.1.0 HotFix 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 Informatica Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Version 9.1.0 HotFix 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Informatica Data Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Version 9.1.0 HotFix 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Table of Contents
iii
Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 Metadata Manager . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 Version 9.1.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Informatica Data Explorer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Informatica Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Informatica Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 Data Analyzer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
iv
Table of Contents
Create Web Service from a WSDL Data Object Wizard. . . . . . . . . . . . . . . . . . . . . . . . . . . 79 Deployment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Ports Tab Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Cache Property in the Lookup Transformation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Web Service Consumer Transformation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Deleted WSDL Data Object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 Ports Tab. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Ports Tab Options. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Table of Contents
Part III: Version 9.0.1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Chapter 19: New Features and Enhancements (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . 95
Version 9.0.1 HotFix 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Informatica Data Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 95 Version 9.0.1 HotFix 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 96 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Version 9.0.1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 Informatica Data Quality and Informatica Data Explorer Advanced Edition. . . . . . . . . . . . . . . 98 Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 PowerExchange. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 Adapters for Data Quality and Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Chapter 20: Informatica Data Quality and Informatica Data Explorer Advanced Edition (9.0.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
vi
Table of Contents
Connection Permissions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 Command Line Programs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 infacmd Changed Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 infacmd New Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 New Environment Variables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Data Integration Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Data Integration Service Privilege. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Maximum # of Concurrent Connections Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Maximum Execution Pool Size Property. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 LDAP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 User Import. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 View Log Events from the Previous Informatica Version. . . . . . . . . . . . . . . . . . . . . . . . . 117 Maximum Heap Size. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Node Diagnostics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 Reports. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 User Activity Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118 License Management Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
Part IV: Version 9.0. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125 Chapter 25: New Features and Enhancements (9.0). . . . . . . . . . . . . . . . . . . . . . . . . . 126
Informatica Data Quality and Informatica Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 Data Quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Data Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
Table of Contents
vii
Informatica Analyst. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 Informatica Domain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Command Line Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130 Management. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Metadata Manager. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 PowerExchange. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 Adapters for PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 PowerExchange for HP Neoview Transporter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136 PowerExchange for Netezza. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 PowerExchange for Oracle E-Business Suite. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for SAP NetWeaver. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for Teradata Parallel Transporter API. . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for Web Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 PowerExchange for webMethods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
viii
Table of Contents
Connection Assignments for Purged Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 Automatic Connection Assignment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 PowerCenter Source Increment Extract Window. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 PowerCenter Parameter Files. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
Table of Contents
ix
Preface
The Informatica Release Guide is written for administrators who are responsible for installing and configuring the Informatica platform and developers and software engineers who implement Informatica. This guide assumes that you have knowledge of the features for which you are responsible. The Informatica Release Guide lists new features and enhancements, behavior changes between versions, and tasks you might need to perform after you upgrade from a previous version.
Informatica Resources
Informatica Customer Portal
As an Informatica customer, you can access the Informatica Customer Portal site at http://mysupport.informatica.com. The site contains product information, user group information, newsletters, access to the Informatica customer support case management system (ATLAS), the Informatica How-To Library, the Informatica Knowledge Base, the Informatica Multimedia Knowledge Base, Informatica Product Documentation, and access to the Informatica user community.
Informatica Documentation
The Informatica Documentation team takes every effort to create accurate, usable documentation. If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation team through email at infa_documentation@informatica.com. We will use your feedback to improve our documentation. Let us know if we can contact you regarding your comments. The Documentation team updates documentation as needed. To get the latest documentation for your product, navigate to Product Documentation from http://mysupport.informatica.com.
includes articles and interactive demonstrations that provide solutions to common problems, compare features and behaviors, and guide you through performing specific real-world tasks.
Standard Rate Belgium: +31 30 6022 797 France: +33 1 4138 9226 Germany: +49 1805 702 702 Netherlands: +31 306 022 797 United Kingdom: +44 1628 511445
Preface
xi
xii
CHAPTER 1
Version 9.5.0
This section describes new features and enhancements in version 9.5.0.
Informatica Installer
This section describes new features and enhancements to the Informatica platform installer.
Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in
the Analyst tool. You can edit database connections in the Analyst tool.
Enterprise Discovery
You can run multiple data discovery tasks on a large number of data sources across multiple connections and generate a consolidated results summary of the profile results. This data discovery method includes running a column profile, data domain discovery, and discovering primary key and foreign key relationships. You can view the results in both graphical and tabular formats. You can run enterprise discovery from a profile model in the Developer tool.
Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.
Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.
Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling
URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.
Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in
the Analyst tool. You can edit database connections in the Analyst tool.
Version 9.5.0
path to the probabilistic model files in the on each Content Management Service.
You can configure a master Content Management Service for an Informatica domain or grid. You specify a
master Content Management Service when you want to run a mapping that reads probabilistic model data on multiple nodes. When you use a master Content Management Service, any probabilistic model file that you create or update on the master service host machine is copied from the master service machine to the locations specified by the other Content Management Services on the domain or grid.
The Content Management Service enables dynamic configuration updates for the Address Validator
transformation and the Match transformation. The Content Management Service updates the input port list in the Address Validator transformation each time you open the transformation. You can install an address validation engine update from Informatica without performing a product reinstall. The Content Management Service updates the list of identity population files in the Match transformation each time you open the transformation.
produces deterministic results for the same source data, masking rules, and seed value.
Dependent masking. Replaces the values of one source column based on the values of another source column.
Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.
Decision Transformation
The Decision transformation handles integer values in IF/ELSE statements in addition to boolean values. The transformation processes a 0 value as False and other integer values as True.
Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.
Probabilistic Models
A probabilistic model is a content set that you can use to identify data values on input ports that contain one or more values in a delimited string. A probabilistic model uses probabilistic matching logic to identify data values by the types of information the values contain. You can use a probabilistic model in Labeler and Parser transformations. You create a probabilistic model in the Developer tool. You select the model from a project folder in the Model repository. The Developer tool writes probabilistic model data to a file you specify in the Content Management Service.
Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling
URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.
Workflows
A workflow is a graphical representation of a set of events, tasks, and decisions that define a business process. You use the Developer tool to add objects to a workflow and to connect the objects with sequence flows. The Workflow Service Module is the component in the Data Integration Service that uses the instructions configured in the workflow to run the objects. A workflow can contain the following objects:
Start event that represents the beginning of the workflow. End event that represents the end of the workflow. Mapping task that runs a mapping. Command task that runs a single shell command.
Version 9.5.0
Human task that involves user interaction with an application. For example, you view bad or duplicate records
to send emails, you must use the Administrator tool to configure the email server properties for the Data Integration Service.
Assignment task that assigns a value to a user-defined workflow variable. Exclusive gateway that makes a decision to split and merge paths in the workflow.
A sequence flow connects workflow objects to specify the order that the Data Integration Service runs the objects. You can create a conditional sequence flow to determine whether the Data Integration Service runs the next object. You can define and use workflow variables and parameters to make workflows more flexible. A workflow variable represents a value that records run-time information and that can change during a workflow run. A workflow parameter represents a constant value that you define in a parameter file before running a workflow. After you validate a workflow to identify errors, you add the workflow to an application and deploy the application to a Data Integration Service. You run an instance of the workflow from the deployed application using the infacmd wfs command line program. You monitor the workflow instance run in the Monitoring tool.
Connections
Version 9.5.0 includes the following enhancements for connections:
You can rename connections. You can configure advanced properties of a database connection when you create a database connection in
the Analyst tool. You can edit database connections in the Analyst tool.
Find in Editor
In the Developer tool, you can search for attributes, columns, expressions, groups, ports, or transformations in any type of mapping editor, in a logical data object editor, in a mapplet editor, or in a workflow editor.
Mapping Specifications
Version 9.5.0 includes the following enhancements for mapping specifications in the Analyst tool:
You can select multiple source columns and drag these to insert between target columns in a mapping
specification.
When you edit a mapping specification, all objects appears as a single tabbed dialog. Analysts can select
sources, joins, lookups, reusable rules, expressions, filters, aggregators, and a target from a tab and edit these objects.
You can run a profile on a source, source columns, or target columns in a mapping specification to better
two sources.
The Analyst tool updates a mapping specification when you open a mapping specification again after deleting a
positions.
Performance
Version 9.5.0 includes the following performance enhancements:
You can configure early selection and push-into optimization with the Java transformation, Web Service
hints to choose a query run plan to access the source. The source database must be Oracle, Sybase, IBM DB2, or Microsoft SQL Server.
Project Permissions
You can assign read, write, and grant permissions to users and groups when you create a project and when you edit project details.
Version 9.5.0
Scorecards
You can configure a third-party application to get the scorecard results and run reports. The profiling
URL from the Developer tool and add it to the source code of external applications or portals. You can also drill down into source rows and view trend charts from external applications.
specific set of criteria. You can submit correlated subqueries from an ODBC, JDBC client, or from the query plan window in the Developer tool.
You can connect to an SQL data service through a default ODBC or JDBC connection specified in the SQL
data service and then create and drop local temporary tables in a relational database.
Web Services
Web Service Consumer Transformation Version 9.5.0 include the following enhancements for the Web Service Consumer transformation:
You can enable the Web Service Consumer transformation to create multiple concurrent connections to a
web service so that it can send multiple web service requests in parallel. When you enable the Web Service Consumer transformation to create multiple concurrent connections to the web service, you can set the memory consumption limit and the number of concurrent connection limits.
The Web Service Consumer transformation can process SOAP 1.2 messages with document/literal
encoding. You can create a Web Service Consumer transformation with a SOAP 1.2 binding. The fault output ports for SOAP 1.2 are code, reason, node, role, and detail. Generic Fault You can define a generic fault to return an error message to a web service client when an error is not defined by a fault element in the WSDL. Create a Fault transformation to return a generic error message. Schema Objects Version 9.5.0 includes the following enhancements for schema objects:
You can add multiple root .xsd files to a schema object. You can also remove .xsd files from a schema
object.
You can update a schema object when elements, attributes, types, or other schema components change.
When you update a schema object, the Developer tool updates objects that use the schema.
The following table describes the methods that you can use to update a schema object:
Method Synchronize the schema. Edit a schema file. Description Synchronize a schema object when you update the schema files outside the Developer tool. The Developer tool re-imports all of the schema XSD files that contain changes. Edit a schema file when you want to update a file from within the Developer tool. The Developer tool opens the file in your XSD file editor or in an editor that you select.
Hierarchy Level of Elements You can change the hierarchy of the elements in an operation mapping. Operations You can create and configure operations in the web service Overview view. After you manually create a web service, you can create an operation from a reusable object. SOAP 1.2 The Data Integration Service can process SOAP 1.2 messages with document/literal encoding. Each web service can have an operation that uses a SOAP 1.2 binding. When you create a fault using SOAP 1.2, the wizard creates the code, reason, node, and role elements. WSDL Synchronization You can synchronize a WSDL data object when the WSDL files change. When you synchronize a WSDL data object, the Developer tool re-imports the object metadata from the WSDL files. The Developer tool also updates objects that reference the WSDL or marks them as changed when you open them.
Informatica Domain
This section describes new features and enhancements to the Informatica domain.
Connection Management
You can rename connections.
Version 9.5.0
When you enable the Informatica Data Director Service, the Service Manager starts Informatica Data Director for Data Quality. You can open Informatica Data Director for Data Quality in a web browser.
Log Directory
Directory for log files. Default is <home directory>/ disLogs. Directory for index and data cache files for transformations. Default is <home directory>/Cache. Directory for source flat files used in a mapping. Default is <home directory>/source. Default directory for target flat files used in a mapping. Default is <home directory>/target. Directory for reject files. Reject files contain rows that were rejected when running a mapping. Default is <home directory>/reject.
Cache Directory
Source Directory
Target Directory
Out of Process Execution You can run each Data Integration Service job as a separate operating system process. Each job can run separately without affecting other jobs running on the Data Integration Service. For optimal performance, run batch jobs and long jobs out of process, such as preview, profile, scorecard, and mapping jobs. Email Server Properties You can configure email server properties for the Data Integration Service. The email server properties configure the SMTP server that the Data Integration Service uses to send email notifications from a workflow. Grid You can run the Data Integration Service on a grid. When you run an object on a grid, you improve scalability and performance by distributing the work across multiple DTM processes running on nodes in the grid. Human Task Service Module The Human Task Service Module is the component in the Data Integration Service that manages requests to run a Human task in a workflow. Logical Data Object Properties If you want to manage the data object cache through the database, you can specify a cache table name for each logical data object. When you specify a cache table name, the database user or a third-party tool that you configure populates and refreshes the cache.
10
SQL Properties You can configure the following SQL properties for the Data Integration Service:
Property DTM Keep Alive Time Description Number of milliseconds that the DTM process stays open after it completes the last request. Identical SQL queries can reuse the open process. You can set this property globally or for each SQL data service that is deployed to the Data Integration Service. Table Storage Connection Skip Log Files Relational database connection that stores temporary tables for SQL data services.
Prevents the Data Integration Service from generating log files when the SQL data service request completes successfully and the tracing level is set to INFO or higher.
Virtual Table Properties If you want to manage the data object cache through the database, you can specify a cache table name for each virtual table. When you specify a cache table name, the database user or a third-party tool that you configure populates and refreshes the cache. Web Service Properties You can configure the following web service properties for the Data Integration Service:
Property DTM Keep Alive Time Description Number of milliseconds that the DTM process stays open after it completes the last request. Web service requests that are issued against the same operation can reuse the open process. You can set this property globally or for each web service that is deployed to the Data Integration Service. Logical URL Skip Log Files Prefix for the WSDL URL if you use an external HTTP load balancer. Prevents the Data Integration Service from generating log files when the web service request completes successfully and the tracing level is set to INFO or higher.
Workflow Service Module The Workflow Service Module is the component in the Data Integration Service that manages requests to run workflows.
Monitoring
You can monitor a workflow instance run in the Monitoring tab of the Administrator tool. You can view the status of running workflow and workflow object instances. You can abort or cancel a running workflow instance. You can also view workflow reports, workflow logs, and mapping logs for mappings run by Mapping tasks in the workflow.
Version 9.5.0
11
The function of the NODE statement did not change for PowerCenter Integration Service workflows.
Profile Privilege
Assign the Manage Data Domains Model Repository Service privilege to enable a user to create, edit, and delete data domains in the data domain glossary.
Security
The Model Repository Service includes the Show Security Details privilege. When you disable this privilege,
error and warning message details do not display the names of projects for which users do not have read permission.
The Informatica domain locks out a user if they exceed the maximum number of failed logins. The administrator
can configure the maximum number of failed logins. The administrator can also unlock an account.
Upgrade
12
PurgeDataObjectCache
UpdateServiceOptions
If you created scripts that use the changed Data Integration Service options, you must update the scripts.
Version 9.5.0
13
infacmd ps Commands
The following table describes new commands:
Command cancelProfileExecution executeProfile getProfileExecutionStatus migrateScorecards Description Cancels the profile model run. Runs the profile model. Gets the run-time status of a profile model. Migrates scorecard results from Informatica 9.1.0 to 9.5.0.
14
Export
Import
If you created scripts that use the changed options, you must update the scripts.
UpdateTableOptions
StartWorkflow
Version 9.5.0
15
infacmd ws Commands
The following table describes an updated command:
Command UpdateWebServiceOptions Description Contains new web service option WebServiceOptions.DTMKeepAliveTime. This option sets the keepalive time for one web service that is deployed to the Data Integration Service.
pmrep
The following table describes updated commands:
Command ExecuteQuery FindCheckout ListObjects ListObjectDependencies Validate Description Contain new option -y. This option displays the database type of sources and targets.
PowerCenter
This section describes new features and enhancements to PowerCenter.
Datatypes
PowerCenter supports the Microsoft SQL Server datetime2 datatype. Datetime2 datatype has a precision of 27 and scale of 7.
Transformation Language
Use the optional argument, match_from_start, with the REG_EXTRACT function to return the substring if a match is found from the start of the string. The REG_EXTRACT function uses the following syntax:
REG_EXTRACT( subject, 'pattern', subPatternNum, match_from_start )
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Resources
SAP BW Resource You can create and configure a SAP BW resource to extract metadata from SAP NetWeaver Business Warehouse. Custom Resource You can create and configure custom resources to extract metadata from custom files such as comma separated files. You can create load template files that contain all mapping rules and rule sets used to load the custom resources.
16
Rule-based Links
Use rule-based links to define rules that Metadata Manager uses to link matching elements between a custom resource type and another custom, packaged, or business glossary resource type. You can also configure rulebased links between a business glossary and a packaged resource type. Configure rule-based links so that you can run data lineage analysis across metadata sources.
Exports a load template file. Deletes a load template file. Lists all the load template files for a custom resource. Updates a load template file. Creates a linking rule set based on a rule set XML file. Updates a linking rule set based on a modified rule set XML file. If the rule set does not exist, the command creates the rule set. Deletes a linking rule set. Exports all linking rule sets for a resource to XML files. You can import the rule sets into another Metadata Manager repository. Imports all linking rule sets from XML files in the specified path into the Metadata Manager repository.
deletelinkruleset exportlinkruleset
importlinkruleset
PowerExchange Adapters
This section describes new features and enhancements to PowerExchange adapters in version 9.5.
Version 9.5.0
17
- Null As - Quote - Error Table - Greenplum Pre SQL - Greenplum Post SQL
authentication.
You can use Power Exchange for Microsoft Dynamics CRM for Internet-facing deployment with claims-
based authentication.
You can read and write PartyList datatype from Microsoft Dynamics CRM. Intersect entities are writable.
PowerCenter command line program, pmpasswd, to encrypt the password. PowerExchange for SAP NetWeaver
PowerExchange for SAP NetWeaver uses SAP RFC SDK 7.2 libraries.
object operation. You can use the data object operation as a source in the mappings.
You can use Facebook search operators in a query parameter to search for data.
object operation. You can use the data object operation as a source in the mappings.
You can use LinkedIn search operators in a query parameter to search for data.
18
object operation. You can use the data object operation as a source in the mappings.
You can use Twitter search operators in a query parameter to search for data.
Documentation
This section describes new features and enhancements to the documentation.
Documentation DVD
The Informatica Documentation DVD contains product manuals in PDF format. Effective in 9.5.0, the documentation DVD uses a browser-based user interface. Supported browsers are Internet Explorer 7.0 or later and Mozilla Firefox 9.0 or later. Ensure that Javascript support is enabled and the Adobe Acrobat Reader plugin is installed in your browser.
Version 9.5.0
19
CHAPTER 2
Actions > Column Profiling Filter Actions > Column Profiling Rules
20
The following table describes the change to the foreign key limiting option:
Task Limit the number of foreign keys identified between a child data object and a parent data object Changed to Max foreign keys between data objects Changed from Max foreign keys returned
Previously, the option determined the total number of foreign keys the Developer tool returned in the profile results and the default value was 500.
Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and
grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.
Previously, the Developer tool displayed all projects regardless of project permissions.
Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.
Projects
21
CHAPTER 3
22
The following table describes tasks to edit a profile that have changed action menus:
Task Change the basic properties such as name, description, and profile type. Choose another matching data source for the profile. Select the columns you want to run the profile on and configure the sampling and drill down options. Create, edit, and delete filters. Create rules or change current ones. Changed to Actions > General Changed from Actions > Edit Properties
Actions > Column Profiling Filter Actions > Column Profiling Rules
Export to PowerCenter
Effective in version 9.5.0, the process to export Model repository objects to the PowerCenter repository writes log message files to the machine that performs the export operation. Previously, the export process did not write log files and displayed log messages for Developer tool export operations only. If you export to a PowerCenter repository from a Developer tool machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\clients\DeveloperClient\infacmd\exporttopc_cli_logs
If you export to a PowerCenter repository from an Informatica services machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\tomcat\logs\exporttopc_cli_logs
You must have write access to the log directory. If you do not have write access, Informatica 9.5.0 displays a warning message to state that no logs are stored for the export.
Export to PowerCenter
23
The following table lists the default values for these directories:
Object Flat file data object Flat file data object Flat file data object Aggregator transformation Joiner transformation Lookup transformation Rank transformation Sorter transformation Directory Field Source file directory Output file directory Reject file directory Cache directory Cache directory Cache directory Cache directory Work directory Default Value SourceDir TargetDir RejectDir CacheDir CacheDir CacheDir CacheDir TempDir
Previously, the default value for all of these directories was ".", which stood for the following directory:
<Informatica Services Installation Directory>\tomcat\bin
When you upgrade, the upgrade process does not change the value of these directory fields. If you used the previous default value of ".", the upgrade process retains that value.
Exception Transformation
Effective in version 9.5.0, you can connect the bad record and duplicate record output from an Exception transformation to a data object in a mapping. You use the transformation to create the bad record and duplicate record tables and the data object. Previously, you used the Exception transformation to write data to bad record or duplicate record tables that are not represented as repository objects. If you upgrade to Data Quality 9.5.0 and the Model repository contains an Exception transformation, complete the following steps to use the transformation in Data Quality 9.5.0: 1. 2. 3. Create a data object from the database table that contains the bad records or duplicate records. Add the data object to the mapping canvas. Connect the bad data or duplicate data ports output ports to the data object.
When you run a mapping with an Exception transformation in Data Quality 9.5.0, you can use Informatica Analyst or Informatica Data Director for Data Quality to review and edit the table records.
Match Transformation
Effective in version 9.5.0, the Match transformation refreshes the list of identity population files that are installed on the Informatica services machine each time you open a strategy in the transformation. Previously, the Match transformation read the list of identity population files when you started the Developer tool.
Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and
grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.
Previously, the Developer tool displayed all projects regardless of project permissions.
Reference Tables
Effective in 9.5.0, you can use the Developer tool and Analyst tool to create, edit, and delete reference tables in the Model repository. Previously, you used the Analyst tool to perform reference table operations.
Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.
Match Transformation
25
CHAPTER 4
Actions > Column Profiling Filter Actions > Column Profiling Rules
26
Export to PowerCenter
Effective in version 9.5.0, the process to export Model repository objects to the PowerCenter repository writes log message files to the machine that performs the export operation. Previously, the export process did not write log files and displayed log messages for Developer tool export operations only. If you export to a PowerCenter repository from a Developer tool machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\clients\DeveloperClient\infacmd\exporttopc_cli_logs
If you export to a PowerCenter repository from an Informatica services machine, the export process writes log files to the following location:
[9.5.0_Install_Directory]\tomcat\logs\exporttopc_cli_logs
You must have write access to the log directory. If you do not have write access, Informatica 9.5.0 displays a warning message to state that no logs are stored for the export.
Previously, the default value for all of these directories was ".", which stood for the following directory:
<Informatica Services Installation Directory>\tomcat\bin
When you upgrade, the upgrade process does not change the value of these directory fields. If you used the previous default value of ".", the upgrade process retains that value.
Export to PowerCenter
27
Projects
Effective in version 9.5.0, Model repository projects include the following changes:
You can share project contents by assigning permissions to users and groups. You can assign read, write, and
grant permissions when you create or edit a project. Previously, to share projects contents, you created a shared project. When you upgrade a shared project to version 9.5.0, all domain users inherit read permission on the project.
The Developer tool hides the projects that you do not have read permission on.
Previously, the Developer tool displayed all projects regardless of project permissions.
Scorecards
Effective in version 9.5.0, you need to migrate scorecards results from version 9.1.0 before you can start using existing scorecards. To view the results of scorecards, run the infacmd ps migrateScorecards command.
28
Web Services
This section describes changes to web services.
Fault Transformation
Effective in version 9.5.0, you can configure a Fault transformation to return a generic error message when the error is not defined by a fault element in the WSDL. When you create a Fault transformation for a generic fault in a web service, you must define the operation mapping logic that returns the error condition. Previously, you could create a Fault Transformation to return a predefined fault from a fault element. The web service used a fault element to define the fault. You could configure a Fault transformation to return a custom error message.
Fault Terminology
Effective in version 9.5.0, the fault handling terminology changed. Faults can be of the following types:
System defined User-defined - Predefined - Generic
associated port. Previously, the Developer tool deleted the associated port.
When you change an element type from simple to complex, the Developer tool marks the port as not valid.
Previously, the Developer tool cleared the location of the associated port.
SOAP 1.2
Effective in version 9.5.0, the following changes are implemented for SOAP 1.2:
Each web service can have one or more operations that use either a SOAP 1.1 binding or a SOAP 1.2 binding
Web Services
29
The SOAP request can be of SOAP 1.1 or SOAP 1.2 format. The SOAP request is based on the type of binding
that is used by the binding operation associated with the operation mapping.
When you create a fault in an operation that has a SOAP 1.2 binding, the wizard creates the code, reason,
node, and role elements. Previously, you could only create an operation with a SOAP 1.1 binding and create a fault in an operation with a SOAP 1.1 binding.
30
CHAPTER 5
31
IntelliScript Editor
Effective in version 9.5.0, the IntelliScript editor opens in the Developer tool. The IntelliScript editor uses new icons and a new font. Script components now appear in black letters. Previously, the IntelliScript editor opened in Data Transformation Studio. Script components appeared in brown letters. The IntelliScript editor also had a separate panel to display the example source document for the main input.
Model Repository
Effective in version 9.5.0, you store schemas, example sources, and other project files in the Model repository. You can also import a Data Transformation project or service into the Model repository. Previously, you stored schemas, example sources, and project files in a workspace folder in the file system. You could import projects or services into the workspace folder, or you could copy the files manually to the workspace folder.
32
Script Objects
Effective in version 9.5.0, you create scripts in the Developer tool. Previously, you used Data Transformation Studio to create TGP file scripts.
33
Transformation
Effective in version 9.5.0, transformations include the following changes:
Data Transformation moved to the Informatica platform. You can create a Data Processor transformation in the
Developer tool. Create scripts and XMap objects in the transformation instead of the DT Studio.
Set the startup component of a Data Processor transformation in the Overview tab. If the startup component is
a component of a script, you can set it in the IntelliScript editor. Previously, you could set the startup component of a TGP file in the IntelliScript editor.
Add a schema object to a project in the Model repository, and then reference a schema in the Data Processor
transformation. Previously, you added a schema to the XSD node in the Data Transformation Explorer view of Data Transformation Studio.
The Output panel of the Data Viewer view displays the main output of a Data Processor transformation or the
output of an additional output port. Previously, you could view the main output and output for additional output ports in a separate view in the editor area.
View an example source in the Input panel of the Data Viewer view. You can view example source data for the
main input and for additional input ports. Previously, you could view the example sources for the main input in the Input panel of the IntelliScript editor, and you could view the example source for additional input ports in a separate view.
Configure document encodings and other settings in the Data Processor transformation Settings view.
Previously, you configured document encodings and other settings in the Studio project properties.
You can no longer use the VarPostData, VarFormAction, and VarFormData system variables. The IntelliScript
editor continues to display them in scripts that were created in previous Data Transformation versions.
Views
Effective in 9.5.0, views in the Developer tool replaced views in Data Transformation Studio. The following table describes the changes in the views:
Developer Tool View Data Processor Events Studio View Events Description Shows information about events that occur when you run the transformation. Shows initialization, execution, and summary events. Displays context-sensitive help for components and properties selected in the IntelliScript editor. Displays the example source documents in hexadecimal format.
Help
Binary Source
34
Studio View IntelliScript editor example panel and other components No equivalent
Description View example input data, run the transformation, and view output results. Displays context-sensitive help for tabs selected in the Data Processor transformation. Add, modify, or delete script and XMap objects from the transformation. Configure ports and define the startup component. Add or remove schemas from the transformation. Configure transformation settings for encoding, output control, and XML generation. Displays details of syntax errors in the Data Transformation project or Data Processor transformation. Displayed all project files in a hierarchical tree. Displayed all schemas available in the project, together with system variables and user-defined variables. Displayed information about services in the ServiceDB folder on the local computer. Displayed all the components of a TGP file in a hierarchical tree. Displayed additional information about the value of the component or property selected in the IntelliScript editor.
Help
Objects
No equivalent
Overview
No equivalent
References
No equivalent
Settings
Validation Log
Problems
No equivalent
No equivalent
Schema
No equivalent
Repository
No equivalent
Component
No equivalent
Intelliscript Assistant
XML Validation
Effective in version 9.5.0, the lexical space of the simple type gmonth is --MM, in accordance with W3C erratum E2-12. Previously, the lexical space of the simple type gmonth was --MM--, in accordance with the original W3C XML Schema recommendation.
XML Validation
35
CHAPTER 6
Connection Management
Effective in version 9.5.0, the Data Integration Service identifies each connection by the connection ID. Therefore, you can rename a connection. Previously, the Data Integration Service identified each connection by the connection name. You could not rename a connection. If you upgrade to version 9.5.0, the upgrade process sets the connection ID for each connection to the connection name.
36
Pass-through Security
Effective in version 9.5.0, you configure pass-through security in the connection properties of a domain. Previously, you configured pass-through security in the Data Integration Service.
Web Services
Effective in version 9.5.0, you can set the keepalive interval for web services through the Administrator tool. You can also set the keepalive interval through the following infacmd command options:
infacmd dis UpdateServiceOptions command, WSServiceOptions.DTMKeepAliveTime option infacmd ws UpdateWebServiceOptions command, WebServiceOptions.DTMKeepAliveTime option
Previously, you set the keepalive interval for all web services through the infacmd dis UpdateServiceOptions command, WebServiceOptions.DTMKeepAlive option. If you created scripts that use this command option, you must update the scripts.
Pass-through Security
37
CHAPTER 7
PowerCenter (9.5)
This chapter includes the following topics:
Pushdown Optimization, 38 Exporting Metadata to Excel, 38
Pushdown Optimization
Effective in version 9.5.0, you can disable creation of temporary views for pushdown optimization to Teradata when the Source Qualifier transformation contains source filter, user defined joins, or SQL override. Previously, pushdown optimization on Teradata database would create and drop views when you have source filter, user defined joins, or SQL override at the Source Qualifier transformation.
38
CHAPTER 8
39
For example, if the long name option in a command is user, specify --user <user name> instead of -user <user name>.
Resource Types
Some of the resource types in Metadata Manager 9.1.0 are deprecated in Metadata Manager 9.5.0 The following resource types are deprecated in Metadata Manager 9.5.0:
Business Objects Cognos ReportNet Microsoft Analysis and Reporting Services MicroStrategy Oracle Business Intelligence Enterprise Edition Erwin ERStudio Oracle Designer Power Designer RationalER Generic JDBC Xconnect
When you upgrade to Metadata Manager 9.5.0, Metadata Manager appends (Deprecated_9.5.0) to the Metadata Manager 9.1.0 resource types. You can view resources of the deprecated resource type, but you cannot create or edit resources. You can also view the existing data lineage for the objects of the deprecated resource types. You must create and load resources with the corresponding new resource types in 9.5.0.
40
After you upgrade to Metadata Manager 9.5.0, the JDBC, ER/win, and Cognos resources are marked as deprecated. Perform resource conversion and load the resources in the following order: 1. 2. 3. 4. 5. 6. Convert the JDBC resource. Load JDBC and Oracle resources. Convert ER/win and Cognos resources. Load PowerCenter, ER/win, and Cognos resources. Recreate any personalization. Delete the deprecated resources.
The changes take effect when you enable the Metadata Manager Service.
41
CHAPTER 9
42
claims-based authentication. Previously, you could use PowerExchange for Microsoft Dynamics CRM for on-premise deployment with active directory authentication.
Intersect entities are readable and writable.
If the new version of a Salesforce object has a different structure than the previous version of the object, re-import the Salesforce object. After you re-import the object, analyze the associated mapping to determine if you need to update transformations in the mapping. For example, if you re-import a source definition that is based on a
43
Salesforce object that contains a new field, you can modify your mapping to extract the new field and write the data to the target.
Previously, you needed to install PowerExchange for Teradata Parallel Transporter API separately.
44
45
CHAPTER 10
After you upgrade to 9.1.0 HotFix 4, an administrator must grant the Access Mapping Specifications and Load Mapping Specification Results privileges from the Administrator tool. After you upgrade to 9.1.0 HotFix 4, users with the License Access for Informatica Analyst privilege will have the Run Profiles and Scorecards privilege. Administrators must grant the Run Profiles and Scorecards privilege to new users.
46
PowerCenter
This section describes new features and enhancements to PowerCenter.
Profile Export
When you export profile results from the Analyst tool, you can choose to export the complete profile results summary to a Microsoft Excel spreadsheet.
Profile Export
When you export profile results from the Analyst tool, you can choose to export the complete profile results summary to a Microsoft Excel spreadsheet.
Aggregators
You can add aggregators to a mapping specification in the Analyst tool to perform aggregate calculations on multiple rows of data.
47
When you add an aggregator, you can perform aggregate calculations on groups of columns or all columns. When you group by columns, you can apply the aggregate conditions and rules to multiple columns. You can include filters, rules, conditional clauses, and nested expressions in an aggregator. You can also add different aggregators to multiple columns.
Export to Excel
You can export the mapping specification logic from the Analyst tool to Microsoft Excel to document and share the mapping specification logic with other analysts and developers. You can then modify the mapping specification with the review feedback.
Data Masking
You can create a Data Masking transformation to transform sensitive production data to realistic test data for nonproduction environments. You can create masked data for software development, testing, training, and data mining. You can maintain data relationships in the masked data and maintain referential integrity between database tables.
Web Services
You can create a web service configuration to control the settings the Developer tool applies when you preview the output of an operation mapping or the output of a transformation in the operation mapping.
Informatica Domain
This section describes new features and enhancements to the Informatica domain.
Secure Connections
You can configure the Reporting and Dashboards Service to use secure communications.
48
Metadata Manager
This section describes new features and enhancements in version 9.1.0 HotFix 3.
PowerCenter
This section describes new features and enhancements to PowerCenter.
Aggregators
You can add aggregators to a mapping specification in the Analyst tool to perform aggregate calculations on multiple rows of data. When you add an aggregator, you can perform aggregate calculations on groups of columns or all columns. When you group by columns, you can apply the aggregate conditions and rules to multiple columns. You can include filters, rules, conditional clauses, and nested expressions in an aggregator. You can also add different aggregators to multiple columns.
Data Masking
You can configure dependent data masking for a source column. With dependent masking, the Data Masking transformation masks more than one column of source data from the same row of data in a dictionary. You can maintain a relationship between the columns of source data, such as the relationship between city and state. You can configure data masking for Social Insurance numbers (SIN). Select the SIN masking type when you configure masking for the source SIN.
Export to Excel
You can export the mapping specification logic from the Analyst tool to Microsoft Excel to document and share the mapping specification logic with other analysts and developers. You can then modify the mapping specification with the review feedback.
49
Overlap Discovery
You can determine the percentage of overlapping data between two columns within one or more data sources. You run the overlap discovery function from a profile model in Informatica Developer. You can validate the results that appear in a Venn diagram.
the address, the census areas the address belongs to, and whether the address is a home or business.
The preferred name of a locality in Canada or the United States, where a preferred locality name exists. For
example, the transformation recognizes "North York" as a locality name in Canada, but it can return "Toronto" as the preferred locality name.
The address type in a Canadian or United States address.
The transformation can return the short forms of United States street and locality names when the address reference data contains the short forms. Use the Element Abbreviation option to add the short forms to the address record.
50
Regional Accelerators
The following regional accelerators include prebuilt identity matching mapplets:
Informatica Data Quality Accelerator for Australia and New Zealand Informatica Data Quality Accelerator for Brazil Informatica Data Quality Accelerator for France Informatica Data Quality Accelerator for Germany Informatica Data Quality Accelerator for Portugal Informatica Data Quality Accelerator for United Kingdom Informatica Data Quality Accelerator for US and Canada
Each regional accelerator includes mapplets that perform the following identity matching operations:
Company name and address matching Family name and address matching Individual name and address matching Person name and data matching
Templates
A template is a set of repository objects that you can use to create and run business intelligence reports from Informatica Data Quality applications. Use templates to quickly gain insights into the data quality of project data. A template contains mappings, mapplet rules, profile objects, and Informatica reference data. Templates use logical data objects to enable rapid implementation. Informatica users can download the following templates: CRM Template Use the Customer Relationship Management (CRM) Template to measure and report on the customer data processing issues associated with CRM implementations and data migrations. Dashboard and Reporting Template Use the Dashboard and Reporting Template to add data quality metrics to enterprise business intelligence reports. The template lets you apply data quality metrics across multiple dimensions for cross-organization and data entity reports.
Web Services
You can create an operation from a flat file data object or relational data object. When you create an operation
from a flat file data object or relational data object, the operation performs a look up on the data object.
You can update the maximum number of occurrences for elements in operations that you create from a
reusable object.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
IMM.Properties
The following table describes properties added to the imm.properties file.
Property shortcut.classes Description Lists the classes that are shortcuts. When you add a shortcut class to this list of classes, children of that shortcut appear in the Catalog tree. ImpactSummary.MaxObjects Sets the maximum number of objects to appear in the impact summary user interface. Sets the number of elements that Metadata Manager processes to calculate impact summary.
ElementFetch.ParamSize
restoreconfiguration
52
Description Lists all models in Metadata Manager. Retrieves all the source files associated with a resource.
The following table describes the new Metadata Manager options for the command line programs.
Commands createresource, updateresource, assignconnection, assignparameterfile Option -pdir Description The directory in which the source files associated with a resource are located.
Address Validation
Data Quality address validation meets the certification standards of the Address Matching Approval System (AMAS) of Australia Post.
Transformations
Address Validator Transformation
The transformation includes output ports that enable address validation to the Australia Post AMAS
certification standard.
The transformation includes ports that provide additional information on United States address that are
validated to the Coding Accuracy Support System (CASS) certification standard of the United States Postal Service.
The transformation includes ports that provide additional information on Canadian addresses that are
validated to the Software Evaluation and Recognition Program (SERP) certification standard of Canada Post.
You can use the Address Validator transformation to generate a report for address data that meet the
Australia Post AMAS certification standard. Exception Transformation You can configure the Exception transformation to append records to the exception table. Labeler Transformation
The Labeler transformation includes a search feature in the token set and character set wizards. You can
search name, description, and tag metadata by entering text that contains all or part of the metadata string.
When you use the Labeler transformation to select objects from a content set, you can override the default
label.
53
Parser Transformation The Parser transformation includes a search feature in the token set and character set wizards. You can search name, description, and tag metadata by entering text that contains all or part of the metadata string.
Schema Objects
You can view global attributes on the Schema view of a schema object.
Web Services
You can create an operation from a reusable mapplet, reusable transformation, or a reusable logical data
object. When you create an operation from a reusable logical data object, the operation performs a look up on the data in the logical data object.
54
The Developer tool extracts nodes in the first level of the operation hierarchy to ports when you choose to
extract the first level of the hierarchy. The Developer tool also creates the ports to perform the extraction.
When you extract nodes from the operation input, you can extract the complete SOAP request as XML instead
of returning groups of relational data in separate output ports. When you extract ports to the operation output and operation fault, you can extract XML data from one string or text input port to the entire SOAP response.
You can configure a SOAP message from a WSDL or schema that contains derived types, anyTypes, and
substitution groups. You must choose the types that can appear in the data.
You can create a SOAP message from denormalized input data. You can configure composite keys in a SOAP message by extracting multiple ports to the same key. You can
preview fails, a system-defined fault displays in the Output area of the Data Viewer view.
extract the first level of the hierarchy. The Developer tool also creates the ports to perform the extraction.
You can configure a SOAP message from a WSDL or schema that contains derived types, anyTypes, and
substitution groups. You must choose the types that can appear in the data.
You can create a SOAP message from denormalized input data. You can configure composite keys in a SOAP message by extracting multiple ports to the same key. You can
Informatica Domain
This section describes new features and enhancements to the Informatica domain.
Connection Permissions
You can view connection permission details for a user or group. When you view permission details, you can view the origin of effective permissions. Permission details display direct permissions assigned to the user or group and direct permissions assigned to groups that the user or group belongs to. In addition, permission details display whether the user or group is assigned the Administrator role which bypasses the permission check.
Monitoring Reports
You can view the following monitoring reports:
Report Name Longest Duration SQL Data Service Connections Description Shows SQL data service connections that were open the longest during the specified time period. Shows SQL data service requests that ran the longest during the specified time period. Shows web service requests that were open the longest during the specified time period.
55
Description Shows the total number of SQL data service and web service requests during the specified time period. Shows the total number of SQL data service requests from each IP address. Shows the most frequent errors for jobs, regardless of job type. Shows the most frequent errors for SQL data service requests. Shows the most frequent faults for web service requests.
Most Frequent Errors for SQL Data Service Requests Most Frequent Faults for Web Service Requests
infacmd
With infacmd rds commands, you can create a Reporting and Dashboards Service and list service process options. The following table describes new infacmd rds commands:
Command CreateService ListServiceProcessOptions Description Creates a Reporting and Dashboards Service in a domain. Lists the Reporting and Dashboards Service process options.
The following table describes a new option for the infacmd command:
Command cms CreateService Option -ds Description Specifies the name of the Data Integration Service to assign to the Content Management Service.
56
database.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
57
Command encrypt
Description Encrypts the text you specify. You can specify the encrypted text when you use the -ep option in a command.
getServiceLog
Exports the service log file from Metadata Manager for a specified date.
The following table describes the new Metadata Manager options for the command line programs.
Command assignConnection Option -s Description During refresh, skips retrieving the connection objects you specify in the resource configuration file. Specifies the encrypted password generated using the encrypt command. This option can be used in multiple commands.
-ep
Version 9.1.0
This section describes new features and enhancements in version 9.1.0.
Profiles
You can create and run a profile on bad record tables and duplicate record tables in the Analyst tool. You can create and run profiles to identify primary keys, foreign keys, and functional dependencies between
columns in data objects. You can also define relationships between columns, and profile the columns to verify the relationship.
You can create a data model of the data objects that you want to profile. You can create and run profiles on the
parameters. The old and new data object must be the same type and must have the same data structure.
You can use a profile to infer the Date type for source data values. You can apply one or more filters to a profile.
58
You can apply one or more filters to drilldown data in the Analyst tool. You can create profiles for multiple data objects in a single operation. The profiles return separate results for
to review the changes that the mapping can make to the source data.
You can run profiling reports that include column profiling statistics and their summary from Data Analyzer.
Reference Tables
You can use the Analyst tool to create reference tables that store data in databases that you specify.
Reference tables that store data in the staging database are called managed reference tables. Unmanaged reference tables store data in user-specified databases.
When using Informatica Developer to import and export reference tables, you specify file paths recognized by
the local file system. Previously when you used the Developer tool to import and export reference tables, you specified directories recognized by the machine that ran the Data Integration Service.
You can apply a text filter when you search the Model repository for a reference table from a data quality
transformation. The filter narrows your search to reference table names that meet your filter criteria.
When you use the infacmd oie exportObjects command to export Model repository objects, you can include the
reference tables associated with these objects. The command exports reference table data from the staging database into a .zip file. When you run the infacmd oie importObjects command to import the Model repository objects, the command writes reference table data from the .zip file into the staging database.
Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.
Accelerators
Accelerators are content bundles that contain rules, reference tables, demonstration mappings, and demonstration data objects. Each accelerator provides solutions to common data quality issues in a country, region, or industry. The Data Quality Content installer includes the Informatica Data Quality Core Accelerator, which contains general data quality rules. You can purchase the following accelerators separately:
Informatica Data Quality Accelerator for Australia and New Zealand Informatica Data Quality Accelerator for Brazil Informatica Data Quality Accelerator for Financial Services Informatica Data Quality Accelerator for Portugal Informatica Data Quality Accelerator for United Kingdom Informatica Data Quality Accelerator for US and Canada
Content Sets
A content set is a Model repository object that you use to store reusable, user-defined expressions. These expressions include pattern sets, character sets, token sets, and regular expressions. When you configure a Labeler or Parser transformation, you can choose to include reusable expressions from a content set. Create content sets in the Developer tool.
Version 9.1.0
59
Exception Management
You can perform the following exception management tasks in the Analyst tool:
Apply status and priority filters to rows in a bad record table. Save changes to multiple rows at a time in a bad record or duplicate record table. View the previous version of a data value in an audit trail table.
Object Deployment
The process to export repository objects to PowerCenter resolves conflicts and dependencies.
Profiles
You can create and run a profile on bad record tables and duplicate record tables in the Analyst tool. You can replace the data object in a profile and run the profile on the new data object without editing the profile
parameters. The old and new data object must be the same type and must have the same data structure.
You can use a profile to infer the Date type for source data values. You can apply one or more filters to a profile. You can apply one or more filters to drilldown data in the Analyst tool. You can create profiles for multiple data objects in a single operation. The profiles return separate results for
to review the changes that the mapping can make to the source data.
You can run profiling reports that include column profiling statistics and their summary from Data Analyzer.
Reference Tables
You can use the Analyst tool to create reference tables that store data in databases that you specify.
Reference tables that store data in the staging database are called managed reference tables. Unmanaged reference tables store data in user-specified databases.
When using Informatica Developer to import and export reference tables, you specify file paths recognized by
the local file system. Previously when you used the Developer tool to import and export reference tables, you specified directories recognized by the machine that ran the Data Integration Service.
You can apply a text filter when you search the Model repository for a reference table from a data quality
transformation. The filter narrows your search to reference table names that meet your filter criteria.
When you use the infacmd oie exportObjects command to export Model repository objects, you can include the
reference table data associated with these objects. The command exports reference table data from the staging database into a .zip file on the Developer client machine. When you run the infacmd oie importObjects command to import the Model repository objects, the command writes reference table data from the .zip file into the staging database.
Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.
60
Transformations
Address Validator Transformation
You can configure the Address Validator transformation in Suggestion List mode. Use this mode to find all
possible matches for an input address in the reference data. Suggestion List mode works for partial and complete addresses.
You can define parameters for default country, line separator, casing style, and mode.
Association Transformation
You can set the minimum amount of cache memory that the Association transformation uses. If you enter a cache memory value that is lower than 65536, the Association transformation reads the
value in megabytes.
The Association transformation generates log entries while mappings are running. The transformation
Comparison Transformation You can use custom-built identity population files when you perform identity operations. Consolidation Transformation
The Consolidation transformation can use row-based consolidation strategies. The transformation uses
these strategies to choose a single cluster row. The transformation populates the master row with the data values from the chosen row.
The Consolidation transformation can use custom consolidation strategies that you define. You define the
Decision Transformation You can configure the Decision transformation to recognize input NULL values. Exception Transformation The Exception transformation creates database tables that you can review for data quality issues in the Analyst tool. Use the Exception transformation in a mapping to create tables that identify poor quality or duplicate records using conditions that you specify. Use the Analyst tool to correct bad records or consolidate duplicate records in the tables. Java Transformation You can implement the resetNotification method in a Java transformation. When the Data Integration Service machine runs in restart mode, this method resets variables that you use in the Java code after a mapping run. Match Transformation
You can analyze a Match transformation to preview the number of computations that the transformation
will perform.
You can analyze a Match transformation to preview the size and number of clusters the transformation will
create.
You can define partitioned identity match operations to improve identity match operation performance. Use
Version 9.1.0
61
Business Terms
You can use a Metadata Manager business term in the Analyst tool to search for objects in the Metadata
Manager repository. You can select Metadata manager objects from the search results and import them as tables in the Analyst tool. You can use these tables as sources for profiles or mapping specifications.
You can access Metadata Manager and the Metadata Manager Business Glossary from the Analyst tool to
manage business terms. You can view or edit business terms in the Metadata Manager Business Glossary.
Mapping Specification
A mapping specification is an object in the Model repository that describes the movement and transformation of
data from a source to a target. Use a mapping specification to define business logic that populates a target table with data that you can leverage to report on the target table.
You can create a mapping specification in the Analyst tool to transform and move data from the source to the
target.
You can configure the sources, target, rules, filters, and joins to transform the data in a mapping specification. You can load the results of a mapping specification to a target.
Monitoring
You can access the Monitoring tool from the Developer and Analyst tools to monitor the status of applications and jobs, such as a profile job.
Use an infacmd control file to complete the following tasks during an export or import process:
- Filter the objects that are exported or imported. - Configure conflict resolution strategies for specific object types or objects. - Map connections in the source repository to connections in the target repository.
Performance Tuning
Mapping Performance You can improve mapping performance with the cost-based optimization method. The Data Integration Service can evaluate a mapping, generate semantically equivalent mappings, and run the mapping with the best performance. Cost-based optimization is most effective for mappings that contain multiple Joiner
62
transformations. The Data Integration Service applies cost-based optimization when you select the full optimizer level. Pushdown Optimization
The Data Integration Service can push Expression and Joiner transformation logic to the source database. The Data Integration Service can push transformation logic to IBM DB2 for i5/OS, DB2 for LUW, and DB2
for z/OS sources when expressions contain supported functions with the following logic:
- TO_BIGINT includes more than one argument. - TO_CHAR converts a date to a character string without the format argument. - TO_DATE converts a character string to a date without the format argument. - TO_DECIMAL converts a string to a decimal value. - TO_INTEGER includes more than one argument.
Schema Objects You can import a schema and store it as a schema object in the repository. When you create a web service, you can define input, output and fault signatures from the schema types. WSDL Data Objects Import a WSDL file to create a WSDL data object. You can use a WSDL data object to create a web service or a Web Service Consumer transformation.
Tags
A tag is metadata that defines an object in the Model repository based on business usage. Create tags to group objects according to their business usage. Use a tag to informally define an object in the Model repository. Create a tag and assign it to multiple objects in the Model repository. You can also search for objects by a tag.
Version 9.1.0
63
Transformations
Java Transformation You can implement the resetNotification method in a Java transformation. When the Data Integration Service machine runs in restart mode, this method resets variables that you use in the Java code after a mapping run. Lookup Transformation The Lookup transformation can perform a lookup on a logical data object. The transformation can return one row or it can return multiple rows. You an configure the Lookup transformation to perform the lookup in a web service operation mapping. Web Service Consumer Transformation The Web Service Consumer transformation consumes web services in a mapping. The transformation can consume an Informatica web service or an external web service. The transformation returns related groups of output data from hierarchical SOAP response messages. Create a Web Service Consumer transformation from a WSDL data object.
Web Services
Informatica web services provides data integration functionality through a web service interface. Create an operation mapping to define how the Data Integration Service processes the web service request. The operation mapping can include logical data objects or transformations. You can create a web service from a WSDL, or you can create a web service without using a WSDL. You can configure message layer security and transport layer security for a web service. Message layer security includes user authentication and user permissions.
Informatica Documentation
This section describes new documentation and enhancements to Informatica documentation.
Informatica Domain
This section describes new features and enhancements to the Informatica domain.
Connections
If you have PowerExchange, you can create an Adabas connection.
64
Data Transformation
You can install Data Transformation Engine and Data Transformation Studio through the Informatica platform installer. When you run the server installation, you can install or upgrade Informatica or you can install only Data Transformation Engine. When you run the client installation, you can install Informatica Developer, the PowerCenter Client, and Data Transformation Studio and Engine.
Dependencies
In the Services and Nodes view on the Domain tab, you can now view dependencies among applications services, nodes, and grids.
Monitoring
You can configure the Service Manager to store historical run-time statistics about objects that run on a Data Integration Service. The Service Manager stores the statistics in the Model repository. You can view the statistics and reports in the Monitoring tab of the Administrator tool for different objects, such as applications, web services, logical data objects, and SQL data services. For example, you can view a report to determine the longest running jobs. You can also monitor objects from the Analyst tool and Developer tool.
readable XML file to determine if you need to filter the objects that you import.
infacmd isp ImportDomainObjects. Imports native users, native groups, roles, and connections into an
Informatica domain. If you do not want to migrate all objects, use an infacmd control file to filter the objects during the export or import.
Version 9.1.0
65
Permissions
Origin of Effective Permissions You can view domain object, SQL data service, or web service permission details for a user or group. When you view permission details, you can view the origin of effective permissions. Permission details display direct permissions assigned to the user or group, direct permissions assigned to groups that the user or group belongs to, and permissions inherited from parent objects. In addition, permission details display whether the user or group is assigned the Administrator role which bypasses permission checking. Search Filters You can use search filters to search for a user or group when you assign permissions, view permission details, or edit permissions.
Privileges
You can assign the following new types of privileges: Connection Privileges Assign the Manage Connection privilege to enable a user or group to create, edit, and remove connections. Monitoring Privileges Assign the monitoring privileges to enable a user or group to configure and view historical run-time statistics and reports. Sample monitoring privileges include Configure Global Settings, Configure Statistics and Reports, and Access Monitoring. Profiling Privileges Assign the Drilldown and Export Results privilege to enable a user to drill down or export profiling results.
Secure Communication
To configure services to use the Transport Layer Security (TLS) protocol to transfer data securely within the domain, enable the TLS protocol for the domain. When you enable TLS for the domain, services use TLS connections to communicate with other Informatica application services and clients.
Views
The Domain tab now has new views:
Services and Nodes view. View and manage services and nodes in the domain. Connections view. View and manage connections in the domain.
infacmd
With infacmd cms commands, you can create and remove a Content Management Service. The following table describes new infacmd cms commands:
Command CreateService RemoveService Description Creates the Content Management Service. Removes the Content Management Service.
66
ImportDomainObjects
removeUserPermission removeGroupPermission
With infacmd ws commands, you can manage web services. The following table describes new infacmd ws commands:
Command ListOperationOptions ListOperationPermissions ListWebServiceOptions ListWebServicePermissions ListWebServices Description Lists operation options. Lists operation permissions. Lists web service options. Lists web service permissions. Lists the web services in an application. If the application name is not provided, all web services are listed. Renames a web service. Sets operation permissions. Sets web service permissions. Starts a web service so it can receive web service requests. Stops a web service.
Version 9.1.0
67
With infacmd xrf commands, you can generate a readable XML file from an export file. You can also edit the readable XML file and update the changes in the export file. The following table describes new infacmd xrf commands:
Command GenerateReadableViewXML UpdateExportXML Description Generates a readable XML file from the export file. Updates the export file with the changes made to the readable XML file.
-ConflictResolution -ControlFilePath
-pf
68
Command
Description Specifies the name of the profile task. Purges results from folders recursively. Purges all results for a specified profile or scorecard from the Profiling Warehouse.
infasetup
The following infasetup commands are updated to enable Transport Layer Security (TLS).
DefineDomain DefineGatewayNode DefineWorkerNode UpdateGatewayNode UpdateWorkerNode
pmrep
The following table describes new pmrep commands:
Command GenerateAbapProgramToFile InstallAbapProgram UninstallAbapProgram Description Generates the ABAP program for a mapping and saves the program as a file. Generates and installs an ABAP program in the SAP system. Uninstalls the ABAP program. Uninstall an ABAP program when you no longer want to associate the program with a mapping.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Business Glossary
When you create or edit a business term, you can add hyperlinks to any other business term in the same or different business glossary. You can also provide links to external web pages as a reference to a business term. These internal and external links help you to browse through related business terms in the business glossary.
Class Properties
You can organize the way to display the class properties. When you edit the properties, you can drag them to change their order or to ensure that they appear in either the Basic or Advanced section of the class properties, in all Metadata Manager perspectives. For a class, Source Creation Date, Source Update Date, MM Creation Date, and MM Update Date properties are referred as the synthetic date properties. You can set the Show_Synthetic_Dates_In_Basic_Section property in the imm.properties file to specify if these properties should be located in the Basic or Advanced section.
Version 9.1.0
69
JDBC Resource
You can create and configure a JDBC resource to extract metadata from any relational database management system that is accessible through JDBC. You can create a JDBC resource for any relational database that supports JDBC. Informatica has tested the JDBC resource for IBM DB2/iSeries. You cannot connect to relational databases through ODBC. Where available, you should use the existing database resource types specific to that relational database instead of the JDBC resource. The database-specific resource types perform better and extract more metadata aspects. For example, to load metadata from an Oracle database, create an Oracle resource instead of creating a JDBC resource.
Gathering Statistics
You can gather statistics for DB2 resources when the GatherStatistics property in the imm.properties file is set to Yes.
PowerCenter
This section describes new features and enhancements to PowerCenter.
PowerCenter Repository
You can create a PowerCenter repository on Sybase ASE.
70
Data Analyzer
This section describes new features and enhancements to Data Analyzer.
amount of time that the SIP Server waits for the PowerCenter Integration Service to connect to the SPI Server.
Integration Service runs on SUSE Linux Enterprise Server 11 and you use Teradata Parallel Transporter 13.10.
You can extract data from a Teradata source or load data to a Teradata target when the PowerCenter
Version 9.1.0
71
CHAPTER 11
Profiling Warehouse
Effective in 9.1.0, you purge result data for profiles or scorecards from the profiling warehouse. Run the infacmd ps purge command to purge the result data. Previously, the profile warehouse purged result data when the Data Integration Service was idle.
72
CHAPTER 12
73
Association Transformation
Effective in 9.1.0, Association transformation behavior changes in the following ways:
The Association transformation accepts string and numerical values on association ports. If you configure a
column of another data type as an association port, the transformation converts the port data values to strings. Previously, the Association transformation accepted string data only on association ports.
The association ID output is hard-coded as an integer. Previously, the association ID output was hard-coded as
a string. If you upgrade to version 9.1.0, the Association transformation preserves the string data type on the association ID port and writes the port output values as a string.
installer on a machine that the Data Integration Service can access. If you are a PowerCenter user, run the installer on a machine that the PowerCenter Integration Service can access. Previously, the Data Quality Content Installer had an additional installer that you ran on the Informatica Developer machine or PowerCenter Designer machine.
The Data Quality Content Installer can install address validation reference data, identity population data, and
sample data sources. Previously, the Content Installer also installed reference table data and Informatica rules. In Informatica 9.1.0, you use the Developer tool to import reference table data and Informatica rules to the Model repository.
9 and export these mappings to PowerCenter for use with Data Quality for Siebel. Previously, users created data quality plans in Data Quality 8.6 before exporting to PowerCenter.
Data Quality for Siebel uses Address Doctor for all realtime and batch address validation. Previously, Data
Quality for Siebel used specific vendors for particular realtime or batch scenarios.
Decision Transformation
Effective in 9.1.0 HotFix 2, when you use a Decision transformation in a web service operation, the DTM resources remain in memory between data requests. Previously, a web service operation that included a Decision transformation stopped and started the DTM for each data request.
74
Effective in version 9.1.0, the Decision transformation implements the function INSTR(str1, str2) by looking for str2 in str1. Previously, the Decision transformation looked for str1 in str2. If you configured a Decision transformation with this function in Informatica 9.0.1, edit the transformation to reverse the order of the strings in the function.
Exception Transformation
Effective in version 9.1.0 HotFix 2, the Exception transformation contains the following change: Column names that the Exception transformation writes to a staging table do not use the prefix "Ex_". Previously, all columns names contained the "Ex_" prefix. Effective in version 9.1.0 HotFix 1, the Exception transformation contains the following changes:
The Consolidation Exception type is called the Duplicate Record Exception. You choose a transformation type
name:
Location Port group Configuration view Priority view (view name) Priority view Priority view Configuration view Configuration view, Data Routing Options Configuration view, Data Routing Options Configuration view, Data Routing Options Previous Name Labels From Priority Current Name Quality Issues Lower Threshold Issue Assignment
Quality Issue Label Priority Upper Threshold Automatic Consolidation Records(Above upper threshold)
Potential Matches
Unique Records
Exception transformations that identity duplicate records now write clusters with only one record to the unique
record category. Previously, the transformation wrote these records to the definite matches category.
The Exception transformation no longer contains Match Score and IsMaster ports in the input and output
groups. The removal of these ports does not affect transformation functionality.
The Score input port is no longer mandatory for Exception transformations that identify bad records. The
Exception transformation can identify bad records by determining if quality issue ports contain data.
Exception Transformation
75
Export to PowerCenter
Effective in version 9.1.0, the earliest version of PowerCenter to which you can export mappings, mapplets, and logical data object read mappings is PowerCenter 8.6.1. Previously, you could export objects to PowerCenter 8.6.
Match Transformation
Effective in version 9.1.0 HotFix 2, identity match strategies contain primary required fields and secondary required fields. You must assign input ports to all primary required fields. You must assign input ports to at least one secondary required field. Previously, identity match strategies only contained primary required fields. Effective in version 9.1.0 HotFix 1, the Execution Instances property on the Advanced view of the Match transformations is only available for Match transformations that use identity matching. Previously, the Execution Instances property was available for Match transformations that used field matching.
Previously, the web service operation stopped and started the DTM for each data request.
76
CHAPTER 13
Application Redeployment
Effective in version 9.1.0, when you change an application that contains a mapping, and you redeploy the mapping with the update option, the Data Integration Service preserves the Administrator tool mapping redeployment properties. Previously, the Data Integration Service replaced the Administrator tool mapping deployment properties with the Developer tool mapping deployment properties.
77
Deployment Menus
Effective in version 9.1.0 HotFix 2, based on the object type that you select, the Deploy menu may contain a secondary menu. When you right-click the following options in the Object Explorer view, the Deploy > Deploy as a web service option appears:
Flat file physical data object Relational physical data object Logical data objects Transformations except for the Web Service Consumer transformation Mapplets
When you right-click the following objects in the Object Explorer view, the Deploy > Deploy as SQL data service option appears:
Physical data objects Logical data objects
Previously, when you right-clicked one of the following objects in the Object Explorer view, the Deploy option appeared with no secondary menu option:
Physical data objects Logical data objects
Export to PowerCenter
Effective in version 9.1.0, the earliest version of PowerCenter to which you can export mappings, mapplets, and logical data object read mappings is PowerCenter 8.6.1. Previously, you could export objects to PowerCenter 8.6.
78
Web Services
This section describes changes to web services.
Ports Tab
Effective in version 9.1.0 HotFix 1, you use the Ports tab to complete the following tasks:
Extract operation input nodes to output ports. Extract input ports to the operation output nodes. Extract input ports to operation fault nodes.
Previously, you extracted input ports to nodes in the operation hierarchy on the Transformation tab in the Properties view of the Output and Fault transformation. You extracted nodes to output ports on the Transformation tab in the Properties view of the Input Transformation.
Web Services
79
Deployment
Effective in version 9.1.0 HotFix 2, you can right-click a mapplet, reusable transformation, logical data object, flat file data object, or relational data object and deploy it as a web service. Previously, you created a web service operation for a reusable object from the Create a Web Service wizard and then deployed the web service in an application.
80
Ports Tab
Effective in version 9.1.0 HotFix 1, you complete the following tasks on the Ports tab:
Extract input ports to nodes in the input operation hierarchy. Extract nodes in the output operation hierarchy to output ports.
Previously, you extracted input ports to nodes in the operation hierarchy on the Transformation Input tab. You extracted nodes to output ports on the Transformation Output tab.
81
CHAPTER 14
Analyst Service
Effective in version 9.1.0, you can associate a Metadata Manager Service with the Analyst Service. You can also configure the Metadata Manager Service Options in the Analyst Service properties. Associate a Metadata Manager Service with the Analyst Service to connect to the Metadata Manager Business Glossary when searching for business terms in the Analyst tool. When you upgrade to 9.1.0, you can edit the Analyst Service to configure the Metadata Manager Service options. You can select a Metadata Manager Service when you create the Analyst Service.
82
Reference Data Location. Location of the address validation reference data. Full Pre-Load Countries. List of countries for which all available address reference data will be loaded into
performance in reference data that has not been preloaded. When you upgrade to 9.1.0, create a Content Management Service to update the properties.
Previously, you configured the Maximun Total Disk Size in megabytes and you configured the Maximim Per Cache Memory Size and Maximum Total Memory Size in kilobytes. After you upgrade, the number that you configured for there properties is retained. You must update the number to match the number of bytes that you require.
83
Domain Management
Effective in version 9.1.0, domain management has changed in the following ways:
The details panel no longer appears on the Domain tab.
Previously, you managed connections from the Manage > Connections menu command.
Previously, you could not use both an object name and a time attribute in an objectList element.
Previously, you had to set the select attribute to all to import all objects of the specified type.
84
Repository Service. For example, a Model Repository Service named MRS writes repository backup files to the following location:
<node_backup_directory>\MRS
When you view backup files for a Model Repository Service, you can view the backup files for that service. Previously, the Model Repository Service wrote repository backup files to the node backup directory. When you viewed backup files for a Model Repository Service, you could view all backup files for all Model Repository Services running on the node.
Permissions
Effective in 9.1.0, the Permissions tab for any domain object in the Administrator tool shows up to 1,000 users or groups. If there are more than 1,000 users or groups, a message appears that asks you to create a filter to limit the number of users or groups. Previously, the Permissions tab did not have a limit on the number of users or groups that it could display.
Reports
You must associate a Reporting and Dashboards Service with the PowerCenter Repository Service to view the PowerCenter reports in JasperReports Server. You must associate a Reporting and Dashboards Service with the Metadata Manager Service and view the Metadata Manager reports in JasperReports Server. Effective in version 9.1.0 HotFix 3, you run the PowerCenter reports from the PowerCenter Client and view them in JasperReports Server. You run the Metadata Manager reports from Metadata Manager to view them in JasperReports Server. Previously, you ran the reports using Data Analyzer.
Permissions
85
Privileges
Connection Privileges
Effective in version 9.1.0, a user must have the Manage Connection privilege to create, edit, and remove connections. By default, only users with the Administrator role have the Manage Connection privilege. Previously, a user did not need a privilege to create, edit, or remove a connection. If you upgrade to 9.1.0, assign users and groups the Manage Connection privilege to enable them to create, edit, and remove connections.
Profiling Privileges
Effective in version 9.1.0, a user must have the Drilldown and Export Results privilege to drill down and export profiling results. By default, users are not assigned this privilege after you upgrade to 9.1.0. Previously, a user did not need a privilege to drill down or export profiling results. If you upgrade to 9.1.0, assign users and groups the Drilldown and Export Results privilege to enable them to drill down and export profiling results.
86
CHAPTER 15
87
The name of the various arguments of the Backup and Restore command have also changed. The following table lists the previous and current names for the arguments:
Previous Argument Name MM_DBTYPE MM_DB_CONNECTION_URL MM_DB_USER MM_DB_PASSWORD MM_DB_NAME Current Argument Name dbType jdbcURL user pass file
Class Properties
Effective in version 9.1.0, you can set the Show_Synthetic_Dates_In_Basic_Section property in the imm.properties file to specify if these properties should be located in the Basic or Advanced section. After you upgrade to Metadata Manager 9.1, any previous change to the order of the class properties is lost. You must edit the model to manually update the order in which class properties of metadata objects appear in all Metadata Manager perspectives. Previously, you could not set the location of the synthetic date properties.
88
Link Reports
Effective in 9.1.0, the link details are not part of the acquisition summary logs or the Load Details tab. You can view the link details in the Link Details tab. Link summary contains the resource, connection, assigned database, assigned schema, links, missing links, correctness (%) details. Previously, the link details were part of the Load Details tab.
After you upgrade to 9.1.0, you must reindex all resources to ensure that the default search priorities are applied to metadata objects in the Metadata Manager warehouse. If required, you can later configure the priorities of the various entities in the searchpriorities.xml file. Previously, you could not configure the priority of the search results.
Resources
This section describes changes to resources.
Embarcadero ERStudio
Effective in version 9.1.0, you can specify the physical models from which you want to extract metadata. You can also import owner schemas for tables, views and other database objects for physical models. Previously, Metadata Manager extracted the logical model and the physical model that was created from the logical model. Metadata Manager displayed all objects extracted from logical and physical models under the logical model.
Link Reports
89
SAP
Effective in version 9.1.0, you must install the SAP JCo 3 libraries before you can create an SAP R/3 resource or use an upgraded SAP R/3 resource. You can download the SAP JCo libraries that are specific to the operating system from SAP Service Marketplace. Previously, you had to install the SAP RFC libraries.
90
CHAPTER 16
PowerCenter (9.1.0)
This chapter includes the following topics:
Session Recovery, 91 Informatica Data Integration Analyst Action Menus, 91
Session Recovery
Effective in version 9.1.0 HotFix 3, the PowerCenter Integration Service resets mapping variables to the start value during session recovery.
91
CHAPTER 17
92
CHAPTER 18
You do not need to configure JVMClassPath custom property for supported Hadoop distributions. You do not need to delete the Hive table before you run a session again.You can overwite the Hive table data.
93
94
CHAPTER 19
Pushdown Optimization
A Data Integration Service that runs on a UNIX machine can push filter transformation logic to Microsoft SQL Server sources when the Data Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Microsoft SQL Server as the ODBC provider.
Match Transformation
You can use multiple threads to run a mapping with a Match transformation configured for identity matching. Use a system variable on the Data Integration Service machine to set the number of threads. You can set this variable on the PowerCenter Integration Service machine if you export the mapping to PowerCenter.
Profiling
You can create a mapping from a profile in the Developer tool. The mapping uses the profile source as a data source and converts any rule defined on the profile into transformations or mapplets.
95
Pushdown Optimization
A Data Integration Service that runs on a UNIX machine can push filter transformation logic to Microsoft SQL Server sources when the Data Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Microsoft SQL Server as the ODBC provider.
Repository
The Data Quality 8.6.2-9.0.1 repository migration process migrates all reference data files from the 8.6.2 file system unless the files are present in the default Data Quality Content Installer file set. You do not need to install country pack or accelerator pack reference data to Data Quality 9.0.1 if the dictionary files used by the 8.6.2 plans are present on the 8.6.2 file system when you start the migration process.
Informatica Domain
This section describes new features and enhancements to the Informatica domain. Upgrade with Changes to Node Configuration Informatica provides an upgrade option to change the node configuration.
96
Profiling
Column Profile Results Profile results report three new types of information: the inferred datatype, the minimum value, and the maximum value in the column. Join Analysis Results When you perform join analysis on data sources in the Developer tool, you can export the common and unique records from the Data Viewer to a flat file.
Pushdown Optimization
The Data Integration Service can push filter transformation logic to SAP sources. The Data Integration Service can push filter transformation logic to Sybase ASE sources when the Data
Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Sybase as the ODBC provider.
Repository Migration
The process to migrate objects from the Informatica Data Quality 8.6.2 repository to the 9.0.1 Model repository provides greater support for address validation operations defined in Data Quality 8.6.2.
Pushdown Optimization
The Data Integration Service can push filter transformation logic to SAP sources. The Data Integration Service can push filter transformation logic to Sybase ASE sources when the Data
Integration Service uses an ODBC connection. When you configure the ODBC connection, configure Sybase as the ODBC provider.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Version 9.0.1
This section describes new features and enhancements in version 9.0.1.
Version 9.0.1
97
Informatica Content
You can define an INFA_CONTENT environment variable on a PowerCenter Integration Service machine to set the path PowerCenter uses to read reference data. Use this environment variable if you cannot install the reference data to the expected location on the PowerCenter machine.
Mapping Performance
You can tune the performance of mappings by updating the mapping optimizer level through the mapping
configuration or mapping deployment properties. The optimizer level determines which optimization methods that the Data Integration Service applies to the mapping at run time. You can choose no, minimal, normal, or full optimization for mappings.
parameter to represent a connection so you can run one mapping with different relational source connections. Create a parameter file to define parameter values. The Data Integration Service applies the parameter values when you run a mapping from the command line and specify the parameter file.
98
source data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.
You can add customized physical data objects to mappings and mapplets as read, write, or lookup objects. Create and import
You can create physical data objects from flat file, nonrelational database, relational database, and SAP
resources.
You can create physical data objects from resources that contain Developer tool illegal characters or
reserved words. For example, you can import a view named "CONCAT" or a table that contains a column with a period in the column name.
You can import tables, synonyms, and views from databases that use mixed case metadata. For example,
you can import tables "CUST" and "Cust" as separate physical data objects.
Profiles
Profile comments You can choose to delete profile comments on a profile. Profile results
You can profile multilingual data from different data objects and view profile results based on the locale
settings in the browser. The Analyst tool changes the Datetime, Numeric, and Decimal datatypes based on the browser locale.
You can sort on multilingual data. The Analyst tool displays the sort order based on the browser locale. After you run a profile, the Analyst tool purges the last profile run results from the profiling warehouse. The profiling warehouse stores 16,000 unique highest frequency values including NULL values for profile
results by default. If there is at least one NULL value in the profile results, the Analyst tool can display NULL values as patterns Running a profile
After you add a rule to a profile that has previously run, you can select the rule and associated columns
and run the profile again. The Analyst tool displays the previous profile results and the recent rule and columns results. You can modify the rule and run the profile again to view changes to profile results for the rule.
When you run a profile, you can choose to discard the profile results for previously profiled columns and
display results for the columns and rules selected for the latest profile run.
Version 9.0.1
99
profile.
Repository
You can migrate objects from the Informatica Data Quality 8.6.2 repository to the 9.0.1 Model repository. You
Rules
You can choose to drill down on live data for a rule. You can select the rules for drill down without profiling all the source columns again after running the profile.
Scorecards
You can group related scores within a scorecard to view a set of scores for a particular business concept.
When you add a profile column to a scorecard, you can choose to add it to a group. You can add a score to a group within a scorecard. You can move scores between groups within a scorecard and edit and remove groups from a scorecard.
You can select columns in the scorecard before running a scorecard again. You can choose to drill down on
scorecard.
Applications
Objects in the Application view are sorted by default. Projects in the Application view have a new icon. They do not use the folder icon anymore. You can rename an application in the Administrator tool. You can refresh the Application view to update newly deployed, undeployed, and restored applications. You can update an application to resolve the conflict when you use the Administrator tool to deploy an
application with the same name as an existing application. Also, when you select the update or replace option during a conflict, you can select an option to stop the existing application if it is running.
unstructured and semi-structured file formats, such as messaging formats, HTML pages, and PDF documents. The Custom Data transformation also processes structured formats such as ACORD, HL7, EDI-X-12, EDIFACT, and SWIFT. The Custom Data transformation calls a Data Transformation service to process the data.
100
Mapping Performance
You can tune the performance of mappings by updating the mapping optimizer level through the mapping
configuration or mapping deployment properties. The optimizer level determines which optimization methods that the Data Integration Service applies to the mapping at run time. You can choose no, minimal, normal, or full optimization for mappings.
parameter to represent a connection so you can run one mapping with different relational source connections. Create a parameter file to define parameter values. The Data Integration Service applies the parameter values when you run a mapping from the command line and specify the parameter file.
source data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.
You can add customized physical data objects to mappings and mapplets as read, write, or lookup objects. Create and import
You can create physical data objects from flat file, nonrelational database, relational database, and SAP
resources.
You can create physical data objects from resources that contain Developer tool illegal characters or
reserved words. For example, you can import a view named "CONCAT" or a table that contains a column with a period in the column name.
You can import tables, synonyms, and views from databases that use mixed case metadata. For example,
you can import tables "CUST" and "Cust" as separate physical data objects.
Staging Database
The staging database properties include the database connection name and the properties for an IBM DB2
Version 9.0.1
101
Virtual Data
Data preview When you preview virtual table data, you can view a graphical representation of the SQL query you enter. You can view the query plan for the original query and for the optimized query plan. Use the query plan to troubleshoot queries that end users run against a deployed SQL data service. You can also use the query plan to troubleshoot your own queries and to understand the log messages. Column level security You can set permissions at the column level to deny queries against a column in a virtual table. You can restrict user access to a column without denying the user access to the table. You can fail a query that selects the column or replace the column value with a default value in a query. Configure column-level security with infacmd.
Informatica Domain
This section describes new features and enhancements to the Informatica Domain.
Connections
Pass-through security The Data Integration Service uses the client user name and the password for connection objects in an SQL data service. The Data Integration Service connects to source objects with the client credentials instead of the default credentials from the connection object. Restrict users from the data in an SQL data service based on user permissions on the physical data object. Object names The Data Integration Service can generate SQL against Oracle, DB2, Microsoft SQL Server, or ODBC connections that have case-sensitive table and column names. You can use the Administrator tool or Developer tool to configure the connection. You can specify whether to include quotes around table and column names in the connection. Microsoft SQL Server
You can use the Administrator tool to specify the owner name and schema name for a Microsoft SQL
Server connection.
You can use the Administrator tool or Developer tool to configure a Microsoft SQL Server connection as a
trusted connection in the domain. IBM DB2 You can use the Administrator tool to specify the tablespace name for an IBM DB2 connection. Connection types If you have PowerExchange, you can create the following connection types:
DB2 for i5/OS IMS Sequential z/OS VSAM
If you have PowerExchange for SAP Netweaver, you can create the following connection type:
SAP
102
Connection permissions You can assign users the read, write, and execute permissions on the database connection. The execute permission grants other users the ability to preview data and run profiles and scorecards on data objects created with the connection.
Domain Configuration
You can use infasetup to back up and restore the domain.
statistics for each host. The report contains information for all licenses assigned to the domain. An administrator can track the number of times a user logs in to the Analyst tool and how often the user runs profiles and scorecards.
Service Upgrade
Use the Service Upgrade Wizard in the Administrator tool to upgrade multiple services at one time. You can
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Browse Metadata
Impact summary When you view a business term or any metadata object except PowerCenter objects on the Browse tab, you can view impacted objects from the most relevant classes. The most relevant classes include business intelligence reports, relational database tables, PowerCenter mappings, and business terms.
Version 9.0.1
103
Business Glossary
Email notifications Metadata Manager sends an email notifying users about the following events:
A data steward proposes a draft business term for review. Metadata Manager displays the email options
Data Lineage
SQL inline views for PowerCenter and relational resources You can view data lineage on a database table, view, or synonym used in an SQL query with an inline view. The SQL query can exist in the following objects:
SQL override in a PowerCenter Source Qualifier or Lookup transformation. Database views, stored procedures, functions, and triggers.
Custom objects
When you run data lineage analysis on a custom object, the data lineage diagram includes the custom
When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object and all instances of the object. When you run data lineage analysis on any instance of the PowerCenter object, the data lineage diagram displays the associated custom metadata object or business term.
- Instance of a transformation or session.
When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object instance within the context where the instance is used. For example, the diagram shows a transformation instance within its corresponding mapping and session context.
- Shortcut.
When you run data lineage analysis on the custom object or business term, the diagram displays the instances of the original object and all instances of the shortcuts to the original object.
PowerCenter
This section describes new features and enhancements to PowerCenter.
104
Domains and enumerations You can configure domains and enumerations to define reference data within a mapping specification. A domain is a reference table. An enumeration includes the reference table values. Reusable rules You can define reusable rules to use as expressions on the Mappings worksheet. You can use rules in a mapping specification to perform simple data cleansing. Validation When you validate a mapping specification, Mapping Analyst for Excel provides more detailed error messages. Multiple mappings You can configure multiple mappings in a single mapping specification based on the Standard mapping specification template.
Mapping Objects
Data Transformation source and target You can configure a Data Transformation source or a Data Transformation target in a mapping. The Data Transformation source and the Data Transformation target process unstructured and semi-structured file formats, such as messages, HTML pages, and PDF documents. The source and target also transform structured formats such as HIPAA, HL7, EDI-X12, and EDIFACT. The Data Transformation source and Data Transformation target call a Data Transformation service. The Data Transformation service is the application that transforms the unstructured and semi-structured file formats. The Data Transformation service receives data from the PowerCenter Integration Service, transforms the data, and returns it to the PowerCenter Integration Service. Unstructured Data transformation The Unstructured Data transformation accepts hierarchical groups of input ports. You can pass data that represents relational tables. Groups are related by primary key-foreign key relationships. To increase performance you can flush sorted input data to the Unstructured Data transformation. Identity Resolution transformation The Identity Resolution transformation is an active transformation that you can use to search and match data in databases. The PowerCenter Integration Service uses the search definition that you specify in the Identity Resolution transformation to search and match data residing in the Informatica Identity Resolution (IIR) tables. The input and output views in the search definition determine the input and output ports of the transformation. Configure match tolerance and search width parameters in the Identity Resolution transformation to determine the matching scheme and search level. The Identity Resolution transformation returns the candidate records along with the search link port, respective scores, and the number of records found for the search.
PowerExchange
This section describes new features and enhancements to PowerExchange.
Version 9.0.1
105
data without your having to perform any additional CDC configuration task.
a single PowerExchange Listener, or by multiple PowerExchange Listener, netport, and batch jobs.
DB2 for i5/OS and DB2 for z/OS Stored Procedures as a Source
If you use the PowerExchange Client for PowerCenter (PWXPC) in PowerCenter, you can now execute DB2 for
i5/OS and DB2 for z/OS database stored procedures as override SQL for a data source.
GetCurrentFileName Function
For a data map record defined for a nonrelational data source, the GetCurrentFileName function gets the name
of the source data file. Use this function to determine from which data file the data for a record was read.
can issue the CloseListener, CloseForceListener, ListTaskListener, and StopTaskListener commands from the command line to manage a PowerExchange Listener Service.
infacmd pwx commands to manage a PowerExchange Logger Service. With the infacmd pwx program, you can
issue the CondenseLogger, DisplayAllLogger, DisplayCheckpointsLogger, DisplayCPULogger, DisplayEventsLogger, DisplayMemoryLogger, DisplayRecordsLogger, DisplayStatusLogger, FileSwitchLogger, and ShutDownLogger commands from the command line to manage a PowerExchange Logger Service. To issue commands to a PowerExchange process that is not managed by a PowerExchange application service, you must use the pwxcmd program.
VSAM or sequential file targets. The writer partitions process insert operations only. Because the partitions process inserts concurrently, this feature can help improve session performance. If you enable offload processing, offload processing also runs in the partitions concurrently.
connections:
- Connection timeouts detect unsuccessful connection attempts. - Heartbeat timeouts detect a failure of the PowerExchange client or PowerExchange Listener to send or
106
Integration Service accesses staged files through FTP, SFTP, or standard file I/O, typically using network file sharing, such as NFS.
session property. A query band expression is a set of name-value pairs that identify a query's originating source.
PowerExchange for Teradata Parallel Transporter API supports Teradata Parallel Transporter version 13. You can load data in parallel through multiple instances into a Teradata PT API target from a source. You can
sources with subsequent runs of the session. Run the session without staged loading to load data from all sources at once to the Teradata PT API target.
You can extract data from a Teradata source or load data to a Teradata target when the PowerCenter
Version 9.0.1
107
CHAPTER 20
Informatica Data Quality and Informatica Data Explorer Advanced Edition (9.0.1)
This chapter includes the following topic:
Transformations, 108
Transformations
Effective in version 9.0.1, the configurable options for some transformations have new names.
For example, most transformations use the term Strategy to specify the data operations defined in the transformation. Previously, transformations used different names for user-defined data operations.
Effective in version 9.0.1, the configurable options for some transformations are redesigned.
For example, the design of the Labeler transformation is simplified. Previously, you selected token labeler or character labeler mode when you created a Labeler transformation. This distinction is removed.
108
CHAPTER 21
data.
Add pre- and post-mapping SQL commands. Define parameters for the data object. Retain key relationships when you synchronize the object with the sources.
Previously, you could not perform these tasks within a relational physical data object.
109
110
CHAPTER 22
Alerts
You can subscribe to domain and service alerts to receive them through email. You can use infacmd isp to configure alerts.
Domain Configuration
You can use the command line interface to back up and restore the domain.
Permissions
Connection permissions. You can use the Administrator tool or command line interface to configure permissions on connections to limit users that perform particular actions on the connection. In 9.0, you could not assign permissions on connections. If you upgrade from 9.0 to 9.0.1, connections that were created in 9.0 remain without permissions. Assign permissions on the 9.0 connections to restrict access to the connections. Service and folder permissions. Unless a specific permission is assigned, services and folders inherit permissions from the parent folder. Domain administrator role permissions. Domain administrators automatically inherit permissions for all services and folders, including new services and folders that are created after the user is assigned the Administrator role.
111
Users
Case-sensitive distinguished name attributes. You can configure Informatica to support case-sensitive distinguished name attributes for LDAP security domains. If you upgrade from 9.0 to 9.0.1, distinguished name attributes that were created in 9.0 are still not case-sensitive. You can configure 9.0 distinguished name attributes to be case-sensitive. User preferences. You can configure user preferences to subscribe to domain and service alerts and to show custom properties. If you upgrade from 8.x to 9.0.1, the following user preferences are no longer available: Show Upgrade Option, Show Tooltips in the Overview Dashboards and Properties, and Overview Grid Refresh Time. As a result, the Administrator tool performs the following actions:
Shows the Upgrade tab if you have privileges to upgrade PowerCenter. Shows tooltips in the Overview and Properties tabs of the Administration tool. Refreshes the grid in the Overview tab every 30 seconds.
Connections
This section describes changes to connections.
Connection Permissions
Effective in version 9.0.1, you can assign read, write, and execute permission on connections. Previously, all users had all permissions on every connection. All users are upgraded with read, write, and execute permissions. To restrict access to connections, a domain administrator must reset permissions.
112
SetSchemaPermissions
Deprecated.
113
Command CreateConnection ListConnections ListConnectionOptions ListConnectionPermissions ListConnectionPermissionsByGro up ListConnectionPermissionsByUse r. ListGroupPermissions ListUserPermissions RemoveConnection RemoveConnectionPermissions SyncSecurityDomains UpdateConnection
Description Creates a new database connection. Lists existing connections. Lists available options for connections that you can use when you create a connection. Lists permissions that a user or group has for a connection. Lists all groups that have permission for a connection and the type of permissions they have. Lists all users having permissions on the given connection, along with the type of permissions. Lists the domain objects that a group has permission on. Lists the domain objects that a user has permission on. Removes a database connection. Removes permissions for a specific user or group. Synchronizes LDAP security domain. Updates an existing connection.
114
115
To manage applications in the infacmd command line program, users must have the Manage Applications privilege for the Data Integration Service. Previously, users needed the Manage Services privilege for the domain and permission on the Data Integration Service to manage applications. After you upgrade, you must assign users the Manage Applications privilege for the Data Integration Service.
116
LDAP
This section describes changes to LDAP.
User Import
Effective in version 9.0.1, the default maximum size for user import is set to 1000. Previously, the default value was set to 0, which indicated that there was no maximum value. When you upgrade, all users are imported into the domain. However, all users over 1000 will be dropped in reverse alphabetic order the next time the Service Manager synchronizes with the LDAP service directory. To avoid dropping users, reset the maximum size in the LDAP server configuration.
Logs
This section describes changes to logs.
Previously, you specified the value in megabytes. When you upgrade, the Administrator tool appends "m" to the value.
LDAP
117
Node Diagnostics
Effective in version 9.0.1, you do not need to import SSL certificates to generate node diagnostics on a secure node. The Configuration Support Manager web application runs on the same web application as the Administrator tool. Previously, the Configuration Support Manager ran on a separate web application. If you wanted to ensure security when you connected to the Configuration Support Manager, you had to configure the nodes for security.
Reports
This section describes changes to reports.
118
CHAPTER 23
PowerCenter (9.0.1)
This chapter includes the following topics:
Mapping Analyst for Excel, 119 Web Services Hub, 120
Excel Add-in
Effective in version 9.0.1, Mapping Analyst for Excel includes an Excel add-in that adds a Metadata menu or ribbon to Microsoft Excel. Use the Metadata menu or ribbon to complete the following tasks:
Show and hide columns on a worksheet. Annotate cells with descriptions from other worksheets. Format a worksheet to resize the columns to fit the text. Validate the mapping specification. Insert another worksheet of a specific type.
You can install the add-in for Microsoft Excel 2003 or 2007. However, use Microsoft Excel 2007 to use the improved user interface. Previously, Mapping Analyst for Excel did not provide a Metadata menu or ribbon. You used the Validate button on each worksheet to validate data.
Export Option
Effective in version 9.0.1, you do not configure the Operation export option. Mapping Analyst for Excel determines the type of export operation to perform. To configure the Format export option, you select the Standard mapping specification template. Previously, you configured the Operation export option and typed the name of a template in the Format export option.
119
You can no longer import from or export to mapping specifications based on these templates. If you have existing mapping specifications, you must reconfigure the mapping metadata in a mapping specification based on the Standard template.
120
CHAPTER 24
Business Glossary
Effective in version 9.0.1, Metadata Manager sends an email notifying users about the following events:
A data steward proposes a draft business term for review. Metadata Manager displays the email options so that
steward assigned to the term. Previously, Metadata Manager did not send emails notifying users about these events.
121
Effective in version 9.0.1, a data lineage diagram displays these relationships in the following ways:
Custom metadata object or business term related to a reusable transformation or session.
When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object and all instances of the object. When you run data lineage analysis on any instance of the PowerCenter object, the data lineage diagram displays the associated custom metadata object or business term.
Custom metadata object or business term related to an instance of a transformation or session.
When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the PowerCenter object instance within the context where the instance is used. For example, the diagram shows a transformation instance within its corresponding mapping and session context.
Custom metadata object or business term related to a shortcut.
When you run data lineage analysis on the custom object or business term, the data lineage diagram displays the instances of the original object and all instances of the shortcuts to the original object. Previously, data lineage diagrams did not display all instances of PowerCenter reusable objects and did not display the context of a PowerCenter object instance for these relationships. After you upgrade, complete one of the following tasks to view these relationships in data lineage diagrams:
Reload the custom resource. Reload each PowerCenter resource that is related to the custom resource or to the business glossary.
After you upgrade, reload any relational database or PowerCenter resource that uses SQL inline views. Previously, Metadata Manager did not correctly display data lineage for a database table, view, or synonym used in an SQL query with an inline view.
122
Email
Effective in version 9.0.1, you use the Administrator tool to configure the host name and port number of the outgoing mail server in the domain SMTP configuration settings. You must configure these properties before Metadata Manager can send email notifications. In version 9.0, you configured the host name and port number of the outgoing mail server in the imm.properties file. After you upgrade from version 9.0, use the Administrator tool to configure the email properties in the domain SMTP configuration settings. In version 8.6.x, you configured the host name and port number of the outgoing mail server in the domain SMTP configuration settings.
Impact Summary
Effective in version 9.0.1, you can view an impact summary when you view the details of a business term or a metadata object in a packaged resource. You cannot view an impact summary for a PowerCenter metadata object or a custom metadata object. The impact summary for a metadata object displays the following details:
Impact Summary Downstream. Lists all downstream objects. Changes to the selected metadata object impact
these objects.
Impact Summary Upstream. Lists all upstream objects. Changes to these objects impact the selected metadata
object. The impact summary for a business term displays the following details:
Impact Summary Downstream. Lists all objects that are downstream from the object related to the business
Changes to these objects impact the related object. After you upgrade, reload all PowerCenter resources to view the impact summary for business terms and metadata objects. Previously, Metadata Manager did not display an impact summary. You needed to run data lineage analysis to determine the impact of metadata changes.
123
124
125
CHAPTER 25
Repository
Data Services and Data Quality both use a Model repository to store objects. If you have Data Services and Data Quality, you can use the same repository.
Application Clients
Data Services and Data Quality both use the following application clients to create objects and preview results:
Informatica Developer (Developer tool). Developers use this application to design and implement data
quality and data services solutions. The Developer tool includes an editor to edit the objects that you create.
Informatica Analyst (Analyst tool). Analysts use this web-based application client to analyze, cleanse,
126
Application Services
Data Services and Data Quality use the following application services to process the objects that you create in the client applications:
Data Integration Service. Performs data integration tasks for Informatica Analyst and Informatica Developer
Data Quality
Informatica Data Quality is enhanced in version 9.0 with new desktop and web-based client applications. Use the Developer tool to design and distribute data quality mappings and rules from your desktop. Use the Analyst tool to analyze data quality and run rules from any Internet browser. With Data Quality, you can perform the following tasks:
Profile data. Create and run a profile to analyze the structure and content of enterprise data and to identify
strengths and weaknesses in the data. After you run a profile, you can selectively drill down to see the underlying rows in the profile results. You can also add columns to scorecards and add column values to reference tables.
Score data. Create scorecards to score the valid values for any column or the output of rules. Scorecards
display the value frequency for columns in a profile as scores. Use scorecards to measure and visually represent data quality progress. You can also view trend charts to view the history of scores over time.
Standardize data values. Standardize data to remove errors and inconsistencies that you find when you run a
data. You can split a single field of freeform data into fields that contain different information types. You can also add information to your records. For example, you can flag customer records as personal or business customers.
Validate postal addresses. Address validation evaluates and enhances the accuracy and deliverability of your
postal address data. Address validation corrects errors in addresses and completes partial addresses by comparing address records against reference data from national postal carriers. Address validation can also add postal information that speeds mail delivery and reduces mail costs.
Manage bad and duplicate records. Duplicate record analysis compares a set of records against each other
to find similar or matching values in selected data columns. You set the level of similarity that indicates a good match between field values. You can also set the relative weight given to each column in match calculations. For example, you can prioritize surname information over forename information.
Create and run data quality rules. Informatica provides pre-built rules that you can run or edit to meet your
project objectives. Create and apply rules within profiles. A rule is reusable business logic that defines conditions applied to data when you run a profile. Use rules to further validate the data in a profile and to measure data quality progress.
Collaborate with Informatica users. The rules and reference data tables you add to the Model repository are
available to users in the Developer tool and the Analyst tool. Users can collaborate on projects, and different users can take ownership of objects at different stages of a project.
Export mappings to PowerCenter. You can export mappings to PowerCenter to reuse the metadata for data
127
Manage reference data. Create and update reference tables for use by analysts and developers to use in data
quality rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. The following table lists the data quality tasks that you can perform in the Developer tool and the Analyst tool:
Informatica Developer Create and run mappings. Create and run rules. Perform profiling. Score data. Export mappings to PowerCenter. Informatica Analyst Perform profiling. Score data. Manage reference tables. Create profiling rules. Run rules in profiles. Manage bad and duplicate records.
Note: Informatica Data Explorer Advanced Edition functionality is a subset of Informatica Data Quality functionality.
Data Services
Informatica Data Services provides a way to find, understand, integrate, and manage data across an enterprise. With Informatica Data Services, you can create data models that describe how to represent and access data in an enterprise. You can use the components of the data model for data integration and data quality projects. Reuse the components for multiple projects to eliminate redundant work. You can also create a virtual database that allows all applications to consume data regardless of how the integration logic is physically implemented. The virtual database also isolates applications and other data consumers from changes in underlying data sources. With Data Services, you can perform the following tasks:
Define logical views of data. A logical view of data describes the structure and use of data in an enterprise.
You can create a logical data object model that shows what types of data your enterprise uses and how that data is structured.
Map logical models to data sources or targets. Create a mapping that links objects in a logical model to
data sources or targets. You can link data from multiple, disparate sources to have a single view of the data. You can also load data that conforms to a model to multiple, disparate targets.
Create virtual views of data. You can deploy a logical model to a virtual federated database. End users can
run SQL queries against the virtual data without affecting the actual source data.
Export mappings to PowerCenter. You can export mappings to PowerCenter to reuse the metadata for
any data project, as it can identify strengths and weaknesses in your data and help you define your project plan. This is available if you have the profiling option.
Create rules. Create rules with Data Services transformations. This is available if you have the profiling option. Manage reference data. Create and update reference tables for use by analysts and developers to use in data
quality standardization and validation rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. Developers use reference tables in standardization and lookup transformations in Informatica Developer.
128
The following table lists the data services tasks that you can perform in the Developer tool and the Analyst tool:
Informatica Developer Create logical data object models. Create and run mappings with Data Services transformations. Create SQL data services. Profile data. Create rules. Export objects to PowerCenter. Informatica Analyst - Manage reference data.
Note: If you have the profiling option, you can perform profiling and also create rules with Data Services transformations.
Informatica Analyst
Informatica Analyst is a new web-based application that analysts can use to analyze, cleanse, standardize, profile, and score data in an enterprise. Business analysts and developers use Informatica Analyst for data-driven collaboration. You can perform column and rule profiling, scorecarding, and bad record and duplicate record management. You can also manage and provide reference data to developers in a data quality solution. Use Informatica Analyst to accomplish the following tasks:
Profile data. Create and run a profile to analyze the structure and content of enterprise data and identify
strengths and weaknesses. After you run a profile, you can selectively drill down to see the underlying rows from the profile results. You can also add columns to scorecards and add column values to reference tables.
Create rules in profiles. Create and apply rules within profiles. A rule is reusable business logic that defines
conditions applied to data when you run a profile. Use rules to further validate the data in a profile and to measure data quality progress.
Score data. Create scorecards to score the valid values for any column or the output of rules. Scorecards
display the value frequency for columns in a profile as scores. Use scorecards to measure and visually represent data quality progress. You can also view trend charts to view the history of scores over time.
Manage reference data. Create and modify reference tables for use by analysts and developers to utilize in
data quality standardization and validation rules. Create, edit, and import data quality dictionary files as reference tables. Create reference tables to establish relationships between source data and valid and standard values. Developers utilize reference tables in standardization and lookup transformations in Informatica Developer.
Manage bad records and duplicate records. Fix bad records and consolidate duplicate records.
Informatica Analyst
129
Informatica Domain
The PowerCenter domain is renamed to Informatica domain. It is expanded include objects and services for the Informatica platform.
infacmd.
infacmd mcf. Export mappings from the Model repository to the PowerCenter repository. infacmd mrs. Manage Model Repository Services. infacmd oie. Export objects from the Model repository to an export file. Import objects to the Model repository
Services
The Informatica domain includes services for PowerExchange, Informatica Analyst, and Informatica Developer.
Analyst Service
Application service that runs Informatica Analyst in the Informatica domain. Create and enable an Analyst Service on the Domain tab of Informatica Administrator. When you enable the Analyst Service, the Service Manager starts Informatica Analyst. You can open Informatica Analyst from Informatica Administrator.
130
Management
This section describes new features and enhancements to domain management.
Informatica Administrator
The PowerCenter Administration Console is renamed to Informatica Administrator (Administrator tool). The Informatica Administrator has a new interface. Some of the properties and configuration tasks from the PowerCenter Administration Console have been moved to different locations in Informatica Administrator.
Connection Management
Database connections are centralized in the domain. You can create and view database connections in Informatica Administrator, Informatica Developer, or Informatica Analyst. Create, view, edit, and grant permissions on database connections in Informatica Administrator.
Deployment
You can configure, deploy, and enable applications in the Developer tool. Deploy applications to one or more Data Integration Services.
Licensing
The Informatica domain enforces the licensing restrictions on the number of CPUs and PowerCenter repositories.
Monitoring
You can monitor profile jobs, scorecard jobs, preview jobs, mapping jobs, and SQL data services for each Data Integration Service. View the status of each monitored object on the Monitoring tab of the Administrator tool.
PowerCenter
This section describes new features and enhancements to PowerCenter.
Real-time Sessions
Session log file rollover. You can limit the size of session logs for real-time sessions. You can limit the size
by time or by file size. You can also limit the number of log files for a session.
Lookup Transformation
Cache updates. You can update the lookup cache based on the results of an expression. When an expression
is true, you can add to or update the lookup cache. You can update the dynamic lookup cache with the results of an expression.
Database deadlock resilience. In previous releases, when the Integration Service encountered a database
deadlock during a lookup, the session failed. Effective in 9.0, the session will not fail. When a deadlock occurs, the Integration Service attempts to run the last statement in a lookup. You can configure the number of retry attempts and time period between attempts.
PowerCenter
131
Multiple rows return. You can configure the Lookup transformation to return all rows that match a lookup
condition. A Lookup transformation is an active transformation when it can return more than one row for any given input row.
SQL overrides for uncached lookups. In previous versions you could create a SQL override for cached
lookups only. You can create an SQL override for uncached lookup. You can include lookup ports in the SQL query.
SQL Transformation
Auto-commit for connections. You can enable auto-commit for each database connection. Each SQL
statement in a query defines a transaction. A commit occurs when the SQL statement completes or the next statement is executed, whichever comes first.
Exactly-once processing. The Integration Service provides exactly-once delivery of real-time source
messages to the SQL transformation. If there is an interruption in processing, the Integration Service can recover without requiring the message to be sent again. To perform exactly-once processing, the Integration Service stores a set of operations for a checkpoint in the PM_REC_STATE table.
Passive transformation. You can configure the SQL transformation to run in passive mode instead of active
mode. When the SQL transformation runs in passive mode, the SQL transformation returns one output row for each input row.
XML Transformation
XML Parser buffer validation. The XML Parser transformation can validate an XML document against a
schema. The XML Parser transformation routes invalid XML to an error port. When the XML is not valid, the XML Parser transformation routes the XML and the error messages to a separate output group that you can connect to a target.
pmrep does not include the full parent path of non-reusable objects in the query result. This option can improve pmrep performance.
Metadata Manager
This section describes new features and enhancements to Metadata Manager.
Resources
Microsoft Analysis and Reporting Services resource. Metadata Manager added the Microsoft Analysis and
Reporting Services resource. Use this resource to extract reporting metadata from Microsoft Reporting Services and to extract an analysis schema from Microsoft Analysis Services.
132
Test a connection to validate the configuration of a resource. When you create or edit a resource, click
Test Connection to test the connection to the source system, validate the Metadata Manager Agent URL, or validate the source file configuration.
PowerCenter parameter file syntax. Metadata Manager supports additional forms of parameters in
z/OS resource.
Single log file. You can download a single log file in Microsoft Excel that includes load details, session
Connection Assignments
Link objects in connected resources. If you change connection assignments for a resource, you do not need
to reload the resource to create the links between matching objects in connected resources. You can use the Resource Link Administration window to direct Metadata Manager to create the links between matching objects in the resources.
Export missing link details. The Load Details tab contains summary information for links created between
objects in connected resources. You can export the details of objects that Metadata Manager could not link to a Microsoft Excel file for further analysis.
Automatic connection assignment. Metadata Manager can automatically configure connection assignments
for a data integration, business intelligence, or data modeling resource. Metadata Manager configures the connection assignments during a resource load or link process. Use the Load Details tab to review the connection assignments and make corrections as needed. Or, you can manually configure the connection assignments.
Purging resources keeps connection assignments. Metadata Manager keeps connection assignments for a
purged resource. The connection assignment properties display the schema status as purged. If you reload the resource, Metadata Manager changes the status to active if the schema still exists in the source.
Business Glossary
Business Glossary approval workflow. You can use the business glossary to create and edit draft business
terms, propose the business terms for review by data stewards and then publish the terms. After publication, the business term is visible to all users.
Migrate a business glossary. You can migrate business glossaries to and from XML. The XML file includes all
categories, business terms, and custom objects, and object comments, links, and relationships.
Business glossary terms in data lineage diagram. When you launch data lineage, Metadata Manager
displays the business terms associated with each object in a lineage diagram. Metadata Manager does not display the both upstream and downstream connections between business terms in a data lineage diagram anymore.
Link business glossary terms to reference tables. You can associate a business term with a reference table
name and a URL to the reference table. You can specify a URL to a reference table in Informatica Analyst, or you can include any valid URL to a reference table. Previously you associated a business term with a reference table in Reference Table Manager. Reference Table Manager no longer exists.
Audit trail. View the history of changes to business glossary categories, business terms, and custom objects.
The audit trail includes the old and new values of a property and the user who made the edits. You can also search the audit trail.
Metadata Manager
133
Browse Metadata
Use the URL API to access Metadata Manager objects and features . You can use the URL API to access
Metadata Manager objects and features from external applications. For example, you can bookmark a link to a particular catalog object, or you can access data lineage on a specific object from a business intelligence tool.
Display one instance of an object in the lineage diagram. PowerCenter mappings and database tables,
views, and synonyms are no longer split across multiple instances in different nodes in a lineage diagram. You can also configure additional classes whose objects are not split.
mmcmd Commands
The mmcmd command line program includes commands in the following areas:
Resource management. Added commands to create, update, delete, and purge resources. Added commands
to configure connection assignments, configure PowerCenter parameter files, and create links between objects in connected resources. Added a command to cancel a resource load.
Metadata Manager Service. Added commands to create and delete Metadata Manager repository content and
Import/Export
Export and import in XML. You can export any custom resource, business glossary, or property added to a
packaged resource type to an XML file and import it into another Metadata Manager instance.
Audit Trail
Audit trail. View the history of changes to business glossary categories and terms, custom resources, and
properties added to packaged resource types. The audit trail includes the old and new values of a property and the user who made the edits. You can also search the audit trail.
Search
Searching metadata in Metadata manager includes the following enhancements:
Links. You can search for text in links for metadata objects, including link name, link description, link URL . Location property. By default, the location of an object appears in search results. You can click the link to
resource objects, business glossary categories and terms, and properties added to packaged resource types.
feel, Action menu, context-sensitive menus, and toolbar buttons. The Search menu is more conveniently placed for accessibility. The user interface is more consistent with other Informatica web-based tools.
PowerExchange
This section describes new features and enhancements to PowerExchange.
Asynchronous network communication . PowerExchange uses asynchronous network communication for
most send and receive operations between a PowerExchange client and a PowerExchange Listener. With asynchronous communication, PowerExchange uses separate threads for network processing and data processing, so that network processing overlaps with data processing. PowerExchange sends and receives heartbeat signals across the network that can be used for early detection of failure situations.
134
Partitioning for bulk data movement sessions . PowerExchange 9.0 provides the following partitioning
enhancements:
- You can use pass-through partitioning without SQL overrides for bulk data movement sessions that include
any of the following offloaded data sources: VSAM data sets, sequential data sets, and DB2 for z/OS unload data sets. PowerExchange opens a single connection to the data source and distributes the data across the partitions.
- For all other nonrelational bulk data sources, you can use pass-through partitioning with disjoint SQL
overrides. If you do not provide overrides with these data sources, data is read into the first partition only.
DB2 for z/OS Stored Procedure transformations . You can use Stored Procedure transformations for DB2 for
z/OS stored procedures in a PowerCenter mapping. Use Stored Procedure transformations for read or write bulk data movement and change data capture (CDC) operations.
DB2 for i5/OS multiple-row FETCH statements . To enhance the performance of DB2 for i5/OS bulk data
movement operations that use the DB2 access method, PowerExchange uses a DB2 multiple-row FETCH statement to retrieve multiple rows of data at a time from a source table. In PowerCenter, you can configure the number of rows to be retrieved by setting the Array Size attribute on a PWX DB2i5OS relational connection used by PWXPC.
PowerExchange Logger for Linux, UNIX, and Windows performance . The following enhancements improve
PowerExchange Logger for Linux, UNIX, and Windows processing and management:
- PowerExchange Logger configuration parameters . You can enter additional parameters in the
PowerExchange Logger for Linux, UNIX and Windows configuration file to control how expired CDCT records are deleted, specify the maximum number of days to hold retention array items in memory, and control whether PowerExchange displays a user confirmation prompt for a cold start or for a warm start from a previous position in the change stream.
- PowerExchange Logger cold starts . PowerExchange 9.0 provides a COLDSTART parameter that enables
you to control whether the PowerExchange Logger for Linux, UNIX, and Windows cold starts or warm starts.
- DISPLAY commands . Additional DISPLAY commands are available for the PowerExchange Logger for
Linux, UNIX, and Windows to help you monitor and manage PowerExchange Logger processing. You can issue these commands from the command line or by using the pwxcmd program.
- Checkpoint file. For more efficient checkpoint processing and smaller checkpoint files, PowerExchange 9.0
changes the format of the PowerExchange Logger checkpoint files from CISAM files to sequential files with the extension .ckp.
PowerExchange Logger file management . Use the PWUCDCT utility to back up, restore, and regenerate the
PowerExchange Logger for Linux, UNIX, and Windows CDCT file and to manage expired CDCT records and orphaned log files. Also use the utility to print reports on PowerExchange Logger files such as checkpoint files and log files.
pwxcmd command support for i5/OS and z/OS . You can issue use the pwxcmd program to issue commands
from a Linux, UNIX, or Windows system to a PowerExchange Listener or PowerExchange Condense process running on an i5/OS or z/OS system.
Extraction of relative record numbers during DB2 for i5/OS CDC processing . When you create a capture
registration, you have the option to capture the relative record number of the change record.
PowerExchange Listener Service . Use this application service to manage the local PowerExchange Listener.
You can create, start, or stop a PowerExchange Listener service or view status information in the Informatica Administrator.
PowerExchange Logger Service . Use this application service to manage the local PowerExchange Logger for
Linux, UNIX, and Windows. You can create, start, or stop a PowerExchange Logger service or view status information in the Informatica Administrator.
PowerExchange
135
Neoview. Select a relational or bulk reader for the connection when you configure the session properties.
You can enter mapping variables and workflow variables in the connection object properties. You can configure parameters for a Neoview session. You can configure a Neoview session for recovery. You can enter pre-session and post-session SQL commands. You can override the source table name and source table name owner the in the session properties. You can configure the following session properties for a Neoview session: - Commit type - Enable high precision - Error log type - Error row handling - Pipeline partitioning - Session on grid If you use PowerExchange change data capture, you can load the changed data to Neoview as a target in the
same mapping.
136
Netezza. Select a relational or bulk reader for the connection when you configure the session properties.
You can enter mapping variables and workflow variables in the connection object properties. You can configure parameters for a Netezza session. You can recover non real-time Netezza sessions in normal mode. You can enter pre-session and post-session SQL commands. You can override the source table name and source table name owner the in the session properties. You can configure the following session properties for a Netezza session: - Commit type - Enable high precision - Error log type - Error row handling - Pipeline partitioning - Pushdown optimization - Session on grid If you use PowerExchange change data capture, you can load the changed data to Netezza as a target in the
same mapping.
137
properties in PowerCenter:
- SQL properties: Source Filter and SQL Query. - Target properties: User Name, Responsibility Name, Security Group Name, Server Name, and Schema Name. - Session properties: Source Filter, SQL Query, User Name, Responsibility Name, Security Group Name,
SAP migration file. The Integration Service can write the data to the IS-U Migration Workbench.
include a user name and password in a web service target or Web Services Consumer transformation.
138
CHAPTER 26
PowerCenter Domain
This section describes changes to the PowerCenter domain.
Domain Ports
Effective in version 9.0, each worker node in the Informatica domain uses the following ports:
Domain port. Port number for the node. Service Manager port. Port number used by the Service Manager on the node.
139
The Service Manager logs show the Service Manager port number. Previously, each node used a single domain port. On a worker node, the node and Service Manager on the node used the same domain port number. On a gateway node, the node, Service Manager on the node, and the Administration Console used the same domain port number. The Service Manager logs showed the domain port number.
No Yes Yes
Yes No Yes
Yes Yes No
Domain Configuration
Effective in version 9.0, the domain configuration metadata uses the same database structure as the Model repository database. You should back up the domain configuration database on a regular basis. You can restore the domain configuration from a backup. If you migrate the domain configuration to another database or change the database connection information, you must update the database connection on all gateway nodes. You create a license object when you install PowerCenter. You can also create license objects in Informatica Administrator. You can view the license and all licensed options in Informatica Administrator.
140
New Commands
The following table describes the new infacmd as commands:
Command CreateAuditTables CreateService DeleteAuditTables ListServiceOptions ListServiceProcessOptions UpdateServiceOptions UpdateServiceProcessOptions Description Creates audit tables. Creates an Analyst Service. Deletes audit tables. Lists Analyst service properties that you can update. Lists Analyst service process options that you can update. Updates Analyst service properties. Updates Analyst service process properties.
ListServiceProcessOptions
UpdateApplicationOptions
141
Command UpdateDataObjectOptions UpdateServiceOptions UPdateServiceProcessOptions CancelDataObjectCacheRefresh ListApplicationObjects ListApplications PurgeDataObjectCache RefreshDataObjectCache RenameApplication StopApplication UndeployApplication
Description Updates data object properties. Updates Data Integration Service properties. Updates Data Integration Service process properties. Stops a refresh of data object cache. Lists the paths of objects in an application. Lists deployed applications for a Data Integration Service. Purges data object cache. Refreshes data object cache. Renames a deployed application. Stops an application from running. Remove an application from a Data Integration Service.
142
143
Description DisplayCPULogger DisplayCheckpointsLogger DisplayEventsLogger DisplayMemoryLogger DisplayRecordsLogger DisplayStatusLogger FileSwitchLogger ShutDownLogger UpdateLoggerService
Command Displays the CPU time spent. Reports information about the latest checkpoint file. Displays events being waited on. Displays memory use. Displays counts of change records. Displays status of a PowerExchange Logger service. Switches to a new set of log files. Shuts down a PowerExchange Logger service. Updates a PowerExchange Logger service.
144
ListSQLDataServices
Lists all SQL data service names for a Data Integration Service. Purge virtual table cache. Refresh virtual table cache. Rename a SQL Data Service. Stop a SQL data service.
145
CHAPTER 27
PowerCenter (9.0)
This chapter includes the following topic:
Reference Table Manager, 146
146
CHAPTER 28
Business Glossary
Effective in version 9.0, you create links between business terms and reference tables by associating a business term with a URL to a reference table. You can specify the following types of URLs:
Informatic a Analyst URL. You can include a URL to a reference table in Informatica Analyst. Other URL. You can include any valid URL to a reference table.
Previously, you created a link from a business term in Metadata Manager to a reference table in Reference Table Manager. Reference Table Manager does not exist in version 9.0.
assignParameterFile
cancel
147
Command createRepository
Description Creates the Metadata Manager warehouse tables and import models for metadata sources in the Metadata Manager repository. Creates a resource using the properties in the specified resource configuration file. Deletes Metadata Manager repository content including all metadata and repository database tables. Deletes the resource and all metadata for the resource from the Metadata Manager repository. Exports a custom resource or business glossary from the Metadata Manager repository to an XML file. Writes all properties for the specified resource to an XML resource configuration file. Imports a custom resource or business glossary from an XML file into the Metadata Manager repository. Creates the links between resources that share a connection assignment to run data lineage analysis across the metadata sources. Lists all resources in the Metadata Manager repository. Deletes metadata for a resource from the Metadata Manager repository. Restores a repository backup file packaged with PowerCenter to the PowerCenter repository database. Updates a resource using the properties in the specified resource configuration file.
createResource deleteRepository
deleteResource export
getResource import
link
updateResource
Data Lineage
Effective in version 9.0, a data lineage diagram displays the database schema name as the parent object of a table, view, or synonym. Previously, a data lineage diagram did not display the database schema name. Instead, Metadata Manager displayed the schema name in the location of the object. You could view the object location in the Details panel and when you moved the pointer over the object.
Logging
Effective in version 9.0, the Load Monitor is renamed the Load Details tab. The Load Details tab contains the following views:
Log view. Contains resource load events. Metadata Manager updates the Log tab as it loads a resource. Objects view. Contains summary information for metadata objects. Errors view. Contains summary information for errors.
148
Sessions view. Contains session statistics for each session in the PowerCenter workflows that Metadata
You can export all of the contents in the Load Details tab to a single Microsoft Excel file. The file contains a worksheet for each view. You can export the contents of the Links view separately to analyze the missing link details. Previously, the Load Monitor listed the object, link, and error information in the Summary tab. You saved each tab in the Load Monitor window to a separate PDF file. After you upgrade, the Load Details tab for all resources displays "No items to show" until you reload the resources.
Resources
This section describes changes to resources.
Resources
149
Use the supported versions to load metadata from these sources. You can still create, edit, and load resources from these deprecated versions. However, Informatica cannot help you resolve an issue encountered on a deprecated version.
Previously, you needed to reload resources after modifying the connection assignments.
150
Resources
151