Professional Documents
Culture Documents
Page 1 of 5
IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
Classic federation Classic federation provides standards-based relational access to non-relational mainframe data by using applications and tools that use SQL. Change-data capture Classic data event publishing and Classic replication use change-data capture technology that monitors source databases or file systems for changes to data that you want to publish or replicate. Classic data event publishing Classic data event publishing captures changes to mainframe data as they occur, and publishes them to applications or to files. Classic replication Classic replication maintains a copy of your mainframe data by using a subscription paradigm that pulls data changes to the copy as they occur. Classic integration with IBM information management products IBM InfoSphere Classic products deliver your mainframe data to numerous IBM informationmanagement products, including IBM InfoSphere Information Server components, IBM Optim Integrated Data, and IBM Cognos Business Intelligence.
1. IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
Classic federation
Classic federation provides standards-based relational access to non-relational mainframe data by using applications and tools that use SQL.
SQL support
Classic federation enables you to read from and write to mainframe data sources by using SQL. You can
http://dsclusterprd:9080/infocenter/advanced/print.jsp?topic=/com.ibm.swg.im.iis.producti... 9/13/2011
Page 2 of 5
return SQL result sets to federated queries or update the data at the source. Classic federation provides secure and reliable SQL query and update support for reading and writing your data. In addition, Classic federation takes advantage of multi-threading with native drivers for scalable performance. Query and update data from multiple source systems as if they were one system. Examples include transactional updates across multiple systems, such as Transactional VSAM and CA-Datacom. The level of update support depends on the capabilities of the source database management system (DBMS).
2. IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
Change-data capture
Classic data event publishing and Classic replication use change-data capture technology that monitors source databases or file systems for changes to data that you want to publish or replicate. Change-data capture provides low latency and minimal processing overhead on the mainframe by using methodologies that are native to the source system, such as exits, log readers, or application programming interfaces (APIs). When change-data capture detects modifications to source data, it initiates the following process: 1. Change-data capture sends a description of the change to the Classic data server in a change message. 2. The data server performs any required transactional processing. 3. The data server delivers the change in a relational format that downstream databases, applications, or tools can process. Parent topic: A closer look at Classic capabilities and features This topic is also in the IBM InfoSphere Information Server Introduction.
Last updated: 2010-09-30
3. IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
http://dsclusterprd:9080/infocenter/advanced/print.jsp?topic=/com.ibm.swg.im.iis.producti... 9/13/2011
Page 3 of 5
Classic data event publishing captures and publishes change data with minimal latency. It uses a publishing paradigm that pushes change messages to the specified target. You can stage the changes in WebSphere MQ queues or in hierarchical file system (HFS) files. Applications can read them in near-real time or periodically. IBM provides an IBM WebSphere MQ license for use with IBM InfoSphere Classic Data Event Publisher for z/OS. Data events in the source system drive subsequent processing by any downstream relational application or API that requires the changed data: Applications Process integration middleware Java Message Service (JMS)-aware applications, tools, or message brokers Classic data event publishing delivers current information to the programs and interfaces that drive your information management solution: Business intelligence (BI) systems Extract-transform-load (ETL) products Enterprise application integration (EAI) products Customer relationship management (CRM) applications Financial applications Web interfaces Payroll systems Inventory systems
Message formats
Classic data event publishing packages change data as self-describing relational messages in XML or delimited format. You choose the message format that balances your requirements for performance and information. Delimited formats, which use specified characters as separators between fields, yield the highest performance by reducing message size and workload. XML formats provide transactional or row-level information for ease of integration.
4. IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
Classic replication
Classic replication maintains a copy of your mainframe data by using a subscription paradigm that pulls data changes to the copy as they occur. Classic replication moves mainframe data to either relational or non-relational copies with near-real-time latency. Use Classic replication with failover or load balancing scenarios, data warehousing solutions, or for data consistency across networks. After equalizing the source and target data store with an initial load or
http://dsclusterprd:9080/infocenter/advanced/print.jsp?topic=/com.ibm.swg.im.iis.producti... 9/13/2011
Page 4 of 5
refresh, you subscribe to data changes to maintain synchronization. Classic replication delivers mainframe data to a full spectrum of industry-standard database management or file systems: DB2 servers DB2 federated servers that deliver data to the following databases: Oracle SQL Server Informix Sybase If you replicate your data to relational databases, you need IBM InfoSphere Replication Server to write the changes to DB2. Classic replication uses the Q Apply program in IBM InfoSphere Replication Server to update the target server. WebSphere MQ, included with IBM InfoSphere Classic Replication Server for z/OS, transports the change messages to Q Apply. Classic replication also supports replication from VSAM files to mirror copies on the mainframe. VSAM to VSAM replication differs from replication to relational targets in that the data server writes changes directly to a copy of the data store or hands off changes to a CICS program that writes them. Parent topic: A closer look at Classic capabilities and features This topic is also in the IBM InfoSphere Information Server Introduction.
Last updated: 2010-09-30
5. IBM InfoSphere Foundation Tools, IBM InfoSphere Information Server, Version 8.5
Feedback
http://dsclusterprd:9080/infocenter/advanced/print.jsp?topic=/com.ibm.swg.im.iis.producti... 9/13/2011
Page 5 of 5
InfoSphere DataStage uses Classic federation to extract, transform, and repackage z/OS data as you need it before delivering it to one or more target databases. In addition to the flexible model that supports SQLbased access, InfoSphere Classic products provide a unique interface for InfoSphere DataStage that optimizes data transformation flow while minimizing the use of z/OS resources. This special interface shifts the reformatting of the z/OS data from the Classic federated server that runs on z/OS to the InfoSphere Information Server engine on Linux, UNIX, or Microsoft Windows. As with the standard interface, InfoSphere DataStage uses Classic metadata to ensure consistency and reusability. Other examples of how Classic federation provides access to IBM information management products are as follows: InfoSphere QualityStage matches, merges, and standardizes your z/OS data. InfoSphere Information Analyzer analyzes and reports on the content of your z/OS data sources. This product helps you to discover patterns, models, and anomalies within the databases or files. Optim Test Data Management accesses your z/OS data, masks sensitive and private information within the data while creating a test database, and improves the quality and speed of iterative testing tasks. Optim Data Growth reads your z/OS data and repackages that data into a compact format that you can use to archive historical transaction records from mission-critical applications. Cognos Business Intelligence reads your z/OS data for business intelligence and analytical processing.
http://dsclusterprd:9080/infocenter/advanced/print.jsp?topic=/com.ibm.swg.im.iis.producti... 9/13/2011