You are on page 1of 37

Data Warehouse Test Effectiveness

Its All about the Planning!


Assuring Data Warehouse Content, Structure and Quality

Wayne Yaddow, 2013

Agenda
Challenges of DWH testing Planning for DWH tests Tester skills for DWH testing Basic ETL verifications Defects you can expect to find Testing tools identified

Wayne Yaddow, 2013

DWH -- Definition
A data warehouse or enterprise data warehouse (DW, DWH, or EDW) is a database used for reporting and data analysis. It is a central repository of data which is created by integrating data from one or more disparate sources. Data warehouses store current as well as historical data and are used for creating trending reports for senior management reporting such as annual and quarterly comparisons.
Source, Wikipedia.org, 2013
Wayne Yaddow, 2013 3

Got DWH Quality Issues?

Reprinted with permission from Informatica Corp., 2013

Wayne Yaddow, 2013

DWH Typical Structure


Sources and DWH Targets

Wayne Yaddow, 2013

The Data Testing Process

Wayne Yaddow, 2013

Plan QA for typical DWH phases

Wayne Yaddow, 2013

Data Model Example

Wayne Yaddow, 2013

Source to Target Mapping


Example

Wayne Yaddow, 2013

Plan QA for DWH Lifecycle


Primary goals for verification

Data completeness
Data transformations Data quality Performance and scalability Integration testing

User-acceptance testing
Regression testing
Wayne Yaddow, 2013 10

Challenges for DWH Testers (1)


1. 2. 3. 4. Often inadequate ETL design documents Source table field values unexpectedly null Excessive ETL errors discovered after entry to QA Source data does not meet table mapping specs (ex., dirty data) 5. Source to target mappings:
1. Often not reviewed by all stakeholders 2. Not consistently maintained through dev lifecycle 3. Therefore, in error
Wayne Yaddow, 2013 11

Challenges for DWH Testers (2)


6. Data models not maintained 7. Target data does not meet mapping specifications 8. Duplicate field values when defined to be DISTINCT 9. ETL SQL / errors that lead to missing rows and invalid field values 10. Constraint violations in source data 11. Table keys are incorrect for important RDB linkages
Wayne Yaddow, 2013 12

Challenges for DWH Testers (3)


12. Huge source data volumes and data types 13. Source data quality that must be profiled before loading to DWH 14. Redundancy, duplicate source data 15. Many source data records to be rejected 16. ETL logs w/ messages to be acted upon 17. Source field values may be missing where they should always be present

Wayne Yaddow, 2013

13

Challenges for DWH Testers (4)


19. SMEs and business rules may not be available 20. Since data ETLs must often pass through multiple phases, transaction-level traceability will be difficult to attain in a data warehouse. 21. The data warehouse will be a strategic enterprise resource and heavily relied upon

Wayne Yaddow, 2013

14

Planning the DWH QA Strategy


Carefully review:
Requirements documentation Data models for source and target schemas Source to target mappings ETL / stored proc design & logic QA deployment tasks / steps Required QA tools

Wayne Yaddow, 2013

15

Planning for DWH QA (1)


Data integration planning (Data model, LLDs) 1. Gain understanding of data to be reported by the application (e.g., profiling) and the tables upon which each user report will be based upon 2. Review, understand data model gain understanding of keys, flows from source to target 3. Review, understand data LLDs and mappings: add, update sequences for all sources of each target table

Wayne Yaddow, 2013

16

Planning for DWH QA (2)


ETL planning and testing (source inputs & ETL design) 1. Participate in ETL design reviews 2. Gain in-depth knowledge of ETL sessions, the order of execution, restraints, transformations 3. Participate in development ETL test case reviews 4. After ETLs are run, use checklists for QA assessments of rejects, session failures, errors

Wayne Yaddow, 2013

17

Planning for DWH QA (3)


Assess ETL logs: session, workflow, errors
1. Review ETL workflow outputs, source to target counts 2. Verify source to target mapping docs with loaded tables using TOAD and other tools 3. After ETL runs or manual data loads, assess data in every table with focus on key fields (dirty data, incorrect formats, duplicates, etc.). Use TOAD, Excel tools. (SQL queries, filtering, etc.)
Wayne Yaddow, 2013 18

Planning for DWH QA (4)


GUI and report validations

1. Compare report data with target data.


2. Verify that reporting meets user expectations Analytics test team data validation 1. Test data as it is integrated into application 2. Provide tools and tests for data validation

Wayne Yaddow, 2013

19

Valuable Books

Wayne Yaddow, 2013

20

Plan for QA Methodology & Tools

Wayne Yaddow, 2013

21

Data Profiling
Column / attribute / field profiling provides
statistical measurements associated with:
frequency distribution of data values number of records number of null (i.e., blank) values data types (e.g., integers, characters) field length unique values patterns in the data
Wayne Yaddow, 2013 22

Identify QA skills (1)


Understanding fundamental DWH and DB concepts High skill w/SQL queries and stored procedures Understanding of data used by the business Data profiling Developing strategies, test plans and test cases specific to DWH and the business Creating effective ETL test cases / scenarios based on loading technology and business requirements

Wayne Yaddow, 2013

23

Identify QA skills (2)


Understanding of data models, data mapping documents, ETL design and ETL coding; ability to provide feedback to designers and developers Experience with Oracle, SQL Server, Sybase, DB2 technology Informatica session troubleshooting Deploying DB code to data bases Unix scripting, Autosys, Anthill, etc. SQL editors Use of Excel & MS Access for data analysis
Wayne Yaddow, 2013 24

Valuable Book

Wayne Yaddow, 2013

25

Basic ETL Verifications (1)


Verify data mappings, source to target
Verify that all tables fields were loaded from source to staging Verify that keys were properly generated using sequence generator Verify that not-null fields were populated Verify no data truncation in each field Verify data types and formats are as specified in design phase
Wayne Yaddow, 2013 26

Basic ETL Verifications (2)


Verify no duplicate records in target tables.

Verify transformations based on data low level design (LLD's)


Verify that numeric fields are populated with correct precision Verify that every ETL session completed with only planned exceptions Verify all cleansing, transformation, error and exception handling Verify PL/SQL calculations and data mappings
Wayne Yaddow, 2013 27

Examples of DWH Defects


1. Inadequate ETL and stored procedure design documents 2. Field values are null when specified as Not Null. 3. Field constraints and SQL not coded correctly for Informatica ETL 4. Excessive ETL errors discovered after entry to QA 5. Source data does not meet table mapping specifications (ex., dirty data) 6. Source to target mappings: 1) often not reviewed, 2) in error and 2) not consistently maintained through dev lifecycle
Wayne Yaddow, 2013 28

Examples of DWH Defects


7. Data models are not adequately maintained during development lifecycle 8. Target data does not meet mapping specifications 9. Duplicate field values when defined to be DISTINCT 10. ETL SQL / transformation errors leading to missing rows and invalid field values 11. Constraint violations in source 12. Target data is incorrectly stored in nonstandard formats 13. Table keys are incorrect for important relationship linkages

Wayne Yaddow, 2013

29

Verifying Data Loads


From RTTS

Wayne Yaddow, 2013

30

DQ tools / techniques for QA team


TOAD / SQL Navigator Data profiling for value range & boundary analysis Null field analysis Row counting Data type analysis Referential integrity analysis Distinct value analysis by field Duplicate data analysis (fields and rows) Cardinality analysis Stored procedures & package verification Excel Data filtering for profile analysis Data value sampling Data type analysis MS Access Table and data analysis across schemas

Testing Automation Informaticas Data Validation Option (DVO) RTTS QuerySurge Analytics Tools J statistics, visualization, data manipulation Perl data manipulation, scripting R statistics

Wayne Yaddow, 2013

31

Bottom Line Recommendations


Involve test team in entire DWH SDLC

Profile source and target data


Remember: DWH QA is much more than source and target record counts

Develop testers SQL and DWH structure skills


Assure availability of source to target mapping documents

Plan for regression and automated testing

Wayne Yaddow, 2013

32

Planning Dev/Unit Tests


Unit testing checklist
Some programmers are not well trained as testers. They may like to program, deploy the code, and move on to the next development task without a thorough unit test. A checklist will aid database programmers to systematically test their code before formal QA testing. Check the mapping of fields that support data staging and in data marts. Check for duplication of values generated using sequence generators. Check the correctness of surrogate keys that uniquely identify rows of data. Check for data-type constraints of the fields present in staging and core levels. Check the data loading status and error messages after ETLs (extracts, transformations, loads).Look for string columns that are incorrectly leftor right-trimmed. Make sure all tables and specified fields were loaded from source to staging. Verify that not-null fields were populated. Verify that no data truncation occurred in each field. Make sure data types and formats are as specified during database design. Make sure there are no duplicate records in target tables. Make sure data transformations are correctly based on business rules. Verify that numeric fields are populated precisely. Make sure every ETL session completed with only planned exceptions. Verify all data cleansing, transformation, and error and exception handling. Verify stored procedure calculations and data mappings. Some programmers are not well trained as testers. They may like to program, deploy the code, and move on to the next development task without a thorough unit test. A checklist will aid database programmers to systematically test their code before formal QA testing.
Wayne Yaddow, 2013 33

Planning for Performance Tests


As the volume of data in the warehouse grows, ETL execution times can be expected to increase, and performance of queries often degrade. These changes can be mitigated by having a solid technical architecture and efficient ETL design. The aim of performance testing is to point out potential weaknesses in the ETL design, such as reading a file multiple times or creating unnecessary intermediate files. A performance and scalability testing checklist helps discover performance issues. Load the database with peak expected production volumes to help ensure that the volume of data can be loaded by the ETL process within the agreed-on window. Compare ETL loading times to loads performed with a smaller amount of data to anticipate scalability issues. Compare the ETL processing times component by component to pinpoint any areas of weakness. Monitor the timing of the reject process and consider how large volumes of rejected data will be handled. Perform simple and multiple join queries to validate query performance on large database volumes. Work with business users to develop sample queries and acceptable performance criteria for each query.

Wayne Yaddow, 2013

34

Recommendations for data verifications


Detailed Recommendations for Data Development and QA

1. 2. 3. 4. 5. 6.

Need analysis of a.) source data quality and b.) data field profiles before input to Informatica and other data-build services. QA should participate in all data model and data mapping reviews. Need complete review of ETL error logs and resolution of errors by ETL teams before DB turn-over to QA. Early use of QC during ETL and stored procedure testing to target vulnerable process areas. Substantially improved documentation of PL/SQL stored procedures. QA needs dev or separate environment for early data testing. QA should be able to modify data in order to perform negative tests. (QA currently does only positive tests because the application and data base tests work in parallel in the same environment.) Need substantially enhanced verification of target tables after each ETL load before data turn-over to QA. Need mandatory maintenance of data models and source to target mapping / transformation rules documents from elaboration until transition. Investments in more Informatica and off-the-shelf data quality analysis tools for pre and post ETL.

7. 8. 9.

10. Investments in automated DB regression test tools and training to support frequent data loads.
Wayne Yaddow, 2013 35

Plan QA for All DWH Dev. Phases

Wayne Yaddow, 2013

36

Plan methods & tools for testing

Wayne Yaddow, 2013

37

You might also like