You are on page 1of 13

Business Intelligence Application testing

Structured Approach to Business Intelligence ( B.I.) Applications Testing


Copy Right Notice: All content of this document is copyrighted by Thinksoft Global Services. All rights reserved. No portion of the content may be directly or indirectly copied, published, reproduced, modified, performed, displayed, sold, transmitted, published, broadcast, rewritten for broadcast or publication or redistributed in any medium without express permission from Thinksoft Global Services. Readers of this document are deemed to have read and accepted the conditions of this copyright notice.

BI Testing V 1.0

Confidential

Page No. 1 of 13

Business Intelligence Application testing

1. Executive Summary
Clients plan a Data Warehouse Program to significantly overhaul and upgrade the scope and depth of its current reporting and data mining applications. The objective is to equip a much larger user community, whose information needs and requirements have multiplied, with the tools and capabilities for extracting customer centric knowledge from a comprehensive data warehouse. Two key aspects of the project involve: a) Re-designing the data model/schema to cater to the integration of data from all channels and sources and provide customer focused data b) Putting into place and executing a robust and efficient test strategy to avoid costly production time errors, This document consists of two sections: Section 2 lists out some specific capabilities which qualify Thinksoft as an organization best suited to perform the Validation and testing for the DWH initiative. Section 3 sets out in brief the Thinksoft initial draft approach to this assignment.

BI Testing V 1.0

Confidential

Page No. 2 of 13

Business Intelligence Application testing

2. Thinksoft has the Critical Ingredients


2.1 Track Record in similar Testing projects
Customer Centric One View Data warehousing, Morgan Stanley, UK MIS Infoedge, Morgan Stanley, UK Securitization, Morgan Stanley, UK Decision Scope, First Data Solutions, USA Mosaic, Deutsche Bank, UK ProcFee for GMAC, UK Testing of complex reports in various projects

2.2 Projects involving testing complex business rules


Thinksoft also has a track record of validating complex business rules and logic for the following products: Probe Strategy Manager ( Experian) Kondor+ ( Reuters) Urbis ( Unisys) Corporate GL (Finance One) CS Eximbills - Trade Finance (China Systems) Flexcube Funds Transfer ( iFlex Solutions) Straight Through Services-CWS ( Standard Chartered bank)

BI Testing V 1.0

Confidential

Page No. 3 of 13

Business Intelligence Application testing

2.3 Domain focused model that assures coverage


Our test approach, based on knowledge of the domains mentioned below ensures clear understanding of requirements, assurance of complete coverage and reduced time to transfer knowledge. Retail banking Credit cards Investment banking Private banking Lending Systems Insurance Loan Products Mutual Funds Brokerages Payment Systems

2.4 Capabilities for handling Data Intensive testing


Our capabilities in testing Data Intensive applications include: Procedures and Tools Data Mapping Generate Voluminous data subject to business rules Simulation of real time Feed files and message structures

BI Testing V 1.0

Confidential

Page No. 4 of 13

Business Intelligence Application testing

3. A Structured Approach to Business Intelligence ( B.I.) Applications Testing


3.1 Objectives
To create a Strategic Test Approach, which will optimize overall test effort while ensuring adequate test and functional coverage through, Built-in domain based validation techniques Creating stage-wise repeatable test packs Judicious Selection of representative data samples Usage of tools at appropriate stages Adoption of a test Framework amenable to automation

3.2 Key challenges


Challenge Issues Thinksoft Solution

Data Volume

Only selective data can be covered in testing

Sampling Technique Validate for specific data set (based on criteria)

Data Integrity

Aggregation De duplication Cleansing Mapping

Data Mapping document Extensive coverage of Business rules and logic in data simulation

Consolidation of Resultant Data values

Different data snapshots Data in Different sources gets updated or cycles at different times (eg. Daily, weekly, monthly etc.)

Effective planning of the incremental feeds from different sources Optimize on the data feeds from different sources based on requirements

BI Testing V 1.0

Confidential

Page No. 5 of 13

Business Intelligence Application testing

Challenge

Issues

Thinksoft Solution

History of data

Decide on the number of years of data for testing

Decision would be taken in the Test Planning phase after reviewing the reporting requirements of the business users

Concurrence obtained from client

Test Data Simulation

Data Coverage (Business rules and logic)

Data Mapping document Dataguidelines document Automated data generation

Adequacy of data ( validate the reporting requirements)

Complexity of Data transformation

Change in data sources

Update from multiple sources to multiple data elements in star schema

Expected data values arrived at based on the details from Data Mapping and Data guidelines document

Complex logic/ calculations during the transformation stage

End to End testing

Incorrect data mapping in the ETL procedures would impact the reports

Verify data elements and values from Source data warehouse reports

For select data elements for specific data set

Testing OLAP Analytics

Complex calculations, statistics

Use of excel macros and Look up features

BI Testing V 1.0

Confidential

Page No. 6 of 13

Business Intelligence Application testing

Challenge Flexible front end tools

Issues Critical and commonly used features only can be tested

Thinksoft Solution Defined Scope Ensure 100% coverage for the scope identified

3.3 Types of Testing envisaged


ETL Testing Functional Testing Performance Testing

BI Testing V 1.0

Confidential

Page No. 7 of 13

Business Intelligence Application testing

BI Testing V 1.0

Confidential

Page No. 8 of 13

Business Intelligence Application testing

3.4 ETL Testing


3.4.1. Scope
1. Ensure that the extraction routines correctly access all the required source data 2. Ensure that all computations involving business rules are proper and correct values inserted in the target DWH tables by the transformation routines 3. Involve Database Techniques, such as, Checking row counts Verification of Summary tables Reviewing the exception logs from the ETL Process o o o o Data type Size Number of rows loaded Number of rows rejected

4. Validations relating to the Primary key, Surrogate keys etc. 5. Verifications relating to consolidation of data obtained from multiple sources at customer level i.e. verification of de-duplication process 6. Verification of Error logs for the following cases: Missing feed files Duplicate upload of the same file Header footer mismatch

7. Verification of roll back procedures adopted while the upload/extraction of data files 8. Verification of audit trail generated for each upload/extraction of source data 9. Processing of Reject records as per specifications

3.4.2. Approach
1. Review of the various data sources and mode of transmission to the data warehouse, Feed file formats and layouts

BI Testing V 1.0

Confidential

Page No. 9 of 13

Business Intelligence Application testing

2. Review data elements in the logical model (star schema) including primary and surrogate keys. 3. Review the mapping of the data elements in the source to target. This would be clearly documented in a Data Mapping document and sign-off obtained 4. Plan for specific source data values to emulate real time data 5. Unit, System and Integration testing should be complete and results verified by the client before the commencement of ETL testing by Thinksoft 6. Thinksoft Test Data Planning would include deciding on the adequacy of data, requirement of historical data etc. depending on the reporting requirements. This would be documented in the Dataguidelines document 7. Data generation for source feed files using Thinksofts proprietory tool - Data Builder 8. Define batch upload and extraction requirements. This would be based on the Batch run requirements, intervals and incremental processing for each data source 9. Populate the target tables by extracting the relevant data from the source files by the ETL programs 10. Random selection of rows from source and comparing them with the data stored in the data warehouse 11. Execute simple SQL Queries to extract rows from the data warehouse based on predefined conditions and to compare them against the source values 12. Data transformations from source to the data warehouse would be verified using the details from data mapping and data guidelines document 13. If no testing or validations is possible on certain data elements (may be due to some difficulties in testing) an explanation and impact of not testing would be documented.

3.5 Functional Testing


3.5.1. Scope
1. Ensure that the data from the DWH Schema or the Star Schema gets correctly reflected in the reports 2. Verification of Querying features in the OLAP/REPORTS tool
BI Testing V 1.0 Confidential Page No. 10 of 13

Business Intelligence Application testing

3. Verification of data items involving computations, units in which the data items are displayed etc. 4. Verification of data items that are not directly available from the data warehouse schema 5. End-to-End testing involving verification of data elements from the Source file to the way it is represented in the reports 6. Verification of Reports for the following: Functionality Layout Header Footer Business rules

7. Cosmetic checking for display, fonts, colour and format 8. Verification of graphs as applicable 9. Excel will be used for report extraction and verification

3.5.2. Approach
1. Review the user requirements and reporting needs 2. Define the scope for functional testing by identifying the business critical and commonly used reports and features. This need to be signed-off by the client. 3. Review the mapping of the data elements in the source to the fields/columns available in the reports. 4. Review the data requirements to complete these reports 5. Data required for testing reports would be generated and uploaded into the test data warehouse as part of the ETL testing 6. Execute SQL queries/views to obtain the resultant data from the backend for the verification of reports 7. Functional testing for reports would commence only when data warehouse is fully loaded with the sufficient data as per testing needs and Daily load procedures are operational

BI Testing V 1.0

Confidential

Page No. 11 of 13

Business Intelligence Application testing

8. Reports verification would be done based on thorough checking for select few rows 9. End to End testing checking would be done for select data elements for specific data set. This would be documented and tracked separately

3.6 Performance Testing


3.6.1. Scope
1. The Performance Testing scope would include: Volume testing for feed file uploads from source systems Multiple user simulation Refresh time for Standard and Complex reports

2. Frequently used business scenarios will be tested for performance

3.6.2. Approach
1. Expected Performance benchmarks will be obtained from client for defining performance testing goals 2. Performance testing is recommended at two stages during the testing project Stage I - After functional testing completion, using the test bed to assess if system meets the minimum performance expectations at this level of data volumes. Stage II Performance testing will be repeated in the production environment before roll out, with production data subsets, to assess if system meets the expected performance criteria with increased data volumes .

3. In each of these stages Performance Testing would be done : During upload of feed files at the ETL end Response time for front end User queries/reports

Performance at ETL End: o Increase in volume of data upload to ETL process.

BI Testing V 1.0

Confidential

Page No. 12 of 13

Business Intelligence Application testing

This voluminous data would be generated using an automated Data Generation Tool, Thinksofts Data Builder. Verification on successful upload of data by doing a sample verification Verification of Exception Logs from the ETL Process Compare the actual upload time against the benchmark advised by the client

o o o

Performance at User Front-End: Appropriate Performance Measurement tool, compatible with the data warehouse front end applications, would be identified for Performance testing o o o o Critical and Commonly used business reports to be identified Data Planning and upload for the reports identified Simulate multiple and concurrent users Measure performance for CPU Usage, Memory Usage, SQL Statistics, Response time at different user level and load for Standard and Complex Reports o Random Verification of the outputs

3.7 Test Environment and Access


1. Logistics of testing will be discussed and mutually agreed after acceptance of the approach paper 2. Access to SAS / Business Objects / other front-end Reports, back-end database SQL and Hyperion Interfaces would be provided to Thinksoft.

BI Testing V 1.0

Confidential

Page No. 13 of 13

You might also like