You are on page 1of 36

Data Migration Strategy for AFP Reengineering Project

Version 1.0

Data Migration

Strategy for AFP Reengineering Project

Version 1.0

TCS Confidential Page 1 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

ABOUT THIS DOCUMENT

Purpose
The purpose of this document is to lay out the structure for data migration for an application
reengineering project

Intended Audience
This document is primarily for the use of consultants associated with Data Migration projects

Glossary
TCS Tata Consultancy Services

TCS Confidential Page 2 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Contents
1 INTRODUCTION.........................................................................................................4
1.1 Background................................................................................................................................ 4
1.2 Scope......................................................................................................................................... 4
1.3 Assumptions............................................................................................................................... 5
1.4 Open Items................................................................................................................................. 6
1.5 System Description..................................................................................................................... 6
1.5.1 Source System Description.................................................................................................... 6
1.5.2 Target System Description..................................................................................................... 6
2 Migration Approach...................................................................................................7
2.1 Introduction................................................................................................................................. 7
2.2 Planning..................................................................................................................................... 8
2.3 Analysis.................................................................................................................................... 10
2.3.1 Analysis of Source Inventory................................................................................................ 10
2.3.2 Source Data Analysis........................................................................................................... 11
2.3.3 Data Cleansing..................................................................................................................... 12
2.3.4 Extraction programs............................................................................................................. 12
2.3.5 Analysis of Target Database................................................................................................ 12
2.4 Strategy definition..................................................................................................................... 14
2.4.1 Proof of concept................................................................................................................... 14
2.5 Design...................................................................................................................................... 15
2.5.1 Mapping rules....................................................................................................................... 16
2.5.2 Data Format – Source to Text File.......................................................................................16
2.5.3 Non-key source fields becoming key fields in target.............................................................17
2.5.4 Date and time stamp / load date fields and user id..............................................................17
2.6 Construction............................................................................................................................. 17
2.6.1 Data migration approach...................................................................................................... 18
2.6.2 Source System (VSAM / DB2) to Staging database (Oracle)...............................................18
2.6.3 Staging database (Oracle) to Target database (Oracle).......................................................20
2.6.4 Cleansing............................................................................................................................. 21
2.6.5 Audit trail data, summary data.............................................................................................. 22
2.6.6 Reports................................................................................................................................. 22
2.6.7 Special Requirements.......................................................................................................... 22
2.7 Testing...................................................................................................................................... 23
2.7.1 Validation............................................................................................................................. 25
2.7.2 Audit..................................................................................................................................... 26
2.7.3 Testing Lifecycle.................................................................................................................. 26
2.8 Pre-Implementation(Dry Runs)................................................................................................. 27
2.9 Implementation......................................................................................................................... 27
2.9.1 Cutover Considerations........................................................................................................ 30
2.9.2 Change Control.................................................................................................................... 30
Scope of Change Control....................................................................................................................... 30
2.9.3 Traceability........................................................................................................................... 31
2.9.4 Backup and Recovery.......................................................................................................... 33
3 Risks.........................................................................................................................33
4 Guidelines................................................................................................................34
5 Recommendation.....................................................................................................35

TCS Confidential Page 3 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

1 INTRODUCTION

Background

ING has initiated a program to replace the existing Pension Fund Management applications running in
Mainframe systems with the J2EE application. This project will replace these legacy systems with more
flexible systems with up-to-date technological platforms and functionality.

As part of the replacement, the data from the existing mainframe applications should be moved to the
target Oracle database. ING has invited Tata Consultancy Services (TCS) Limited to prepare the data
migration strategy document. This document details the various steps necessary for the life cycle of the
data migration project that will feed the legacy data to state of the art “Oracle database”.

Scope
The scope of this document is to define the strategy for the various phases of data migration. The phases
in this data migration project are as follows.

 Preparation Stage

o Planning

o Analysis

o Design

o Construction

o Testing

 Implementation Stage

o Pre-Implementation/Dry Runs

o Implementation/Production data migration

This document also addresses

 Tools

 Cutover Considerations

 Proof of Concepts

 Guidelines

TCS Confidential Page 4 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

 Special Requirements

 Change Control and Traceability

 Challenges and Risks

 Roadmap

Assumptions

 Target data model will be developed iteration wise and so may undergo several changes. So source
data analysis has to be done based on evolving target data model. Once the target data model is
baselined unmapped fields in source will be further analyzed to confirm whether it can be actually
ignored.

 ING will define the strategy, analysis, design and construct scripts for Data Cleansing. TCS will
support and complement this.

 The production cut-over window for implementation is expected to be 48 hours over a weekend. This
could change based on the volume of the record, relationship between tables which defines the order
of migration

 The source inventory and corresponding data are based on the assumption that the go-live date will
be on a weekend that doesn’t fall on a month-end.

 The current strategy is to extract the data from mainframe source using Informatica power exchange
and use Informatica powercenter to transform and load Oracle target database

 Existing master data will not be updated during migration window.

 Data to be migrated is frozen before the start of the migration

 There will not be any explicit lock on the data to be migrated by any of the application accessing the
data during the outage window

 The current existing model is base lined and assumed to be 100% complete.

 The scope of data migration project is to migrate only the data that will be accessed by the target
application system

 ING will provide the list of concurrent activities during the outage window. The impact of it will be
studied and the outage window size will be decided

TCS Confidential Page 5 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Open Items

 Need for migrating the historic and back up data in tapes which are not going to be accessed by
the target application, target table and the strategy for the same will be analyzed by ING and
discussed and finalized. Both ING and TCS will discuss and resolve on the extra effort involved
and the impact on the plan.
 The possible solution could be one time migration either through regular interface or using
scripts and then incremental migration using regular interface.
 The scope of migrating the data present in tapes which are rarely used by the application needs
to be finalized. The feasibility of the target application system accessing the same tapes needs
to be studied
 Risk analysis, Implementation details, Roll back strategy, handling of exceptions are yet to be
finalized.
 The migration strategy of back up data when the layout is different is yet to be finalized.

System Description

The scope of the data migration project is to migrate the data from the existing mainframe system to
ORACLE Database. The System architecture related to these systems is:

1.1.1 Source System Description

System Operating Software Database


System Platform

1 IBM Mainframe COBOL, VSAM, CICS DB2


OS/390

1.1.2 Target System Description

System Operating Software Database


System Platform

1 UNIX Java/J2EE Oracle

TCS Confidential Page 6 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

TCS Confidential Page 7 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2 Migration Approach

Introduction
Data migration is process by which data is moved from source databases to target databases. Currently
source data is in VSAM and flat files and DB2 tables in Mainframe. This data needs to be moved to target
databases in Oracle. The various phases involved in this endeavor are as described below.

 Preparation Stage

o Planning

o Analysis

o Strategy Definition

o Design

o Construction

o Testing

 Implementation Stage

o Pre-Implementation/Dry Runs

o Implementation/Production data migration

The preparation stage will be used to develop data migration strategy and the data migration programs.
This will be tested in non-production environment. All the factors that influence Implementation stage like
business requirements, data volumes and infrastructure constraints should be taken into account in the
preparation stage. This stage is very vital in the success of any data migration program. This stage will be
done in seven iterations and will be synchronized with the iterations in ING Core AFP Project.

The actual execution of the data migration programs on the production data will be done in
implementation stage. Implementation is planned in two phases. Preceding each implementation will be a
Pre-Implementation or dry run to test the data migration scripts with production data in simulated test
environment.

TCS Confidential Page 8 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Planning
All planning activities required for data migration will be done in this phase. Other activities that will be
taken up in this phase will be the finalization of source inventory, creation of standards, strategy for data
analysis, cleansing, implementation and selection of tools.

Assumptions
 Project Plan is available

Activities

SL Category Task Schedule (Week-


Day)
1 Planning Conduct kick-off meeting for the phase
2 Planning Prepare detailed plan for the strategy documentation

TCS Confidential Page 9 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

SL Category Task Schedule (Week-


Day)
phase
3 Planning Prepare detailed plan for the Iterations
4 Planning Consolidate source inventory.
5 Planning Creation of standards.
6 Planning Identify and evaluate tools for data migration
7 Planning Set up environment for next phase
Planning Identify candidates for Proof of Concept(POC)
14 Documentation Document results of proof of concept (PoC) for identified
candidates.
15 Tools Finalize the list of tools & environment setup definitions
16 Environment Identify development/testing environment.
18 Configuration Identify, document and obtain approval for the
Data configuration and reference data requirements
26 Acceptance Define Acceptance Criteria

Deliverables
 Updated Project Plan
 Source Inventory list
 Inventory List for POC
Tools

The tool required for various phases of data migration has been identified during POC and the list is given
below.

Sl Process Sub-process Tools


1 Extraction VSAM Informatica Power Exchange
DB2 Informatica Power Exchange
File Comparison DFSORT,COBOL
2
Transformation Informatica Power Center, COBOL
3
Loading Informatica Power Center Source Analyzer and Warehouse
4
Designer
Cleansing Pre Extraction << ING >>
5 Extraction <<ING >>
Transformation << ING /TCS >>
Target Database << ING >>
Data Analysis Manual/SQL/Excel
6
Audit Informatica
7
Validation Informatica Reports
8
Reporting Informatica
9
Scheduling Informatica Power Center Workflow manager
10

Analysis
Detailed analysis of source and target databases will be carried out in this phase. Data analysis will be
carried out to understand the contents of source data and documented. Data cleansing requirements are
documented and criteria for extraction audit and validation of source data are agreed upon.

TCS Confidential Page 10 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.1 Analysis of Source Inventory

The VSAM files, DB2 tables and flat files (structures, data and copybook layouts) are assumed to be base
lined for inventory purposes. As Archive data migration will take place if archives are in current source
format, their inventory needs to be documented.

When data is migrated from VSAM and DB2 to Oracle, the data that needs to be migrated and the data
that is left in source because of duplications etc. need to be identified as part of scope analysis.

Sl Description Quantity Link for the list


1 No of VSAM files in 667
inventory
List of VSAM files

2 No of DB2 tables in 313


inventory
list of tables

3 No of VSAM files to
be migrated

4 No of DB2 tables to
be migrated
5 No of VSAM backups

6 No of DB2 backups
7 Volume of data

8 Size of DB2 database 25GB


9 Size of VSAM 245GB
database is

10 No of DB2 Tables
with Reference Data
11 No of VSAM files with
Reference Data
12 No of DB2 tables with
transaction data
13 No of VSAM files with
transaction data
14 No of DB2 tables with
Master data
15 No of VSAM files with
Master Data
16 No of Databases in
the system

TCS Confidential Page 11 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.2 Source Data Analysis

Data analysis for all the source entities needs to be documented. This will be done iteration wise
based on the evolving target data model. ING will provide the field description, ranges, and
domain values for all the fields. This will help us in deciding whether an unmapped source field
can be ignored or not. The following excel format is agreed upon and ING and TCS will jointly
complete for all the VSAM files and DB2 table attributes and their descriptions.

Field Analysis
Template.xls

As part of Standardization measure, the domain values of the source database may have to be
standardized for target (based on international standards, ING specifics or new application
design). Such domain values should be agreed upon and signed off well in advance, as part of
analysis phase.
The analysis should also cover the following aspects of source and target data model,

- Business dependencies between the entities

- Understanding of multiple record layouts

- Technical dependencies between the entities

- Database specific constraints that may have potential impact on the data conversion (for

example the impact of migration of COMP-3, OCCURS, REDFINES, etc. from a mainframe

environment to Unix/Oracle)

2.1.3 Data Cleansing

Based on the data analysis, the fields that need to cleansed should be identified. Data cleansing
is required to ensure that only accurate, consistent and complete data is loaded into target
database. Data cleansing will be required for
- Junk Characters/Characters not supported by Oracle like nulls

- Invalid Domain Values

- Domain value standardization

- Values not within Range of the field

- Format consolidation (eg, dates , amount fields)

- Referential integrity (eg, affiliate RUT in any transaction table should also be present in

affiliate master)

TCS Confidential Page 12 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

The cleansing requirements should be documented clearly, stating the present conditions and the
proposed corrective action. The field analysis template itself can be used for documenting
cleansing requirements. Data cleansing requirements and routines will be provided by ING. We
also need to identify at what stage the cleansing rules can be applied (extraction , transformation
or load)

2.1.4 Extraction programs


The extraction rules will be based on the business need and the data required for each iteration.
Extraction rules to extract data from the source (VSAM / DB2) needs to be defined jointly by ING
and TCS and the same will be incorporated in the extraction programs.

2.1.5 Analysis of Target Database


Once the target database design is completed and baselined the following table will be updated

Sl Table Name Total Not Date Unique Key


Null
1

Total

Assumptions
 Updated Project Plan is available
 Finalized Source inventory list for current iteration is available
 Target data model for current iteration is available

Activities

SL Category Task Schedule (Week-


Day)
1 Analysis Document base-lined source inventory
2 Analysis Categorize the source entities in “Reference, Transaction and Master”
3 Analysis Identify candidate field. Analyze and understand the domains, range/set
of valid values of the identified candidate fields.
4 Analysis Analyze the source and target data models for cardinality,optionality and
relationships
5 Analysis Understand the record identifiers for data stores with multiple layouts
(Internal to COBOL programs – may be hidden in the data definition)
6 Analysis Understand the impact of environment specific constructs like
compressed data items (Comp variables in COBOL), repeating data
groups (Occurs clause in COBOL) , reusage of storage
space(Redefines and value clause in COBOL), date structures(date

TCS Confidential Page 13 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

SL Category Task Schedule (Week-


Day)
may not have century part, maybe Julian date)
7 Analysis Identify System Dependencies (eg, Character set in mainframe is
EBCDIC while it is ASCII in UNIX. Date format is Date + Time in target
Oracle while it may not be the case in source)
8 Analysis Classify the entities that “must be converted for the target”, entities that
“must be only used for transformation”, entities that are “redundant”,
entities that are “not required for target”, entities that are “in question”.
Identify the owner for the entities that are “in question”
9 Analysis Finalize and document the criteria for data extraction
10 Analysis Identify the right source based on the discussion with maintenance and
business team. Right instance of the data.
11 Analysis Define general flow for migrations process (VSAM extract flat files
versus master files)
12 Analysis Review the standards for data mapping from target to source.
13 Data Cleansing Identify and document data cleansing requirements.

Deliverables
 Data analysis findings
 Updated Inventory list

Challenges
 It is essential to baseline both source and target data models to reduce rework. However it is not
practical when analysis is done in iterations. It is vital that any changes to the source and target
baseline should be informed to the data migration team immediately. The changes should be
immediately analysed and data analysis document updated.
 All environment specific constructs should be identified. It should be verified whether the
informatica tool will handle it. If the tool does not handle it suitable solutions should be identified
for migrating them to target. During POC we have identified the following list
o Character set in mainframe and Unix are different. Mainframe uses EBCDIC while Unix
uses ASCII. Informatica power center is able to handle this conversion.
o Occurs , and Redefines can be handled by Informatica power center.
o For Occurs depending we have to manually alter the data to make it the maximum
number before loading in informatica power center. Usage of Power Exchange will be
able to address this problem.
o Loading of DB2 null data into Oracle was found to be a problem. An extra field was
manually added before every column that may contain null. This is to hold the null
indicator. Usage of Power Exchange will be able to address this problem
o In Oracle Date is defined as YYYY-MM-DD-Time but in Vsam files it can be of any
combination. A transformation rule was written in power center to transform source date
to target format
o We could not find any Julian dates in POC. So a strategy for transforming it is not
identified. Further analysis to be done to check if ING core AFP system uses Julian date
or not.

Strategy definition

TCS Confidential Page 14 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

The various strategies related to data migration are defined in this phase. The data migration strategy
document is prepared in this phase. A proof of concept has been done to validate the migration strategy
for extraction, transformation and load. This document will be updated with best practices and lessons
learnt after each iteration.
2.1.6 Proof of concept

The migration of following VSAM files and DB2 tables will be the scope for the Proof of concepts. The
extraction, transformation and load will be done for these sample data in the development environment.

VSAM

1. CUENTAS.PROD.PMC321D1
2. CUENTAS.PROD.PMC321D2
3. CUENTAS.PROD.COT905D1
4. BENEFIC.PROD.PCB150D1
5. BENEFIC.PROD.PCT200D1
6. BENEFIC.PROD.PPR100D1
7. INCORPOR.DESA.EAE02M
8. INCORPOR.PROD.EAE03M

DB2

1. PER_INC_REC
2. RECLAMO
3. EMPLEADO
4. DIRECCION_POSTAL
5. DIRECCION_PERSONA

The proof of concept is completed and the following is proved

1. Extraction of VSAM file to flat file and ftp to text file


2. Extraction of DB2 to flat file and ftp to text file
3. Mapping and transformation between source and staging tables using informatica power center
4. Mapping and transformation between staging and target tables using informatica power center
5. Loading of VSAM and DB2 extract flat file into staging tables using informatica power center
6. Moving data from staging database to target database by executing the mapping and
transformation scripts in informatica power center workflow
7. Transfer of scripts and integration between offshore and onsite

Assumptions
 Project Plan is available

Activities

SL Category Task Schedule (Week-


Day)
1 Strategy definition Define data migration strategy
2 Strategy definition Define testing strategy
3 Strategy definition Define Implementation strategy
4 Strategy definition Create data migration strategy document

TCS Confidential Page 15 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

SL Category Task Schedule (Week-


Day)
5 POC Do proof of concept
6 Review Review the data migration strategy document
7 Presentation Presentation to selected audience
8 Sign-off Obtain sign-off from Clients on the strategy documents

Deliverables
 Data Migration Strategy Document

Design
The objective of this phase is to define a set of rules to transform data from source to target. The mapping
rules are based on source and target data structure and domain information provided by ING. The
mapping repository is created to maintain list of mapping rules.
The following template is used for mapping repository

"Mapping repository
template.xls"

2.1.7 Mapping rules


Direct mapping
Identify target fields with one to one relationship with source and specify the source value to be
used

Transformation rule mapping


For remaining target fields, document transformation rule in detail, specifying source fields and

computation clearly.

Default value mapping


Identify target fields that have no relation with source and specify the default value to be
populated. Functional and design people need to be involved in taking these kinds of decisions.
Unmapped fields in source
Unmapped fields in source will be analyzed and risk of not migrating these data will be
estimated. This analysis will be done only if the field is unmapped even after all iterations are
completed.

2.1.8 Data Format – Source to Text File

VSAM to Flat file (Any COBOL Layout to Free format Layout)

TCS Confidential Page 16 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

All the following conversions will be done by Informatica Power center itself based on the standards
VSAM DATA TYPE Flat File REMARKS

COMP-3 Free format Signed Edited text


numeric field
COMP-2 Free format numeric display field

Signed Decimal Sign edited text field


COMP Free format Signed Edited text
numeric field

Numeric Numeric

DB2 to Flat file


DB2 Data Type Flat file REMARKS

SMALLINT PIC -9(4) 1 <= n <= 15


INTEGER PIC -9(9) 16 <= n <= 31

DECIMAL (p,s) or PIC –9(p).9(p-s) p – precision


NUMBER (p,s) PIC 9(p).9(p-s) s – scale
1 <= p <= 31 and
0 <= s <= p
CHAR (n) PIC (n) 1 <= n <= 255

2.1.9 Non-key source fields becoming key fields in target


For the source data where the non-key fields become key fields in target, proper integrity and the
order of migration should be performed so that the complete information is retained without any
data inconsistency and data redundancy. Unique & non-unique constraint will be analyzed and
the proper validation technique will be ascertained, so that there is no undefined information in
the system. Proper indexes will be defined in the target system so that the access time is within
the SLA.

2.1.10 Date and time stamp / load date fields and user id
Date will be ORACLE format of mm/dd/ccyy with default value set by the business. Time stamp
will also be default ORACLE timestamp. For load dates field and update user id field the date
when the loading/migration is done and a default User Id will be assigned.

TCS Confidential Page 17 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Assumptions
 Baselined source and data model for the current iteration is available
 Data analysis findings is available

Activities

SL Category Task Schedule (Week-


Day)
1 Design Create mapping repository
2 Review Review the mapping repository

Deliverables
Mapping repository

Construction
The objective of this phase is development of data migration suite. This phase consists of creation of
extraction , transformation and load scripts for data migration.

2.1.11 Data migration approach


The data migration would occur in 2 stages. In the first stage data will be migrated from the source
systems to the staging ORACLE database in the same layout as the file layouts. In the second stage we
will move the data from the staging database to the target ORACLE database. Following diagram depicts
the data migration steps:

Source CONVERSION Target

JCL Text
VSA M COB OL File

Transfo
Loading Target
rmat ion
Conversion Database
(AS IS) Loading
FTP DB

DB2 Te xt
DB2 Unload File

Cleansing Cleansing Cleansing Cleansing

EXT RACT TRANSFORM LOAD LOAD

TCS Confidential Page 18 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.12 Source System (VSAM / DB2) to Staging database (Oracle)

2.1.12.1 Extract

The strategy of extraction given here is without Informatica Power Exchange. The impact of
having Informatica Power Exchange on extraction process will be analyzed and the same
will be updated in this document after iteration 1.

The data from VSAM file and DB2 table is extracted by the following steps

Steps for extraction of VSAM files


1. REPRO JCL’s to extract the VSAM files into flat files will be written. Temporary
variable to be used in the JCL and the name of the file to be hard coded at only one
place.
2. The JCL should also contain a step to FTP the flat file in binary format to FTP Server.
3. Logical grouping of the files in one JCL should be determined and standardized

Steps for extraction of DB2 tables


1. DB2 Unload JCL’s to extract the DB2 tables’ data into flat files will be written.
Temporary variable to be used in the JCL and the name of the table and the load
file to be hard coded at only one place
2. The JCL should also contain a step to FTP the flat file in binary format to FTP
Server.
3. Logical grouping of the tables in one JCL should be determined and standardized

Pre processing - Informatica Power center

1. The COBOL format programs with copybook names with “.CBL” extension will be
written
2. The copybooks in the same folder with ".CPY” extension will be copied
3. The source descriptions will be defined in Informatica Source Analyzer
4. Using the source descriptions the target table descriptions (Staging Oracle db)
will be defined in Informatica Warehouse designer
5. The staging target tables are created in the database

Note: Any compatibility issues between Mainframe data and loading data into Infomatica Power
center will be analyzed and the extraction process may have an impact. The document will be
updated accordingly.

The following are also done as part of the extraction process


 Some degree of data cleansing activity will be performed as a part of the
extraction process. These will include replacing junk characters by blanks,
substituting zero for a numeric field.
 Reporting mechanism on each extraction process will also be developed. This
will report the details of the rejected records, bad records, excluded records, and
bad data.
 Transferring the text files from mainframe to UNIX environment will be performed
by typical ftp. The file to be transferred will be split into number of files and split
files will be compressed by PKZIP software. The compressed files will be
transferred through UNIX box to Informatica server. In UNIX the files will be de-
compressed with the help of PKUNZIP software and will be loaded into
Informatica server.

TCS Confidential Page 19 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.12.2 Transform
The following are the steps that needs to be followed in informatica power center
1. The mapping rules are defined and linked between the source and target in the
mapping designer
2. The transformations rules are designed and scripted in Transformation developer
3. Some degree of data cleansing activity will be performed as a part of the
extraction process.

2.1.12.3 Load
The following are steps involved in loading the data from VSAM and DB2 to staging oracle
database.
1. The reusable sessions are created which will define the mapping
2. The workflow is created in the Workflow Manager which will define which session
needs to be executed and sequence and time of execution
3. The workflow is executed to load the data from the source to the staging
database. The number of workflows will be decided based on the sequence of
the migration
4. The referential integrity will not be maintained in this database.
5. Indexes will be created based on the performance requirements

2.1.13 Staging database (Oracle) to Target database (Oracle)

2.1.13.1 Extract

The data from staging database is not extracted but it is physically represented as mapping
and transformation and the Informatica Power center picks it up from staging database to
the target database.

Pre processing - Informatica Power center

1. The source descriptions (Staging database needs to be defined as source )will


be defined in Informatica Source Analyzer
2. Using the logical target database design the target table descriptions (Target
Oracle db) will be defined in Informatica Warehouse designer
3. The target tables will be available in the database already created by the
application team

2.1.13.2 Transform
The following are the steps that needs to be followed in informatica power center
1. The mapping rules are defined and linked between the source and target in the
mapping designer
2. The transformations rules are designed and scripted in Transformation developer
3. Cleansing activities will also be done here.
4. The data will be ported to Informatica server through UNIX box.

2.1.13.3 Load
The following are steps involved in loading the data from staging oracle database to target
oracle database.
1. The reusable sessions are created which will define the mapping

TCS Confidential Page 20 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2. The workflow is created in the Workflow Manager which will define which session
needs to be executed and sequence and time of execution
3. The workflow is executed to load the data from the staging database to target
database.
4. The referential integrity will be maintained in this database and hence the data
loading will have to be performed based on the defined loading sequence.
5. Indexes will be created based on the Target database schema requirements.
6. Additional indexes may also be necessary to improve the performance
requirements

Note: For the input source data that does not require cleansing in staging will be migrated directly to the
target. The analysis of cleansing for the files plays a major role in deciding this strategy. This approach
will save a lot of time during the implementation

TCS Confidential Page 21 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.14 Cleansing

Data Cleansing at different stages

X%
Cleansing

Transformation
VSAM
VSAM files
files

Y%
W % Ta
Ta
Cleansing
Cleansing by Dat
Dat
COBOL/JCL
DB2
DB2 Tables
Tables Transformation

Staging
Staging Z%
Database
Database Cleansing by
JAVA /SQL

Mainframe Oracle Or
Source Staging Ta

2.1.14.1 Pre Migration (Production Phase)

This process will cleanse all the non voluminous and business non-critical data. The main
purpose of cleaning the data directly in production is to avoid any cleaning activities of the similar
data in the subsequent migration. Therefore the data, which is cleaned, will remain clean
throughout the different phases of migration. The types of data that will be cleaned are:

 Name: These are entity properties data like, Customer name, Customer address, Dealer
name, Bank name DSSO Name.
 Comment: These are entity attributes data, like, description, Comments, Attention fields and
any other fields that are not participants in business validation.

TCS Confidential Page 22 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

 Appropriation: These are any standardization data like, customer name standardization,
address standardization.

2.1.14.2 Extraction Process

This level cleansing process is to clean voluminous business non-critical data. This also includes
the data that are routine and static clean. The data that are cleaned in this process are:

 Technical Data (does not need any Business intervention)


 Default Data (Handling of Space, Null, Date)
 Cleaning of junk character
 User identified incorrect data

2.1.14.3 During Transformation

The major part of the data cleansing rules is applied in this stage. Cleansing at transformation
include while transforming data from source to staging and also while transforming staging to
target. These include:

 Inconsistency in Business
 Domain value (ZIP code, RUT)
 Unmapped data

2.1.14.4 In staging

Some level of cleansing will be done on the data present in the staging table. Either Java
programs or SQL will be written to clean the data present in staging.

Note: If cleansing will be done during transformation and staging, then ING and TCS has to
analyze the impact on the effort involved and the changes to the plan.

2.1.15 Audit trail data, summary data


Audit trail data will contain total number of records migrated and summation of any numeric field.
This will be re-validated in the target system to confirm the correctness of File transfer. Record
rejection, record appropriation can also be included in the Audit data.

Summary data will contain all type of key information for migration of a particular entity. For
example for Affiliate Master, RUT, Name of the affiliate and any other information that are critical
to the entity will be considered.

TCS Confidential Page 23 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

2.1.16 Reports

Exceptions reports will be analyzed, gap will be studied and new/changed data cleansing definition will
be incorporated. Both the rule definition and programs will be configured. Use of any reporting tool will
be analyzed and finalized

2.1.17 Special Requirements

Any special requirements that arise as part of Data Analysis will be documented and updated frequently

Assumptions
 Baselined source and target data model for the current iteration is available
 Mapping repository available
 Data cleansing requirements available

Activities

SL Category Task Schedule (Week-


Day)
1 Construction Extraction routines to be written for extracting data from mainframe
2 Construction Source and target definitions to be created in informatica power center
using information from source and target Data model
3 Construction Mapping and transformation rules are created in informatica power
center based on the information collected in mapping repository
4 Construction Session and workflows are created using power center for executing the
mapping and transformation rules.
5 Data Cleansing Data cleansing rules are also written if required in this stage
6 Validation Finalize and document the criteria for data validation, to the verify the
correctness of migration (Business Validation)
7 Audit Finalize and document the criteria to verify the completeness of
migration (Technical Validation)

Deliverables

 Extraction routines
 Source and target definitions
 Mapping and transformation rules
 Load routines (Sessions and workflows)
 Audit and validation routines

Testing
This phase comprises of testing the data migration suite for each iteration. Testing will check all the
transformations / mappings / workflows / cleansing / audit and validations. Individual test cases need to

TCS Confidential Page 24 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

be prepared for testing out various functionalities. The following matrix illustrates the broad areas that the
test cases will pertain to –

Attributes Measurement plan Remarks


Business 1. Identify all business important fields that can be used Business important
for summation checks for data extracts and in target fields that can be used
important fields
tables for checksums need to
for checksum 2. Perform summations on the identified fields in be requested to ING
incoming data files and match the sum users and it should be
3. Perform summations on identified fields in the ODS included in the extracts.
and match with that of incoming data
Business rules All data elements to be mapped to business rules All Business rules
All data elements and relationship should pass associated should be provided by
business rules ING users and TCS will
(E.g.- Data attribute can contain only one out of a set of do a feasibility analysis
values) for the same
Integrity checks 1. Identify all integrity constraints Integrity constraints
2. All data must pass through associated integrity should be specified by
constraints(Eg- There can be no detail records in the ING users and verified
absence of a master) and validated
Outlier Identify the Minimum, Maximum and default values for Min, Max and Default
conditions data attributes values should be
All data attributes should contain a valid value provided in and verified
Raise alert when invalid values are detected and validated
Alert 1. Identify all steps which need to generate alert (eg- Any specific alert
Invalid incoming data, failed integrity checks, outliers, requirements should be
mechanism
load specified in the ETL
2. Raise alerts strategy to incorporate
the same in
development
Correctness of Identify fields involving complex calculations ING users to specify
calculations Recalculate once loading is complete critical fields involving
Match with previously calculated values complex calculations
and the same should be
incorporated
Audit trail Identify data to be captured in audit trail ( Eg- File name, Any specific audit
number of records on file, Records inserted from file, requirements should be
Capture audit attributes during load process and store in specified in the ETL
Audit table specs and will be
incorporated
Incoming data Identify summary information for input data to be sent Incoming control
summary in additional file (File name, number of records, date summary file
and specification to be
Perform checks on incoming data ( Match record provided and same

TCS Confidential Page 25 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Attributes Measurement plan Remarks


count in control file and actual number of files should be incorporated
received, match in the extract
Raise alert in case of mismatch
Business test TCS will write 40 to 50 test cases to check the Information on critical
cases bussiness scenario for audit. Bussines test cases reports to be provided
could be writing SQL queries to get the data from by ING
target and verify it using existing mainframe data. The
choice of the business criteria can be identified from
the legacy reports or may be provided by ING

2.1.18 Validation
The following are the validations that will be performed to ensure the correctness of the data migrated.

No Category Source Destination Criteria


1 Number of physical record All Entities All Entities Exact match or
deviation justified
2 Sum Field-1; Table-1 Field-1; Table-1 Exact match or
Field-2; Table-2 Field-2; Table-2 deviation justified
Field-1; Table-1 Field-1; Table-3
3 Sum against a branch Field-1; Table-1 Field-1; Table-1 Exact match or
deviation justified
4 Total number of active Field-1; Table-1 Field-1; Table-1 Exact match or
affiliates deviation justified
5 Total number of deceased Field-1; Table-1 Field-1; Table-1 Exact match or
affiliates deviation justified
6 Totals Field-1; Table-1 Field-1; Table-1 Exact match or
deviation justified
7 Status Fields Group By count Group By count Exact match or
deviation justified
8 Null fields Count Field-1 Count Field-1 Exact match
Count Field-2 Count Field-2
9 Blank Fields Count Field-1 Count Field-1 Exact match
Count Field-2 Count Field-2
10 Not Null Fields Count Field-1 Count Field-1 Exact match
11 Duplicate Rows Table-1 Table-1 Exact match
Table-2 Table-2
12 Deleted Rows Justify
13 Key fields (RUT, Folio Group by range Group by range Exact match
Number)
14 Name Fields Compare by key Compare by key Exact match
16 Round off Verify correct Verify correct Exact match

TCS Confidential Page 26 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

No Category Source Destination Criteria


decimal places decimal places
17 Truncation Error on Identified Correct truncation Correct truncation Exact match
Field
18 Exceptions Defined Defined Validate
19 Bad Records Identify Defined Validate

2.1.19 Audit

Audit rules are expected to be defined by the ING Core AFP Data Migration team in the following format.
The auditing should be done based on the reliable reports from business. Business reports will be
provided by ING to be used for auditing

No Category Source Destination Criteria

2.1.20 Testing Lifecycle

 The construction and unit testing will be done by TCS onsite/offshore team after the finalization of
design document. This will be going forward basis.

 The Migration components will be delivered by TCS upon completion of construction/unit testing.

 The components will be validated by ING Data Migration team.

 After this primary validation a bigger Revolution Unit Testing will be performed by ING data
Migration team after ING Core AFP application is delivered. TCS will support the testing.

 After the completion of this phase Performance Testing and Revolution Unit Testing will be
performed in parallel. TCS will support these two types of testing.

Assumptions
 Data migration suite available (Extraction, transformation and load routines)
 Audit and validation routines available
 Source Data for migration is available

Activities

SL Category Task Schedule (Week-


Day)
1 Testing Test the data migration suite
2 Audit and validation Run the Audit and validation scripts and verfy the completeness and
correctness of data migration.

TCS Confidential Page 27 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Deliverables
Tested data migration suite

Pre-Implementation(Dry Runs)
Pre-Implementation or Dry run is the simulation of production implementation in test environment.
The objective is to understand the complexities during implementation, in terms of the window for
data migration, infrastructure requirements and to fine tune the programs and implementation
procedures if required. Data migration implementation is planned in two phases. So Pre-
implementation run will also be done for each of these phases. This will be done by ING Data
Migration team and the Business Capability Team. TCS will support this testing.
 The strategy describes the go-no-go checkpoints after different stages and a Root cause Analysis
(RCA) will be done for each checkpoints. Based on the RCA the Data Mapping, Data Model,
Design, Migration Design, Migration component codes will be revisited and necessary actions will
be taken.

Assumptions
 Tested Data migration suite available for the current implementation phase
 Test environment that is simulated based on production is available
 Source Data for pre-implementation dry run is available

Activities

SL Category Task Schedule (Week-


Day)
1 Pre implementation Test the data migration suite
2 Audit and validation Run the Audit and validation scripts and verfy the completeness and
correctness of data migration.
3 Performance Performance tuning of data migration suite if required

Deliverables
Full volume Tested data migration suite

Implementation
Implementation phase comprises of activities for implementing the actual production data
migration.
The implementation of data migration depends mainly on the implementation window, volume of
data to be migrated and the type of data. On further analysis on data and discussions the
implementation strategy will be finalized.

As of now the implementation of phase 1 roll out alone is considered. Based on further analysis the
document will be updated for the implementation of phase 2 roll out

TCS Confidential Page 28 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

All the back up data will be migrated two weeks ahead and the reference data will be migrated one
week ahead and the transaction and master data will be done in one weekend before live. The
same is depicted in the figure below

Time Line

17 th Sep-
17th Sep- 28
28th
th 29
29th
thOct
Oct 4 th 5
4th th Nov
5th Nov Go-live/Mon
Go-live/Mon
30 th Oct
30th Oct Mon
Mon –
–3 rd Nov
3rd Nov Fri
Fri Sat
22 nd Oct
22nd Oct Sat
Sat -- Sun
Sun Sat -- Sun
Sun 6 th Nov
6th Nov

Catch
Catch up
up
for
for
Testing
Testing changed
changed Apply
Apply
Back Reference
Reference post
Back upup Reference
Reference
post
files Data
Data dated
dated
files Data
Data
Migration
Migration Transac
Transac
Transaction
Transaction -tions
-tions
Data
Data
Data
Data Unlikely
Unlikely 2 nd Nov
2nd Nov Thu
Thu –
–3 rd Nov
3rd Nov Fri
Fri
Testing To
To Change
Change
Testing Apply
Apply
and
and Hold
Hold
Pre
Pre Data
Data
Master
Master
processing
processing View
View only
only Data
Data
Data Migrate
Migrate Frozen
Frozen
Data
Account
Account Files
Files

One off Migration One off Migration and Catch up One off Migration Corrections

Points to be considered to adopt this approach:


1. All the back up data be extracted in 48 hours
2. All the reference data be extracted in 48 hours
3. All the transaction, master and catch up reference data be extracted, cleaned,
transformed and loaded in 48 hours
4. It is assumed that data migrated on first weekend is not going to change at all
5. It is assumed that data migrated on second weekend (reference data) may not
change in one week
6. Additional effort is involved in doing catch up for reference data
7. Testing of the data will be in the parallel run time
8. Incremental migration may be required for reference data

Note: Data cleansing implementation is not considered. The cleansing implementation will have impact on
the strategy defined here and this document will be updated based on the cleansing implementation

Following source files of ING Core AFP System split according to the modules and the best strategy and
time for migrating these data will be tabularized in the following format once the approach is finalized.

# The following no of records and database sizes are based on the available information in production.

Sl System Data # Volu Vertical Horizont Special Link for the Proposed date of
me Split al Split Treatment list of Migration
( (GB) (By Tables/Files
M Design)
)
1 Contracts Transa
ction
2 Contracts Master
3 Contracts Refere

TCS Confidential Page 29 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Sl System Data # Volu Vertical Horizont Special Link for the Proposed date of
me Split al Split Treatment list of Migration
( (GB) (By Tables/Files
M Design)
)
nce
4 Accounts Transa
-1 ction
5 Accounts Master
-1
6 Accounts Refere
-1 nce
7 Claims -1 Transa
ction
8 Claims -1 Master
9 Claims -1 Refere
nce
10 Accounts Transa
-2 ction
11 Accounts Master
-2
12 Accounts Refere
-2 nce
13 Claims -2 Transa
ction
14 Claims -2 Master
15 Claims -2 Refere
nce
16 Pensions Transa
ction
17 Pensions Master
18 Pensions Refere
nce
19 Bonds Transa
ction
20 Bonds Master
21 Bonds Refere
nce

Assumptions
 Full volume Tested Data migration suite available for the current implementation phase

Activities

SL Category Task Schedule (Week-


Day)
1 Implementation Backup the source data to be migrated if required
2 Implementation Backup the target data in phase 2 implementation as
target database would be operational between phase 1
and phase 2 implementation

TCS Confidential Page 30 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

SL Category Task Schedule (Week-


Day)
3 Implementation Execute the data migration suite (Extraction, Cleansing,
transformation, Load scripts)
4 Implementation Resolve and reconcile any data errors
5 Audit and Execute the Audit and validation scripts and verfy the
validation completeness and correctness of data migration.
6 Implementation Resolve and reconcile any errors encountered
7 Implementation Invoke fallback procedures if unable to Resolve and
reconcile any errors encountered
8 Implementation Make the target application go live

Deliverables
Data migrated to target table as per data migration requirements.
2.1.21 Cutover Considerations

Sl Candidate Issue
1 Master Files Weekend cutover will not have any issue. Go-live on weekday will require the
files to be kept on hold.
2 Quarterly Back up These type of files which are not going to be modified can be migrated two
files weeks ahead
3 Contracts All the contracts related files should go live in the month end only
4 Deceased Data Data related to deceased can be migrated well in ahead as they are not going
to be modified
5 Closed Claims All data pertaining to closed claims can be migrated well in ahead

6 Inactive affiliate The data related to inactive affiliate can be migrated well in ahead as they are
not going to be modified
6 DB2 tables Weekend cutover will not have any issue. Go-live on weekday will require the
record locked
7 Maintenance Stop online users to do any maintenance transaction in the last 3-4 days
before implementation. This will make the database more static.
Changes
8 Regulatory Stop applying regulatory changes in the last 1 month before implementation
Changes
9 Final Backups prior After the completion of the batch cycle final backups need to be taken.
to migration

2.1.22 Change Control

Scope of Change Control

TCS Confidential Page 31 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Sl Artifacts Owner Repository Formal


1 Source Database Schema Business Analyst Team
2 Source Data Business Analyst Team
3 Target Data Model Business Analyst Team
4 Extraction Rules Business Analyst Team
5 Extraction Programs Technical Team
6 Extraction Jobs/Schedules Technical Team
7 Extracted Data on mainframe Technical Team
8 Transfer Programs Technical Team
9 Transformation Rules Business Analyst Team
10 Transformation Programs Technical Team
11 Transformation Jobs/Schedules Technical Team
12 Transferred Data (in UNIX) Technical Team
13 Conversion Database Technical Team
14 Transformed Data Technical Team
15 Loading Programs Technical Team
16 Loaded Data Technical Team
17 Loading Jobs/Schedules Technical Team
18 Cleansing Rules Business Analyst Team
19 Cleansing Programs/Scripts Technical Team
20 Cleansing Report Technical Team
21 Validation Rules Technical Team
22 Validation Programs Technical Team
23 Validation Reports Technical Team
24 Test Case (Unit/Integration) Technical Team
25 Test Script (Unit/Integration) Technical Team
26 Test Result (Unit/Integration) Technical Team
27 Test Report (Unit/Integration) Technical Team
28 Audit Rules Business Analyst Team
29 Audit Programs Technical Team
30 Audit Reports Technical Team

2.1.23 Traceability
The traceability of the data migration artifacts (documents and programs) are to be traced from target
fields to the audit and validation routines in the ING Core AFP system. The following diagram depicts the
traceability requirements in different stages.

TCS Confidential Page 32 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Target Source Audit


Audit
Target Source Clean
Clean Extract
Extract Transfer
Transfer Transform
Transform Clean
Clean Load
Load Clean
Clean Test
Test Validation
Validation

Table Table Rules Rules Spec Rules Rules Spec Rules Cases Rules/Spec

Progra Progra Progra Progra Scripts/


Fields Fields Programs Programs Programs Scripts
ms ms ms ms Programs

Report Report Report


Jobs Jobs Jobs Jobs Results
s s s Reports

Sched Schedules
Schedules Schedules Reports
ules

Following example can be used as a template for traceability matrix

Sl Trace Tracing to
Filed Number <Table Number>-<Field Number>
1 Target Field
2 Target Table
3 Source Field
4 Source Table / File
5 Clean Rule
6 Clean Program
7 Clean Report
8 Extract Rule
9 Extract Program
10 Extract Job
11 Extract Schedule
12 Transfer Spec
13 Transfer Program
14 Transfer Jobs
15 Transfer Schedule
16 Transform Spec
17 Transform Program
18 Transfer Job
19 Transfer Schedule
20 Clean Rule
21 Clean Program
22 Clean Report
23 Load Spec
24 Load Program
25 Load Job
26 Load Schedule
27 Clean Rule
28 Clean Program
29 Clean Report

TCS Confidential Page 33 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

Sl Trace Tracing to
30 Test Case (Unit/Integration)
31 Test Script (Unit/Integration)
32 Test Result (Unit/Integration)
33 Test Report (Unit/Integration)
34 Validation Rule
35 Validation Script
36 Validation Report
37 Audit Rule
38 Audit Script
39 Audit Report

2.1.24 Backup and Recovery


The backup and recovery process for the version controlled artifacts will be placed in Rational Clear Case
tool under corresponding folders. The frequency of the back ups will depend on the type of the artifact.

The strategy developed for the data migration has been modularized to enable restart of the process at
any stage of failure.

The exception handling during outage window will be decided based on the business criticality of the data
that is migrated.

The possible ways of handling exceptions are

1. Stop the migration. Delete everything and start from the first

2. Write the exception in separate file and continue with the migration without inserting
that record

3. Write the exception in separate file and continue with the migration by inserting the
record with pre defined values.

4. Stop the migration. Analyze the exception, solve it and restart the migration from the
last commit point

3 Risks
 Target Database Design is not available on time. This may impact on defining mapping rules and
transformation rules and the whole migration process.

 Delay in Source Inventory analysis by ING

 Delay in Data Cleansing activities by ING

TCS Confidential Page 34 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

 The production cut-over window for implementation is expected to be 48 hours over a weekend.
This window might get reduced.

 Environment readiness

 Cutover window – Network, Link, Database, Extended Production Window

 Software Version Change (Oracle, Informatica, OS)

 Major changes in source due to SAFP Regulatory Changes

 Change in the layout of the files

4 Guidelines
 Pre extraction data cleansing is advisable for voluminous non-critical data

 Vertical split of the source data is preferable only when design of the target data model demands
it. Vertical split to handle the voluminous data is not recommended.

 The scripts written for extraction and all other activities needs to be written in standard formats
and back ups taken periodically.

 Non Critical and back up data can be migrated two weeks before the system goes live.

TCS Confidential Page 35 of 36


Data Migration Strategy for AFP Reengineering Project
Version 1.0

5 Recommendation
1. Target data model for conversion database is yet to be firmed up. Once we have the firmed up
target model, the mapping can be started. However, the iterative development model of ING Core
AFP project definitely demands for an iterative construction and unit testing phase for the data
migration programs. .

2. The assumption of month-end-weekend implementation of the whole ING Core AFP Data
Migration may not hold good. The DM POC can be used as a contingency plan for the complete
implementation.

3. The fine line between the several interfaces and the cutoff scenario of data migration has to be
properly monitored. Several cutoff issues are related to the handling of the interfaces during the
cutover window. We recommend formal weekly interaction among the interface team, migration
team, business capability team and maintenance team.

4. The complete migration life cycle (extraction to loading on target data base) has been designed
with redundancy and modularity. This is to ensure that in every logical break point one can
commit or restart.

5. To minimize the risk, the implementation strategy assumed a one off data migration on the
weekend and incremental build for five days. This portion will include only the data that are
unlikely to change. The data related to active accounts will be migrated on the production cutover
weekend.

6. We recommend a comprehensive traceability matrix based on the target fields on target


database. This will provide the proper insight into the project as well as help in change control
mechanism.

7. The data migration for the phase I and phase II will be assumed to be two separate
implementations.

8. Candidate field analysis for all the source data is recommended upfront to identify the potential
data-cleansing requirement.

6 Responsibility Matrix

"Responsibility
matrix.xls"

TCS Confidential Page 36 of 36

You might also like