You are on page 1of 6

Ravi Verma

Informatica Developer - Accenture Service Pvt. Ltd


Bangalore, Karnataka - Email me on Indeed: indeed.com/r/Ravi-Verma/e3e5e4011cf18a40
A competent and result oriented professional with 36 months of experience in Software Development and
Project Execution.
Extensively involved in extraction of data from for ETL Process
Currently Associated with Accenture Services Pvt. Ltd., Bangalore as an Software Engineering Analyst.
Sound experience in project execution activities including module Estimation, Design, Build, and quality
management of the applications.
Experience of end-to-end implementation of the SDLC, including finalization of design, implementation for
Data Migration, System Integration and Datawarehousing projects.
Possess excellent interpersonal, communication and analytical skills with demonstrated abilities in interacting
with clients.
Significant experience working with Clients, Project Managers and technical teams for securing & executing
concurrent multi-technology BI projects.
Conversant with designing, building and supporting applications in, Informatica Data Quality (IDQ) 9.5,
Informatica PowerCenter 9.5, SQL.
Willing to relocate: Anywhere

WORK EXPERIENCE

Informatica Developer
Accenture Service Pvt. Ltd - Bangalore, Karnataka - February 2013 to Present
3
Title: Planning/ Stabilization of business applications for Client Nike.
Environment: IDQ 9.5, Flat Files, Oracle 10g, Linux
Description: Nike is the one of the largest US based companies which deals with the Apparels and Footwear
in Fashion Industry. My team's responsibility was to develop the
Mapping Specifications and implementing business rules according to the business
requirements. Further Profiling the Mappings and Creating Data Quality Metrics and
Cleansing of the data and performing the cleansing activities further.
Responsibilities:
Extensively involved in extraction of data from Oracle and Flat files
Designed data quality validation process using IDQ Tool to load check the health of data by running the
profiles and validation rules on the data sets.
Designed and developed various Mappings using in Informatica Analyst for the generic rule validation of
data, profiling along with rules and creating the scorecards to get the report.

Used various Transformations like Standardizer, Parser, Decision, Match, Key Generator etc. for better Data
cleansing so as to have clean and consistent Data considering the data quality dimensions.
As a member of Data Cleansing Team, responsible for analyzing, designing and developing mapping
specifications, profiles and scorecards for Data Quality check and further implementing the Cleansing rules.
Developed standard and re-usable mappings and mapplets using various transformations like Expression,
Aggregator, Joiner, Router, Standardizer, Match, Labeler, Parser and other data cleansing transformations
Data quality checking in Analyst and implementing the cleansing business rules in Developer to clean the
data to be loaded into the datawarehouse.
Optimized various Mappings, Mapplets and Scorecards to perform efficient execution.
Unit testing performed on developed Mappings
Title: System Integration of Business data for the Client Mattel
Environment: IDQ 9.5, Informatica PowerCenter 9.5, Flat Files, Oracle, Linux, Putty,
HPQC
Description: Mattel is an American toy manufacturing company. The products and brands it
produces include Fisher-Price, Barbie dolls, Monster High dolls, Hot Wheels and
Matchbox toys, Masters of the Universe, American Girl dolls, board games, WWE
Toys, and early-1980s video game. My team was responsible for performing data
Cleansing activities on the existing data and further developing the ETL mappings and workflows to integrate
the data from different systems.
Responsibilities:
Extraction of data from Flat files, Oracle and XML files.
Performing Data quality checks to ensure the quality of data and interacting with the business analysts
As a member of Data Cleansing Team, responsible for analyzing, designing and developing data quality
Mapping specifications, Profiles and Scorecards for cleansing of master data.
Profiling and analyzing the data to identify all the issues and propose the technical validation rules for the
data quality report generation.
Developed re-usable mapplets and flows to clean the existing master data from the different legacy system
and loading them into the current system.
Designed ETL processes and developed ETL mappings and workflows using Informatica to load from Source
to Target after implementing business logic and transforming the data as per the requirement.
Developed re-usable sessions and flows to integrate the customer ratings and feeds with the product using
set of Transformations like Transaction Control, Expression, Filter, Sorter, Joiner, Router and Lookups etc for
business logics implementation.
Created Unit test Cases and performed Unit testing performed on developed Mappings
Title: System Integration of Business data for the Teliasonera
Environment: IDQ 9.5, Powercenter 9.5, Flat Files, MS, SQL Server, Toad, Oracle, Linux, TIBCO-MFT, Putty
Description: TeliaSonera is the dominant telephone company and mobile network operator in
Sweden, Finland and Denmark. My team is responsible for integration between the two systems. My Team is
responsible for extracting the data from various

legacy systems and load the cleansed data into the current system to be used by transactional flows for SAP
ERP Systems through
Responsibilities:
Extensively involved in extraction of data from Flat files and Oracle database
Responsible for REFERENCE data for RR2R system and extracting the data from various source systems
and loading it into the reference tables after the cleansing activities.
Responsible for implementing the Cleansing rules as per the specifications.
Designed data quality Mapplets for the reference data as well as the transactional data using IDQ to load
from 39 different Source Systems to Target system after performing Data Cleansing.
Created Data Services and deployed as Applications to be used by the business users from the third-party
tools.
Created custom and Quick profiles to analyse the data and the corresponding patterns in the data.
Created scorecards to represent the actual health data for the stakeholders to take further business decision.
Performed Join analysis profiling to get the accuracy of the join prior to the use the same into the Mapplet/
Application
As a member of Data Quality Team, responsible for analyzing the data sets, designing and developing Data
quality mapplets to clean the Reference and Master Data to be used by transaction flows.
As a member of ETL Team, responsible for designing and developing ETL Mappings to generate the target
files for SAP system.
Created the Error handling mapplet which is being used across 350+ mappings
Developed standard and re-usable mapplets for the Reference Auditing of the Target files.
Responsible for implementing Error handling logic, Audit Logic and Data Quality checks to ensure the quality
of data so as to ensure smooth flow of data.
Created and Executed unit test cases on all the developed mappings to ensure smooth delivery across the
phases of the project.

EDUCATION

CBSE
Holy Mission High School - Samastipur, Bihar
2006 to 2008

CBSE in Online Brain Quiz on ibibo


D.A.V. Public School - Kanti, Bihar
2005 to 2006

Bachelor of Technology in Computer Science and Engineering


ANNAMALAI UNIVERSITY

SKILLS
Informatica,ETL, IDQ, Informatica Data Quality (3 years)

ADDITIONAL INFORMATION
AREAS OF EXPERTISE
Technical Skill - Informatica Data Quality (IDQ), INFORMATICA PowerCenter, SQL
Database: Oracle, MS SQL Server
Other Skills: Talend, Unix, Putty, TIBCO - MFT, PL/SQL, SAP HANA Studio, TOAD
Ticketing Tool: HPQC
Industry Domain: Resources, Telecom and Consumer Goods & Services (Fashion)
Functional Skill
Project Execution
Interfacing with Project Leads / Clients for requirement gathering and finalizing of technical specifications.
Adhering to plans & schedules, participating in team meetings for individual projects.
Interacting with team members to ensure smooth progress of project work.
Maintaining quality and coding norms throughout the development & implementation process.
Software Development
Handling various technical aspects like project documentation, system design & integration, monitoring critical
processes & taking appropriate actions.
Designing, implementing the modules and delivering it as per scheduled deadlines, planning and coordinating
post-implementation and maintenance support to the technical support team and client.
Interacting with client for requirement gathering, system study & analysis.
Preparing technical design document as per the functional requirements.
Assisting in functional testing of the new system and ensuring that it meets the user specifications.
Proficient in SDLC activities related to requirements gathering, documentation, prototyping, development,
testing, implementation, user training and support for clients.
Informatica Experience:
Proficient in Data Quality (Informatica) development using Oracle, MS-SQL Server Flat Files as both Sources
and Targets.
Proficient in using Informatica Workflow Manager, Workflow monitor to create, schedule and control
Workflows, Tasks, and Sessions.

Proficient in the development of Data Quality Metrics for checking the health of data and further implementing
the data cleansing rules to clean the data.
Profiling of dataset in the Analyst and analyzing further to identify all the discrepancies and inconsistencies
in the data as per the data quality metrics
Scorecard and charting for analysis purpose after implementing the data cleansing rules.
Cleansing of the data using IDQ developer by using data quality transformations such as (Standardizer,
Parser, Match, Merge, Consolidation, Association, Key generator etc.)
Performed different types of Profiling to Analyse the data to identify the inconsistencies within the data and
proposing the validation and further the cleansing rules for data cleansing.
Created Data Services and applications in IDQ to be used by the third party interfaces for Reporting/
Dashboards.
Created reference tables in IDQ-Analyst and used the same for the data standardization as well.
And further created workflows in IDQ and executed the same through the Unix Jobs.
Testing of Mappings, Mapplets, Rules for Profiles and Custom Development.
Sound experience in Informatica 9.1, 9.5 using components such as Mapping Designer, different types of
Transformations such as (Source Qualifier, Filter, Aggregation, Router, Update Strategy, Lookup, Union, etc.)
and Workflow Manager for creating sessions and workflows.
Deployed workflows and sessions from one environment to another using repository manager.
Proficient in creating Reusable Sessions, Mapplets and Transformations.
Proficient to design ETL process using Informatica Tool to load from Sources to Targets through data
Transformations
Data warehousing/SQL:
Data warehousing design and Physical/Logical Data Modeling, Star Schema and Snow Flake Schema
Good knowledge in data warehousing techniques for Data Cleansing, Surrogate Key Assignment and Change
Data Capture
Loading and Unloading Data from the Data warehouse Tables
Good understanding of Business Intelligence (BI)
Capable in incorporation of various data sources such as Microsoft Access, MS SQL Server, Oracle and Flat
Files into the Staging Area. Competencies in database concepts, SQL analysis
Capable of working with complex SQL queries for analyzing as well as the implementing across mappings.
Strong understanding of business processes & interface with IT and it's all phases of the Software
Development Life Cycle (SDLC)
Good technology and business experience in the areas of Fashion and Communications Industries.

Strong communication and interpersonal skills to maintain effective work relationships with all levels of
personnel
Self-motivated and energetic, able to set priorities with adherence to guidelines and deadlines
Using prototyping skills when appropriate, clarify customer requests/problems and determine appropriate
action with proven success exceeding customer expectations
Capable of working under high stress environment with resource constraints
Strong organizational and problem solving capabilities
Excellent communicator with emphasis on clearly presenting information and developing strong interpersonal
relationships

You might also like