You are on page 1of 7

Avinash Reddy avireddy03@gmail.

com 217-381-7766 Summary

Over 8 years of experience in Information Technology with a strong back ground in Analyzing,
Designing, Developing, Testing and Implementation of Database development in various verticals such as Financial, Health Payer, Pharmaceutical and Telecom Over 7+ years of Data Warehousing experience using Informatica Power center 8.5.1/8.1.1/7.x/6.x/5.x dealing with various Data sources like Oracle, SQL Server, Excel, Flat files, Teradata, DB2. Experience in Loader utilities including SQL Loader, Fast Load and Multi load Experience in integration of various data sources with Multiple Relational Databases like Teradata, Oracle, SQL Server, Sybase and XML files, Main Frames, IBM DB2 and Worked on integrating data from flat files like fixed width and delimited. Hands on knowledge of Informatica Administration. Involved in grid management, creation and upgradation of repository contents, creation of folders and users and their permissions.

Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer. Worked extensively with complex mappings using different transformations like Source Qualifier, Expression, Filter, Joiner, Router, Union, Unconnected / Connected Lookups and Aggregator. Expertise in several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Data Quality, and lookup Tables and ETL data movement.

Strong experience in developing Teradata Loading and Unloading utilities like Fast Export, Fast Load, and Multiload (Mload). Extensive knowledge with Teradata SQL Assistant to analyze and BTEQ scripts to load the data from Teradata staging to Teradata warehouse

Extensively worked on Informatica Designer components Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Mapplet Designer and Documenting the same Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator and XML Source Qualifier. Good experience in using Informatica Workflow Manager to create and schedule workflows and worklets.

Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions. Expertise in developing SQL and PL/SQL codes through various procedures, functions and packages to implement the business logics of database in Oracle Good exposure to development, testing, debugging, implementation, documentation and production support Excellent working knowledge of UNIX shell scripting, job scheduling on multiple platforms like windows NT/2000, HP-UNIX, Sun Solaris

Hands-on experience working with data masking tool DMSuite

A self-starter, target-oriented, self-disciplined and proactive professional with excellent communication, interpersonal, analytical, problem solving, intuitive and leadership skills with ability to work efficiently on multiple tasks in both independent and team environments.

Technical Skills: Operating Systems Languages Databases Data Warehousing Tools

Web Servers Applications EXPERIENCE: Client: Perficient, Denver, CO Sr. ETL Developer

Windows NT/2000/2003/XP, Unix (HP-UX 11.x, Sun Solaris 2.x), Red Hat Linux 5.1 Visual Basic 6.0, ASP, VB.NET, ASP.NET, HTML, JAVA, Java Script, SQL, PL/SQL, XML, COBOL, PERL Scripting. MS Access 2000, Oracle 8/9/10g/11g, Teradata V2R5, V2R3, SQL Server 2000, DB2 UDB,AS400, Lotus Notes 4.0, Sybase, Essbase, Salesforce. Informatica 6.x/7.x/8.1/8.6,Apex data loader, Informatica cloud, Cognos Impromptu 6.5, PowerPlay Web Reports, MicroStrategy 8.0, UpFront, Business Objects4.x/5.x/11i, Web Intelligence2.6, SQL Loader, Erwin, Seibel analytics\OBIEE. IIS 5.0, PWS, Apache MS Office, MS. Project, MS FrontPage, Rational Rose, Visio, Mall-Surfer, Dream Weaver, Crystal Reports, SAS and Visual Interdev, DMSuite.

Mar '11 till date

Project: Perficient implemented this project for JANUS CAPITAL which is an investment company. The project involves migration of employee 401k accounts from Milliman to Fidelity (provider), and making sure iComply is kept in synch with these changes. Responsibilities: Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models (relationship, cardinality, attributes, and candidate keys) as per business requirements using Erwin. Involved in designing the process flow for extracting the data across various source systems. Extensively used Informatica Power Center and created mappings using transformations like Source Qualifier, Router, Lookup, Update Strategy, and Sequence Generator etc. Worked on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies. Experience in data masking. And working with sensitive data. Clearly analyzed the structure of the data to be masked in various databases like Dev, Test and UAT Extensively used Teradata utilities like Fast load, Multiload to load data into target database. Developed sessions using different types of partitions like round robin, hash key portioning for better performance. Involved in Developing UNIX shell Scripts to generate parameter files and executed oracle procedures as batch jobs. Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts. Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency. Performed Unit Testing and verified the data and performed Error Handling on sessions in Workflow, which was extracted from different source systems according to the user requirements. Created scripts for Batch test and set the required options for overnight, automated execution of test scripts Scheduled batch jobs using Autosys to run the workflows.

Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation, variables. Responsible for regression testing ETL jobs before test to production migration. Provided support in handling file validation errors and database down and connectivity errors for daily and weekly batch loads. Investigated and fixed problems encountered in the production environment on a day to day basis.

Environment: Informatica Power Center 8.6, Business Objects XI, Teradata, Oracle 10g, MS SQL Server, XML Files, TOAD, SQL, PL/SQL, Windows XP, UNIX, Autosys, DMSuite. Client: DADS (Department of aging and disability service), TX Sr. Informatica Developer Jul 10 Mar 11

Project: Developing ANE (abuse, neglect, exploitation) data mart for the Texas Department of Aging and Disability Services. Design and develop a system to collect and maintain data on ANE investigations on individuals receiving DADS waiver services. Report on ANE investigations by program and provider. Responsibilities: Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project. Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis. Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin. Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems. Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies. Developed BTEQ scripts to load data from Teradata Staging area to Data warehouse, Data warehouse to data marts for specific reporting requirements. Extensively worked on Teradata utilities like Fastload, Multiload to load data into target database. Optimized high volume tables (Including collection tables) in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods. Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data. Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading. Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer. Used the feature EXPLAIN PLAN to find out the bottlenecks in a given Query, thus improving the performance of the job. Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements. Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis. Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency. Involved in production and deployment phase to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day to day basis. Environment: Informatica Power Center 8.6.1, Business Objects XI, Teradata, Oracle 10g, Sybase 12.5, SQL Server, XML Files, TOAD, SQL, PL/SQL, Windows XP, UNIX.

Client: NYCHA (New York City housing authority), Manhattan, NY ETL Developer

Aug '09 Jul10

Project: New York City housing authority aims is to provide decent and affordable housing in a safe and secure living environment for low and moderate income residents. The project objective is to build a data warehouse, which would use the history data, pricing strategies and pricing practices to assist in providing the statistics to improve the business processes and financial performance. This data will be used by the reporting application to present to the management key measures to take to assist the New York City housing authority continue to provide customers affordable and secure environment. Responsibilities: Modified the logical and physical data models that capture new requirements to reflect the data model using Erwin. Involved in designing the process flow for extracting the data across various source systems. Extracted source data using power exchange from legacy systems. Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user. Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager. Coded BTEQ scripts to load data from Teradata staging to Enterprise Data warehouse. Worked with Teradata External Loader connections such as MLoad, Upsert, MLoad, Update, and Fast Load in the Informatica Workflow Manager while loading data into the target tables in Teradata Database. Optimized the existing applications at the mapping level, session level and database level for a better performance. Developed complex mappings to implement type 2 slowly changing dimensions using transformations such as the Source qualifier, Aggregator, Expression, Static Lookup, Dynamic Lookup, Filter, Router, Rank, Union, Normalizer, Sequence Generator, Update Strategy, and Joiner. Developed Informatica workflows and sessions associated with the mappings using Workflow Manager. Created pre and post session Stored procedures to drop, recreate the indexes and keys of source and target tables. Performed tuning of Informatica sessions by implementing database partitioning, increasing block size, data cache size, sequence buffer length with the help of DBAs, target based commit interval and SQL overrides. Performed data validation, reconciliation and error handling in the load process. Migrated the mappings and workflows from the development server to the Test Server to perform the integration and system testing. Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities. Used PL/SQL procedures and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort. Responsible for writing unit test cases and performing the unit testing for data validity based on business rules. Member of On call team for providing support for daily and weekly batch loads. Environment: Informatica Power Center 8.1.1, Teradata V2R5, Unix, DB2 Power exchange 8.1, Erwin, TOAD, PL/SQL, Flat files, Oracle 10g/9i,SQL Server. Medica, Minnetonka, MN Senior Informatica Developer Jun 08 jul 09

Project: Provider Data Repository (PDR) The goal of the project is to provide an enhanced data load process and adhoc reporting capabilities for the selected Business Users for the Provider Data group repository. It also includes a redesign in the existing schema for the new business requirement Responsibilities: Responsible for business analysis and requirements collection. Translated requirements into business rules & made recommendations for innovative IT solution. Involved in analyzing scope of application, defining relationship within & between groups of data & star schema, etc. Analysis of star schema in dimensional modeling and Identifying suitable dimensions and facts for schema. Involved in the Design and development of Data Mart and populating the data from different data sources using Informatica. Documented data conversion, integration, load and verification specifications. Parsing high-level design spec to simple ETL coding and mapping standards. Maintained warehouse metadata, naming standards and warehouse standards for future application development. Extensively used ETL to load data from wide range of sources such as flat files, SAP BW Sources, and Oracle to XML Documents. Worked with the various enterprise groups to document user requirements, translate requirements into system solutions and produce development, testing and implementation plan and schedule. Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator. Developed, documented and executed unit test plans for the components Involved in Informatica administrative work such as creating Informatica folders, repositories and managing folder permissions. Worked with ETL Developers and SQL SERVER DBA Team Collected performance data for sessions and performance tuned by adjusting Informatica session parameters. Used XML schema to extract data from Oracle, Teradata into XML using Export Option in Informatica. Created pre-session and post-session shell scripts and mail-notifications. Developed Shell scripts using UNIX to drop and recreate indexes and key constraints. Used TOAD to develop oracle PL/SQL Stored Procedures. Creating the design and technical specifications for the ETL process of the project. Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources. Maintain Development, Test and Production mapping migration Using Repository Manager. Involved in enhancements and maintenance activities of the data warehouse including performance tuning. Involved in interactions with the batch scheduling team and the Control-m team, who scheduled most of the processes on the Remote UNIX Box. Extensively used Business Objects for report generation. Environment: Informatica PowerCenter/PowerMart 6.x, Power Center Designer, workflow manager, workflow monitor, Power connect, Oracle 9i, SQL Server 2005, UNIX, Business Objects 6.5, Windows XP, TOAD XO Communications, Plano, TX Jul 07 May 08 Informatica Developer XO Communications is a telecommunications provider offering nationwide communication services such as voice, voip, internet to small and growing businesses and larger enterprises. Infostore data warehouse which is a merger of Allegiance Telecom and XO Communications, Inc. have the integrated data from different sources.

Responsibilities: Implemented the Software development life cycle methodology for creating reports. Requirement analysis in support of Data Warehousing efforts. Worked on PowerCenter Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformations Developer. Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval. Created multiple catalogs and resolved loops by creating table aliases and contexts. Organized data in the reports by using Filters, Sorting, Ranking and highlighted data with Alerts. Creation of Transformations like Sequence generator, Lookup, joiner and Source qualifier transformations in Informatica Designer. Extensively worked on the Informatica Designer, Repository Manager, Repository Server, Workflow Manager/Server Manager and Workflow Monitor. Created Workflows containing command, email, session, decision and a wide variety of tasks to load the data into Target database. Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling. Employed performance tuning to improve the performance of the entire system. Performed Pipeline partitioning to speed up mapping execution time. Code review testing has been done. Developed PL/SQL aggregation programs and stored procedures that BO queries can use. Provided production support by monitoring the processes running daily. Developed Informatica mappings and also tuned them for better performance Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data. Performance tuning of Oracle PL/SQL Scripts Designed and developed Perl scripts. Environment: Informatica PowerCenter 6.x, Work Flow Manager, Business Objects 5.0, Oracle 9i/8i, SQL, PL/SQL, Toad, SQL * Loader, MS SQL Server 2000, UNIX, Windows NT 4.0. Satyam/ Bharti Airtel, Gurgaon, India Jr Informatica Developer Aug 03 May 07

Project: Key Management Report System This project is a critical downstream application of Bharti Airtel Data Warehouse. The main Objective of this application is to calculate Monthly Commissions/Rebates for Bharti Airtel Employees and Indirect Retailers based on certain complex business rules. Commissions System extracts data from Data Warehouse using Informatica and applies complex rules through PL/SQL code and finally arrives at Monthly Payouts that are to be paid to sales persons, distributed across different Sales Channels. Several daily reports are run on this system to evaluate business in various prospects. Responsibilities: Involved in data Extraction and Transformation from various source databases and flat files and Loaded into Data Warehouse using Informatica. Based on the requirements created Functional design documents and Technical design specification documents for ETL. Created tables, views, indexes, sequences and constraints. Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic. Transferred data using SQL*Loader to database. Created Source and Target definitions, Transformation, Mapplets and Mapping by using Designer. Implemented various transformations like Expression, Lookup, Sequence generator, and Source

Qualifier, Rank, Aggregator etc. to load the data from Staging area to Data Warehouse. Used Server manager to create, run and monitor Sessions. Involved in debugging mappings to fix data transformation problems, also involved in tuning Sessions to improve performance.

Environment: Informatica PowerCenter 5.1, Oracle 8i, MS SQL Server 2000/7.0, XML, Windows NT/2000, HP-UX.

You might also like