You are on page 1of 8

Avijit Roy Nandi

Objective: To Provide the Best Effort while Doing Anything in Life.

Technical Expertise:
7 Years in IT industry. 7 years of IT Experience in Client Server, multi tier and Data warehouse domain. Experience of designing relational and dimensional model. Experience of handling different kind of databases Oracle 7.3, Oracle 8x, Oracle 9i, Oracle9iAS, Developer 2000(4.5& 6i) with UNIX and Windows environment. Detail understanding of Oracle Including conversion of Old data system to new system, Loading the data through SQL* Loader, Export and Import). Extensive working experience in Data Warehousing using ETL tools like INFORMATICA and Ascetical Datastage . Experience in Test Designs, Reviews, Defect logging & tracking, and preparation of Test plan & Test Summary Reports Worked at Client location in London, Prague & USA for a period of 3 years for DHL.

Project Details: Project Name: Noreaster


Organization: Capgemini consulting India Pvt. Ltd.

Duration: April, 2007 to till date


Client: FairPoint Communications, Inc. Team Size: 400 in Offshore and 200 in Onshore. ENVIRONMENT: Informatica PowerCenter 8.1 Oracle 9i, TOAD, HP UNIX 11.23, Windows XP. PROJECT DESCRIPTION: Client: FairPoint Communications, Inc. is a leading provider of communications services to rural communities. FairPoint offer a wide area of services to residential and business customers including: local voice, long distance and data products. The FairPoint Communications are among the more than 1,100 small, independently owned firms known as RLECs, or rural local exchange carriers. Project Description: Cap Geminis goals as per Nor'easter program in helping FairPoint to provide Expertise, leadership, and program management in the full software development life-cycle and support of organizational design activities. The business goals of this program are to assist FairPoint Communications to

provide a transition of all current products and services from Verizon to a new platform within a 15 month timeframe to become independent of the Verizon TSA. This program will define the new organizational structure and transition acquired employees to support the new company in operating and delivering the current products and services. Responsibilities: Act as a team lead for taking care of one module (ERP-HR Systems) in respect to data conversion. Part of analysis, Map the two different systems (people soft to oracle apps).So it was a challenge for me to do this having understanding of two different systems. Drive the client calls with client daily basis to discuss about the mapping analysis, target requirements; source data analysis raises the defect etc. Creating and reviewing Informatica Mappings code developed by the developer under me, coordinate with the testing teams to reach the data till the end.

Project Details:
CONTAINERIZATION

Organization: ITC Infotech India Ltd.


Duration: July 2006 March 2007. Client: DHL IT Services Scottsdale,USA Team Size: 10 ENVIRONMENT: MQT, Data Stage 7.1 (Data Stage Manager, Data Stage Designer and Data Stage Director), Oracle9i, Informix on UNIX AIX 5. Brief description of job duties and responsibilities: System & Integration test planning Analysis of requirements for preparation of test plan using business rules & SRS Preparation of System & Integration Test Plans/Project plan. Preparation of estimates for the entire system test lifecycle Preparation of defect cycle & maintenance plans to be used in Teamtrack defect tracking tool and Mercury Quality Center Preparation of test cases using Mercury QTP tool. Preparation of test reporting approach using Mercury Quality Center Preparation of build reports Risk management

Project Details:

Organization: ITC Infotech India Ltd.


Project Name: European Global Customer Database (EU GCDB) Client: DHL, Czech Republic Duration: Jan 2005 July 2006 Team Size: 16 ENVIRONMENT: Software: Ascential Data Stage 7.1 7.1as ETL tool. Operating System: HP UNIX 11.20 Databases: Oracle 8i. Other tools: Quest Software Toad 8. DESCRIPTION: The main objective of this project is to migrate & validate the customer data existing in the disparate legacy systems of all the integrated DHL partner companies across EU to a central consolidated customer database i.e. Global Customer Database, GCDB is a single consolidated database that encompasses the customer related data existing in various applications across the 18 EU countries today. The GCDB contains data related to customer, account, address, and agreement and so forth from various source applications in the areas of billing, customer services, sales and marketing. ROLE: Actively involved in business/source system analysis of different EU countries, involved in the creation of the GCMF file, GCMF stands for Global Customer Common Message Format (GCMF) layout. It is proposed that GCMF file format layout will be used for providing data extracts from legacy systems to GCDB, from GCDB to global solutions (e.g. Dashboard & COMET) requiring customer data as well as updates back from global solutions. The GCMF may be created either when, Extracting customer information from Legacy systems for uploading into GCDB Or data from GCDB is to be provided applications requiring customer information Or global solutions need to provide customer information updates back to GCDB.The GCMF will be used for providing a complete extract of customer information as well as for incremental updates. Playing a major role in validating the legacy data, using general/country specific business validation rules while migrating the data into GCDB.

Project Details:
Organization:ITC Infotech India Ltd.

Project Name: COMET Client: DHL, London(UK)


Duration: Feb 04 Jan 05 Team Size: 25 Enviornment: Ascential DataStage 7.1 as ETL tool,ProfileStage 7.0 and QualityStage 7.0. Informix as Database. Operating System: HP UNIX 11.20

Overview: The project objective is basically integration of data from different location to a centralize location for business prospective. As per the project scope, seventeen countries (like Italy,Sweden,Norway,Switzarland,Denmark etc.) data are going to integrate in a one place. This project had two phases which are migration and production. In the migration phase basically various type of data like customer, sales, account for a particular country are extracted and then validated, filtered the data. In the Production phase that validated date goes to the COMET server after doing all kind of look up validation. Apart from that also production phase takes care of ongoing data, which loads the data daily basis. Responsibility: Understand the data and Business Rules. Update the mapping document. Coding in Data Stage for Job Designing. Test the jobs with predefined set of results and data

Project Details:
Organization: WDC, Kolkata.

Project Name: GM GRF MARS Client: General Motors Corporation. Duration: July 2003 - Jan 2004. Team Size: 18 Enviornment: Software: Informatica 6.2 as ETL tool. Databases: Oracle 9i Other tools: Microsoft Visual SourceSafe (VSS) 6.0, Toad 7.0.4.26 (Quest S/w Organization), Advance Query Tool (AQT) SQL 6.1, Lotus Notes 5.1.3 Overview: GM (General Motors) has been investigating solutions to provide comprehensive business information services to WWP Executives, Managers and individuals, since at least 1997. Difficulty in reaching consensus on needs, scope, technology, service providers and cost have lead to partial solutions being

implemented, but a complete business intelligence and data management solution has not been previously defined or implemented. (Metrics and Reporting System) MARS will provide on-demand, web-delivered, self-service business information to the broad GM Worldwide Purchasing organization, worldwide. The business information must be consistent, complete, accurate, up-to-date and verifiable. IBM understands that GM is seeking expert assistance in implementing the (World Wide Purchasing Metrics) WWP MARS system. IBM looks forward to mutual success by using applicable components from our GM Global Reporting Factory (GRF) delivery model to execute this project. In the first phase incremental data will be pulled from GIF database. Then the data will be validated and will be subsequently loaded into GMNEW ODS tables. Errors logging will be done for any record that do not pass through the validation. Responsibilities: Go through the functional specification for particular object. Construct the mapping based upon the functional specification Power center designer and run the mappings through work flow manager and work flow monitor. Test (Unit/Integration) the jobs with data according to test plan. Finally loaded the data from source to ODS staging area.

Organization: TSG (Total Solutions Group, Kolkata).


Project Name: Nestle Globe DC/ETL- Globe Client: Nestle Global Duration: December, 2002-June, 2003. Team Size: 40 Enviornment: Software: Informatica 6.0 as ETL tool, Server: Windows 2000(Server) and Windows 2000(professional)/98 as client. Databases: IBM DB2 7.2, Oracle 8i. Other tools: Microsoft Visual SourceSafe (VSS) 6.0, Advance Query Tool (AQT) SQL 6.1, Embercadero Rapid SQL 7.1.5, Lotus Notes 5.1.3 Overview: Project GLOBE at Nestle is the biggest SAP implementation in the world with 55 sites and 300 different SAP installations. It is also one of the largest ETL projects in Europe. This was a challenging project because of its size and because of the extremely tight deadlines. The assignment includes the following activities: Design of ODS schema for the data and metadata Design of Powercenter mappings and workflows Construction of mappings Unit Testing of Mappings System Testing of Mappings

Migration of Market legacy data to SAP has been done in two phases - a market dependent phase and a generic phase. The market dependent phase largely deals with the extraction and pre-processing of legacy data into a preliminary data structure that already resembles the final SAP structure. Once legacy data is molded into a uniform zed structure, generic data processing can be applied. This system Extract, Transport and Load data from different market having legacy system like Malaysia, Chili, Poland etc, to the ERP (SAP) placed at like Switzerland, Poland and other countries. Informatica Repository Server situated in Switzerland, Informatica PowerCentre 6.0 as a client connects server from IBM office Kolkata has implemented this ETL technology. Informatica PowerConnect 6.0 helps to communicate between Repository server and SAP/AS400. Responsibilities: As system consultant for this project, primarily responsible in the development of Switzerland, Poland objects. This involved understanding the functional specification, creation of a Technical specification, and creation of a Unit Test plan, development using Informatica 6 and testing the object. The following tools in informatica PowerCentre are used for different activity: Repository Administratora) Create repository like ETLGL100 (Switzerland), ETLPL300 (Poland). b) Create user and manages the users. c) Folder management (copy, delete etc.) PowerCentre Designera) Analyzing source from any RDBMS, Flat files. b) Designing target Warehouse and creating target from Informatica 6.0 designer. c) Creating necessary Transformations. d) Creating necessary mapping for Migration. e) Working with reusable logics like Mapplets. Workflow Manager & Workflow Monitora) Create session. b) Create workflow. c) Monitor the submitted task through workflow monitor using logs and debugs the session. Power Connecta) SAP/R3 for connect to the SAP. b) PowerConnect AS/400 (Detail 4.2.0 of Striva Corporation).

Organization: TSG (Total Solutions Group, Kolkata). Project: BskyB CRM Program Client: BskyB (British sky Broadcast)
Software: Informatica 6.1 as ETL tool. Server: Windows 2000(Server) and Windows 2000(professional)/98 as client. Databases: Oracle 8i on UNIX. Other tools: Microsoft Visual SourceSafe (VSS) 6.0, Toad 7.0.4.26 (Quest S/w Organization), Advance Query Tool (AQT) SQL 6.1, Embercadero Rapid SQL 7.1.5, Lotus Notes 5.1.3 Overview:

BSkyB is a giant broadcasting company and has several systems running for their daily operations. On top of these heterogeneous systems they have a datawarehouse solution. Data from these different systems flow to the datawarehouse on a scheduled basis. BSkyB has opted for Informatica for catering to this requirement of getting data from different source systems and putting them to one target the datawarehouse. Informatica developments are carried out depending on the functional specifications of the several interfaces. The source system is from Oracle database where data gets populated from a CRM system developed in Chordiant. The target database is also Oracle where data is accessed by the MIDAS datawarehouse. The extract is carried out in two stages - Source to ETL Staging and then to DW Staging to DW. The transformed data is used for DW reporting. Responsibilities: User requirement study. Analysis and design. Prepare functional specification for particular object. Design of ODS schema for the data and metadata. Construction of mapping and workflow through Powercentre designer. Test (Unit/Integration) the mapping with data according to test plan. Finally loaded the data from source to datawarehouse staging area.

Organization: National Informatics Centre (NIC), Ministry of Information Technology,


WBSU, Kolkata On behalf of SYSCOM.

Project: Information Management for Promotion of Administration in the Commercial Taxes (IMPACT), West Bengal.
Client: Directorate of Commercial Tax, Government of West Bengal.
Duration: October 2001 December 2002.

Environment: Database: Oracle 9i. Internet Developer Suite: Oracle9iAS Portal, Developer 2000(Form 6i & Report 6i) Application Server: Oracle 9i Application Server Other Tools: ER Win 4.0. Operating System: HP Tru 64 UNIX 11, Windows 2000 Advance, Windows 98 Team Size: 15 Overview: IMPACT will provide an Integrated Management Information System covering all activities under VAT of the Directorate of Commercial Taxes, W.B., at its head quarters and its Circle offices, Charge offices, Zonal offices, Range offices and Check posts. The web-based architecture will provide authenticated and authorised access to its central database from any node in the network. IMPACT would enable West Bengal Commercial Taxes Directorate to switch over to a VAT based model, building up a comprehensive database, provide a strong management information system, integrate all the offices of the Directorate through LAN/WAN and would thus benefit the Directorate and the tax payers as well. Role:

Flowchart creation for the processes (viewable pages) and their structure. Creation of Database Objects - Involve in designing and normalization of tables. - Data Conversion from Old system to new system. Defining table level and column level constraints in the tables. Creating views, indexes, synonyms, and sequences. Development of Forms - Creating and Linking forms with the Database Tables - Developing various triggers like Validation Triggers, Format Triggers Extensive Querying on Database Calling various Graphs and Reports inside Forms Preliminary Testing with Sample Data Building Reports -Creating of various reports including character based report using Report Builder -Calling Graphs inside Reports - Performance Testing with Sample Data Building Graphs -Creating Charts using Graphics Builder - Preliminary Testing with Sample Data

Qualification and Personal details: -------------------------------------------------- DoEACC O & A (60%) Level in 2000 from Electronic Research & Development Centre of India, Kolkata. B.sc in 1997 with Physics,Chemistry and Mathematics from Calcutta University. H.S (62.5%) in 1995 and Madhyamik (71.5%) in 1993 from West Bengal. Date of Birth: 04-01-1977. Passport No: B 2507877. Marital Status: Married

You might also like