Transcript
Page 1: Informatica Resume Dec2011

Mahesh Rao Date of Birth: 1 Jan 1982Gender: MaleNationality: India

hyderabad

Phone: Not specifiedMobile: 91-9885524901

Email: [email protected] email: [email protected]

Current Location: Hyderabad

 

  informatica

Work Experience : 2 years 9 months

Skills : informatica

Domain Knowledge : IT/ Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : Accenture

Current Annual Salary : 3.00 lacs per annum

Work Authorization : Need H1 Visa - Authorized to work in India

Highest Degree Held : M.Sc, Computers, Nagarjuna University

2nd Highest Degree Held

: B.Sc, Computers, Nagarjuna University

Preferred Job Location : Anywhere

 

  Skill Name

Last Used Skill Level Experience

Oracle Mar 2011

Intermediate 2 months

 

 

Maheswara Rao .B.V

Email: [email protected]

Contact: 9885524901

CAREER OBJECTIVE

Looking for a challenging and Responsible position in the field of Information Technology and have the flexibility to adapt to any new environment and work on any project wish to utilize this experience in an organization as part of them

PROFESSIONAL SUMMARY

Page 2: Informatica Resume Dec2011

A highly dedicated and Dynamic Professional around 2.10 years of Experience in Data warehouse implementation using Informatica tool as ETL-Developer. Main areas of expertise are analyzing, developing and testing the data warehousing projects. Experience in Development of mappings using needed Transformations using Informatica tool. Experience in the Data Warehousing using Data Extraction, Data Transformation and Data Loading. Used ETL process to Extract, Transform and Load the data into stage area and data warehouse. Preparing the Documentation for the mappings according to the Business Requirements. Good Knowledge on Data Warehousing concepts like Star Schema, Dimensions and Fact tables. Optimizing Informatica Mappings and Sessions to improve the performance. Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager Possess complete domain and development Life Cycle knowledge of Data Warehousing. Excellent problem solving skills with strong technical background and good interpersonal skills. Quick learner and excellent team player, ability to meet deadlines and work under pressure.

EDUCATION

Msc Computer Science from Nagarjuna University, SSN PG college ongole, 2006-2008.

WORK EXPERIENCE

Presently working in TATA CONSULTANCY SERVICE, Hyderabad from June-2008 to till date.

SOFTWARE PROFICIENCY

ETL Tool Informatica Power Center 8.1.1/7.1.2

Database ORACLE 8i/9i,10g, SQL Server 7.0, Microsoft Access

Operating Systems UNIX, Windows

1.) PROJECT NAME

APO(Advanced Planning Optimization)

Client ELI LILLY, US

Period June -2008 to Aprl-2009

Solution Environment

Informatica 8.5, Oracle 10g, Windows 2000, TOAD

Role Informatica Developer

Domain PharmaDescription:The project comprises of various applications across various platforms. These applications cater to provide data to various marketing groups of a major pharmaceutical company. The data is received from an external vendor, is processed and then loaded in various databases and supplied to the respective business owners for further

Page 3: Informatica Resume Dec2011

analysis and processingResponsibilities: Designed the Mapping Technical Specifications on the basis of Functional Requirements. Created ETL mappings using Informatica Power Center to move Data from multiple sources like Flat files, Oracle into a common target area such as Staging, Data Warehouse and Data Marts. Involved into source code review,the mappings created by my team members. Prepared deployment document and assisted to deployment team while migration. Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time. Tested the Informatica mappings with manual testing and assisted QA team. Involved in creating test scripts. Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc to load the data into staging tables and then to target. Aggregator, Lookup, Expression, update Strategy, Joiner, Router, processes, test cases, and error control routines Sequence generator etc transformations in the mappings. Involved in performance tuning by optimizing the sources, targets, mappings and sessions. Monitored transformation processes using Informatica Workflow monitor. Used SQL tools like TOAD to run SQL queries and validate the data loaded into the target tables.2.) PROJECT NAME

Bank of Montreal Management System

Client Bank of Montreal, Canada

Period May2009 to March-2010

Solution Environment

Informatica 6.2, Oracle 8i, Windows 2000, TOAD

Role ETL Developer

Domain BankingDescription:This project is involved in the development of data warehouse for BOM based on four kinds of data marts such as Accounts, Loans, Credit Cards and Insurance. Each data marts represent a collection of data pertaining to a single business process. In loan data marts BOM Involved in disbursal of loan amounts for various purposes like: Personal Loan, Educational Loan, Vehicle Loan, Housing Loan etc. The company requires different levels of analysis regarding loan amount, type of loans, type of customers, type of payment schedules, interest rates (variable or fixed), defaulters list and the penal interest calculations etc. They needed a Data Warehouse so that they could maintain historical data and central location for integration, analyze business in different locations, according profit areas, which could serve a purpose of a DSS for decision makers.Responsibilities: Mainly involved in ETL developing. Analyzed the Sources, targets, transformed the data, and mapped the data and loading the data into Targets using Informatica. Transformations like Expression, Router, Joiner, Lookup, Update Strategy, and Sequence Generator are mainly used. Creating the mapping and transformations

Page 4: Informatica Resume Dec2011

using Informatica Creating the reusable mapping using mapplet designer and that Mapplets are used in mapping Using ETL process to Extract, Transform and Load the data into stage area and data warehouse. Preparing the Documentation for the mappings according to the Business Requirements.3.) PROJECT NAME Banking Management System

Client Shinhan Bank, Korea

Period April -2010 to still date

Solution Environment Informatica 6.1, Oracle 8i

Role ETL Developer

Domain BankingDescription:This project is to provide different reports and support business queries for making intelligent banking decision based on data available in various branches across the country collected over a period of time. Bank wants to have tool, which will help it analyze the current business trends and to make predictions for future business trends. Bank is especially focused on getting the analytic information about targeted customer group, loss making schemes, session wise financial analysis …etc. The bank wants to aggressively launch new product for its loan and deposit customer in a very competitive market for which it wants the edge in identifying key customer on various age group to lure them to buy their product.Responsibilities: Creating Informatica mappings, mapplets and reusable transformations. Design and Development of Informatica mappings with Error logging logic's. Creating session & scheduling of session for loading data. Coding and testing SQL*Loader control files to load data from various data files into Oracle tables. Created various Transformations like Lookup, Joiner, Aggregate, Filter, Expression, Router, and Update Strategy. Perform debugging and tuning of mapping. Involved in preparation of test cases and testing of mapping. Preparation of ETL mapping documents.PERSONAL PROFILE

Name : BV.Maheswara rao

Nationality : INDIAN

Status : Unmarried

Permanent Address : 2-266, Nehru nagar, Guntur Road,

ongole-523002(AP).

Contact Address : SR nagar, Hyderabad-5000082.

[email protected] +91-9035167263

Page 5: Informatica Resume Dec2011

Vibha  Venkannagari Date of Birth: Not SpecifiedGender: FemaleNationality: United States

Phone: 1--408.657.4648Mobile: Not specified

Email: [email protected]

Current Location: US

 

  Informatica

Work Experience : 7 years 1 month

Skills : Informatica

Domain Knowledge : IT/ Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : JPMorgan Chase

Current Annual Salary : Not Specified

Previous Employer : PNC Bank

Highest Degree Held: B.E/B.Tech, Computers, JNTU,Hyderabad

Preferred Job Location : US

 

  Skill Name

Last Used

Skill Level

Experience

Informatica

Not specified

Expert 48 months

 

 Vibha VenkannagariProfessional Summary:• Over 7 years of IT experience in Business Requirements Analysis, Data Modeling, Application Design,Development, Testing, Implementation and Support for Data Warehousing applications. • Over 5 years of Extensive experience in developing ETL solutions using Informatica Power Center.• Developed and designed ETL methodology for supporting data transformations and processing, in acorporate wide ETL Solution using Informatica Power Mart / Power Center 5x/6x/7x/8x/9x.• Created mappings using various transformations like Lookup, Aggregator, Expression, Source Qualifier,Joiner, Update Strategy, Router, Sequence

Page 6: Informatica Resume Dec2011

Generator, Normalizer Transformation etc.• Extensively worked on Performance Tuning of ETL mappings and sessions. • Experience in the design and development of Operational Data Store (ODS), Data Marts, and DecisionSupport Systems (DSS) using Multidimensional Model (Kimball and Inmon), addressing Slowly Changing Dimensions(SCDs).• Involved in upgrading of Informatica power center 6x to 7x and 7x to 8x• Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer,Transformations Developer, Mapplet and Mapping Designer. • Analyzed the session, event and error logs for trouble shooting mapping sessions.• Performed various kinds of testing like Integration, Unit and User Acceptance testing.• Extensively used SQL statements while performing ETL process and applied Query.• Setting up sessions to schedule the loads at required frequency using Power Center Workflow manager,PMCMD.• Strong exposure to metadata manager.• Strong expertise in Relational data base systems like Oracle 8i/9i/10g, SQL Server 2000/2005, MSAccess, Teradata design and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL LOADER. • In-depth understanding of Star Schema, Snow Flake Schema, Normalization, 1st NF, 2nd NF, 3rd NF, Facttables, Dimension tables.• Proficient in writing UNIX Shell Scripts for scheduling.• Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.• Highly proficient in writing, testing and implementation of triggers, stored procedures, functions,packages, Cursors using PL/SQL.• Experience in working with scheduling tools like Autosys, Control M• Knowledge of reporting tools like Cognos and Business Objects.• Outstanding communication and interpersonal skills, ability to learn quickly, good analyticalreasoning and high compliance to new technologies and tools.

Technical Skills:ETL Tools:Informatica Power Center, Power Exchange, IDQAnalysis/Reporting Tools: Business Objects, CognosDatabases:Oracle 10g/9i/8i, MS SQL Server 2000/2005, MS AccessDB Tools:SQL*Plus, SQL*Loader, TOADOperating Systems:

Page 7: Informatica Resume Dec2011

UNIX (LINUX), WINDOWS XP/ 2000/98/95Languages:VB.Net, SQL, PL/SQL, UNIX, Shell ScriptingData Modeling:Dimensional Data Modeling, Star/Snowflake Schema, Fact

Dimension Tables, ERWINScheduling/ Tools: Autosys, Control-MProfessional Experience:JPMorgan Chase, Delaware Aug’10 –PresentInformatica DeveloperJPMorgan Chase is one of the oldest financial institutions in the United States. They are a leader ininvestment banking, financial services for consumers, small business and commercial banking, financialtransaction processing, asset management and private equity. Corporate Data Warehouse serves as thecentralized repository for the bank's assets and liabilities. Transactional data from the loans,deposits, home equity and other legacy applications is fed to the data warehouse on a daily basis. CDWprovides the ability to analyze the exposures at a cross product level and generate various reports (ondemand & canned reports) for analysis and helps the user in monitoring and controlling the exposures.

Responsibilities:• Designed/Developed ETL mappings/workflows using Informatica Power Center 8.x with source/target asOracle and MS SQL Server for implementing account, customer, securities, sales, marketing and operationsusers to have access to the single consolidated view of the data.• Used ETL methodology for all data loads and developed mapping specifications and transformation logicto implement business logic.• Extensively used Informatica Power center and created mappings using transformations like SourceQualifier, Joiner, Aggregator, Expression, Filter, Router, Update Strategy, and Sequence generator.• Used Lookup Transformation to access data from tables, which are not the source for mapping and alsoused Unconnected Lookup to improve performance. • Worked with Informatica Debugger to debug the mappings in Designer.• Used Workflow Manager for Creating, Validating, Testing and Running the sequential and concurrentBatched and Sessions, and scheduled them to run at a specified time.• Develop complex mappings such Slowly Changing Dimensions Type II - Time Stamping in the MappingDesigner.• Involved in upgrading of Informatica power center 7x to 8x.• Worked with business analyst,

Page 8: Informatica Resume Dec2011

source application developers and report developers to identify andresolve data issues between the source systems and the CDW.• Involved in business requirements review. Developed the design specifications for the ETL processResponsible for unit testing and integration testingEnvironment: Informatica 8.6.1/7.1, ORACLE 10g/9i, SQL Server2005, UNIX (LINUX), Windows XP, ShellProgramming, TOAD, Business Objects.

PNC Bank, Lawrenceville, NJ Jan’09 - Jul’10Informatica DeveloperPNC is one of the nation’s premier financial organizations. They offer a wide range of services for allcustomers. Middle office Data Mart(MODM) is a data-warehousing platform compiling trade data originating inthe Bank’s trading desks, validating it against their Settlement functions, enriching it with market andcompany static data and also taking on trial balance and general ledger data. MODM serves as the primaryapplication for much of the bank’s regulatory requirements.Responsibilities: • Worked closely with end users and business analysts for requirement gathering and analysis• Responsible for data analysis.• Extensively used ETL to load data from fixed width and delimited Flat files. • Extensively worked with Repository Manager, Designer, Workflow Manager and Workflow Monitor.• Designed and developed complex mappings, from varied transformation logic like Unconnected andConnected lookups, Router, Filter, Expression, Aggregator, Joiner and Update Strategy. • Created ftp connections, database connections for the sources and targets. • Worked with Variables and Parameters in the mappings to pass the values between sessions.• Implemented Type I Slowly Changing Dimensions.• Created reusable transformations and Mapplets and used them in complex mappings.• Setting up sessions to schedule the loads at required frequency using Power Center Workflow manager,PMCMD and also using scheduling tools. Generated completion messages and status reports using Workflowmanager.• Created workflows and worklets for designed mappings. • Working on Dimension as well as Fact tables, developed mappings and loaded data on to the relationaldatabase. • Wrote session commands to configure pre-session and post-session tasks.• Created Event Wait, Event

Page 9: Informatica Resume Dec2011

Raise, Timer and Control Events in workflow manager according to thebusiness requirement.• Extensively worked in the performance tuning of programs, ETL procedures and processes.• Used Informatica Data Explorer and Informatica Data Quality for Data Quality Management and Dataprofiling purposes.• Tested data using IDQ (Informatica data quality) to resolve inconsistencies in data.• Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse. • Participated in upgrading process of Informatica power center from 6x to 7x• Performed the troubleshooting when the workflows fail in loading the data. • Performed unit testing and Involved in tuning the Session and Workflows for better performance.Environment: Informatica Power Center/Power Mart 8.1/7.1/6.2, Oracle 9.2, Workflow Manager, PMCMD command,IDQ, PL/SQL, SQL, SQL*PLUS, SQL*LOADER, Business Objects, ERWIN, UNIX Shell Scripting , Windows XP.

Hartford Insurance, CTFeb’08 - Nov’08Informatica DeveloperHartford Financial Services is one of the largest insurance and investment companies based in United States.They are a leading provider of life insurance, group and employee benefits, automobile and homeownersinsurance and business insurance. Wealth Insurance Group primarily servers the single platform for highnet-worth and ultra-high net worth individuals. Wealth Insurance Data Warehouse(WIDW) insurance serves thebusiness involving Home, Auto, Private Collections, workers compensation and Yacht etc. The system receivesthe data from several legacy OLTP systems and other external vendors and helps the business as a single pointof source for all the reporting requirements of the group.Responsibilities: • Worked on Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer andTransformation Developer• Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined thehierarchies in dimensions• Extraction, Transformation and loading of data from flat files and oracle sources into oracledatabase. Created Informatica Mappings and Mapplets using different transformations• Involved in Performance tuning for better performance• Responsible for automation of ETL processes using Autosys• Responsible for Creating workflows and worklets. Created Session,

Page 10: Informatica Resume Dec2011

Event, Command, Control Decisionand Email tasks in Workflow Manager• Analyze data quality and report the anomalies to business users using TOAD• Wrote Pre-session and Post-session shell scripts for dropping, creating indexes for

tables, Email tasks and various other applications• Involved in Unit, System integration, User Acceptance Testing of Mappings• Involved in the production support by monitoring the workflows and resolving the problems in theproductionEnvironment: Informatica Power Mart 7.0, Oracle 9i/8i, Shell Scripting, SQL, PL/SQL, TOAD, and UNIX (LINUX),Windows XP, Autosys.

Fossil, Richardson, TXSep’06 - Dec’07Warehouse DeveloperFossil is a leader in the design, development, and distribution of contemporary, high quality watches,Apparel, and accessories. In order to quickly identify customer needs and improve services to its hugesubscribers, a business intelligence system was implemented using Informatica Power Mart software. With PowerMart a large amount of customer-related data was collected from diverse sources which included customerbilling, ordering, and support and service usage. Business users then analyzed this data to understand whichcustomers stayed the longest with the service and why and to gauge the effectiveness of promotionalactivities. The results have improved customer-retention efforts and more focused effective customerpromotions.Responsibilities:• Involved in developing Data Warehouse using Star schema.• Involved in conducting design sessions and reviews. Communicated with the users to observe variousbusiness rules in implementing the data warehouse.• Analyzed the requirements, functional specifications and identified the source data to be moved tothe Data Warehouse.• Developed mappings using Informatica Power Center Designer. • Configured Informatica with different source systems.• Developed PL/SQL stored procedures for loading data into the data warehouse.• Created, scheduled and monitored the sessions and batches using Workflow Manager.• Configured Workflow manager to commit the expected rows.• Worked with the pre session and post session settings and Flat files.• Developed the work flow diagrams and worked with technical specifications.

Page 11: Informatica Resume Dec2011

• Defined the program specifications for the data migration programs, as well as the necessary testplans used to ensure the successful execution of the data loading processes.• Involved in developing the mapping, which contains all the Lookups used by all the facts to improveperformance.• Involved in Performance tuning of mappings and sessions.

Environment: Oracle8i, PL/SQL, Informatica Power Mart 6.2, Windows 2000, Erwin, UNIX (LINUX).

ICICI Bank, India Jun’05 – Aug’06

ETL DeveloperICICI Bank is India’s second-largest bank. They offer a wide range of banking products and financialservices to corporate and retail customers through its subsidiaries in the areas of investment banking, lifeand non-life insurance, venture capital and asset management. The system helps the customer servicerepresentatives to deal and transact with customers loan, credit, debit, portfolios, investment etc. Theoperational data of different financial departments loaded into central Data Warehouse and transformed intodifferent regional data marts. The various corporate metrics/web reports like Credit Profile, ProductProfile, and Funds Transfer etc…are generated periodically. Informatica PowerCenter is used to extract thebase tables in the data warehouse and the source databases include Oracle.Responsibilities:• Involved in the requirement definition and analysis in support of Data Warehousing efforts.• Created Repository using Repository Manager.• Worked Extensively on Informatica tools -Repository Manager, Designer and Server Manager.• Involved in Extraction, Transformation and Loading (ETL) Process.• Created the Source and Target Definitions using Informatica Power Center Designer.• Developed and tested all the backend programs, Informatica mappings and update processes.• Created and Monitored Batches and Sessions using Informatica Power Center Server.• Tuned the mappings to increase its efficiency and performance.• Used Informatica Workflow Manager to create workflows• Workflow Monitor was used to monitor and run workflowsEnvironment: Informatica Power Mart 5.1, Oracle 8i, UNIX, TOAD, SQL * Loader, Windows XP.

Insight Innovative Solutions, India Feb’04- Apr’05 MS SQL Server DeveloperInsight Innovative Solutions is a leading

Page 12: Informatica Resume Dec2011

provider of information technology, consulting, web designing, webhosting, domain name registration, marketing and business process outsourcing services. Insight InnovativeSolution’s single-minded mission is to consistently outperform client’s best vendors by a nominal margin,thereby creating the most efficient customer centric enterprise for Clients and customers.Responsibilities:• Designed logical / physical data models and defined primary key, foreign key using Erwin tool.• Wrote SQL statements using DML and DDL for retrieval of the data.• Developed stored procedures, functions, views and triggers.• Actively participated in gathering of User Requirement and System Specification.• Created new database logical and physical design to fit new business requirements, and implementednew design into SQL Server 2000.• Created Users and Roles, and managed security and permission of database.

Environment: SQL Server 2000, ASP .Net, Excel, Windows 2000

RamaKrishna  Ganni Date of Birth: Not SpecifiedGender: MaleNationality: United States

Phone: 1--972.782.9943Mobile: Not specified

Email: [email protected]

Current Location: US

 

Page 13: Informatica Resume Dec2011

  Informatica

Work Experience : 7 years

Skills : Informatica

Domain Knowledge : IT/ Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : Wells Fargo

Current Annual Salary : Not Specified

Previous Employer : Global Wireless Solutions

Highest Degree Held : B.E/B.Tech, JNTU,Hyderabad

Preferred Job Location : US

 

  Skill Name

Last Used

Skill Level

Experience

Informatica

Not specified

Expert 48 months

 

 RAMAKRISHNA GANNICAREER OBJECTIVE:

Objective: Around 7 years of experience in System Analysis, Design, Development, Implementation, and Testingand implementation of business applications with RDBMS, Data Warehouse/Data Mart, ETL, OLAP, Client/Serverenvironment applications for Financial, Pharmacy and Insurance Verticals. Five-plus years of solid ETL dataintegration and Data Warehouse experience using Informatica Power Center 8.6/8.1/8.0/7.1, Power Mart, Clienttools – Mapping Designer, Workflow Manager/Monitor and Server tools – Informatica Server Manager,Repository Server Manager.

SUMMARY: 7+ Years IT experience in Databases, Data Warehousing & Decision Support Systems. Over 6 Years experience in the Implementation of Extraction, Transformation & Loading (ETL) LifeCycle using Informatica Power Center/ Power Exchange 8.0/7.0/6.x. Experience with the SQL/ PL-SQL in Oracle, SQL Server, DB2, MS-Access as RDBMS with Procedures,Functions, Triggers and SQL * Plus. Experience with handling

Page 14: Informatica Resume Dec2011

Informatica Designer, Informatica Workflow Manager, Workflow monitor andRepository Manager. Identification of User requirements, System Design, writing Program specifications, Coding andimplementation of the Systems. Data analysis and profiling for source and target systems and Data Warehousing concepts, stagingtables, Dimensions, Facts and Star Schema. Expertise in developing and running Mappings, Sessions/tasks, Workflows, Worklets and Batch processeson Informatica server. Experience in Database design, entity relationship modeling and dimensional modeling using Star andSnowflake schemas. Experience of modeling tool using Erwin 7.x/4.x Tools Extensively worked with mappings using different transformations like Filter, Joiner, Router, SourceQualifier, Expression, Union, Update Strategy, Unconnected / Connected Lookup, Aggregator and SCD Type-1,2&3. Experience in tuning Mappings and Sessions for better Performance. Experience of unit testing of each mapping what we developed. Extensively worked on creating, executing test cases and test scripts using Manual/Automated methods. Worked in Production support team for maintaining the mappings, sessions and workflows to load thedata in Data warehouse. Experience in writing Triggers, Stored Procedures, Functions, and Packages using PL/SQL. Experience in UNIX Shell Scripting. Sound theoretical and practical background in the Principles of Operating systems and Multi-threadedapplications. Highly motivated with the ability to work effectively in teams as well as independently.

EDUCATION :

Masters in Computer ScienceTECHNICAL SKILLS :

OPERATING SYSTEMS Windows XP Professional, Server 2003, HP UNIX, LinuxETL Tool Informatica ,Power Center 8.x/7XWEB TECHNOLOGIES HTML, DHTML, XMLLANGUAGES C, C++, PL/SQL, JAVA, VB, .NET, J2EE, SQL, JCL and JMSSCRIPTING LANGUAGES TSL, Java Script, VB Script, HTML, CSS, Perl, UNIX Shell,DATABASES Oracle 9i/10g/11g, SQL Server 2005, MS-Access and DB2WEB SERVERS Web Logic, Apache 2.0., LDAP, LAMP, DNS, NISOther TOOLS Toad, Putty, Tel Net WinSCP, Erwin 7.x/4.x, MS Power

Page 15: Informatica Resume Dec2011

Point, Visio, Remedy and Share Point,Mercury Quality Center, Version Control toolPROJECT/WORK EXPERIENCE:PROFESSIONAL EXPERIENCE:

ORGANIZATION DESIGNATIONDURATION

Wells Fargo Informatica Consultant09/2009 - Present

Global Wireless Solutions – VASr.ETL Developer06/2008 – 08/2009

St. John`s Mercy Medical Center, MOETL Developer 11/2007- 05/2008Volkswagen, MI ETL Developer

08/2006 - 10/2007Bayer, Berkeley CA ETL Consultant

10/2005 - 07/2006Hucon Solutions, Bangalore, INDIA Data Warehouse Developer 05/2004 - 09/2005PROJECT/WORK EXPERIENCE:

Wells Fargo, CA09/2009 - Present

Informatica ConsultantAn overall project is currently underway to Migrate Wachovia Global Treasury Solution (GTS) products from theWGC portal to the existing Wells Fargo CEO Portal, to eliminate redundancy in like Systems, which includescorporate level customer facing portals.

Responsibilities: Analyze the source systems data and the current reports at the client side to gather the requirementsfor the Design inception Extracted and transformed data from high volume data sets of fixed width, delimited and relationalsources to load into target systems Developed and maintained critical and complex mappings and transformations involving Aggregator,Expression, Joiner, Filter, Sorter, Sequencer, Procedure, connected & unconnected Lookup, UpdateStrategy and SQL transformations using Informatica Power Center 8.6 Debugged mappings and sessions by creating break points using debugger wizard Redesigned some of the existing mappings in the system to meet new business logic. Created mapplets, worklets and other transformations that enable the reusability of code. Used parameters and variables extensively in all the mappings, sessions and workflows for easiercode modification and maintenance Created Pipeline session partitions for concurrent loading of data and to optimize the performancein loading target tables Effectively used error handling logic mechanism for data and process errors Performance tuning has been done to increase the through put for both mapping and session level for

Page 16: Informatica Resume Dec2011

large data files by increasing block size, data cache size, sequence buffer length and target based commitinterval. Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automateworkflows and to populate parameter files Coordinated and lead the complete code migration/merge process from Development, UAT to Productionenvironments Involved in extensive backend testing of various modules and documented the results using QualityCenterEnvironment: Informatica Power Center 8.6, SQL Server, Oracle 10g, UNIX, Quality Center, CVSGlobal Wireless Solutions – VA 06/2008 – 08/2009Sr.ETL DeveloperGlobal wireless solutions are a leading independent benchmarking solutions vendor for the wireless industry.Here, benchmarking refers to the process of comparing one operator’s network delivered quality againstcompetitors and comparing network performance between markets. It also provides unique bench markingsolutions for largest wireless carriers worldwide. My main job was to use data marts as the source data whereeach data mart belongs to a client and work on the ETL process using Informatica.

Responsibilities: Collection of requirements from business users and analyzed based on the requirements. Worked in the development, Test, Production support of Informatica. Created the Folder Management i.e. Repository administration using Repository Manager. Created complex mappings using Aggregator, Expression, Joiner, Filter, Sequence, Procedure, Connected& Unconnected Lookup, Filter and Update Strategy transformations using Informatica Powercenter designer. Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the datafrom multiple source systems like Oracle, SQL Server and Flat Files to Oracle. Designed and Implemented the Informatica mappings to extract data from source system to staging andfrom staging to target. developed mappings, mapplets, worklets and other reusable objects Used SCD type1, Type2, Type 3 to load the present and historical data into EDW Extensively used the various features of Informatica and database Performance tuning of the Informatica mappings using various components like Parameter files,Variables and Dynamic Cache Performance tuning using round robin, hash auto key, Key range

Page 17: Informatica Resume Dec2011

partitioning Analyzed existing SQL queries, tables and indexes for performance tuning and advised based on theloading time. Developed UNIX shell scripts and PL/SQL procedures, packages, triggers in Oracle 10g as required. Designed and Developed the Framework Manager Model and Projects using Cognos Developed advanced reports using IBM Cognos8 in Report Studio upon requests from business users. Developed prompts and customized prompts as per business user requirements using Cognos, reportstudio.

Environment: Informatica Power Center 8.6, IBM Cognos8, Oracle 11g/10g, MS Sql Server, Toad, Erwin, UNIX,Windows.

St. John`s Mercy Medical Center, MO 11/2007-05/2008ETL DeveloperSt. John's Mercy is one of the nation's largest Catholic hospitals and the second-largest hospitalin metropolitan St. Louis. . The clinical Outcomes Data Mart is a Star Schema Data Repository, maintained inan integrated Oracle environment optimized for reporting. The environment extracts data from heterogeneoussource systems and integrates Clinical Management, Membership, Benefits, Eligibility and Claims information. Services empower individuals, expand consumer choice and strengthen patient-provider relationships across thehealth care spectrum.

Responsibilities: Gathered the system requirements and created mapping document which gives detail information aboutsource to target mapping and business rules implementation. Designed, developed and debugged ETL mappings using Informatica designer tool. Created complex mappings using Aggregator, Expression, Joiner, Filter, Sequence, Procedure, Connected& Unconnected Lookup, Filter and Update Strategy transformations using Informatica Power centerdesigner. Extensively used ETL to load data from different sources such as flat files, XML to Oracle. Worked on mapping parameters and variables for the calculations done in aggregator transformation. Implemented slowly changing dimension for accessing the full history of accounts and transactioninformation. Tuned and monitored in Informatica workflows using

Page 18: Informatica Resume Dec2011

Informatica workflow manager and workflow monitortools. Created and configured workflows, worklets and sessions using Informatica workflow manager. Scheduled the jobs using third-party scheduler tool Autosys. Implemented the National City standard data population method loading the data into new set of tablesafter successful completion of the data loading, rolled over to original tables. This concept used to highavailability of the data to users at the time of the data loading instead of users to wait to run theirreports. Designed and developed universes for reporting generation from warehouse database using BusinessObjects. Resolved loops by creating aliases and contexts. Created different types of reports like list, cross tab and etc.

Environment: Informatica Power Center, SAP Business Objects XIR3, Oracle 11g/10g, MS Sql Server, Toad, Erwin,UNIX, and Windows.

Volkswagen, MI 08/2006 - 10/2007Role: ETL DeveloperVolkswagen manufactures vehicles in Germany and Mexico. And it distributes it to dealers all over US andCanada. All the information pertaining to dealers such as dealer information, warranty information, laborinformation etc. is stored in the data warehouse. Worked on dealers coding project and the vehicle in serviceproject. The dealer project goal was to store all the information of the US and Canada dealers in acentralized database where the users can use it for different business applications. The Vehicle In serviceproject aim is to store all the warranty start dates of the 2007 model VW minivan vehicles in the operationaldatabases.

Responsibilities: Co-ordinate, designed and implementation for loading and updating the warehouse. Worked for loading data from source systems like DB2, XML into Oracle tables using varioustransformations - Expression, Lookup, Source Qualifier, Update Strategy etc. Applied business and application knowledge to design the data loads for consistency and integrity. Worked with production load standards and error handing. Assisted in performance tuning by running test sessions. Created and facilitated presentations and demonstrations for Informatica. Scheduled Informatica workflows using Shell Scripts. Created

Page 19: Informatica Resume Dec2011

Tasks, Workflows and Worklets usingWorkflow Manager. Created the transformation routines to transform and load the data. Tuned Mappings and Workflows forbetter performance. Defined Target Load Order Plan for loading data into Target Tables. Extracted data from SAP sources using Informatica Power Connect. Used Informatica Powercenter Server manager to create sessions, batches to run with the logicembedded in the mappings. Used Informatica power exchange change data capture option to capture the updates to the existingdata. Maintained the Defects Tracking in QC and to analyze the variation between the expected and actualresults. Fine tuned Transformations and mappings for better performance. Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Worked with analysts and data source systems experts to map requirements to ETL code.

Environment: Informatica 7.1.3, Oracle 10g, Erwin 3.5, TOAD, Business objects, DB2, PL/SQL, Perl/Unix/Shellscripting, XML, Tivoli, Power exchange, Rational Clear Case.

Bayer, Berkeley CA 10/2005 - 07/2006Role: ETL ConsultantBayer’s is one of the leading companies for bio-manufacturing with its primary capacity devoted toproducing a leading recombinant FVIII therapy. Berkeley site began to focus on the emerging field ofbiotechnology, and robust capabilities of developing and manufacturing Bayer health care pharmaceuticalsResponsibilities: Used Repository manager to create groups and users, and managed users by setting up their privilegesand profile Analysis was done using existing transactional database schema and designing star schema model tosupport user need for reporting Created the mappings in the Informatica for extraction, transformation and loading from Source toTargets SDLC - from analysis to production implementation, with emphasis on identifying the source and sourcedata validation, implementing business logic and used transformations as per the requirement in developingmappings and loading the data into the target Developed the ETL procedure to ensure conformity, compliance with

Page 20: Informatica Resume Dec2011

standards and lack of redundancy,translate business rules and functionality requirements into ETL procedure in PL/SQL Implemented complex mapping such as Slowly Changing Dimensions Type 2 using Flag. Developed number of complex Informatica Mappings, Mapplets and reusable transformations for othermappings Extensively used Transformations like Router Transformation, Aggregator Transformation, SourceQualifier Transformation, Joiner Transformation, Expression Transformation and Sequence generatorTransformations Designed and developed Oracle PL/SQL Procedures Developed stored procedures using PL/SQL to backfill the data in production environment Used PMCMD command to trigger the workflow through command line in the UNIX environmentEnvironment: Informatica Power Center 7.1.1(Informatica Server, Informatica Repository Server, RepositoryManager, Designer, Work Flow Manager, Work Flow Monitor), Erwin 3.5, Oracle 10g/9i, SQL, PL/SQL, SQL*Loader,TOAD, MS SQL Server 2000, Flat files, UNIXHucon Solutions, Bangalore, INDIA05/2004 - 09/2005Role: Data Warehouse DeveloperAmgen-CCS Data warehouse collect, denormalize and integrate data from its operational database, as well asthird-party hospital, pharmacies and physician patient’s reports to provide critical analytical data fordecision makers. Informatica Powercenter 6.2.2 is used to extract, transform and load the data from stagingarea built through Oracle 8i snapshot replication of operational database into the data warehouse implementedin Oracle 8i.

Responsibilities: Designed and implemented for loading and updating the warehouse. Implemented data integration from source systems into Oracle data marts using various transformations- Aggregator, Expression, Lookup, Source Qualifier, Update Strategy etc. Applied business and application knowledge to design the data loads for consistency and integrity. Worked with production load standards and error handing. Worked with IMS Data to validate SampleHistory module data. Assisted in performance tuning by running test sessions. Created and facilitated presentations and demonstrations for Informatica. Scheduled Informaticaworkflows using Shell Scripts. Created Tasks, Workflows and Worklets using Workflow Manager. Created the transformation

Page 21: Informatica Resume Dec2011

routines to transform and load the data. Tuned Mappings and Workflows forbetter performance. Used Informatica Powercenter Server manager to create sessions, batches to run with thelogic embedded in the mappings. Fine tuned Transformations and mappings for better performance. Involved in the process designdocumentation of the Data Warehouse Dimensional Upgrades. Worked with analysts and data source systemsexperts to map requirements to ETL code.

Environment: Informatica PowerCenter 7x, Oracle 8i, Erwin, TOAD, JSP, Web Logic, XML, MS SQL Server2000, MSVisio, UNIX-HP.

Roshitha Bellam Date of Birth: Not SpecifiedGender: FemaleNationality: United States

Phone: 1--408.884.1548Mobile: Not specified

Email: [email protected]

Current Location: US

 

Page 22: Informatica Resume Dec2011

  Informatica

Work Experience : 7 years

Skills : Informatica

Domain Knowledge : IT/ Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : Bank Of America

Current Annual Salary : Not Specified

Previous Employer : Bayer Healthcare Pharmaceuticals

Highest Degree Held: B.E/B.Tech, Computers, Amity Business School

Preferred Job Location : US

 

  Skill Name

Last Used

Skill Level

Experience

Informatica

Not specified

Expert 48 months

 

 Roshitha Bellam Sr. ETL Informatica DeveloperSummary: Over 7years of IT experience in Software Design, Development, Analysis, Testing and Implementation ofbusiness applications for Insurance, Banking, Pharmaceutical and Realtor verticals. Data Processing experience in designing and implementing Data Mart applications using ETL toolInformatica Power Center 8.6.1/7.1.4/6.2/5.1,power exchange 8.x. Adept at understanding business processes / requirements and implementing them through mappings andtransformations. Involved in Database design, entity relationship modeling and dimensional modeling using Star andSnowflake schemas. Extensively worked with mappings using different transformations like Filter, Joiner, Router, SourceQualifier, Expression, Union, Update Strategy, Unconnected / Connected Lookup, Aggregator and SCDType-1,2&3. Experience in tuning Mappings and Sessions for better Performance. Experience in loading various data sources like Oracle, SQL Server, ERP?, DB2 and Flat Files intoData marts.

Page 23: Informatica Resume Dec2011

Developed Complex, Logical and Physical Data Models, using Erwin 4.x/3.5, Oracle Designer Tools andextensively worked on integration. Excellence to satisfy organizational goals and objectives by using UDB DB2. Developed various projects using the SDLC methodology. Skilled professional experience in developing test plans, test cases, test procedures and executionof application using both Manual and Automated-testing tools. Experience in requirements gathering, creating documentation and unit testing of each developedmapping. Worked in Production support team for maintaining the mappings, sessions and workflows to load thedata into the data warehouse. Experience in Oracle 10g/9i/8i, writing Triggers, Stored Procedures, Functions, and Packages etcusing PL/SQL. Experience in Data modeling using Erwin 4.0/3.5 and UNIX Shell Scripting.

Technical Skills:

ETLInformatica 7.x/8.x, SQL Server 2005 (SSIS)BI/OLAPMicroStrategy 8.0.1, Crystal Reports 8.0/7.0, BusinessObjects Data ModelingERWin 3.x/4.xDatabasesOracle11g/10g/9i/8i, MS SQL Server 2005/2000, MS Access, Db2,Netezza, Informix, Teradata V2R5Operating SystemsWindows 95/98/2000/2003 Server/NT Server Workstation 4.0,UNIXProgrammingVisual Basic 6.0/5.0, PowerBuilder 6.0, C, PL/SQL, JavaScript, PERL, VBScript, HTML, XML,DHTMLOther ToolsSQL*Loader, SQL*Plus, TOAD, MS VisioProfessional Experience:

Bank Of America,Merill Lynch,NJ Jan'10 Present

Sr. Informatica Developer Description: Working as an Informatica Developer in one of the leading banks in the United States. Myprimary responsibilities are development and validation of new mappings, maintaining existing mappings,fixing issues and documenting the findings. Data sources used are Mainframe DB2, Oracle, sql server and flatfiles. Target Database is Oracle/Sql Server. Control-M is used as the scheduling tool.Responsibilities:• Used Third Party tools like Control-M for scheduling along with Informatica PowerCenter 8.6.0 andOracle 9i.• Developed ETL jobs to extract information from Enterprise Data Warehouse. • Responsible for developing the mappings from given technical documents using various transformationslike Aggregator, sorter, update, expression, etc.• Experience in modifying

Page 24: Informatica Resume Dec2011

changes to the existing mappings according to user requirements.• Worked with sources like flat files, oracle and DB2 tables. • Target database is oracle.• Wrote various types of SQL queries involving joins, aggregate calculations and tuning the queries tooptimize performance. • Performed extensive testing and wrote queries in SQL to ensure loading of the data.• Performed unit testing at various levels of the ETL and documented the results.• Set Standards for naming conventions and best practices for Informatica mapping development andmigration of mappings to QA and Production.• Responsible for creating workflows and worklets.• Created Session, Event, command, and Control Decision and Email tasks in Workflow Manager.• Used Informatica debugging techniques to debug the mappings and used session log files and bad filesto trace errors occurred while loading.• Extensively used Expression, Joiner for heterogeneous sources, look-up, filter, aggregate, and updatestrategy transformations to transform data before loading into the target tables.• Developed workflow sequences to control the execution sequence of various jobs and to email supportpersonnel.• Implemented transformation logic using transformations like Joiner, Aggregator, sorter, Updatestrategy etc.• Responsible for implementation of one time data movement and data replication methods.• Prepared migration checklist, fixed the errors, documented the changes applied and test results.• Worked extensively with parameritrising the workflows, configuring the reusable success and failuremails, command tasks.• Setting up batches and sessions to schedule the loads at required frequency using Power CenterWorkflow manager and also using scheduling tools. Generated completion messages and status reports usingWorkflow manager.Environment: Informatica Power Center 8.6(Workflow Manager, Workflow Monitor, Worklets, Source Analyzer,Mapping Designer, Mapplet Designer, Transformations),Oracle 10g, DB2 OS/390 8.1.5, SQL Plus, PL/SQL,Microsoft Sql Server 2005, Windows XP.

Bayer Healthcare Pharmaceuticals, NJ Nov'09 - Dec'10Sr. Informatica DeveloperDescription: Bayer is a leading pharmaceutical company in United States. They have got several regionaloffices inter connected amongst one another. The objective of this project was to develop a complete data

Page 25: Informatica Resume Dec2011

warehouse so that they can analyze, do slicing, dicing, drill down on patient data. Database was maintainedin Oracle 8.x on Solaris. Data warehouse is designed using Erwin with different Fact and Dimension tables.Created multiple instances for redundancy. Data is transferred to data warehouse.

Responsibilities:• Designed data models to support user? business requirements. • Designed and developed complex aggregate, join, look up transformation rules (business rules) togenerate consolidated (fact/summary) data identified by dimensions using Informatica ETL PowerCenter 6.0tool. • Used the Slowly Changing Dimensions (type 2) to update the data in the target dimension tables. • Created sessions, database connections and batches using Informatica Server Manager / WorkflowManager. • Optimized mappings, sessions/tasks, source, and target databases as part of the performance tuning. • Configured the server and email variables using Informatica Server Manager / Workflow Manager.• Used command line program cmd to communicate with the server to start and stop sessions and batchesto stop the Informatica server and recover the sessions. • Used all types of caches like dynamic, static and persistent caches while creating sessions/tasks. • Used Metadata Reporter to run reports against the repository. • Created and populated the intermediate aggregate tables to make the reporting efficient and fast. • Designed the physical structures necessary to support the logical database design. • Designed processes to extract, transform, and load data to the Data Mart. • Involved in Informatica mappings development using PowerCenter designer and server manager / Workflow Manager to create the sessions and did lot of testing and data cleansing. • Performed impact analysis, remedial actions and documented on process, application failures relatedwith Informatica ETL and PowerAnalyzer. • Performed developments and changes using ETL tool (Informatica) and Oracle database/data warehouse. • As part of optimization and revival process, performed design changes and changes in Informaticamappings, transformations, sessions. Involved in development of complex mappings. • Interacted with external, internal users and DBA in troubleshooting problems reported in huge Volumetesting (100-200 Million records Data) systems.

Environment: Informatica PowerCenter

Page 26: Informatica Resume Dec2011

6.x/7.x,Oracle 8i, SQL Server2000, MS Access, SQL, PL/SQL, TOAD 7.x,Windows NT.

Move.Com,CA Feb' 08 -Oct'09Informatica DeveloperDescription: MOVE is a realtor company dealing with a number of agency systems to maintain and store theirdata in a centralized database where the end users can use it for different business applications. Thisproject maintains and enhances the agencies data warehouse, thus enabling them to perform trend analysis. Thedata was being extracted from various databases such as Oracle, SQL server, and Flat Files and loaded into anOracle database using the ETL tool Informatica. The reporting was done using OBIEE.

Responsibilities:• Designed the ETL processes using Informatica tool to load data from Oracle, flat files into thetarget SQL Server and Oracle databases.• Followed and Implemented standards for BI & DW at various levels of the SDLC.• Developed and improved existing and new Oracle procedures.• Developed mappings in Informatica to load the data from various sources using transformations likeSource Qualifier, Expression, Lookup (connected and unconnected), Aggregator, Update Strategy, Filter,Router, etc• Created new and customized existing OOTB mappings as per end user requirements. • Responsibility included resolving tickets raised by the business users on ad-hoc reporting requests.• Used Informatica workflow manager to create, schedule, monitor and send the messages in case ofprocess failures.• Designed SQL queries with multiple joins to pull relative data during import state.• Designed and modified PL/SQL stored procedures to modify data flow.• Used triggers in order to enforce business rules.• Worked on tuning and performance improvement of jobs in Informatica. Translated business requirementsto Informatica mappings. Involved in unit testing of mappings.• Worked closely with the end users and business analysts to understand the requirements and developedtransformation logic to be used in Informatica.• Created the transformation routines to transform and load the data. Developed processes forautomation of loading data using parameter driven sessions for batch schedule processes, verification andre-conciliation of data stored in several different source systems.

Page 27: Informatica Resume Dec2011

• Developed complex, logical and physical data models, using Erwin 4.x tool and extensively worked onintegration.• Worked with analysts and data source systems experts to map requirements to ETL code.• Designed and documented validation rules, error handling routines and testing strategy for themappings.

Environment: Informatica 8.6/8.6.1, Power exchange 8.5, TOAD, SQL plus, Oracle, Erwin 4.0, XML, Autosys,Erwin 4.0, PL/SQL, Oracle Business Intelligence 10.1.3x , Unix/Perl/Shell Scripting.

Chase,NY May'07- Feb'08

Informatica DeveloperDescription: This project was done for Chase, in the credit card division, built a Data warehouse to becustomer-centric to analyze transactions across finance, marketing, risk collections, and consumer relations.The Data Warehouse is growing in analytical richness with customer data successfully implemented.

Responsibilities:• Analyzed various schemas for implementation and modeled the Data Warehousing Data marts using StarSchema.• Created mappings using transformations such as the Source qualifier, Aggregator, Expression, lookup,Filter, Router, Rank, Sequence Generator, Update Strategy etc.• Extracted data from flat files and oracle database, applied business logic to load them into thecentral oracle database.• Developed complex mappings & mapplets in Informatica to load the data using differenttransformations.• Created and monitored Sessions and Batches using Server Manager.• Extensively used various performance tuning techniques to improve the session performance(Partitioning etc).• Successfully moved the Sessions and Batches from the development to production environment.• Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer,Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.• Generated completion messages and status reports using Workflow manager.• Created workflows and worklets for designed mappings.• Developed mappings and loaded data into relational database. Worked on Dimension as well as Facttables.• Extensively used PL/SQL

Page 28: Informatica Resume Dec2011

procedures and functions to build business rules.• Identified and fixed bottlenecks and tuned the complex Informatica mappings for better performance.• Scheduled the batches using UNIX.• Worked to parameterize all variables, connections at all levels in UNIX.

Environment: Informatica Power Center 7.1.4, Cognos, Oracle, Teradata, Toad 8.6, UNIX, HP-UX 11i, Sun Solaris5.4, SQL, FACETS 4.X, MS Office Visio 2003Boston Mutual Life Insurance,MASep'06-Apr' 07Sr.Informatica DeveloperDescription: Boston Mutual Life Insurance Company sells traditional group and individual life insurance, aswell as disability and supplemental accident and illness coverage. To migrate and manage the data fromvarious regions required a Data Warehouse be designed.

Responsibilities:• Used Source Analyzer, Warehouse designer, Mapping designer, Mapplet designer, Transformationdeveloper and Informatica Repository Manager. • Developed and tested Mappings and Workflows.• Experienced in performing the analysis, design, and programming of ETL processes for Teradata.• Extensively worked in the performance tuning of the ETL mappings and workflows.• Involved in writing shell scripts and added these shell scripts in Autosys as scheduled daily,weekly, monthly.• Extensively created mapping/mapplets, reusable transformations using transformations like Lookup,Filter, Expression, Stored Procedure, Aggregator, Update Strategy, etc.• Strong expertise in using both Connected and Un-connected Lookup transformations.• Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache.• Developed Re-usable Transformations, Re-usable Mappings (Mapplets) and Re-usable workflows(Worklets).• Wrote Shell scripts to schedule batches on Cron Jobs.• Used Informatica scheduler to run workflows on UNIX box and monitored the results.

Environment: Informatica Power Center 6.x,7.1.1, Oracle 10g, OBIEE 10.1.3.3.x, TOAD, UNIX, Windows XP,Teradata.

Airtel, India Aug 05-July 06

Data AnalystDescription: Airtel is a powerful integrator of telecom products, services and e-

Page 29: Informatica Resume Dec2011

commerce technologiesspecializing in the distribution of pre-paid telecom products and services including phone cards and pre-paidwireless products. Project involved in Normalizing and designing Tables, Constraints, Views, and Indexes etc.in coordination with application development team. Fine tuned application logic for better performance.Developed complex queries, functions, stored procedures and triggers using PL/SQL.Responsibilities:

• Documented business workflows textually as well as in UML diagrams (Use Cases, Activity diagrams,State diagrams) Identified and documented attributes for entities and also defined domain for all attributes.• Analyzed the DW project database requirements from the users in terms of the dimensions they want tomeasure and the facts for which the dimensions need to be analyzed.• Responsible for creating Use Cases, Business Processes and Work Flow Diagrams using MS Visio and MSExcel.• Collected the information about different Entities and attributes.• Used Active Directory to store information and settings in a central database.• Was involved in the designing and Normalization of complex OLTP data base models.• Followed 2NF/3NF Normalization standards to create database for OLTP databases.• Carried out denormalized for the DW and generated Star schema.• Used Erwin 4.0 to design (PDM)Physical and logical models(LDM) for the relational database.• Responsible in mapping logical data models to physical data models.• Worked extensively in creating a schema to create tables in the database.• Worked with Business managers to confirm logical models based on data requirements. • Performed Logical to Physical mapping. • Responsible in the process of Data profiling in Spreadsheets. • Worked extensively in creating a schema to create tables in the database.• Responsible in creating and maintaining the Data Dictionary.• Used Active Directory to store information about the network resources across a domain and alsocentralize the network.• Fine-tuned application logic for better performance. Developed complex queries, functions, storeprocedures and triggers using PL/SQL.• Designed and created stored sub programs with Dynamic SQL at both client and Server side. • Designed and developed backend PL/SQL packages in database layer, stored procedures, functions andtriggers.

Page 30: Informatica Resume Dec2011

• Session management to provide security to web access.• Tuned stored procedures & developed business logic in database layer using PL/SQL.• Worked on PL/SQL Packages, Procedures, Objects, indexing, Functions, and REF Cursors.• Worked with developers to improve the performance of the SQL statements to increase the performance.• Actively implemented system control. Also provided production support with DBA in using Activedirectory to assign policies, deploy software, and apply critical updates to an organization• Worked on tuning the performance of the existing reports, and successfully completed testing thereports. Environment: Oracle 9i, Reports 9i, SQL*Plus, Active Directory, MS PowerPoint, MS Excel, MS Access, MS Visio SQLNavigator, UML, VB PL/SQL, Oracle Designer 6i, Windows Server 2000, Toad, ERWIN 4.0, MS Access

REFERENCES AVAILABLE UPON REQUEST

Mohit Rastogi Date of Birth: 4 Oct 1983Gender: MaleNationality: India

Flat nos:21/3, Mutha Chambers, Senapati Bapad Road, Pune - 411057

Phone: 91-120-2776873Mobile: 91-9881722477

Email: [email protected] email: [email protected]

Current Location: Pune

 

  INFORMATICA

Work Experience : 5 years 7 months

Skills : INFORMATICA

Domain Knowledge : Not specified

Industry : IT/ Computers - Software

Category : IT

Roles : Team Leader/ Technical Leader

Current Employer : COGNIZANT TECHNOLOGY

 

Page 31: Informatica Resume Dec2011

LTD

Current Annual Salary : 6.90 lacs per annum

Previous Employer : INFOSYS TECHNOLOGY LTD

Highest Degree Held: B.E/B.Tech, Electronics/Telecommunications, LNCT

Preferred Job Location : Anywhere

  Skill Name

Last Used Skill Level Experience

Informatica Jul 2011 Intermediate 48 months

Unix Jul 2011 Beginner 8 months

Oracle Jul 2011 Beginner 12 months

 

 

MOHIT RASTOGI

Row House Nos:1/Yash Residency/Phase 2/Sus Road /Pune - 411057

©: +91-9881722477

Email: [email protected]

OBJECTIVE

A challenging growth oriented position, where my technical and professional skills can be effectively utilized and improved, eventually leading to contribution in the growth of organization.

PROFESSIONAL SYNOPSIS

Worked extensively on Data warehousing project for 5yrs 7months.

Skills in UNIX ,Oracle, Informatica Power Center, Data Warehousing concepts, Teradata , Microsoft Visual Source Safe

Proven abilities in leading complete SDLC entailing analysis & development.

An effective communicator with excellent relationship management skills. Strong analytical, problem solving abilities. Possess a flexible & detail oriented attitude.

Page 32: Informatica Resume Dec2011

ORGANIZATIONAL EXPERIENCE

Current Organization: Cognizant Technologies Ltd. (April10 – Till Date)

Project:Investment Bank Intelligence Engine 19 April 2010 to till date (14 Months)

Client:JP Morgan Chase.

Team Size:   5

Role: Associate Consultant (Lead)

Scope:JP Morgan Chase (JPMC) is a fortune 25 Financial Giant with revenue figures over US $125 Billion. JPMC Bank is a global banking firm serving the complex needs of its customers (Includes Government organizations, Corporate Sector, Institutional investors etc) through a sophisticated & highly integrated range of financing, advisory, trading, investment & associated capabilities

JPMC provides a full range of IB (Investment Bank) and commercial banking products and services. The Firm also commits its own capital to proprietary investing and trading activities. IB Intelligence Engine is one of its prominent business areas.

JPMC has initiated a program called IBIE-Integration to integrate data from different source system to a centralized Data warehouse (IBIE). Data from different source system like CMR, AQUEDUCT, IBCRM and HEADCOUNT are loaded into IBIE. Data coming from different sources are first loaded into staging tables.After stage load Dimension tables and fact tables are loaded. Generally Slowly Changing Dimension Type 2 (SCD 2) is used as dimension type. Data from Dimension table is then loaded into Vertica Data mart tables. On top of data mart table OLAP views are created, which is used for reporting purpose.

Responsibility: Requirement Analysis, Estimation, Design, Coding, Reviewing, Testing, Release,

Page 33: Informatica Resume Dec2011

Support and Monitoring

Platform:Informatica Power Center ,Oracle, Microsoft Project, Microsoft Visual SourceSafe

Previous Organizantion: Infosys Technologies Ltd. (Dec 05 – March 10)

Project:Anthem Care   Comparision 15 June-2008 to 31 March-2010(21 Months)

Client:   Well Point.

Team Size:   4

Role: Software Engineer (Lead)

Scope:   This project is intended to manage the flow of data between WellPoint and a single WEBMD vendor through various logical processes. The data is extracted from various Source Systems of various regions in a particular format for re-pricing Claims of all types. Then through ACC Process the claims are re-priced. These re-priced Claims are used for creating various reports for analysis. Extract files created are sent to WebMD which allows users to compare prices and experience/volume of tests across facilities.

Responsibility: Requirement Analysis, Estimation, Design, Coding, Reviewing, Testing, Release, Support and Monitoring

Platform:Informatica Power Center , MS SQL Server 2000, Microsoft Project, Microsoft Visual SourceSafe

Project:Geriatic Care Management03 Oct 2007 – 15June 2008 (8 Months)

Client:   Well Point.

Team Size:   3

Role: Software Engineer (Module Lead)

Scope:   This project is intended to manage the flow of data between WellPoint and several vendors to provide Risk Analysis for Medicare Advantage patients. The creation of a data mart will give better opportunity to analyze and determine drivers around the cost of care better assess

Page 34: Informatica Resume Dec2011

and determine the special needs and gaps in care with our members, and allow enhancing and monitoring performance of Quality Improvement initiatives

Responsibility: Requirement Analysis, Estimation, Design, Coding, Reviewing, Testing, Release, Support and Monitoring.

Platform:Informatica Power Center, Teradata, Microsoft Visual SourceSafe, Microsoft Project

Project:Enterprise Data Warehouse12-June-2006 to 03-Oct-2007 (16 Months)

Client:   Well Point

Team Size:   5

Role: Software Engineer

Scope: This project is aimed at creating a common data warehouse platform from various existing WellPoint data

Warehouses which existed in different formats and parameters (which exist due to WellPoint’s acquisition of

various other firms which served different sets of customers) for the purpose of Statistics and Analysis. This

is an ETL (Extraction, Transformation and Loading) based project.

Responsibility: Requirement Analysis, Estimation, Design, Coding, Reviewing, Testing, Release, Support and Monitoring

Platform:Informatica Power Center, Teradata, Microsoft Visual SourceSafe

KEY ACCOMPLISHMENTS

Certifications: Domain: American Health Insurance Plan (AHIP) certified.

Technology:Informatica Certified

IT SKILLS

Languages:    Java

Page 35: Informatica Resume Dec2011

Database: Teradata, Oracle

Operating Systems:     Windows, UNIX

Tools:Informatica Power Center, Microsoft Visual

EDUCATION

B.E. in Electronics and Telecom Engineering from LNCT , (2001-2005), 74%

12th, CBSE (2001), 82%

10th, CBSE (1999), 78%

PERSONAL PROFILE

Date of Birth    :   04 Oct 1983

Correspondence Address   :Row house:1/Yash Residency/Phase 1/Sus Road /Pashan/Pune

Permanent Address   :Flat No:105/Nanda Tower/Kaushambhi/Near Pacific Mall/Ghaziabad.

   Phone No: 01202776873

Marital Status   :   Single

Language Known   :   English and Hindi

Nationality   :   Indian

Place: Pune

Mr. MohitRastogi

Satish Kolla Date of Birth: Not SpecifiedGender: MaleNationality: India

Phone: Not specifiedMobile: 91-9949999815

Email: [email protected]

Current Location: Hyderabad

 

Page 36: Informatica Resume Dec2011

  Informatica

Work Experience : 6 years 1 month

Skills : Informatica,oracle.Unix,pl/sql

Domain Knowledge : IT/ Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : Verizon Data Services ind pvt ltd

Current Annual Salary : Not Specified

Previous Employer : IBM India pvt ltd

Highest Degree Held : M.Sc, Computers, Bharathidasan University

Preferred Job Location : Bangalore, Chennai

 

  Skill Name

Last Used

Skill Level Experience

InformaticaNot specified

Not specified 6 months

UnixNot specified

Not specified 3 months

OracleNot specified

Not specified 6 months

 

 

Satish K

Mobileno : +91-9949999815 Email: [email protected] summary: 6 + of Experience in Information Technology as a Software Engineer/ ETL Developer in the development of Data Warehousing. Extensive experience development of ETL in Informatica. Good understanding of RDBMS concepts and experience in writing queries, procedures & scripts using Oracle SQL proficiency in database programming using Oracle RDBMS. Strong knowledge in Data warehouse tools like Informatica PowerCenter 6.2/7.1.2/8.6 Analyzed the types of business tools and transformations to be applied. Implemented performance tuning logic on sources, mappings, sessions and targets. Good communication and interpersonal skills.Technical Skills:ETL Tools Informatica Power Center

Page 37: Informatica Resume Dec2011

6.2/7.x/8.xDatabases Oracle 9i,DB2Programming Languages SQLOperating Systems Ms-Windows 98/NT & 2000.Experience Summary   : Working as an Senior Analyst with Verizon Data Services India Pvt Ltd, Hyderabad Since July 2006 to Till Date. Worked as an Software Engineer with IBM India Pvt.Ltd,Kolkata Since Sept 2005 to June 2006. Worked as an Software Engineer ith Optimal Solutions , Bangalore Feb 2005 to May 2005.Educational Qualification: Masters Degree In Information Techonology From Bhrathidasan University in 2001 Project Profiles :Project # 1Project Title � � � � � � � �: Metrics Data Repository Organization � � � � � � � : Verizon Data Services ind Pvt Ltd Environment � � � � � � � : Informatica 8.6, Orcale ,UnixDuration : July '06 -Till Date 

� � � � � � � � Verizon Communications is the one of the leading provider of the communication services in the world and largest provider of wire line and wireless communications in the USA .

� � � � � � � � Metrics data repository deals with DSL and FTTP sales data. It enables the top-level management to analyze the DSL and FTTP sales. At present we are working on enhancements and at the same time we are analyzing the existing design to improve the performance, It involves the redesign of the mappings, and sessions and tuning the queries for utmost performance.

Reposibilities

Create Mappings,reusable transformations using Designer of Power Center (Informatica Tool). Sophisticated transformations are implemented using Informatica features like Aggregator, Filter, Expression, Lookups, Update stratrgy,Sequence Generator and Source Qualifier and etc. Using Joiner transformation for extracting data from multiple sources. Implemented performance tuning logic on sources, mappings, sessions and targets in order to provide maximum efficiency and performance. Coding & Debugging, Sorting out their time-to-time technical problems. Create sessions, worklets and workflows as per the requirements. Analyze the types of business tools and transformations to be applied. Analysis of Functional specification. Performing Unit TestingProject # 2Project : Project North - Metrix Data Reconcilation (MDR) - SPINCO & RETAINCOClient : Frontier

Page 38: Informatica Resume Dec2011

Role : Team MemberEnvironment : Informatica 8.6, Oracle 9i, Unix,SQL NavigatorDuration : Nov'09 -Apr 2010Description : Verizon sold out West part of Business to Frontier which includes business of FIN,HSI,FTV,FDV at 12 States & Few wirecenters of CA.The Project North includes a kind of Demerge of existing COMPLETE VERIZON S/W,H/W,Data,Resources and lot more(State of Art at World's Largest Scale of Spinning Out New Company out of Existing Organization in stringnet, Tightly time bounded Deadlines with boasting figures like 700+ Projects,2k+ servers, 4k Resources, 6 Billion $'s, 1 Year etc..) into Two Parts namely SPINCO (Transferred to Frontier), RETAINCO ( Retained by VERIZON) adhering to the both Organizations regularly business rules & Security of data ( Thincleint and Citrix Web Inteface published applications in Clean Room Environment), Federal Rules. MDR being the Broadbond Metrics Datawarehouse has to Churn out PN-MDR with same applications, existing functionality in simple a REPLICA in parallel along with existing BAU.

Responsibilities:

Identifying the COMPLETE Stake Holders like Source Database Systems, Target DB Systems, Inbound DB, Inbound XFER, Outbound XFER, Outbound DB, Retired-Not PN Applications, S/W, H/W Servers & Config, Licences Procurement Details. Blanket Security Clearnce, Publishing all MDR applications at CR with Biometric Access to PN resources Project realted activites (Identify Hardware requirements, Identify Software Requirements) -- DB Servers, ETL Servers and Appworx Client and Server Setup Develop Methodology and Work plan, identify upstream interfaces IN & Identify Downstream Interfaces OUT Application and Database Setup i.e Establish Hardware, Software Config in FTW Replicating production environment to PN environment and access to all VDSI PN resources from CR. Setup the replication Process like create FTW DB instances, FTW informatica Instances etc.. Extract Frontier data using data pump export scripts. Export all on-line data including history for Project North States. Testing the Application Load/Unit test, Inbound Feeds and Outbound Feeds Regression and Client acceptance Testing. Post Production Support and Job monitoring using informatica scheduler.Project # 3Project Title : Nestle Globe ETLOrganization : IBM India Pvt. Ltd.,Environment : Informatica 6.2, DB2Duration : Sept '05 - June `06Description:Nestle is a multinational company is in the process of implementing SAP R/3 ERP system in more than 80 countries. Nestle maintains market specific data at the local SAP R/3 implementations as well as some portion of it in the centralized SAP R/3 database on DB2 at Switzerland. Data from various disparate

Page 39: Informatica Resume Dec2011

legacy systems needs to be extracted, cleansed, transformed and loaded into these SAP databases. The ETL tool being used is Informatica Power Center 6.1.2 and DB2 is the target database. Additional Informatica components are used for accessing and modifying data of proprietary data stores like SAP R/3,AS/400 these are achieved through the use of respective Informatica Power Connect 6.2.Responsibilities Extracting source data from legacy systems and moving into Staging area DB2 and Flat Files. Create Mappings,reusable transformations using Designer of Power Center (Informatica Tool). Using Power Connect to Generate ABAP code and Importing SAP R/3 Tables. Sophisticated transformations are implemented using Informatica features like Aggregator, Filter, Expression, Normalizer,Lookups, UPS and Source Qualifier and etc. Using Joiner transformation for extracting data from multiple sources. Implemented performance tuning logic on sources, mappings, sessions and targets in order to provide maximum efficiency and performance. Coding & Debugging, Sorting out their time-to-time technical problems. Create sessions, worklets and workflows as per the requirements. Run the workflows,sessions and extract the data SAP R/3,AS/400, SQL SERVER and DB@ database. Analyze the types of business tools and transformations to be applied. Analysis of Functional specification. Testing extraction of data From SAP. Performing Unit TestingProject # 4Project Title : Risk Information System(RIS),AustraliaClient : ANZ-BankRole : ETL DeveloperEnvironMent : Informatica 7.1.2, Oracle10gDuration : Feb 2005 to May 2005ANZ is significantly expanding its risk capability via projects such as Basel, RXM Replacement (RXM-R) and One Customer view- Data sourcing (OCV-DS). RXM-R is a Replacement by RAZOR; TheRAZOR is an application which is having 9 interfaces. RAZOR will store the data of Interfaces which consists of Customer, Deal and Limit Information of all Account holders of the ANZ Bank. The Main Scope of this Project is to calculate the Risk of the ANZ Bank.The Basel Steering committee has taken a decision to adopt a new Risk Information store (RIS) keeping the existing platforms like DB2 on mainframes, flat files, with enhancements and additions (in the form of ETL), to provide a base that will enable us to meet the future Basel as well as NON-Basel needs. This project has been extended. To include the requirements for Capital Calculations, Internal reporting and the development of modeling test bed platform.

Page 40: Informatica Resume Dec2011

Responsibilities Worked on Informatica tool -Source Analyzer, Data warehousing designer, Mapping Designer & Mapplet, and Transformations Developer. Created mappings using Informatica Designer to build business rules to load data. Most of the transformations were used like the Source Qualifier, Aggregator, Lookup, Router, Filter,Sequence Generator, Expression,Joiner and Update Strategy . Developing and modifying changes in mappings according to business logic. Creating Mapping variables, Mapping Parameters, Session parameters. Recovering Data from Failed sessions and Workflows. Creating Workflows,sessions using Workflow Manager. Coding & Debugging, Sorting out their time-to-time technical problems. Analyze the types of business tools and transformations to be applied. Implemented performance tuning logic on sources, mappings, sessions and targets in order to provide maximum efficiency and performance. Developed Mapplets using Mapplet Designer. Write SQL and PL/SQL programs Involved in writing the Unit Test Cases Fixing the Bugs in the Mappings, Sessions and Parameter files by using the TEST DIRECTOR 8.0. Performing Unit Testing

Sinraj N Date of Birth: 10 Mar 1983Gender: MaleNationality: India

Phone: 91-7299-404072Mobile: 91-7299404072

Email: [email protected]

Current Location: Chennai

 

  informatica

Work Experience : 5 years 1 month

Skills : informatica

Domain Knowledge : Banking/ Financial Services,IT/

 

Page 41: Informatica Resume Dec2011

Computers - Software

Industry : IT/ Computers - Software

Category : IT

Roles : Software Engineer/ Programmer

Current Employer : TCS

Current Annual Salary : 4.20 lacs per annum

Previous Employer : Wipro Technologies

Highest Degree Held: B.E/B.Tech, Electronics/Telecommunications, Anna University

Preferred Job Location : Chennai

  Skill Name

Last Used Skill Level Experience

informatica Jul 2011 Intermediate 40 months

oracle Jul 2011 Intermediate 48 months

 

 

SINRAJ NALLAPPANEmail : [email protected] Mobile:+91-7299404072EXPERIENCE SUMMARY: Having 5 years of IT experience in Data Warehousing Applications and Business Intelligence. Experience in Design, Development, Testing of different data warehousing projects. Good experience in all aspects of SDLC. Well versed in latest versions of Informatica 8.6 Extensively worked on developing and debugging Informatica mappings, sessions and workflows. Good knowledge and experience with Oracle RDBMS. Exceptional ability to quickly master new concepts and applications. Ability to handle multiple tasks and work independently as well as in a team. Excellent interpersonal, communication, presentation and analytical skills that are required to effectively work in the field of applications development and maintenance.PROFESSIONAL EXPERIENCE: Working as a Senior Software Engineer in TCS (Through Future Focus InfoTech) , from August 2010 to Till Date. Worked as a Software Engineer in Wipro Technologies (Through Pyramid IT consulting), from February 2010 to July 2010. Worked as Content Engineer in Utopia India Pvt Ltd from March 2007 To August 2009. Worked as a Data Engineer in Altius Content Solutions Pvt Ltd, from October 2005 To March 2007.EDUCATIONAL QUALIFICATION: Bachelor of Engineering (Electronics & Communication), Anna University, India, 2005TECHNICAL SKILLS:ETL TOOL : Informatica 7.1/8.6/8.1RDBMS : Oracle 9i/10g.

Page 42: Informatica Resume Dec2011

OPERATING SYSTEMS : Windows 95/98/xp, windows NT/2000.Project #1TITLE : DESA (DIRECT ENGINE SALES ANALYSIS).Client : CUMMINS ENTERPRICES(USA).Period : JAN 2011 to till dateRole : ETL Developer (Informatica).Description:

DESA is Mainframe to DW migration project. DESA is currently Mainframe application that by virtue of three successive projects has moved its reporting, its database, and its submission process into a distributed web based environment. In light of the corporate initiative to get off the Mainframe by 2014, it is now time to take the final step to remove DESA processes from the mainframe. And provide the ability to run and support an independent DW solution for DESA business user community reporting requirements.Remove the mainframe dependencies and move the entire core DESA process to data warehouse.Create common feed for Mainframe downstream application. Implement KGAF and KGAJ screens . Re engineer the MKG0002 job functionality which extracts the ESN file with all the history related to the engines that were updated into DESA system during the current month. This needs to be implemented in EBU DW.Invoice/Cost matching logic in mainframe need to be incorporated in data warehouse in order to apply the shop order costs to the engine.�Responsibilities:

�1.����� Requirement gathering, analysis and converting IMS mainframe code to BI as per DW standards.

2.����� Discussed with IMS team and understood the functionality of IMS mainframe logic.

3.����� Data profiling based on the requirement and impact analysis on existing BI application.

4.����� Developed ETL mapping by incorporating all the IMS logics and with all dependencies.

5.����� Prepared the unit test cases and sent it to IMS team for validation and correcting the logic as per IMS comments. And Environment setup for SIT & UAT,Pre and post production activites.

Project #2TITLE : OBI ANALYTICS.Client : CUMMINS ENTERPRICES (USA).Period : AUGUST 2010 to DECEMBER 2010Role : ETL Developer.

Description:

The purpose of this project is to ensure that a robust process & system is developed to enable

OOB reports for Order Management and Supply Chain Management subject areas with New and

Recon parts data for Mechelen, Memphis and Singapore PDC's (Parts DistributionCenters).The

integrated IT solution should support business

Page 43: Informatica Resume Dec2011

requirements and� provide accurate Business

Intelligence reporting.

Responsibilities:

1. Create ETL code to load Mechelen,Memphis and Singapore PDC's(Parts Distribution

Center) New and Recon Parts data from Engine Business Unit Datawarehouse into Oracle

Business Analytics warehouse for Order Management ,Finance and Supply Chain Management

subject areas.

2. Validate the mapped OOB reports/dashboards.

3.Created unit test cases, SIT test cases environment setup for UAT

Project #3TITLE : Collateral Management and Client Statement Reporting (DBLite).Client : Credit Suisse LLC. (USA)Period : Feb 2010 to July 2010Role : ETL Developer.Credit Suisse is one of the world's largest investment banks. The purpose of this project is to meet requirements for automating and generating BD Lite Client Statements, Monthly Review and Valuation Reports. This project is the global application for supporting Americas, Europe and Asia Pacific regions. The transactional data is coming from different data sources and loaded into Data warehouse via ETL. The Trades, Positions, Balances and Cash flow activities are enriched with reference data i.e. Counterparty, Product also loaded in to data warehouse. CMACSR address the aspects of the requirements to create, distribute and archive the monthly and ad-hoc Client Statements, Semiannual Tax Statement, P&L Trade Facts reports, AMS and Adjustment Audit Trail Reports etc. All these reports can be viewed by end users through GUI client interface.Responsibilities: Analyzed the Business Requirements through the analysis of BRD, FRD. Involved in the development of BRD and Data Traceability Documents. Create mapping schemas interacting with onsite coordinator. Involved in analyzing and Importing Source Tables from the respective databases Created mapping as per specifications by understanding technical and functional requirements. Joining different Database tables thru joiner and lookup source and target tables as per the requirement of related tables and ports. Create source, target and staging tables in database and loading. Analyzing target tables after Extract Transform and Loading. Designed test case documents and performed unit testing, SIT on the ETL before move them to UAT. Involved in Unit testing, System testing to check whether the data loads into target are accurate.Environment: Informatica Power Center 8.6, Rapid SQL, Oracle 10G, Control-M.Project #4Title : UWEB Grainger

Page 44: Informatica Resume Dec2011

Client : GRAINGER- USARole : ETL Developer.Duration : Apr 2008 To August 2009.Description:Grainger is one of the leading retail company in USA. The data warehousing application made for the analysis and reporting purpose of the sales and marketing company. The main objective of the project is to help the senior and middle level management of the organization to improve the sales and get the knowledge of the new business opportunities. Extract data from client server and loading in to the staging and target tables Thru informatica. Using this project we can generate various types of reports like best Employee Report, Best Customer Report, Best Product sold report, Best Promotion report, etc.,

Responsibilities: Analyzed the Business Requirements through the analysis of BRD, FRD and TDS. Involved in the Setup and installation of the Informatica Power center Server. Involved in analyzing and Importing Source Tables from the respective databases Create mapping by using different transformations. Created different mapplets and reusable transformation to reuse business logics in many mappings Creating reusable transformations and using into the different mapping. Analyzing and fixing the ETL problems thru debugger. Analyzing target tables after Extract Transform and Loading. Generating test cases and executing test cases , maintain associated documents Preparing and testing the code to the fix the issues in Development and QA environment. Documentation of the project.Environment: Informatica Power Center 7.1, Rapid SQL, Oracle 9i.Project #5Title : UTPRClient : PERTAMINA IndonesiaRole : ETL Developer.Duration : Apr 2007 to Mar 2008.Description:PERTAMINA is a oil and Gas company. This data warehouse captures the sales and marketing process of the business which enables to build the business intelligence layer i.e., to view the reports by the end user through which the top level management can make some decisions on the basis of business performance and also helps to calculate the revenue of the organization. The system helps the decision making team of the organization to monitor and improve the sales and to explore for new business opportunities.�Responsibilities: Analyzed the Business Requirements through the analysis of BRD, FRD and TDS. Creating source/ target tables from the respective databases Creating reusable transformations and using in mapping Developed and tested the mappings Joining different Database tables thru joiner and lookup source and target tables as per the requirement of related tables and ports. Create source target and staging tables in

Page 45: Informatica Resume Dec2011

database and loading. Performed tasks of validating and executing the sessions in Informatica Implemented the business rules using various transformations. Prepared the test cases. Validated the data by executing the test cases. Involved in data validations as per business requirementsEnvironment: Informatica Power Center 7.1, Oracle 9i.Project #6Title : Attendance RecorderClient : AltiusRole : SQL Developer.Duration : Jan 2006 to March 2007Description:Involved in the development of the project called Attendance recording system that integrates with and extends the functionality of access control mechanism and the associated software. While the card reader based access control system logs all door transactions and implements present time zones and access levels to the office, the attendance recording system facilitates import of these raw transactions via text files, implementing the office rules. This system also facilitates database loading, maintenance backup and a powerful report generation feature.Responsibilities: Created tables, views in the table columns at database level. Loading data into oracle database tables. Analyzing, correcting the data and creating audit columns. Created, monitored and maintained Oracle databases. Configured ODBC connect


Top Related