john vincent etl_dwh_bi quality analyst_4yrs_exp
TRANSCRIPT
John Vincent
ETL/DWH & BI Quality Analyst
+91 9941730490
SUMMARY
Driven professional with 45 months of experience in ETL/DWH & BI Quality Analyst
Proficient in all phases of STLC from test planning to defect management
Extensive experience in reviewing Business Requirement Documents, Functional Requirement
Documents and preparing Test Cases and Execution
Ample Knowledge in Data Warehousing concepts and RDMS Database - Teradata, Oracle & SQL
server
Expertise in Data Warehouse Architecture – IBM-BDW Model, EDW Model and Teradata-FSLDM
Extensive experience in Banking - Credit Risk Data Mart and Business Data Analysis
Extensive experience in developing SQL scripts, Teradata macros for validating the databases tables
and reports data for backend database testing
Good Exposure in Data Quality Check, Boundary Values checks, Static Testing, Integration testing
and Regression Testing
Exposure to ETL tools like Abinitio & Informatica
Expertise in BI Tools like Cognos (Report Studio & Query Studio) & SSRS
Extensive experience in Developing ETL graphs using Abinitio using components like Reformat,
Join, Filter, Partition by, Normalize, in-build HDFS Components
Exposure to Hadoop Big Data Testing on IBM Big Insights
Extensive experience in History Data Integration testing in Data Mart and History Management
Process Validation
TECHNICAL SKILL SET
Database : Teradata, Oracle 10g, SQL Server, DB2 8.0
Utility Tools : SQL Server, TOAD and SQL Assistant
ETL Tool : Abinitio, Informatica
Reporting Tool : Cognos, SSRS
Test Management Tool : HP Quality Center 10.0/11.0 ALM
Scheduling Tool : Mainframe (TWT)
Language Skills : SQL, PL/SQL, Unix/Linux Scripting
Data Model : IBM-BDW & EDW, Teradata – FSLDM
BIG Data Tools : Hadoop – IBM Big-Insights
PROFESSIONAL EXPERIENCE
PROJECT: Credit Risk Reporting
CLIENT: KeyBank
DURATION: September 2013- Till now
ORGANIZATION: Cognizant Technology Solutions, Chennai
ROLE: DWH Quality Analyst / ETL Test lead
DOMAIN: Banking and Financial Services
DESCRIPTION: Credit Risk Reporting (CRR) is used to review the credit standing and rating of their
customers. This report is designed to assist Financial Institutions and Credit Grantors strengthen their
credit risk management and improve turnaround time for credit evaluation. This is possible through the
availability of comprehensive and reliable credit information and ratings within a single report.
Bank wanted to build Data mart to review the credit standing and risk report of their customers via.
Cognos Reporting. The architecture of CRR was to move to date from 15 different Source to different
layers like Harmonization and Canonical layers and to integrate the data into single Database based on
IBM – BDW model called Shared Foundation Data (SFD). The Data from SFD will be moved to CRR
Data Mart.
Later, Bank wants to improve CRR Mart performance by replacing SFD – IBM BDW Model with
Hadoop framework using IBM big insights. To handle the volume of data and reduce the data load
complications. It also reduced the SFD - BDW data model complexity.
RESPONSIBILITIES UNDERTAKEN:
Performed Static testing to find the discrepancies between Business/Functional Requirements
document vs. Mapping Document across each layer to avoid mapping defect during STLC phase
Performed Boundary value checks based on the Business Requirements against Source data
Extracted data from various sources like Mainframe, Sybase, Oracle, SQL server, DB2, Flat files,
vector and conditional dml files
Extensively used Abinitio for file conversion from vector and conditional dml files to flat files
Worked on Data profiling to collect statistics and information about source data
Developed Test Plans, Test Strategy, Test scripts and executed the test scripts for CRR-DWH
Project.
Developed complex SQL queries and SQL procedures for querying data against different data
bases for source data verification process
Developed test scripts based on technical specifications/Data design documents and Source to
Target mappings
Extensively interacted with ETL developers, Line of Business & Business analyst teams to
understand the CRR project business requirements and ETL design document specifications
Participated in regular Project status meetings, Line of Business meetings and QA status meetings
Defects identified in SIT environment are communicated to the ETL developers and Line of
Business using Quality Center – Defects triage Meetings
Prepared daily and weekly status reports with details of executed, passed, and failed test cases
and defect status based on severity of defects
Setting up, monitoring and using Mainframe TWT Scheduler in SIT/QA Region
Worked on issues with migration of Data from Development to SIT/QA Environment
Developed Unix Scripts to validate the file availability in NAS region
Developed test scripts to validation the data flow form HDFS Source to IL Layer in Hadoop
environment using IBM Big Insight
Developed SQL procedures for regression testing between Source to Canonical Layer
Worked with PDM team and Business analyst for understating IBM – BDW model
Strong Exposure about all Subject areas in IBM-BDW Model and monitored the data flow across
all areas in SFD (Shared Foundation Data)
Developed Teradata macros to validate the Slowly Changing Dimension(SCD) Process in SFD
and Mart, History Data Management Process in SFD and Referential Integrity across all tables
Developed Data Analysis dashboard reports to validate the record count and key facts across all
the layer
Worked on General Ledger Reconciliation Process to verify the data in GL proof balance sheet is
matching with CRR mart
Worked on Business validation along with Line of Business in CRR mart during UAT
Extensively tested several Cognos Reports – Line of Business reports and Analytic Dashboards to
validate the reports the data and cosmetics of the report
Developed Ad-hoc reports using Report/Query Studio for data comparison between Production
and SIT region during UAT cycles
Providing Encouragement, Support, Knowledge and Guidance to team members during test
execution
Ensure that Process and Standard are followed by team members as defined by Company and
Clients
Developed Abinitio graphs for Source vs. Hadoop source files regression testing for Ancient
History FICO Source Data
Worked on Validating History Data Integration in Mart for both Analytic FICO Data and Credit
Card Data
Successfully delivered Credit Risk Mart to Business stakeholders in both SIT and UAT cycles by
leading a team of 3
Environment: Teradata, IBM Big insights, Abinitio, Cognos, Tivoli
PROJECT: ALM Technology Infrastructure
CLIENT: BNYM Mellon
DURATION: February 2013- August 2013
ORGANIZATION: Cognizant Technology Solutions, Chennai
ROLE: ETL Tester
DOMAIN: Banking and Financial Services
DESCRIPTION: Bank has an existing Asset and Liability Management System (ALMIS) which helps to
review the banks’ balance sheet, reconcile its value against the company’s general ledger and also
generate various requested by the US government and financial regulatory organizations like BASEL.
The existing system is mostly manual, has less scalability. With the introduction of the new BASEL III
rules, BANK wanted to build a robust, automated ALM System to meet current regulatory demands and
also create a system which can be scaled as time goes on. The architecture was to move data from 55
different sources of the bank and integrate the data into one central data warehouse using SSIS through
various data layers like staging, normalized and de normalized SQL Server 2008 databases, and perform a
full pledged reporting system using QRM (Quantitative Risk Management) – a third party application
which has industry proven experience in performing financial reporting.
RESPONSIBILITIES UNDERTAKEN:
Identified gaps between the mapping document, Data Model and Business Requirements
Developed Test Plan, Test Scenario and Test Cases
Developed SQL Queries based on mapping document for Test Execution
Validated Data flow between SQL Server 2008 tables across all layers
Defects were tracked, analyzed, reviewed using HP Quality Center
Involved in constantly monitoring the batch process to make sure the process ran smoothly,
contacting the Developing team if any errors encountered.
Involved actively in daily status meetings and provided testing status to Development team and
other stakeholders
Involved proactively with the team, in providing better solutions within the team for any issues
encountered.
Developed Data Governance dashboard reporting using SSRS for ALMTIP
Developed automated SQL procedures for regression testing
Environment: SQL Server 2008, SSRS
EDUCATIONAL QUALIFICATION
Course Institution Board/University Duration Percentage
M.E
Communication
Systems.
Sri Venkateswara
College of Engineering. Anna University.
(2010-
2012)
CPGA
:7.99
B.E Electronic and
Communication
Engineering.
St.Peter’s Engineering
College. Anna University.
(2006-
2010) 81
Higher Secondary
Education.
Little Flowers Higher
Secondary School. State Board.
(2005-
2006) 86.75
Secondary School
Education.
Little Flowers
Matriculation School. Matriculation.
(2003-
2004) 89.72
PERSONNEL DOSSIER
Date of Birth : 19-04-1989
Address : 53/29, Ponnusamy Street, New Washermanpet, Chennai - 81