from zero to hero in 205 days!
TRANSCRIPT
W7 Test Management 5/4/16 13:45
From Zero to Hero in 205 Days!
Presented by:
Michael Wasielczyk
T. Rowe Price
Brought to you by:
350 Corporate Way, Suite 400, Orange Park, FL 32073 888-‐-‐-‐268-‐-‐-‐8770 ·∙·∙ 904-‐-‐-‐278-‐-‐-‐0524 -‐ [email protected] -‐ http://www.stareast.techwell.com/
Michael Wasielczyk T. Rowe Price CTFL Michael Wasielczyk has thirty years of experience in the areas of software testing, quality assurance, metrics, process improvement, and project management. In his current assignment at T. Rowe Price, Michael is responsible for establishing the testing practice and building an organization to support system integration testing for all investment data systems. Previously, he spent fourteen years at UPS, responsible for the successful implementation of the software testing practice across UPS’ Customer Technology Portfolio, and fourteen years as a test analyst/test manager for companies including CSC, Nichols Research, and TRW.
FROM ZERO TO HERO
IN 205 DAYS!
Michael Wasielczyk, T. Rowe Price
2016 STAREAST Conference
May 4, 2016
TOPICS Introduction
What Contributed to
Success?
Results
Summary
3 3
Introduction
Day 1:
– No team
– No defined test practice/processes
– Corporate Culture: Developers and BAs test
– No Test Environment
– No experience in managing a Data
Warehouse test effort
– Software Development Practices absent
(Requirements Management, Design,
Documentation, etc.)
– V1.0 Deployment Date cannot change
“Can you be ready to test something in a
month?”
4 4
What Contributed to Success?
Senior Management Support
Best Practices
Guiding Principles (Based on Experience)
“Pick Your Battles”
Team
5 5
What Contributed to Success?
Senior Management Support
– Buck the Corporate Culture
Independent Testing Team
Testing Focused vs. Business
Knowledge Focused
– Partnering with Business
Analysts – necessary and
valuable
– Test Environment
Convince management and
support organizations in the
need/value
6 6
Historical Test Phases,
Environments and Owners
DEV QUAL
Test
Ph
ase
E
nvir
on
men
t O
wn
er
Unit Test
System
Integration
Test (SIT)
User
Acceptance
Test (UAT)
Development Development/Business
Analysts Business Users
DEV
Release to
Production
7 7
Current Test Phases,
Environments and Owners
SIT Test
Summary
and
Metrics
UAT Test
Summary
and Metrics
DEV SIT QUAL
Test
Ph
ase
E
nvir
on
men
t O
wn
er
Unit Test System
Integration
Test (SIT)
User
Acceptance
Test (UAT)
QUAL
Development Independent Test
Team
Business Users Independent Test
Team
Performance
Test
Performance
Test Results
Release
to
PROD
8 8
What Contributed to Success?
Best Practices
– Test Planning/Strategy
– Test Automation
– Performance Testing
– Metrics/Reporting
– Lessons Learned/Process Improvement
9 9
Source to CIM
Verify the format of the
feed files from source
systems per information
provided in requirements
documents
Validate corresponding
trigger / checksum files
have been received
Validate header records
in the Source Feed file for
filename, timestamp,
record count and column
names
IDW Ingestion
(IDH Layer)
Verify the CIM / CMM
file format (i.e..
Message format) is
consistent with
baselined
specifications
Process via Direct and
Indirect Adapters
Verify integrity of the
source data is retained
Landing Zone
Validate data load into
Landing Zone tables
Verify definition of
Landing Zone tables
Verify the count of
records in the Landing
Zone is consistent with
count of records from
the source
Verify integrity of the
source data is retained
Staging Area to
Warehouse
Validate that the source
data has been loaded
into the Warehouse per
rules specified in
functional
specifications
Verify integrity of the
source data is retained
Verify record counts
based on batch
processing of data
loads from Landing
Zone
Validate scenarios that
involve rejection of the
data
Validate basic data
audit requirements
Validate records with
Record Types (N,U,D)
Warehouse to
Publications
Validate the Warehouse
data is distributed
through Publications per
the desired format, and
criteria
Validate the standard
publication file format
Validate the record
count as per the filter
criteria
Validate generation of
checksum file
Validate columns in the
publication file
Validate data per the
mapping with
Publication views
Verify integrity of the
source data is retained
Test Planning/Strategy
T1
T2 T
3 T
4 T
5
10 10
How did the Test Plan/Strategy
contribute to our success?
Development work was sequential in nature
– Requirements developed for source systems (external) for
data feeds; reviewed and approved by source system
– Source systems (external) developed data feeds and
delivered file to Data Hub team (driven by source system
team availability)
– Once test file is received and verified, Data Hub team
develops adapters to ingest the data and lands the data into
the Landing Zone (LZ)
– Once data is landed in LZ, the Data Warehouse Team
develops the Universal Load Controllers (ULCs) to land the
data into the Warehouse
11 11
Result of the Development
Process?
12 12
The Perfect Storm!
Re
lea
se
Nu
mb
er
of
Cyc
les
Sta
rt D
ate
En
d D
ate
Pri
cin
g/F
X
Iss
ue
r
Ra
tin
g
Co
rp A
cti
on
HiN
et
SR
D
Tra
ns
ac
tio
n
Pa
rty
Po
sit
ion
Cla
ss
ific
ati
on
Po
rtfo
lio
/Ac
ct
Ea
gle
SR
D
Se
c R
ef
Oth
er
En
tity
Ad
dit
ion
al
Fu
nc
tio
na
lity
1 2 10/14/14 11/04/14 X
2 2 11/28/14 12/19/14 X X X X
3 3 Details Below
Cycle 1 01/13/15 02/09/15 X X X X X X - Table Driven Queue
Processing
- Regression Testing
Cycle 2 02/13/15 03/06/15 X X X X X X X - Exception Handling /
Diagnostics (IDH)
- Monitoring
- Regression Testing
Cycle 3 03/11/15 03/27/15 X X - Exception Handling /
Diagnostics (IDW)
- Data Services
- Pricing Publications
- Multiple Change
Controls
- HiNet File Changes
- Required Defect Fixes
- Regression Testing
Majority of functionality deployed at the latest point in the project schedule
13 13
Project Timeline
09/02/2014 03/29/2015
10/01/2014 11/01/2014 12/01/2014 01/01/2015 02/01/2015 03/01/2015
09/02/2014
Started Work at TRP
10/14/2014
Started Testing - Bld 1
03/29/2015
IDW 1.0 in Production11/04/2014
Completed Testing Bld 1 (1.6% of planned sourced attributes)
11/10/2014
Test Automation Begins
11/28/2014
Started Testing - Bld 2 (including Automation)
12/19/2014
Completed Testing Bld 2 (11.4% of planned sourced attributes)
01/06/2015
SIT Environment Available
12/23/2014
1st Performance Test
02/09/2015
Completed Testing Bld 3 Cyc 1 (39.1% of planned sourced attributes)
03/06/2015
Completed Testing Bld 3 Cyc 2 (97.9% of planned sourced attributes)
03/27/2015
Bld 3 Cyc 3 Testing Complete
14 14
Test Automation
Introduced automation after the first test release in November
– Approach: Automate new functionality from the previous test release, in the time we had
Framework developed/automation driven using HP Unified Functional Tester
– Data Driven, Modular Framework Implementation
By Deployment:
– The team achieved 64% test automation of all Test Cases written for the first two releases turned over to test
Current State of Automation:
– Smoke Test Automation (Database Schema Validation, Source Feed File Processing)
– Regression Test Automation
Automation in place for all T1, T2, T3 and T5 test cases (~3,200+ manual test cases)
15 15
Automation Benefits Received
Focus Area
Execution
Time
Manual
Testing
Execution
Time
Automated
Testing
Time Saved
(min)
Time Saved
(%)
T1 – Source to CIM 18.00 0.40 17.60 97.74%
T2 – IDW Ingestion 30.00 3.00 27.00 90.00%
T3 – Landing Zone 7.56 0.36 7.20 95.23%
T4 – Staging Area to
Warehouse 9.00 0.03 8.97 99.66%
TOTAL 64.56 3.79 60.78 94.12%
IDENTIFIED SAVINGS BASED ON PRICING/FX TEST CASES
16 16
Performance Testing
No performance requirements
No GUI or Consumers
Focus:
– Establish the performance baseline for the IDW architecture by processing multiple entity source data feed files of varying record sizes through the system architecture to evaluate the following metrics:
CPU Utilization
Data Feed Processing Time (Ingestion through Landing)
I/O Monitoring
Value added:
– Bottlenecks identified
– Re-architecture of Solution (V1.2)
– Comfort level that future consumer business processes would not be impacted
17 17
Metrics/Reporting
Daily Status Email
– Environment Availability
– Test Execution Status (Textual and Tabular)
– Test Execution (Plan vs. Actual)
– List of Defects Found that Day
– Count of defects by:
Severity
State
– Defect Age by Priority
– Defect Estimate vs. Actuals
18 18
Test Execution (Plan vs. Actual)
0 2354
98 107 109 120 125 125 125 125 140186
795
881
1202
1433 1469 1485 15051532
1556
0
200
400
600
800
1000
1200
1400
1600
1800
Test Execution Plan vs. Actuals - IDW Maintenance Release 1
Actual Plan - Blocked Plan Linear (Actual)
19 19
30 64%
11 23%
4 9%
2 4%
ALL DEFECTS FOUND IN V1.2 BY
SEVERITY
01-Crash or Data Loss
02-Functional- No Workaround
03-Functional-Workaround
04-Cosmetic
1 2%
8 17%
1 2%
29 60%
1 2%
5 11%
2 4%
1 2%
ALL DEFECTS FOUND IN V1.2 BY
STATE
New Pending Assignment
Fixed In Development
Ready for SIT Closed
Verified in SIT Rejected
Defect Counts
20 20
0
5
10
15
20
25
30
35
0D 3D 4D 6D 2W
Nu
mb
er
of
De
fec
ts
Defect Age
DEFECT AGE GRAPH – GROUPED BY PRIORITY
01-Urgent 02-High 03-Medium 04-Cosmetic
Defect Age by Priority
21 21
Defect Estimate vs. Actuals
0
10
20
30
40
50
60
10-Sep 17-Sep 24-Sep 1-Oct
Co
un
t
Date
V1.2 DEFECT ESTIMATE VS. ACTUALS
Actual Estimate Linear (Actual)
22 22
Metrics/Reporting (cont.)
Test Closure Report
– Overall Test Status, Key Risks/Issues
– Exit Criteria Summary
– Test Execution Results
– Defect Analysis (including Open Defects and Resolution
Plan)
– Environment Availability Summary
– Value Delivered/Lessons Learned
23 23
Lessons Learned/Process
Improvement
“Constructive Dissatisfaction”
Focus on the entire software lifecycle
Driven by Test Team, ownership transferred to PMO
Test team perspective: easier to identify/recognize
improvements rather than identifying what was
working
24 24
What Contributed to Success?
(cont.)
Guiding Principles (Based on Experience)
– Transparency
– Process-Oriented
– Accountability
– Flexibility
25 25
Battles Fought
– Documentation
– Defect Management
– Release Notes
– Defect Fix SLA
Battles Ignored (or
Fought less Vigorously)
– Requirements
Management/Change
Management
– Requirements Traceability
“Pick Your Battles”
26 26
Meeting Defect Fix SLAs
Incorporated Fix Complexity into Calculation
Positive Influence:
– Defects were fixed faster as deployment approached
Negative Influences:
– Amount of Development work
– Number of Defects that were Reopened (Fixed and then not fixed in next build)
27 27
Meeting Defect Fix SLAs (cont.)
Release Cumulative Success
1 36% (4 of 11)
2 41% (11 of 27)
3 49.6% (75 of 151)
Priority Release 1 Release 2 Release 3
01 – Urgent 0% (0 of 2) 0% (0 of 3) 50% (4 of 8)
02 – High 25% (1 of 4) 20% (3 of 15) 44% (41 of 94)
03 – Medium 60% (3 of 5) 89% (8 of 9) 61% (30 of 49)
28 28
Meeting Defect Fix SLAs (cont.)
0.00
200.00
400.00
600.00
800.00
1000.00
1200.00
1400.00
1600.0001
/06/
2015
01/1
6/20
15
01/2
6/20
15
02/0
5/20
15
02/1
5/20
15
02/2
5/20
15
03/0
7/20
15
03/1
7/20
15
03/2
7/20
15
04/0
6/20
15
As expected, Defect Fixes were turned over quicker the closer we got to Deployment
Turnaround
Linear (Turnaround)
29 29
What Contributed to Success?
The Test Team (4.5 local; 6 off-shore)
– Dedication
9-10 hour workdays
– Flexibility
Weekends as needed
– Innovation
Creation of Test Case Generator
– Communication
Early morning / late evening hand-off
meetings
RESULTS
31 31
A total of 4,560 test cases were created (T1 – T4)
A cumulative total of 7,571 test cases executions were performed
A 90% average pass rate was observed across all test cycles
Only 6% of defects were rejected due to testing errors
The team achieved 64% test automation of all Test Cases written for the first two releases
turned over to test
Test Release 1 – 73% automation
Test Release 2 – 54% automation
Created 311 test cases for Performance Testing; executed 275 test cases in two
performance test cycles
Overall, 96.9% of attributes were landing correctly
in the Warehouse
42 defects/enhancements deferred
• 22 Coding Defects
• 6 Documentation Defects
• 9 Enhancements
• 4 Environment/Configuration Issues
• 1 Source Data Issue
No Urgent Priority Defects Open at Deployment
Successful Deployment of
IDW v1.0 Test design and test execution activities started in early September 2014 for IDW v1.0 implementation
Timeframe: September 2014 – March 2015
SUMMARY
33 33
Summary
When faced with a difficult project assignment; it’s
important to utilize the knowledge gained in previous
assignments:
– Best Practices (aka, What Worked)
– Your Guiding Principles (Steadfast, Unwavering)
– What are the things (processes, tools, etc.) worth fighting
for?
(BONUS: Having Senior Management Support and a Great Team
Doesn’t Hurt!)
THANK YOU
35
Michael Wasielczyk, CTFL has 30 years experience in
the areas of Software Testing, Quality Assurance,
Metrics, Process Improvement (CMMi) and Project
Management. In his current assignment, Michael is
responsible for establishing the testing practice and
building an organization to support System Integration
Testing for all Investment Data Systems at T. Rowe
Price. Prior to T. Rowe Price, Michael spent fourteen
years at UPS, responsible for the successful
implementation of the software testing practice across
UPS’ Customer Technology Portfolio; and fourteen
years as a test analyst/test manager for multiple
companies (CSC, Nichols Research, and TRW).
E-mail: [email protected]
Bio Page
Michael Wasielczyk Manager, Quality Assurance and
Testing, Investment Data
Systems