reducing project risk through comprehensive testing …...project execution: testing planning,...
TRANSCRIPT
Reducing Project Risk Through Comprehensive Testing Planning
Lisa Newell
Stephanie Hanko
November 14, 2014
Agenda
© 2014 CSG Government Solutions, Inc.
Topic Presenter
Introductions Lisa Newell & Stephanie Hanko
Overview: The Testing Planning and
Execution Life Cycle
Stephanie Hanko
Project Initiation: High-Level Testing
Strategy
Stephanie Hanko
Project Planning: Detailed Testing Strategy Lisa Newell
Project Execution: Testing Planning, Design,
Training, and Execution
Lisa Newell
Monitoring and Controlling: Testing
Tracking and Progress Monitoring
Lisa Newell
Project Close-Out: Testing Close-Out Lisa Newell
Wrap-up / Q & A Stephanie Hanko & Lisa Newell
November 14, 2014 2
Lisa Newell, PMP
Lisa is a Senior Consultant with CSG Government Solutions. She is a PMI-certified PMP and a Six Sigma Green Belt with Lean experience. She holds an ITIL Foundation v3 certification, and has completed SCAMPI (Standard CMMI Appraisal Method for Process Improvement) training. Over 30 years in IT
Project planning and strategy development System design and implementation Project assurance, including testing, quality, and IV&V Business process reengineering and process improvement
Over 20 years of software testing and quality assurance
IV&V Project Manager, Tapestry Project
November 14, 2014 3 © 2014 CSG Government Solutions, Inc.
Stephanie Hanko, PMP
Stephanie is a Senior Practice Adviser in the CSG Government Solutions Consulting Services Practice. She is a PMI-certified PMP. Over 20 years of experience in systems,
development, implementation, and oversight
Over 20 years of experience in project management, planning, quality, and policy
Expertise in: Project Management Office (PMO) Quality Assurance (QA) Independent Verification and Validation
(IV&V)
November 14, 2014 4 © 2014 CSG Government Solutions, Inc.
Testing Cannot Be Overestimated!
© 2014 CSG Government Solutions, Inc.
Testing reduces project risks and maximizes the likelihood of meeting the system’s intended objectives and the user’s needs.
Testing can take over 30% of the project work effort to complete.
Testing hours are often cut or time compressed because of schedule delays and cost overruns.
Role and responsibility confusion about who and how testing will be done between the State and system vendor reduces testing effectiveness and extends testing time.
November 14, 2014 5
Overview: The Testing Planning and
Execution Life Cycle
Testing Types
7 © 2014 CSG Government Solutions, Inc.
Across the software development life cycle, basic types of testing must occur to confirm and ensure the system is developed as expected:
Unit Testing: Tests the smallest testable part of a system. When the tests pass, the code is considered complete.
Integration Testing: Tests how the units (software modules) operate when put together. Tests the interface between system components.
System Testing: Tests the software as a whole based on requirements specifications. Final test of the software by the development team.
Stress and Performance Testing: Test 1) the system benchmarks for performance (e.g., time to load a screen), and 2) how the system recovers when it is overloaded (e.g., increasing numbers of users). Typically done concurrently with User Acceptance Testing.
User Acceptance Testing: Tests how the system works for the user and confirms it meets expected business and technical functionality.
Other types: e.g., regression, smoke, data conversion, interface testing, etc.
November 14, 2014
Testing V-Model
Code
Module Unit Test
ApplicationIntegration
Test
System TestRequirement
BusinessConcept
AcceptanceTestVerify
Verify
Verify
Verify
Testing V-Model
© 2014 CSG Government Solutions, Inc. November 14, 2014 8
The Testing Planning and Execution Life Cycle
© 2014 CSG Government Solutions, Inc. November 14, 2014 9
Project Initiation: High-Level Testing Strategy
Project Initiation: High-Level Testing Strategy
© 2014 CSG Government Solutions, Inc. November 14, 2014 11
Expectation Setting: Testing Work Effort
12 © 2014 CSG Government Solutions, Inc.
Testing can range from a low of 10% to over 30%* of the total work hours for a project.
The level of testing effort is influenced by: Project size and complexity Risks and risk level Number and type of external interfaces Developer experience and business knowledge Quality of requirements and design specifications Approach to the SDLC (software development life
cycle, e.g., waterfall, iterative, Agile) Previous testing experience
(*CDC, 2007: 20-30%; Gartner, Oct 2006: 10-35%; stackoverflow.com blog posts, 2009-13: 10-50%)
November 14, 2014
Expectation Setting: Initial Level of Effort Estimate
13 © 2014 CSG Government Solutions, Inc.
The complexity of the system impacts the level of effort needed for testing, such as: Complete system replacement Enhancement of certain components of a current system Development of a completely new system Use of a commercial-off-the-shelf (COTS) product Number and complexity of functional requirements Number and complexity of technical requirements Type of and complexity in the required technical architecture
Several high level approaches can assist with determining the testing (and overall project) level of effort: Using a Request for Information (RFI) or asking vendors to
demonstrate their products Discussions with peer agencies in other states also developing a
similar system.
November 14, 2014
Expectation Setting: Assumptions and Constraints
14 © 2014 CSG Government Solutions, Inc.
Each project has a set of assumptions or constraints that set boundaries. These boundaries impact testing.
Project constraints that can impact testing include: A mandatory implementation deadline Limited resource availability Reliance on unallocated funding The source, amount, and requirements tied to available funding
Project assumptions that can impact testing include: The vendor is providing the system testers. The vendor is preparing the test environments and data. The vendor is providing automated testing tools. The vendor is providing traceability from requirements to testing
artifacts. The vendor is writing test cases for system testing. The State is writing test cases for user acceptance testing.
November 14, 2014
Expectation Setting: Roles and Responsibilities
Type of Testing Responsibility Role
Unit System Vendor Developer
Integration System Vendor Business Analyst
System System Vendor Business Analyst
User Acceptance State Organization with support from System Vendor
Subject Matter Expert (SME)
Stress and Performance
System Vendor with support from the State Organization
Technical Staff
© 2014 CSG Government Solutions, Inc. November 14, 2014 15
Expectation Setting: State Resources
16 © 2014 CSG Government Solutions, Inc.
State staff are needed for testing activities across the software development life cycle for: Developing the State testing strategy and plan Providing oversight of vendor testing efforts Managing the State’s testing efforts Reviewing and confirming vendor testing plans, defect criteria and
prioritization, testing reports, and progress Creating user acceptance test cases, scenarios, and conditions Training State staff to perform user acceptance testing Conducting user acceptance testing Backfilling job duties
State infrastructure is needed for conducting testing such as: Setting up a separate testing lab Setting up a separate system testing environments, including legacy Determining and setting up testing equipment (e.g., rooms, desks,
computers, scanners)
November 14, 2014
Expectation Setting: System Vendor Parameters
17 © 2014 CSG Government Solutions, Inc.
It is important to understand the variety of automated testing tools a vendor may use. Whether or how they are used can impact project costs, resources, and testing efficiency and reliability. For example, tools can be used for: Requirements traceability Defect tracking and reporting Test case execution Stress or performance testing
A vendor’s control of how changes are made to a system (i.e., configuration management), impacts the reliability of the system and needs to be defined and understood. For example: What environments for system development and testing will be set
up and used? What processes will the State require the vendor use (and does the
State have a required process for configuration management)?
November 14, 2014
Putting it all Together: The Procurement Process
18 © 2014 CSG Government Solutions, Inc.
The key elements from the high level testing strategy (i.e., level of work effort, roles and responsibilities, system vendor parameters) are used for the system vendor procurement to: Define the requirements for testing
Determine the evaluation criteria and weighting to assess the vendor’s testing approach and methodology
Determine the reasonableness of the system vendor’s work effort estimates
Develop the questions asked of a system vendor’s references
November 14, 2014
Project Planning: Detailed Testing Strategy
Project Planning: Detailed Testing Strategy
© 2014 CSG Government Solutions, Inc. November 14, 2014 20
Detailed Testing Strategy: Refinement Begins
21 © 2014 CSG Government Solutions, Inc.
Once a system vendor is engaged, the high level testing strategy is refined based on the vendor’s response, contract, evaluation results, responses to reference questions. For example, Have any testing assumptions or constraints changed? What SDLC methodology is the vendor using and how will
that impact the testing strategy? Have there been changes in the roles and expectations for
system or user acceptance testing? Did the testing level of work estimates increase or
decrease? How? Why? What’s the vendor’s testing strategy and schedule? What’s
the strategy to work with the vendor to understand and be engaged in the testing they will do?
What automated tools will be used and for what?
November 14, 2014
Detailed Resource Planning
22 © 2014 CSG Government Solutions, Inc.
Developing a Detailed Project Strategy includes confirming the specific State resources needed for the entire testing planning and execution life cycle. This includes determining: Who the best SMEs are for the project Whether the SMEs will function as business analysts,
testers, or both Whether the agency will hire a contractor to assist with
testing Whether they will be available for the entire project The training needed to learn to be testers How much of their work time will be needed The plan to back-fill the job duties of the testers The skills the individuals need to back-fill
November 14, 2014
Validating Resource Estimates
23 © 2014 CSG Government Solutions, Inc.
A key component of the detailed testing strategy is to review and validate the resource estimates developed for the work effort by the 1) State and 2) system vendor to: Level-set expectations between the State and system
vendor on what it will take to get the testing work done Confirm the testing work activities, work hours, timeline,
and number of resources needed to support the overall project
Testing resource constraints need to be considered such as: Availability of resources (e.g., full-time for a short period
vs. part-time for a longer period) Vacations, holidays, or other planned time off Schedule delays and impacts to the testing schedule
November 14, 2014
Validating Resource Estimates: Methodologies
24 © 2014 CSG Government Solutions, Inc.
Determining the resource estimation methods to use depends on the SDLC approach and estimator’s experience.
Estimating methods range from a best guess and experience to formulaic methods, such as: Task Decomposition/Delphi Technique Three Point Estimation Development Effort Percentage Distribution Percentage Testing Point Analysis Use Case Point Estimation
Adjust the baseline project schedule based on the final resource estimates and manage to that schedule.
November 14, 2014
Finalizing the Test Strategy
25 © 2014 CSG Government Solutions, Inc.
The detailed testing strategy lays the foundation for test plans, and provides information to address: Testing scope
Testing approach based on the SDLC methodology
Level of work effort
Resource expectations, roles, and responsibilities
Assumptions and constraints
November 14, 2014
Project Execution: Testing Planning, Design,
Training, and Execution
Project Execution: Testing Planning, Design, Training, and Execution
© 2014 CSG Government Solutions, Inc. November 14, 2014 27
Project Execution: Testing Planning
28 © 2014 CSG Government Solutions, Inc.
There should be a test plan for each level of testing.
Test plans build on the detailed testing strategy.
Test plans elaborate the tactics and logistics of each level of testing.
Test plans define how you will do the testing.
November 14, 2014
Test Plans For Each Testing Level
© 2014 CSG Government Solutions, Inc.
Test plans define the following: Test artifacts – The test cases/scenarios, scripts, and conditions
that will be developed and used for the testing level. Pre-conditions – What must be in place for testing to occur
(e.g., testing environment established, artifacts developed). Entrance and exit criteria – What must occur for the testing
level to start and end (e.g., system testing is completed and all known critical defects are repaired before user acceptance testing begins).
Schedule – The testing tasks and time allotted for completing them.
Resources – The testing roles and who will fulfill them and their experience level.
Defect management – How defects are defined, and the process for how defects are identified, monitored, and tracked.
November 14, 2014 29
Defect Management: Definitions
© 2014 CSG Government Solutions, Inc.
A critical aspect of testing planning is defining with the system vendor what constitutes a defect: What is a defect vs. an enhancement/change request. What are the defect severity levels and how are they
defined, such as: Critical (show stopper): Can’t proceed with testing until the
defect is fixed. High: Important to resolve but testing can continue. Medium: A work around is available. Low: Functionality is not perfect, but is not impairing
anything. Cosmetic: e.g., color changes needed, report that needs to
be adjusted.
What severity levels need to be resolved before moving to a new testing level.
November 14, 2014 30
Defect Management: Process
© 2014 CSG Government Solutions, Inc.
A key component of defect management is setting up the processes for resolving defects, including: Developing a process to prioritize defects (i.e., what gets
worked on first), which includes considering: What severity level a defect has Whether a defect is dependent on another one to be repaired
Understanding the system vendor’s defect triage process (i.e., before a defect is routed for resolution), including: Reviewing, validating, and reproducing the defect Assessing whether the defect is a duplicate Reviewing the documentation for understandability and
thoroughness Using the triage process to improve how defects are identified
and documented
November 14, 2014 31
Project Execution: Testing Design
32 © 2014 CSG Government Solutions, Inc.
Analyze requirements to determine how to evaluate the implementation of each: Criticality – Identify the critical business functions the
requirement must satisfy. Risk – Identify the consequences of not meeting those
functions.
Develop test artifacts (cases, scenarios, scripts, and conditions) to make sure the software satisfies the requirements.
Cross check test artifacts for consistency and quality using reviews and walkthroughs.
Ensure traceability so each requirement has been satisfied (requirements to software features to test artifacts and back to requirements).
Document testing processes and procedures. Configure testing environments and automated testing tools.
November 14, 2014
Testing Training and Support
© 2014 CSG Government Solutions, Inc.
Understand and plan for the training needed for both the system vendor staff as well as the State staff.
Train the testers not only in how to test but also how the new system works (i.e., there may be differences between old and new system).
Ensure the testers understand the testing process and terms, such as: What is a defect? How do you report a defect? What level of information do you need to document a defect? What kind of detail is needed so a person can correct the defect?
Leverage the user training materials as reference materials when the testers are doing the testing.
Plan for the support of the business analyst team as well as the developer team during system and user acceptance testing.
November 14, 2014 33
Project Execution: Testing Execution
34 © 2014 CSG Government Solutions, Inc.
Execute the testing process and procedures using written process documents and process flows.
Log each test executed and whether it passed or failed.
Document defects in detail.
Triage, prioritize, and assign for follow-up.
Monitor the testing process to ensure adherence. November 14, 2014
Monitoring and Controlling: Testing Tracking and Progress
Monitoring
Testing, Tracking, and Reporting
36 © 2014 CSG Government Solutions, Inc.
State staff need to understand, track, and monitor all aspects of the system vendor testing. This includes: Visibility into the system vendor’s testing progress Working with the vendor to define defect levels Working with the system vendor to determine and
track defect priorities Tracking the State’s own testing progress, including:
Test plan development and finalization Resource allocation and environment set up User acceptance training and test case development Testing progress and risk management
November 14, 2014
Testing Metrics and Progress Monitoring
37 © 2014 CSG Government Solutions, Inc.
It’s important to design metrics to evaluate the health and productivity of the testing process. For example: Number of test artifacts developed
Number and percent of test scenarios tested, passed, and failed
Number and percent of test scripts tested, passed, and failed
Repair time per defect
Number of cycles per defect to resolution
Number and severity of defects per test scenario and test script
Number and severity of defects per requirement
Number and severity of design defects
Number and severity of software defects
November 14, 2014
Testing Metrics and Progress Monitoring
38 © 2014 CSG Government Solutions, Inc.
Metrics should be appropriate for the complexity of the project.
Reassess the value of metrics as testing progresses, such as: Eliminate metrics of little value. Refine metrics as needed. Add additional metrics based on needs of project.
Metrics reports and graphs should provide instant snapshot of progress.
Align metrics with testing levels, goals, and objectives: Metrics should be reasonable for the level of testing. Performance criteria and requirements need different sets of
metrics.
Schedule checkpoints with the vendor to ensure visibility into the testing process to confirm the status of testing and the traceability from requirements to testing artifacts throughout the testing process. November 14, 2014
Project Close-Out: Testing Close-Out
Project Close-Out: Testing Close-Out
40 © 2014 CSG Government Solutions, Inc.
State needs to ensure test activities are complete and repositories are current: Traceability is complete.
All test cases have passed.
All exit criteria for user acceptance testing have been met.
Testing artifacts have been archived.
The testing process has been documented and it is reusable for defect repair and enhancement testing and implementation.
Document lessons learned.
November 14, 2014
Wrap-Up / Q & A
Wrap-Up
42 © 2014 CSG Government Solutions, Inc.
Plan early and often! Early planning reduces overall project risk.
Early strategizing for the testing effort helps ensure the state gets what is needed in the procurement process.
Testing must be controlled and managed from the beginning of the project.
The State must be engaged in the vendor’s testing process from the very beginning and throughout.
The state and vendor must agree on key points, such as: defect definitions, testing level entrance and exit criteria, defect management process, metrics, prioritization of defects.
Level set expectations of the system vendor and staff from the beginning and throughout the project.
Think through resource estimates and needs, then review and refine throughout the testing planning and execution life cycle.
November 14, 2014
Wrap-Up
43 © 2014 CSG Government Solutions, Inc.
Lisa Newell
(505) 259-0569
Stephanie Hanko
(602) 206-3197
November 14, 2014
Wrap-up
44 © 2014 CSG Government Solutions, Inc.
Questions?
November 14, 2014