independent project monitoring - nm doit

43
Independent Project Monitoring Friday, November 14, 2014

Upload: others

Post on 22-Nov-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Independent Project Monitoring

Friday, November 14, 2014

Agenda

Agenda/Discussion Topics• Welcome and Introductions• Why Might Projects Need Independent

Monitoring?• Does QA/IV&V Really Help?• QA Vs. IV&V • Roles and Differentiators for

Implementation, Quality Assurance, and IV&V Teams or Vendors

• Qualities of a Good QA/IV&V Function• Lessons Learned from IV&V • Q&A

With You Today

Eric Applewhite, PMPDirector

Jack Umansky, PMPDirector

William Bangs, PMPManager

2

Why might projects need Independent monitoring?

The Challenges of IT Projects

“72% of all IT Projects are late, over budget, lack functionality, or are never delivered as

planned”Meta Group

“Only 25 percent of telecom IT projects succeed, a rate that is worse than IT projects

across all industries, where 28 percent succeed.”

Standish Group

“…the odds of a FORTUNE 500 software development project coming in on time and on

budget are unfavorable – only one in four succeed.”

Standish Group

The Project Management Profession Lives in Troubled Times…..

“75% of 1,450 firms surveyed exceeded their IT deadlines and more than 50% of them substantially exceeded their budgets. Key reasons were poor planning, inadequate

studies on how IT projects relate to the firm’s needs and lack of management support.”

KPMG SurveyWhy?

4

Why Projects Need QA and IV&V?

0%

10%

20%

30%

40%

50%

60%

2000 2002 2004 2006 2008 2009 2010Succeeded 28% 34% 29% 35% 32% 32% 37%Challenged 49% 51% 53% 46% 44% 44% 42%Failed 23% 15% 18% 19% 24% 24% 21%

Project Outcomes (2000 – 2010)

Succeeded

Challenged

Failed

Source: Standish Group CHAOS Manifesto 2012.

5

Recent IT Project Failures at State and Local Governments

• For Child Support Enforcement Systems nationally, 14 States Have Been Penalized for Lack of Certification. From 1988 to 1998, 7 Out of 54 State Projects Failed, At a Cost to Taxpayers Of More Than $350 Million. California Paid 1 Billion dollars for 2 failed implementations and Over $1 Billion In additional Penalties before it succeeded on its third attempt. South Carolina has Paid Over $50 Million To-Date and Will Likely Have Paid Close to $100M by the Time they are Certified

• Implementation of a new $55 million payroll system for a large urban school district resulted in $53 million in overpayments to teachers, months of non-payments, and additional costs of $35 million for repairs and delays.

• A Southern state planned a $68 million construction project of two data centers located on the floodplain of a dam determined by the Army Corps of Engineers to be at risk of failure.

• With the $100 million price tag already spent, a state’s enterprise financial management system project was indefinitely suspended after an independent assessment determined that the state had failed to adhere to industry best practices for funding, planning, and implementation.

• A $171 million five-year project to build and run a state’s medicaid claims system is behind schedule and bogged down in litigation, with the state blaming the vendor for unacceptable work product and the vendor blaming the state for inefficient project management and lack of knowledge about the current system.

• Nearly half of eight months’ worth of Medicaid fraud documents were destroyed and scores of prosecutions were compromised when the outsourcing vendor for one of the largest states failed to follow file backup procedures.

• A large Northeastern city has yet to deploy a new billing system for its half-million water customers after spending over three years and $18 million, more than twice what it expected to pay.

6

Why Make an Investment in Some Type of Project Assurance?

• Independent, third-party assessment of specific products, activities, and processes• Subject matter knowledge in individual disciplines• Insight into leading practices from other areas of the state, from other states, or the private

sector• Partnership with the client to provide informed assessments and recommendations• The American federal government supports and funds third-party QA for federally funded

projects (45 CFR Part 95)• Jurisdictions require QA for significant projects (e.g., NYC, NASA, California)

Costs of Project Assurance (IV&V, QA, Others) can include:• Fiscal Costs• Time and Resource Costs to coordinate with Assurance teams• Cultural Impacts• Sponsorship Costs

7

Does QA/IV&V Really Help?

The Quality Assurance (QA) Value Proposition

• A Titan study conducted at NASA determined:– The use of an Independent Project Oversight can reduce the total ownership costs of the system– Nearly two-thirds reduction in Defect Density with full lifecycle Independent Project Oversight

• A case study conducted by Digital, Finding Defects Earlier Yields Enormous Savings, indicated that each phase a defect survives it costs approximately three times as much to fix that defect (See http://www.digital.com/solutions/roi-cs2.php)

• Another case study, conducted by NASA, A Case Study of IV&V Return on Investment (ROI), indicated that the ROI for IV&V on a software development project was between 1.25 and 1.82

• For another project, the IV&V ROI was 11.8 (Estimating Direct Return on Investment of Independent Verification and Validation using COCOMO-II, J.B. Dabney, G. Barber, and D. Ohi)

The Titan Study argues the ROI figures understate the full value of an Independent Project Oversight as a result of the following factors: • Watchdog effect – The presence of a QA vendor makes the developer more conscientious and less likely to

cut corners• Improved maintainability – QA reviews improve the accuracy, readability, and general usability of system

documentation• Better understanding and response to risks – QA offers impartial evaluations and recommendations as to

how to proceed when there are difficult alternatives

9

IV&V and QA – Return on Investment (ROI)

Cost of Correcting Defects in SDLC

10

Why Projects Need QA and IV&V – A Return?

Source: http://onproductmanagement.net/wp-content/uploads/2010/08/treecomicbig.jpg.

11

QA Vs. IV&V

QA/IV&V DefinedIndependent Verification and Validation (IV&V) is often associated with a formal, objective review of project results. It is commonly associated with the IEEE definition for Software Verification and Validation and Quality Assurance (IEEE Standard: 1012-2004) The following are IEEE’s definition of IV&V and Quality Assurance.Verification:“Ensures that the solution was built (or is being built) according to the requirements and design specifications.”Validation:“Ensures that the solution actually meets the client’s needs, fulfils its specific intended use, and that the specifications (requirements) were correct in the first place.”As defined in the IEEE standards, IV&V processes include activities such as assessment, analysis, measurement, inspection, and testing of software products and processes. These IV&V processes further include assessing software in the context of the system, including the operational environment, hardware, interfacing software, operators, and users. The IEEE standard seeks to assure that software IV&V is performed in parallel with software development, not intermittently or at the conclusion of the software development. ANDQuality Assurance:“Is the process for providing assurance that the products and processes in the Project lifecycle conform to their specified requirements and adhere to their established plans.”Both of these methods are further described on the following pages.

13

Typical QA Scope• Observe, analyze and provide feedback on the project

governance process and related sub-processes.• Review project management, business process, change

management, technology, and internal controls• Evaluate processes utilized to develop products associated

with a project.• Measure/monitor resultant quality of the project’s outputs

(e.g., planning documents, requirements and design documents, incremental releases, test results, training, deployment and knowledge transfer).

• Support process improvements throughout the project.• Develop and monitor adherence to quality consistent with

leading practices and quality standards.• Conduct quality-based reviews of project artifacts in a

proactive manner (i.e., comment of draft version before it is submitted in addition to the final review).

• Engage in or actively observe the quality outcomes of behavioral and organizational change management activities.

• Proactively engage and provide real-time regular assessments, and recommendations for remediation, throughout the project.

Note: QA helps “ensure” quality – but does not provide absolute assurance.

What is Quality Assurance?Quality Assurance encompasses both of the following:Quality Assurance is based on a set of activities designed to evaluate the process by which products are developed.Quality Control is based on a set of activities designed to evaluate the quality of products developed.Even though there are differing definitions across various industries regarding QA and QC, for the purposes of this KPMG methodology, Quality Assurance covers both as defined above.Quality is achieved when the project and processes are performed:• According to approved specifications

(requirements) and standards• To stakeholders’ needs and expectations• On time• Within budgetQuality standards such as from PMBOK and ISO9000 are commonly used.

14

What is Independent Verification & Validation?

Verification focuses on the process of providing objective evidence that the application and its associated products conform to requirements (e.g., for correctness, completeness, consistency, accuracy) for all lifecycle activities:• Throughout each process (planning, acquisition,

development, operation, and maintenance)• To satisfy standards, leading practices, and conventions

during lifecycle processes• To successfully complete each lifecycle activity and

satisfy all the criteria for initiating, subsequent and successful activities (e.g., building the application correctly).

Validation focuses on the process of providing evidence that the application and its associated products satisfy system requirements at the end of each lifecycle activity:• Solves the right problem (e.g., implement business rules,

use the proper system assumptions)• Satisfies the intended use an user needs.

Typical IV&V Scope (can also apply to QA)• Inspect and provide comments/feedback on project management practices and system development life cycle (SDLC) planning

artifacts against key quality attributes.• Verify that the practices used to develop the product and project artifacts conform to industry leading standards and practices.• Inspect software and system engineering artifacts (e.g., requirements documentation, system design, technical architecture,

system development and configuration, testing, training, implementation, deployment and system maintenance documentation).

• Provide artifact feedback regarding deficiencies when project documentation does not comply with industry standards or other project-specific references.

• Validate the traceability of project requirements throughout the development lifecycle.• Provide verification of test results reports via an appropriate method (i.e., hands-on, over the shoulder).• Provide risk assessments and mitigation recommendations. May be ‘point-in-time’ (e.g., end of SDLC phase, quarterly).

15

Monitoring versus QA: 3 Differentiators

Key differences can be summarized in three basic areas:• Independence

– IV&V provider must usually be more independent of the project team than QA– QA provider works at the direction of the project team

• Risk Remediation– IV&V typically cannot participate in the design or execute the solutions for identified

risks and recommended mitigations– QA can advise and assist project management more directly in risk remediation

• Participation– IV&V can be conducted either as a periodic assessment or ongoing– QA is generally involved ongoing throughout the project lifecycle

16

What Quality Assurance is NOT

• QA assesses information and provides recommendations based on those assessments:– The objective is not to find fault with staff; it is to help provide potential improvements

in existing processes, alternatives that may not have been recognized or considered, and independent, unbiased views and suggestions

• QA helps “ensure” quality – but does not provide absolute assurance• QA is not a guaranty of project success – issues in or out of a project’s control may ultimately

overtake other best efforts:– Incorrect or insufficient staffing– Funding issues– Lack of timely or appropriate decision making– Absence of appropriate tools– Absence of senior management or key stakeholder support

• QA is not “staff augmentation” or an open checkbook:– QA does not perform a client’s job if the client happens to have insufficient staffing– QA must perform contracted functions and not “stop-gap” or out-of-scope activities

17

Implementation Team, Quality Assurance, and Independent Verification & Validation Teams

Roles and Differentiators

4 Major Business/Technology Transformation Roles

Entity Role Typical Challenges

Agency/ Program Project Manager

Identify transformation need and strategy, procure and manage outsourced vendors to achieve goals, manage and control overall engagement, provide critical program and other subject matter expertise, define business and technology requirements, validate that new systems and processes meet requirements. Manage all project resources to deliver broader scope.

May be resource or skillset constrained and have to outsource/procure expertise from vendors to deliver transformation agenda.

System Integrator or Implementation Team

Define/Develop/Implement process changes required to deliver outsourced components of business or technology transformation agenda. Manage and coordinate activities of its resources to delivery its scope. SI quality efforts focused on ensuring completeness and accuracy of requirements and design through the development phase not overall project quality to include project management and implementation readiness.

Tends to focus only on its roles and responsibilities without wider program perspective, tends to treat quality as a testing exercise only, may attempt to pass quality validation onto Agency/QA/IV&V. Often a delivery organization that does not have or chose to employ the necessary project control or quality resources necessary to support an agency.

Quality Assurance Entity

Performed by a contractor in direct support of stakeholders to provide an assessment of risks and issues threatening project success as well as recommendations for mitigating them (less independence focus and more focus on designing and executing mitigations). Focus on both development and management processes.

Can be used to remediate risks and issues directly but may be accidentally or purposefully biased or constrained due to reporting relationships with project team. Because QA can design and execute, all activities may not be permissible for auditors.

IV&V Provider

Performed by a contractor in direct support of stakeholders to provide an assessment of risks and issues threatening project success as well as recommendations for mitigating them (lore focus on independent reporting some or all design and execution activities excluded). Focus on both development and management processes.

Cannot be used to remediate risks and issues but reporting structure built to provide independent and direct access to senior program leadership Because of stricter independence requirements IV&V is a permissible service for auditors.

19

Independent Verification and Validation (IV&V)

Qualities of a Good QA/IV&V Function

Qualities of a Good IV&V Function

A Qualified IV&V Function Should:• Be Independent and Objective• Bring an Already Established Approach that is Focused on Risks and Controls • Have Tool Sets Built Around a Structured, Repeatable Process Based on IEEE Standard 1012-

2004 and the Project Management Body of Knowledge (PMBoK)• Have the Bench Strength to Deploy and Maintain Continuity for a Team Experienced with the

Program/Agency or Domain, IV&V, IEEE, Federal Compliance, and Relevant Technology Skills that Can Serve as a Trusted Advisor

• Possess Federal Compliance Experience to Help Monitor Development of a System Design that will Achieve Federal Certification as well as Satisfy Single Audit Requirements

• Articulate an Approach that Builds in Opportunities for Collaboration Without Compromising Independence and Makes Efficient Use of State and Integration Team Time

• Provide Reasonable and Actionable Recommendations for Identified Deficiencies• IV&V/QA’s goal is the same as yours – the successful implementation of your system and its

subsequent Federal Certification – The best review is a review with no findings.– IV&V/QA does not get rewarded for the volume of findings we identify.

21

Ensuring Feedback Throughout Each Review Cycle

• How can a State’s process help ensure there is regular and timely feedback:– On-site presence for regular communication and dialogue– Some form of formal documented feedback at least monthly– Escalation Reports for when there is an immediate issue– Project Deliverable Observation Reviews (PDORs) for when the State would like IV&V to

take a deeper more specific look at a risk area• What’s important to ensure:

– There are numerous channels available to provide feedback on a daily, weekly, and monthly basis

– The goal is not to allow the formality of our process to get in the way of providing timely valuable information

22

How QA / IV&V is Delivered?

Good QA Starts with the Right Framework

• Provides a context for conducting QA • Provides a high level concept for conducting a project• Break a project into smaller, manageable components• Specify management and product development processes• Show how the pieces fit together

– By project phase– By project process– By deliverable

24

Process Areas and Industry Standards –Examples

Process Area Sample Industry Standards

Strategy and Governance • PMI’s Portfolio Management Standards• COBIT• SEI’s CMMI

Project Management • PMI’s Project Management Body of Knowledge• PMI’s Program Management Standards• PRINCE2® (PRojects IN Controlled Environments)

Business Process • IIBA’s Business Analysis Body of Knowledge• SEI’s CMMI

Organizational Change • Thought leaders (John Kotter and many others) • CMMI (People)

Technology Integration • Proprietary SDLC’s (Rational Unified Process)• Agile• IT Infrastructure Library (ITIL)• SEI’s CMMI; ISO; IEEE

Internal Controls • AICPA’s COSO• Proprietary Business System Controls Methods

25

SDLC Maps to the QA Framework

Value proposition: identify issues and risks early to increase efficiency and reduce project risks

Execution and Control Stage

SelectionAssistanceReq QA Design QA Develop-

ment QATesting

QAImplement

QA

Planning Stage

• Establish QA Standards

Initiation Stage

• Assemble QA Team

Close Stage

• Post Implementation Review

PM QA Assistance

Project Framework and Project Life Cycle

26

A Typical IV&V Review – Approach Overview

INITIATEPHASE

REVIEW/REPORTPHASE

CLOSEOUTPHASE

PLANPHASE

ASSESSMENTPHASE

ANALYSIS PHASE

27

A key to assessing status is the review of processes or deliverables along the way:

• Initial assessments to confirm appropriate process design

• Periodic reviews of Project Management or SDLC processes while you are underway

• Post implementation reviews to gather lessons learned

How to Know When and What to Measure?

The Checklist is Everything!

28

How to Know When and What to Measure? (Cont.)

ASSESS PROJECT MANAGEMENT AND STAKEHOLDER ALIGNMENT PRACTICES FOR APPROPRIATENESS AND EFFECTIVENESS

PM-1 Assess and recommend improvement, as needed, to assure continuous executive stakeholder buy-in, participation, support and commitment, and that open pathways of communication exist among all stakeholders.

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition andSoftware Acquisition Capability Maturity Model (SA-CMM) v1.03

PM-1.1 Are business cases, project goals, objectives, and expected outcomes documented and supported by executive stakeholders?

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

PM-1.2 Are the commitments coordinated among affected executive stakeholders? Software Acquisition Capability Maturity Model (SA-CMM) v1.03

PM-1.3 Are executive stakeholder requirements managed and influenced by the Project management team?

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

PM-1.4 Are differences among stakeholder expectations resolved in favor of the core business requirements?

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

PM-1.5 Do Executive sponsors of the projects provide sufficient financial resources to support the Project?

Software Acquisition Capability Maturity Model (SA-CMM) v1.03

PM-1.6 Are communication needs defined for stakeholders; and methods and technologies identified for meeting stakeholder communication needs?

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

PM-1.7 Is stakeholder satisfaction assessed at key Project milestones? Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

PM-1.8 Does the Project document and perform standard procurement management processes such as procurement planning, solicitation planning, solicitation, source selection, contract administration, and contract closeout?

Project Management Institute, A Guide to the Project Management Body of Knowledge, 2008 Edition

29

IV&V / QA

Lessons learned

Things to Look and Prepare forChallenge Recommendations

Project work plan cannot be used to manage the project

• Invest resources in a schedule management process

– Competent PM/PMO

– Maintenance is as or more resource intensive than plan creation

• Be very specific about your vendor responsibilities relative to the work plan

Requirements Management Process not MUTUALLY understood

• Define your expectations around requirements management as clearly as possible in your RFP

• Stipulate a Requirements Management Plan as an early deliverable in your RFP

• Ensure Requirements Traceability (Federal and State) at each project phase

Pace and volume of project deliverable reviews overwhelm State and project staff

• Create a Deliverable Expectation Documentation process

• Create a process for deliverable review management and tracking

• Use schedule to proactively identify and eliminate deliverable bottlenecks before they occur

31

Things to Look and Prepare forChallenge Recommendations

External partners resist or cannot meet CSE program expectations for interfaces

• Assume complexity in negotiation and schedule with external partners

• Develop plans for when interfaces fall behind

• Create a process for aligning and communicating with partners – Engage as early as possible!

– Develop Memoranda of Understanding (MOU’s) with participating organizations and build escalation processes in advance

– Develop communications plans to support project implementation

– Develop operations Service Level Agreements (SLA’s) for post go live to support of system and associated interfaces

Known or unknown resource shortages create skill gaps for the program

• Create formal definition of responsibilities

• Create a resource management/skill analysis process with a long enough runway to allow you to fill gaps

• Confirm existence of a staff augmentation process that allows you to rapidly procure unexpected skill sets when needed

32

Things to Look and Prepare forChallenge Recommendations

Leading edge tools are purchased but an approach and resources for learning to use them are not clear

• Newer more powerful tools are often more challenging to implement and use

• Be specific about your plan for identifying roles and responsibilities for tool use and who will train in their use

• Be specific about vendor and State roles regarding tool use, training, ownership and transition

Data quality issues cause schedule delays

• Create an early and incremental process for classifying and resolving data cleanse issues

• Identify process for managing State based data cleanse efforts

• Be specific about vendor expectations for data cleanse assistance

• Engage early and incentivize county users to assist with data cleansing efforts

33

Things to Look and Prepare forChallenge Recommendations

Insufficient project and program management processes and resources

• Absence of project management processes and metrics

• Inadequate risk management, including insufficient risk definition, and lack of mitigation strategies and prioritization

• Inadequate or slow procurement and vendor management processes

• Create specific vendor deliverables and scope to define and keep project and program management processes updated

• Retain a PMO and QA vendor to execute and/or monitor successful execution of processes

• Require Integration Vendor to provide a dedicated QA Manager and/or function

Insufficient technology skills and resources to validate vendor technology efforts

• Create specific vendor deliverables and scope related to configuration management and SDLC lifecycle definition (including design and development approach and quality standards)

• Determine whether specific technology skills will be needed and where there are gaps in create a plan to fill them. Confirm a process for quickly filling unanticipated gaps

34

Things to Look and Prepare forChallenge Recommendations

Inability to assume Operations and Maintenance (O&M) of system post Certification and continued reliance on Implementation Vendor beyond warranty period

• Assess internal staffing levels, skills and gaps early on

• Integrate State resources into project early on, including business analysts, developers, testers, architects and others from requirements, into design, developments

• Identify future state (post implementation) operating model, including partner and support agency involvement. Engage them early to establish operating model and draft Service Level Agreements (SLA).

• Require Integration Vendor to submit O&M and Transition related deliverables minimally 9-12 months prior to conclusion of the Warranty Period. (Generally ~3 months post implementation)

• Avoid delays in O&M planning and Implementation vendor’s responsibilities related to system documentation

Early Implementation Planning –Early Formal Assessment of New System Impacts on Counties

• Conduct early assessment of business processes, especially variations across counties. (Yes, they exist no matter how centralized and uniform a State’s CSE program may think they are)

• Create advocates for new system and local county “super-users” by identifying such individuals and engaging them early in the project from planning, requirements, design, testing, training, implementation and ongoing support

• Assess and document impact of application on business processes across counties from early stages of requirements validation/elaboration

35

Things to Look and Prepare forChallenge Recommendations

Challenged communication and collaboration due to do dispersed project teams

• Identify and secure adequate and appropriate office space for co-location

• Plan for office space to house integrated project team including State, implementation vendor, QA vendor, PMO vendor, IV&V vendor and others.

• Determine out of state and off shore development requirements and set expectations with implementation vendor for business analysts and developers

• Temporarily establish space for State management, team leads and BAs to co-locate with project vendors

Lack of actionable Business Continuity and Disaster Recovery (DR) Plan

• Establish Disaster Recovery Plan early and test prior to Statewide implementation.

• Conduct Disaster Recovery Testing regularly (at least annually)

• Document lessons learned and refine process documentation with each test executed

• Establish Service Level Agreements (with related vendors and partner agencies)

36

Contact Information

John KennedyPartner505 880 [email protected]

Alfonso R. SalazarDirector213-817-3185 [email protected]

Eric Applewhite, [email protected]

Jack Umansky, [email protected]

William Bangs, PMPManager [email protected]

KPMG LLP6565 Americas Parkway NE, Suite 700 Albuquerque, NM 87110

37

AppendixHigh Level Case Studies

Case Study 1: Independent Verification and Validation (IV&V) Over the Replacement of a State

Child Support Enforcement SystemKPMG provided federally required Independent Verification and Validation (IV&V) Project Monitoring services for the implementation of a new Child Support Enforcement System.As part of our scope of work, the KPMG team has conducted semi-annual reviews of the project control, technical, and functional processes and work products, including the following phases: • Requirements Definition• Functional and Technical Design• System Development/Build• Test (Unit, System, UAT, Regression)• Implementation• Maintenance (in a future review)Our reviews also include an examination of the project management progress and processes.The purpose of these reviews is to provide the project team with specific observations and recommendations related to specific integrator deliverables and project management procedures as well as leading practices for project monitoring, quality, and tools for project planning, requirements validation, documentation, and tracking. At the end of each review, the team produces a standardized report of risks, findings, and recommendations relative to the project. This report is submitted to the federal Office of Child Support Enforcement (OCSE), which provides oversight, as well as to the State Department of Human Services (DHS) which was undertaking the project. The project was Federally certified in 2010.

39

Case Study 2: Quality Assurance (QA) Over the Creation of a Common Enterprise Platform for

Health and Human ServicesKPMG was selected to provide Quality Assurance services over the integration of systems scattered across 13 City agencies serving over two million people. The vision for the project focused on breaking down fragmented methods of service delivery and moving towards a coordinated service delivery model and client-centric approach. Foundational projects included:• Common Client Index• Worker Portal• Client Screening and Online Application PortalKPMG provided QA services at the project portfolio management level, as well as within each case management system. Services for each individual project included:Phase I – Assessment• Creation of the Requirements Traceability Matrix (RTM)• Assessment of systems operational in other Jurisdictions (Cities, Counties, States)• Assessment of available solutions and systems, including the development of a vendor comparison matrix

comprising of 15 Vendors for Alternative Solutions OverviewPhase II – RFP Support and Selection Assistance • Assistance with solicitation for SI and evaluation of SI proposals • Implementation Planning SupportPhase III – System Integration• Project Monitoring• Quality Assurance• Proof of Concepts• Testing Support

40

Case Study 3: Traditional QA and Selection Assistance at the

Large Municipal Justice AgencyAs part of the new Jail Management System developed for the City, KPMG’s QA services included:Phase I – Assessment• Creation of the Requirements Traceability Matrix (RTM)• Assessment of systems operational in other Jurisdictions (Cities, Counties, States)• Assessment of available solutions and systems, including the development of a vendor

comparison matrix comprising of 15 Vendors for Alternative Solutions OverviewPhase II – RFP Support and Selection Assistance • Assistance with solicitation for SI and evaluation of SI proposals • Implementation Planning SupportPhase III – System Integration• Project Monitoring• Quality Assurance

41

Case Study 4: Current State Analysis and Strategy Development at a Large Municipal

Buildings AgencyAs part of the new Construction Monitoring and Validation System developed for the City, KPMG’s QA services included:Phase I – Assessment• Analysis of current state business processes and issues• Leading practices review of peer agencies within the USA and internationallyPhase II – Strategy Development and Cost Benefit Analysis • Development of phased strategy for improvements to customer self service, planning

approval processes, building inspection processes, and IT modernization• Cost Benefit Analysis for each recommendationPhase III – System Integration• Project Monitoring• Quality Assurance

42

Case Study 5: Project Monitoring and Quality Assurance

Over Construction of Large Data CenterA large City agency is in the process of contracting with additional vendors to build a new Data Center and migrate all of its information. As part of this in-flight Project Monitoring and Quality Assurance engagement, KPMG’s QA services include:Phase I – Assessment• Analysis of the Data Center lease and financial operating models• Observations and recommendation for the Data Center commissioning procedures and

support documentation• Onsite validation of the commissioning process (covering all electrical and mechanical

aspects, including rack, power and cable design)Phase I – ResultsThe client received timely observations that has allowed them to take rapid actions to reduce the risk, increase the quality and control the cost of the DC program. For example:• Lease: 27 observations, impacts and recommendations• Commissioning: tracked to closure 96 commissioning punch list items and 8 issues identified

during commissioning walkthrough• Rack, Power and Cabling Design: 80 observations across 25 design documents

43