developing and implementing program efficiency measures...

Post on 15-Oct-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Developing and Implementing Program Efficiency Measures: Challenges, Differentiation, Strategic Issues – A Discussion Paper

Prepared for the PPX Event on “Developing Effective Efficiency Indicators:

A discussion workshop on challenges & solutions”

Evert Lindquist School of Public Administration

University of Victoria

Ottawa, Canada 22 January 2013

Overview

• The government has established a sub-committee of Planning & Priorities on “Government Administration”

• The Treasury Board has directed departments to start developing program efficiency indicators

• Measuring productivity, measurement for generally, and even cost/efficiency measurement is not new (OPMS)

• The purpose of this event and the discussion paper is to stimulate an early discussion among departments about how to respond

• You have received an excerpt of the discussion paper

The Flow of Discussion Paper...

1. The Challenge: Measuring and Reporting on Program Costs and Efficiency

2. Differentiation: From Ascertaining Costs to Displaying Efficiency Information

Cost Dimensions of Programs

Comparing Programs

Comparing Costs

Displaying Cost Drivers

3. Reporting on Program Efficiency: Cautionary and Optimistic Notes

Focus of today’s PPX

dialogue

1

The Challenge: Measuring and Reporting on Program Costs & Efficiency

Call for Efficiency and Cost Indicators • To “gain insight into to the cost delivering a program

relative to the work performed.”

• Should be aligned with MRRS strategic outcome indicators for specific programs under PAA structures

• Parsimony: one indicator per program, year-over-year

• A tight fiscal environment: the government will continue strategic/program reviews and cost/efficiency indicators are to accompany expenditure proposals

• Not the normal multi-year roll-out for TBS reporting?

• Some departments and agencies may have significant investments in cost/efficiency reporting; others not

• Parsimonious reporting requires back-up data & analysis

2

Differentiation: From Ascertaining Costs to Displaying Efficiency Information

Cost/Efficiency and Differentiation • When cost/efficiency indicators are sought for review and

decision purposes, comparability across programs springs to the fore as an important issue.

• Today’s focus is on what data might be required as back-up for parsimonious cost-efficiency indicators for programs and whether programs can be classified into distinct categories to facilitate comparisons:

Cost Dimensions of Programs

Comparing Programs

• However, such data must be analyzed, displayed & compared within and across entities. The need for a typology becomes clearer when one considers how (briefly) this could be done:

Comparing Costs

Displaying Cost Drivers

Some Cost Dimensions to Consider

A. Program: Work and Client

• Here the focus is on ascertaining the nature of program in terms of the kind of work or tasks undertaken, clients served, and whether intermediaries are involved: o Program activities: does the program focus on service

delivery, regulation, funding, policy, communications, internal services, etc?

o Program outputs and clients: What does the program “produce”? What clients does it serve? Does the program provide direct or indirect delivery? Are intermediaries involved in the service or delivery chain?

o Quickness of service delivery: Do the tasks of the program require immediate service responses or interactions over months or years?

B. Program Factor Inputs

• The cost structure of programs may vary according to nature of inputs and where transforming activities occur. Some dimensions to consider are:

o Point of service: What is the extent of reliance on headquarters vs. regional operations?

o Reliance on information technology: What is the extent of reliance on information technology and other systems (IT vs. non-IT costs)?

o Staffing costs: What is the extent of reliance on staff or human resource capabilities (ratio of HR to non-HR costs)? Have the costs of benefits and training been factored in?

o Service channels: How many channels are used to deliver the services associated with a program? Do they have different cost structures? Has the mix and volumes changed?

C. Program Infrastructure Cost Cycle

• Programs vary with respect to their newness and the lifecycle of technologies they need to carry out work (technological and social). Such costs may vary with respect to: o Maturity. Is the program old or new? Does the program

have sunset provisions?

o Infrastructure costs: Do costs need to be amortized over the length of the program?

o Expected spikes and dips: Does the program have cycles within and across years?

o Phases: Will the program have phases when fixed as opposed to variable costs are completed, partners engage, putative benefits are anticipated to come on stream, etc.

D. Department Cost Allocation • Departments may have distinct corporate practices for

assigning costs in support of programs and corporate services (HR, FM, IT, accounting, contract management, real property, communications, etc.), which will affect program efficiency. These may vary with respect to:

o Costing internal services: Does the department assign costs as a central corporate program or assign these costs to programs on a distributed basis? A mixed model?

o Costing program services: Do all of the costs of programs rest with a given programs or are they shouldered by program clusters or corporate services? A mixed model?

o Costing overhead: What constitutes overhead? Are there different levels? How are overhead costs calculated, allocated, and amortized?

E. Accounting for Scale

• Programs and their constituent elements vary with respect to scale, creating difficulty in comparing programs across departments due to scale and affecting the robustness of cost and efficiency indicators used to describe those programs.

o Considering scale: What are the number of clients are served? The amount of staff resources used? What amount of funding is moved to clients or intermediaries?

o Level for comparison: Should a program, distinct activities, or even sub-activities be compared with each other?

Lots of Information to Produce Parsimonious

Cost Indicators?

Differentiating Programs

Outcomes Observable?

Process Observable?

Yes

No

Yes No

Production Organizations

Procedural Organizations

Craft Organizations

Coping Organizations

Outcomes Observable?

Process Observable?

Yes

No

Yes No

Production Organizations

Procedural Organizations

Craft Organizations

Coping Organizations

Communications Programs

Policy Programs

Regulatory Programs

Direct Service Delivery By Department

Contributions to External Service

Delivery Providers

In-House Science

Programs

Funded External Science

Programs

Internal Services (HR, IT, FM, etc.

Displaying Indicators

Comparing Drivers and Programs

http://www.google.ca/imgres?q=parallel+coordinates&start=87&hl=en&sa=X&biw=1680&bih=925&tbm=isch&prmd=imvns&tbnid=pWCSBHs4gEnNFM:&imgrefurl=http://twcmaxcurran.blogspot.com/&docid=fMZ5QwQ8azFriM&imgurl=http://1.bp.blogspot.com/-v4P2FMHCx-w/T1fAtU_Y7uI/AAAAAAAAAWY/fahvbfQG3t0/s1600/K3%252Bversion%252B1.bmp&w=813&h=510&ei=ZsHiT_vdA-3k2wWCiaTFCw&zoom=1

Example of Parallel Coordinates Graph

One Possible Adaptation: Comparing Programs

Each coordinate represents a different department bundle of programs; the coloured lines represent cost or efficiency indicators for different kinds of programs (see typology).

Dept A Dept B Dept C Dept D Dept E

Another Possibility: Program and Cost Drivers Over Time

Each coloured line represents different costs; coordinates are time

2013-14 2014-15 2015-16 2016-17 2017-18

3

Reporting on Program Efficiency: Cautionary and Optimistic Notes

Cautionary Notes Broader institutional and system issues to consider:

• What happens when policy shifts increase costs?

• What if aggregation masks diverging cost trends?

• Cost trends vs. “the base” de facto benchmarking?

• How will cost indicators and public value get analyzed?

• Focusing on cost trends and program achievements: the realism of strategic outcome indicators? Attribution.

• How costly will measuring costs be? Assessing value.

• Will a focus on costs require new analytic expertise?

• Will cost reporting crowd-out other analysis/reporting?

• Will cost reporting likely affect decision-making?

Optimistic Notes What has changed in 40 years since OPMS was rolled out?

• significant improvements in information technology and financial management systems;

• vast increases in computational power, significantly lowering the costs of assembling and manipulating data;

• considerable experience rolling out corporate initiatives (MC, RfC, MAF, and MRRS/PAA) reporting as multi-year exercises;

• successfully responding to the reporting requirements of the Government Accountability Act

• most data for developing cost/efficiency indicators in systems

• many departments and agencies already pay close managerial attention to costing and efficiency reporting; and

• visualization techniques for analyzing and presenting data have greatly improved.

Thank You!

Time for questions.

evert@uvic.ca

top related