Download - Performance Management Presentation
1
ORF
Performance Management Presentation
Team Members:
Division of Policy and Program Assessment
Farhad Memarzadeh, Sheri Bernstein, Reza Jafari, Robert Henry, Mike Shaw and Clarence Dukes
21 January 2004
Manage Policy and Program Assessment for the Delivery of NIH Owned and Leased
Facilities
2
ORF
Overview and Contents• Background on DPPA• PMP Template
– Value proposition and strategy– Performance objectives and measures
• Customer Perspective: Critical Measures– Project Review Board (PRB) reviews that meet due dates– Customers attending training
• Internal Business Process Perspective: Critical Measures– Qualified Projects using Earned Value Analysis (EVA)
• Learning and Growth: Critical Measures– Training hours related to EVM and related professional development
• Financial Perspective: Critical Measures– Unit cost for DPPA discrete service
• Conclusions
3
ORF
Background on DPPA
• DPPA resulted from the re-alignment of the Office of Research Services.
• Its Mission is to responsibly execute the public trust by management and development of policies and procedure to help achieve the best value and quality at all levels of project design, construction, commissioning and operations within the Office of Research Facilities (ORF )
• DPPA consists of Policy and Quality Assurance Branches.
• DPPA strives to provide tools for efficient and effective delivery of NIH real property.
4
ORF
DS4: Provide ORF staff training to ensure effective implementation of policies and procedures to deliver quality facilities.
Manage policy and program assessment for the delivery and operations of NIH owned and leased facilitiesService Group
Division of Policy and Program Assessment Performance Management Plan (PMP)
DS3: Develop suitable performance management methods to evaluate, improve and enhance the delivery of facilities.
DS2: Review ORF operations for compliance with applicable regulations, codes, standards, and guidelines.
DS1: Provide policies, standards, and guidelines for NIH owned and leased facilities.
Discrete Services
Division Approval/Date: Associate Director Approval/Date:
Service Strategy
DPPA translates government policy on facilities design, construction, commissioning, and operations into accurate and reliable policies to help ensure NIH facilities provide the highest quality and best value while meeting the NIH researchers' needs. Our professional team of engineers, scientists, and architects delivery thorough and reliable guidance to our customers while assuring time-sensitive schedules are met. We combine our team's hands-on experience in complex design, construction, maintenance, and operations with state-of-the-art quality assessment and performance measurement programs to assure our customers and stakeholders get the best value for their budget. We partner with our customers to continuously improve our methods and practices to deliver guidance that works for everyone.
Value Proposition
DS5: Prepare the facilities section of the NIH Government Performance and Results Act (GPRA) Plan.
Definition
Develop and implement life-cycle based policies, procedures, and guidelines as well as performance management and assessment systems and tools for efficient and effective delivery and operations of NIH owned and leased facilities.
The DPM team of cross-functional experts focus is on a fast response with reliable guidance to help our customers get their jobs done. We will be the central repository for all the information necessary to make timely decisions and provide the tools and training required to be a state of the art policy and performance team.
Strategy Description
Team Members
Clarence Dukes
Farhad Memarzadeh, Sheri Bernstein, Reza Jafari, Robert Henry, and Mike Shaw
Team Leader
Operational Excellence
Customer Intimacy
Product Leadership
Growth
Sustain
Harvest
5
ORFDPPA Performance Plan FY 04 – FY 05
Objective Measure FY 03FY04
TargetFY05
Target
Cu
sto
mer
C1. Increase customer satisfactionC2. Improve availability and reliability of guidance and expertise
C3. Our customers are familiar with the Guidelines
C1.a. Customer survey overall score
C2.a. % of PRB reviews that meet due datesC2.b. # of disputes that go to Dispute Resolution BoardC2.c . # customer policy, standards & guideline gaps identifiedC3.a. % of customers attending training
N/A
N/A
N/A
N/A
N/A
Baseline
40%
≤ 20%
5
50%
≥ FY04
50%
≤ 15%
5
50%
In
tern
al
Bu
sin
ess
IB1. Improve thoroughness, reliability and accuracy of policies, guidelines, standards and procedures
IB2. Improve best value analysis through-out design and construction of facilities
IB3. Provide best of class performance management assessment,
IB1.a. Customer surveyIB1.b Customer Survey of Relevance of policies, standards and guidelines IB1.c. % of variances requested vs approvedIB1.d. # changes to design policies and guidelines resulting from variancesIB1.e % of project reviews complete
N/AN/A
N/A
N/AN/A
N/A
N/AN/A
N/A
TBDTBD
10%2
50%
3
≥ 50%≤ 10
30-50%
≥ FY04≥ FY04
10%2
75%
6
≥ 80%≤ 5
60%
6
ORFDPPA Performance Plan FY 04 – FY 05
Objective Measure FY 03FY04
TargetFY05
Target
Lea
rnin
g an
d G
row
th
L1. Be experts in best practices for project analysis and quality assessment(DPPA staff and Customers - PO's)
L1.a. Customer survey L1.b. % training hours for assessment and interpretation of Earned Value Analysis (EVA) and related professional development courses
N/A
Baseline
Baseline
10%
≥ FY04
15%
Fin
anci
al
F1. Minimize unit cost of discrete services
F1.a. Yr to Yr comparison in discrete services costsF1.b. Qtr to Qtr change in discrete service chargesF1.c. Cost per permit reviewF1.d. Cost per guideline revision
N/A
N/A
N/AN/A
TBD
TBD
TBDTBD
TBD
TBD
TBDTBD
7
ORF Strategy Mapping Conclusions and Actions
• Strategy Mapping revealed our Strategic Objectives should be goals within DPPA’s span of control. This means the team must focus on those activities that directly contribute to successfully accomplishing the long term objectives that benefit our first line customers – the Project Officers and Project Managers.
• We modified our PMP to reflect this approach.
• Strategy Map Analysis:• DPPA will be successful if it C1: “Increase customer satisfaction” . Accomplishment of
this goal relies on the internal business processes objective IB1: “Improving best value analysis throughout design and construction of facilities.” The capacity to improve the “best value analysis” is dependent on the ORF project services community becoming “Experts in best practices for project analysis and quality assurance.”
8
ORF
Customer Perspective Critical Measures
Objective Measure FY 03FY04
TargetFY05
Target
Cus
tom
er
C2. Improve availability and reliability of guidance and expertise
C3. Our customers are familiar with the Guidelines
C2.a. % of PRB reviews that meet due dates
C3.a. % of customers attending training
N/A
N/A
40%
50%
50%
50%
9
ORFPERMITTING PROCESS
PR
B D
esi
gn
Re
vie
w P
roce
ss -
Ste
p 2
PRB Project Review Plan Page 2
A/EPRBDPPAIntermediate
(Supervisor, AO,CO, Dispute Board)
Project Officer
C.O. issues Notice toProceed to A/E
Chairs Kick Of f meetingw/ A/E, customer & PRB
Dev elops designsubmission & deliv ers
to P.O.
Transmits PRBcomments to A/E w/
Progress payment Ltr.
Delivers submission toPRB reviewers &
submits transmittalnotification to DPPA w/
tracking #
Rev iewssubmission f orcompleteness
Reviewssubmittal.
Is it technically adequate?
Prepares & sendscomments
NO
10 Business Day sor more w/ PO decision
Assembles design team & prepares
f or Kick Of f meeting
NO
Invites PRB & A/E toKick Of f Meeting
10
ORF
Objective C2. Improve availability and reliability of guidance and expertiseObjective C3. Our customers are familiar with the Guidelines
Measure: PRB’s meeting Review ScheduleMeasure: % Scheduled customers attending training
PRB's Meeting Review Schedule Target = 75% Actual = 75%
0
20
40
60
80
100
120O
ct
No
v
De
c
Jan
Fe
b
Ma
r
Ap
r
Ma
y
Jun
e
July
Au
g
Se
pt
Oct
To
tal
# PRB's
# On Time
Critical Measures
Oct
Dec
Feb
Apr
June
Aug
Oct # Customers
01234567
Scheduled Customers Attending Training Target = 75% Actual = 75%
# Customers # Attending
Notional Data – Actual To be
reported in FY 04
11
ORF
Customer Scorecard Methodology
• FY 04 will be the first year DPPA schedules a Customer Satisfaction survey. The intent is to determine how well DPPA is delivering on its primary mission of providing the expertise, practices, tools and guidance that helps project officers monitor and measure project status.
• Our plan is to measure customer satisfaction two times each year.
• Our audience will include project officers plus a sample set of A&E professionals from the Contractor community who are on the job.
12
ORFWhat will we learn from this Customer data?
What actions will it indicate we should take?
Data Analysis will indicate several possible conditions:
• The estimate of time to perform reviews may require new time frame.
• A lack of available resources to perform reviews – could be over utilization of the professional staff or
inefficient prioritization of tasks.
• More training in the overall review process may be required.
• Some projects require different levels of professional resources – may indicate need to realign resources.
13
ORF Internal Business PerspectiveCritical Measures
Objective Measure FY 03FY04
TargetFY05
Target
Inte
rnal
B
usin
ess
IB1. Improve thoroughness, reliability and accuracy of policies, guidelines, standards and procedures
IB1.b Customer Survey of Relevance of policies, standards and guidelines
N/A TBD ≥ FY04
14
ORF Objective: IB2. Improve Best Value Analysis Measure: IB2.1 # of Qualified projects using EVA
0
10
20
30
40
50
60
70
80
90
1st Qtr 2nd Qtr 3rd Qtr 4th Qtr
Projects AProjects BProjects C
Notional Data – Actual data To Be reported in FY 04
15
ORFSAMPLE S-CURVE TRACKING
SCHEDULED VS ACTUAL EXECUTION GOALS
0%
20%
40%
60%
80%
100%
May
-98
Jul-9
8
Sep-9
8
Nov-98
Jan-
99
Mar
-99
May
-99
Jul-9
9
Sep-9
9
Nov-99
Jan-
00
Mar
-00
May
-00
Jul-0
0
Sep-0
0
Nov-00
Jan-
01
Mar
-01
MONTHS
Planned Progress Actual Progress Forecast Progress
Actual Actual Progress
Forecasted Progress
Original Progress Curve
16
ORF What will we learn from this Internal business process data?What actions will it indicate we should take?
• If behind on the goals then it will show us where we need to change.
• Indicate where more analysis is necessary to reduce risk on project falling behind schedule or exceeding budget
• Indicate the ability of the organization to adopt the new methods and procedures for implementing EVA on qualified projects.
• Risk Management Tool
17
ORF Learning & GrowthCritical Measures
Objective Measure FY 03FY04
TargetFY05
Target
Lea
rnin
g an
d G
row
th
L1. Be experts in best practices for project analysis and quality assessment(DPPA staff and Customers - PO's)
L1.b. % training hours for assessment and interpretation of EVM and related professional development courses
Baseline 10% 15%
18
ORF Objective L1. Be experts in best practices for project analysis and quality assurance
Measure L1.1: Training hours for assessment and interpretation of EVM and related professional development courses (Notional Data - Actual to be Provided in FY04)
0
50
100
150
200
250
300
Hou
rs
1st Qtr 2nd Qtr 3rd Qtr 4th Qtr
Training Hours by Type
EVM Prof Dev.Other
19
ORF
PROJECT HEALTH INDICATOR DASHBOARD
PROJECT NAME
ON SCHEDULE
(Yes/No)
WITHIN BUDGET (Yes/No)
ESTIMATED TO BE COMPLETED
ESTIMATED TO BE
COMPLETED
HIGH RISK PROJECT (Yes/No)
MANAGEMENT INTERVENTION
REQ'D ? (Yes/No)
On Time (Yes/No)
Within Budget (Yes/No)
Construct Consolidated Physical Training Facility, Phoenix, Arizona
Y N Y N N N
Construct Insectaries, Miami, Fla
N N N N Y Y
NOTIONAL PROJECT HEALTH INDICATOR DASHBOARD
20
ORF What will we learn from this Internal L& G data?What actions will it indicate we should take?
•Identify training needs of the organization
•Identify individual requirements for types of training
•Help ID the need for additional training resources – out source versus in house provider
•Act as a roadmap for the future – show where we need to go.
•Help establish the budget necessary to provide requisite training.
21
ORF
Financial PerspectiveCritical Measures
Objective Measure FY 03FY04
TargetFY05
Target
Fin
an
cia
l F1. Minimize unit cost of discrete services
F1.a. Yr to Yr comparison in discrete services costs
N/A TBD TBD
22
ORFNOTIONAL DATA FOR
REPORTING TIME BY COST ACCOUNT
DPPA Time Card Report
1.6.3 ADMS
23%
1.1 RNU
13%
1.2 VAR
1%
1.3 TRKG
11%
1.4 DEVN
36%
1.5 SPECS
1.5%
1.6.1 PROJS
12%
1.6.2 SPCM
2%
23
ORF Objective F1. Minimize unit costsMeasure F1.1 Comparison Yr to Yr Discrete Service costs
0102030405060708090
1000
's
FY 03Est.
FY 04Actual
FY 05Budget
DS Costs Yr over Year
DS 1DS 2DS 3DS 4
24
ORF
What will we learn from this Financial data?What actions will be indicated?
•Identify the make up of all the costs associated with each Discrete Service.
•We help establish baselines of minimum resources necessary to produce outputs.
•Indicate how resources should be reallocated from one area to another.
•Will help establish a more robust budgeting mechanism based on actual throughputs.
•Build actual to budget comparisons for better planning.
•Show trends over time.
25
ORFConclusions
Need to thoroughly analyze how the professional resources are allocated to perform the discrete services.
Our key initiatives for FY 04 and FY 05:
Training Policy Development
Design-Build Safety Policy
Pre-Project Planning (PDRI) Post Occupancy Evaluation
Guidelines Building Occupancy
EVMS Security
Permitting Process Color of Money
Quality System Manual Portfolio Management
New A/E Selection Form
Other Actions
Conduct Study to verify ORF/ORS Training needs