air force integrated master schedule (ims) assessment process ims_ assessment... · air force ims...

185
Air Force IMS Assessment Process 1 Version 4.0, 21 September 2012 Air Force Integrated Master Schedule (IMS) Assessment Process

Upload: others

Post on 21-May-2020

16 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

1 Version 4.0, 21 September 2012

Air Force Integrated

Master Schedule (IMS)

Assessment Process

Page 2: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

2 Version 4.0, 21 September 2012

Document Configuration Management Log Version (Date) Content Summary / Change Description

V. 1.0

(27 July 2011)

Content: Includes IMS Quick Look process for assessing

schedule health using the first 5 of 8 Generally Accepted

Scheduling Principles (GASP).

V. 1.1

(23 September 2011)

Changes: Revised Quick Look test table to include metrics

(thresholds). Added Beginner section with scheduling concepts

and terms. Revised Comprehensive Assessment Matrix for

readability. Added Schedule Tool Appendix.

V. 2.0

(7 May 2012)

Changes: Added Performing an IMS Quick Look Assessment

Using Run!23. Updated IMS Quick Look table. Updated

Schedule Tool Appendix.

V. 3.0

(7 June 2012)

Changes: Added Run!23 installation instructions and Performing

an IMS Quick Look Assessment on Open Plan Professional

Schedules.

V. 4.0

(21 September 2012)

Changes: Updated IMS Comprehensive Assessment Section.

Added Common Program Manager Questions for IMS

Assessments.

Page 3: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

3 Version 4.0, 21 September 2012

Contents

Document Configuration Management Log .......................................................................... 2

Contents ................................................................................................................................... 3

Overall Document Introduction / Purpose / Organization .................................................... 6

Purpose and Organization .............................................................................................................. 6

Part 1 - Introduction to Integrated Master Schedule (IMS) .................................................. 7

IMS Composition / Use / Contractor Expectations ...................................................................... 7 What is an IMS? ............................................................................................................................ 7 How is the IMS used? .................................................................................................................... 7 Who uses the IMS? ........................................................................................................................ 8 The IMS and Contract Requirements ............................................................................................ 9 Expectations of Contractor-Delivered IMS Files .......................................................................... 9

Basic Scheduling Concepts ........................................................................................................... 10 Milestones .................................................................................................................................... 10 Tasks ............................................................................................................................................ 10 Task Duration .............................................................................................................................. 11 Logic Networks ........................................................................................................................... 12 Dates ............................................................................................................................................ 12 Predecessors ................................................................................................................................. 13 Successors .................................................................................................................................... 14 Constraints ................................................................................................................................... 14 Float ............................................................................................................................................. 15 Critical Path ................................................................................................................................. 16 Resources ..................................................................................................................................... 17 Calendar(s) ................................................................................................................................... 18 Baseline Dates ............................................................................................................................. 18 Statusing ...................................................................................................................................... 19 Recording history ......................................................................................................................... 20 Part 1 Conclusion ......................................................................................................................... 22

Part 2 – Basic Integrated Master Schedule (IMS) Assessments ......................................... 23

Introduction / Purpose / Scope ..................................................................................................... 23

Overview / Refresher ..................................................................................................................... 23

Standards for IMS and IMS Assessments ................................................................................... 24 Contract Requirements ................................................................................................................ 24 Contractor Requirements ............................................................................................................. 24 Agency Standards / Guidelines .................................................................................................... 25 Generally Accepted Scheduling Principles (GASP) .................................................................... 26 GASP Description ....................................................................................................................... 27 Additional Schedule References .................................................................................................. 30

Page 4: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

4 Version 4.0, 21 September 2012

Key Terms and Concepts .............................................................................................................. 32 Government Oversight Schedules and Supplier Schedules ......................................................... 32 IMS Hierarchy (WBS Levels, Summary Tasks, and Indentures) ................................................ 33 Integrated Master Plan (IMP) ...................................................................................................... 33 IMS Supplemental Information ................................................................................................... 34 Program Milestones or Control Milestones ................................................................................. 34 Control Account or IPT Start, Finish, and Interim Milestones .................................................... 34 Planning Packages ....................................................................................................................... 35 Task Types ................................................................................................................................... 35 Task Names (Nomenclature) ....................................................................................................... 36 Task Durations ............................................................................................................................. 36 Logical Relationships Between Tasks ......................................................................................... 36 Logical Relationships Between Summary Tasks ......................................................................... 39 Leads and Lags ............................................................................................................................ 40 Constraints ................................................................................................................................... 40 Deadlines ..................................................................................................................................... 41 Calendar(s) ................................................................................................................................... 42 Critical Path and Driving Path(s) ................................................................................................. 42 Crashing the Critical Path ............................................................................................................ 43 Resource Loading ........................................................................................................................ 44 Baselines ...................................................................................................................................... 45 Building an IMS .......................................................................................................................... 45 Maintaining an IMS ..................................................................................................................... 47

IMS Quick Look Assessment ........................................................................................................ 53 Overview of Mechanics ............................................................................................................... 53 Tests by GASP Tenet ................................................................................................................... 53 Using the Quick Look Results ..................................................................................................... 73

Performing an IMS Quick Look Assessment Using Run!23 ..................................................... 74 Installing Run!23 ......................................................................................................................... 74 Configure MS Project (MSP) 2007 and the Run!23 (.MPP) Template ....................................... 74 Preparation for using Run!23 ....................................................................................................... 77 Use “Quick Look” Button to Execute Filters .............................................................................. 78 Using Run!23 to Perform an IMS Quick Look Assessment ........................................................ 79

Performing an IMS Quick Look Assessment Using Open Plan Professional ........................ 106 BK3 File..................................................................................................................................... 106 Views ......................................................................................................................................... 111 Filters ......................................................................................................................................... 114 Using Open Plan Professional to Perform an IMS Quick Look Assessment ............................ 118

Part 3 – Integrated Master Schedule (IMS) Comprehensive Assessment ......................... 139

IMS Comprehensive Assessment ............................................................................................... 139 Scope of the IMS Comprehensive Assessment ......................................................................... 139 Reasons for an IMS Comprehensive Assessment ...................................................................... 140 Expertise Required for the IMS Comprehensive Assessment ................................................... 140 Tailoring Tests / Activities in the IMS Comprehensive Assessment ......................................... 141 Steps in the IMS Comprehensive Assessment ........................................................................... 141

Page 5: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

5 Version 4.0, 21 September 2012

IMS Comprehensive Assessment Test / Activity List ............................................................... 144

Documenting an IMS Comprehensive Assessment .................................................................. 166 IMS Comprehensive Assessment Report................................................................................... 166 Turning Reports into Results ..................................................................................................... 167

Common Program Manager Questions Answered by IMS Comprehensive Assessments ... 167

Specific IMS Comprehensive Assessment Techniques and Measurements ........................... 173 Measures of Schedule Execution ............................................................................................... 173 Late and Critical Tasks .............................................................................................................. 175 Critical Path / Driving Path Determination ................................................................................ 175 Bow Wave Analysis .................................................................................................................. 176 Rolling Wave / Planning Packages ............................................................................................ 177 First and Last Tasks Analysis .................................................................................................... 178 Duration Variance / Pace / Earned Value Method Analysis ...................................................... 179 Risks & Opportunities Integration Analysis .............................................................................. 179

Acronyms ............................................................................................................................. 181

Appendix – Schedule and Schedule Assessment Tools ...................................................... 184

Schedule Generating Software ................................................................................................... 184

Schedule Assessment Tools Associated with Schedule Software ............................................. 184 Acumen Fuse ............................................................................................................................. 184 Air Force IMS Performance Trends Tool .................................................................................. 185 Defense Contract Management Agency (DCMA) 14 Point IMS Assessment .......................... 185 Deltek Open Plan Professional .................................................................................................. 185 Run!AzTech for MS Project ...................................................................................................... 185 Run!23 for MS Project ............................................................................................................... 185 Steelray Project Analyzer .......................................................................................................... 185

Page 6: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

6 Version 4.0, 21 September 2012

Overall Document Introduction / Purpose / Organization

Purpose and Organization

This document is written for Air Force program managers and their functional staff. The purpose

of this document is to define the process of performing an IMS assessment at two levels. There

are three parts to this document. The first part is for the beginner who is learning about schedules

and schedule assessments. The second part is for a person with intermediate scheduling

experience desiring to perform a schedule assessment. The third part is for experienced

schedulers desiring to perform a comprehensive schedule assessment. Shown below is an outline

of the document.

Overall Document Introduction / Purpose / Scope

Part 1 - Introduction to the Integrated Master Schedule

o IMS Composition / Uses / Contractor Expectations

o Basic Scheduling Concepts (terms and definitions)

Part 2 - Basic IMS Assessments

o Introduction / Purpose / Scope

o Overview / Refresher

o Standards and Principles including GASP

o Key Terms and Concepts

o IMS Quick Look Schedule Assessment

Overview

Tests by GASP Tenet

o Performing an IMS Quick Look Assessment Using Run!23

o Performing an IMS Quick Loos Assessment Using Open Plan Professional

Part 3 – IMS Comprehensive Assessment

o Overview of the IMS Comprehensive Assessment

o Steps in the IMS Comprehensive Assessment

o IMS Comprehensive Assessment Test / Activity List

o Documenting an IMS Comprehensive Assessment

o Common Program Manager Questions

o Specific IMS Comprehensive Assessment Techniques

Acronyms

Appendix – Scheduling and Schedule Assessment Tools

Page 7: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

7 Version 4.0, 21 September 2012

Part 1 - Introduction to Integrated Master Schedule (IMS)

The section provides the program team with a basic understanding of Integrated Master

Schedules; what they are, how they are used, background requirements, and the value and

benefit they provide to managing large, complex programs. This part does not provide specific

schedule analysis techniques or procedures. It does provide the background information to

understand the desired characteristics of a robust Integrated Master Schedule.

IMS Composition / Use / Contractor Expectations

This section describes the IMS and it uses. It also helps the program office understand what they

can expect from their contractor’s IMS deliverable.

What is an IMS?

Foremost, an Integrated Master Schedule is a schedule that embodies all work effort needed to

produce a product or accomplish specific goals that have an associated cost. The work effort is

represented by well-defined tasks that program participants execute (expending cost) to generate

the desired products (hardware, software, documents, or systems). Tasks are organized in the

proper sequence that facilitates efficient execution (socks before shoes) enabling the program to

know what work must be accomplished to achieve their objectives. Task sequencing occurs by

recognizing the dependencies among all the tasks (gotta have them socks on before we start

talking shoes).Then, by identifying the expected time to perform each task the program can

project completion time.

The IMS pulls together several pieces of information to develop the program schedule through

the planning process. Understanding the requirements (why) helps establish goals and objectives.

The scope (what) is broken-down into manageable tasks. The approach (how) and location

(where) determines the task sequences and method of execution. Identifying the required human

resources (who) that perform the work and manage the effort along with the material resources

often places the other characteristics in perspective that generates the timing (when). Aligning

the tasks with the goals provides a visibility into the effort required to meet the program

objectives. These characteristics taken collectively and through iterative planning comprise the

IMS.

The IMS incorporates a program’s work scope and resource planning to develop the plan,

satisfying requirements and achieving cost, schedule, and technical goals and is used as an

effective tool to help manage the program.

How is the IMS used?

First and foremost, the IMS is a tool to help the program team manage the program. The IMS

provides program history, existing conditions, and projections to program stakeholders and

customers.

The IMS incorporates many of the source documents that define the program contract. The

statement of work and work breakdown structure are two more common basis documents that

comprise the work scope and work organization reflected in the IMS. The IMS integrates the

work scope and organization, along with basis of estimates, organizational structure and other

technical source information aligning this data to meet program goals and objectives.

Page 8: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

8 Version 4.0, 21 September 2012

The IMS uses all the source information as part of the planning process to develop a time phased

capability that records history, projects future outcomes, and reports remaining efforts and

meaningful performance indices. The IMS is a program navigation tool, used to determine

course direction and manage the inevitable changes that require course adjustment during the

program’s life cycle.

The program team engages in the planning process to use the source information combined with

the collective specialized work knowledge and management objectives to produce the IMS.

The IMS helps the program team:

Describe the entire known work effort at an executable level of detail

Identify program goals and key objectives represented as milestones

Quantify the amount of time anticipated to perform each task

Identify the dependencies among all tasks that provides the work sequencing

Align the sequenced work and its anticipated completion to the program milestones

Reflect progress achieved and identify the remaining effort to complete the program

Provide visibility into what lies ahead and which tasks have higher priority to focus

attention

Enable reliable predictions based on identified trends and remaining efforts

Communicate progress, productivity, and predictive future to team, customer, and

stakeholders

Who uses the IMS?

The primary user of the IMS is the program management team. The program manager and

program management office staff along with the control account managers (CAMs), who are

responsible for planning and executing the work, and other program team members use the IMS

on a daily basis to manage the program work effort. Other members in the organization, but not

directly related to managing the day-to-day activities, rely on information produced by the IMS

to coordinate resource assignments to meet the program needs and synchronize efforts on other

programs within the organization. Persons outside the organization, such as the customer,

depend on valid program information to manage their responsibilities and determine priorities

for providing needed assistance across all programs under their control and reporting status to

their management organization.

Typical users:

Program manager

Contract manager

Chief engineer

Supply chain / subcontracts manager

Business manager

Earned Value Management (EVM) analysts

Control Account Managers

Integrated Product team (IPT) leads

Page 9: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

9 Version 4.0, 21 September 2012

Functional Managers

Executive leadership

Government Program Management Office (PMO)

Government oversight organizations such as Defense Contract Management Agency

(DCMA)

The IMS and Contract Requirements

The program contract stipulates many of the requirements for an IMS. Government requirements

for managing the program using Earned Value Management may also be in effect and are

communicated through the contract. The contractor’s own governing policies and procedures

may determine the IMS construct, use, and reporting obligations. Understanding and

incorporating all these requirements is paramount when assessing an IMS.

The program contract communicates the scope of work and deliverables through the Statement

of Work (SOW). The SOW or other contractual documents may also establish requirements such

as the level of task detail definition, quantities and lot size deliveries and related due dates,

required technical and program reviews, and reserved code fields in the contractor’s project

management software for customer use. The contract uses a Contract Data Requirements List

(CDRL) to further identify deliverables and may also tailor the government specification

documents such as DI-MGMT-81650 that is required for Earned Value Management.

Expectations of Contractor-Delivered IMS Files

The contractor delivers the IMS according to the frequency defined in the program contract. The

IMS must meet all the specifications as defined in the contract and in accordance with their own

organization’s governing policies.

Typically, the contractor delivers the IMS and other required EVM reporting information on a

monthly basis, although this could be more frequent, as agreed to. The IMS is normally delivered

in its native project management software format and may be accompanied by the same data in

other electronic formats that the government customer specified for accessibility reasons. Hard

copy reports may be produced and delivered along with the electronic data. If the format and

content does not satisfy the contractual obligations, the government representative may reject the

contractor submittal citing the reasons for refusal and requiring the contractor to comply with the

requirements prior to resubmitting the data.

A narrative document explaining changes to the schedule should accompany the IMS submittal.

The explanations should include schedule information differences between the current schedule

and the agreed-to schedule. The narrative offers reasons for the differences, referred to as

variances, and should address the changes encountered since the previous submittal, the potential

or realized impacts from the changes, and remedies considered or employed to lessen the ill

effects of the impacts, referred to as mitigation efforts. Common topics center on variances to

baseline (agreed-to plan) and conditions of or changes to the critical path or near-critical path

(more on this later).

Page 10: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

10 Version 4.0, 21 September 2012

Reviewing the IMS to ensure that it remains a functioning tool for program management

decision-making and reporting is part of the assessment process. Performing a schedule health

assessment for the recognized qualities and characteristics present in a sound IMS ensures the

IMS is satisfactory or is considered deficient in specific areas that must be addressed. A poorly

maintained IMS can quickly degenerate into an ineffective and unreliable tool. Routinely

assessing the IMS can detect areas requiring improvements that when implemented, keeps the

schedule performing reliably and providing value-added information to the program team.

Be aware that IMS data converted from its native electronic format into another electronic

format may not always reflect the exact same information. The receiver should check items in

the IMS to verify that they align with the contractor’s published information and the receiver’s

knowledge of program. Differences in the data should be reconciled with the contractor.

The IMS should have the following qualities:

Reflect the current status

Accurately record information for completed tasks

Not contain invalid dates (incomplete tasks in the past or completed tasks in the future)

Tasks with proper predecessors and successors

Accurate remaining task durations that properly forecast the future tasks

Remaining scheduled work that is achievable

Properly represent the impacts of the status update to the program milestones

Ability to pass a schedule heath assessment (discussed in Part 2 of this document)

Basic Scheduling Concepts

This section discusses basic scheduling terms and concepts.

Milestones

Milestones are zero duration tasks, typically events, starting points, or end points for work.

Tasks

The task identifies the work content that is a further decomposition of the total work scope into

an actionable effort. The task has a name (task description) that uniquely identifies this effort

from all other tasks in the IMS and is descriptive enough for all users to understand the intent of

the work and what is produced as a result of performing the work. Ideally, each task has an

identified owner (CAM), someone who is responsible for its content and performance. This

identification helps with obtaining status and communicating schedule issues and conditions.

Generically, “task” refers to any activity in the IMS, whose duration can range from zero days,

for milestones, to several months for discrete effort, planning packages or Level of Effort (LOE).

These are described in more detail below: Summary Tasks vs. Tasks, Tasks vs. Milestones,

Discrete vs. Planning Packages vs. LOE.

Page 11: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

11 Version 4.0, 21 September 2012

Task Duration

Task duration quantifies the expected amount of time for executing a specific work effort under

normal conditions. The duration begins with the start date and ends with the finish date and uses

a working day calendar. Duration is generally expressed in days, but may be in other units such

as hours, minutes, weeks, or months.

Fundamentally, the more work effort is broken down into smaller, actionable steps, the shorter

the task duration becomes. Shorter duration tasks are more manageable because the work effort

has been refined to its elemental steps. This makes organizing the various tasks easier and

facilitates assessing progress during status updates.

There are several duration types involved in scheduling. Each has its use when discussing

specific task characteristics under different conditions.

A bit more technical:

Task / Original Duration – the anticipated amount of time from start to finish for a task to

complete the scope of work. The span of time may include periods of low or no activity

depending on the manner in defining the task’s execution.

Actual Duration – the realized amount of time from start to finish for completing a task.

Tasks that have partial completions, started but not completed, may reflect partial actual

duration.

Remaining Duration – the anticipated amount of time to finish a partially-completed task’s

associated scope of work. Tasks that have not started have the same amount of Remaining

Duration as Task / Original Duration.

Baseline Duration - the planned amount of time from planned start to planned finish for a

task to complete the scope of work. The span of time reflects the task’s baselined amount of

time for executing the work under normal conditions and is used for comparison to the task’s

Original / Actual / Remaining durations to help determine if execution is proceeding as

anticipated. This baseline is defined before the work is started.

Variance – the calculated time unit difference between the forecast or actual dates and the

baselined dates. This is expressed in terms of duration. Values can be positive (over-running

the planned dates), zero (matching the planned dates), and negative (beating or under-

running the planned dates).

Total Float –the calculated time difference between the forecast dates and the baselined need

dates (to achieve the desired program end date). This is expressed in terms of duration.

Values can be positive (are not affecting the end date), zero (may be affecting the end date),

and negative (impacting the end date and not supporting the desired baseline date), in days

duration. A task’s total float value is a measure of the amount of time a task can delay before

delaying the program completion. This may also be referred to as total slack.

Free Float –the calculated amount of time a task’s finish date may slip before delaying the

start of its successor task. This is expressed in terms of duration. If a task has more than one

successor, it is the least amount of duration, before delaying the earliest successor task’s start

date. This may also be referred to as free slack.

Page 12: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

12 Version 4.0, 21 September 2012

Logic Networks

The logic network refers to the dependencies that exist among tasks that make up the IMS.

Listing the known tasks in an order perceived as correct and determining the related dates for

their execution against a calendar may be a good idea for the moment, but things change. Tasks

may start or finish earlier or later than originally planned for a multitude of reasons or new tasks

may be added to reflect additional scope as directed by the customer or to better define

previously generalized work. Rather than re-figure all remaining and newly added tasks when

change occurs, the push and pull of the dependencies reflects the impacts of these dynamic

conditions in a manner that is difficult to perform manually and the logic network automatically

establishes and re-establishes start and finish dates based on the changes. The dependencies exist

regardless of when in time the tasks execute and are used to determine the future forecast dates

on remaining tasks giving the users immediate awareness of the impacts.

The logic network is a good forecaster of future remaining work efforts. It also has the ability to

identify which tasks, and which paths of tasks, are more important than others in achieving

program milestone dates and the program end date. This relative importance aids program team

members in deciding on which tasks to concentrate limited resources and focus management

attention.

The logic network is often viewed as a diagram where each task is a box on the chart, and read

from left to right (earlier to later). The tasks appear tied together to form linked sequences of

tasks. These sequences are often referred to as paths, and a viewer can follow the path forward

(to the right) to see what tasks are next, or backward (to the left) to see what tasks came before.

These paths represent the precedence logic of what has to happen and in what order to execute

the work as planned.

Collectively, all tasks connected by predecessor and successor relationships make up the logic

network. The term for constructing this logic network and the rules governing its application is

called Critical Path Method (CPM) scheduling.

Dates

There are several start and finish date types involved in scheduling. Each has its use when

discussing specific task characteristics under different conditions.

A bit more technical:

Start / Forecast Start – the anticipated beginning of a task’s duration to execute the

associated scope of work. Generally, the dates associated with the current schedule, reflect

when the task expects to begin execution.

Finish / Forecast Finish - the anticipated conclusion of a task’s duration to execute the

associated scope of work. Generally, the dates associated with the current schedule, the dates

reflect when the task expects to complete execution. Dates can be entered.

Early Start – the earliest a task is able to start based on its predecessor path impacts, or

applied “no earlier than” constraints. Dates are established by performing a “forward pass”

calculation through the logic network (from the beginning to the end of the network)

recognizing the predecessor relationships and task durations / remaining durations to

determine the dates. In most cases Early Start is the same as Start / Forecast Start.

Page 13: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

13 Version 4.0, 21 September 2012

Early Finish – the earliest a task is able to finish based on its predecessor path impacts,

remaining duration, or applied “no earlier than” type constraints. Dates are established

during the “forward pass” calculation through the logic network. In most cases Early Finish

is the same as Finish / Forecast Finish.

Late Start – the latest a task is able to start based on the need dates. The Late Dates are

established by performing a “backwards pass” calculation through the logic network (from

the end of the network to the beginning) recognizing the task dependencies and task

durations and “no later than” type constraints. Delaying a Late Start potentially delays the

program end date.

Late Finish - the latest a task is able to finish based on the need dates. The Late Finish is

calculated during the “backwards pass” and is determined prior to establishing the same

task’s Late Start. Again, the “no later than” type constraints affect establishing the Late

Dates during the backwards pass. Delaying a Late Finish potentially delays the program end

date.

Actual Start - the date the task started. In-progress tasks, started but not completed, have an

Actual Start date, but not an Actual Finish date.

Actual Finish – the completed end of a task’s duration indicating when the associated scope

of work was concluded.

Constraint – an imposed date used to fine-tune a task’s scheduled or need start or finish dates

that the logic network alone cannot effectively determine.

Baseline Start – the planned beginning of a task’s duration to execute the associated scope of

work.

Baseline Finish - the planned completion of a task’s duration to execute the associated scope

of work.

Predecessors

Predecessors are other tasks that must be performed before executing the focus task. The focus

task therefore is dependent on one or perhaps more than one task and this dependency is

identified in the IMS. The logically connected path of tasks leading up to the focus task is

referred to as the predecessor path or the upstream path.

The important thing about predecessors is that they determine the date of the focus task. If a task

in the predecessor path moves to the right (becomes later), this can have a chain-reaction effect

on all the tasks in its path, moving them to the right, and can impact the date of the focus task,

driving it to the right also. Conversely, if a task in the predecessor path moves to the left

(becomes) earlier, the same chain-reaction throughout the network up to the focus task can allow

the tasks to move to the left and become earlier too.

A task that is missing a predecessor may be in jeopardy of starting too early. The problem with

starting too early is other work may deliver needed items or knowledge that benefits the task.

Performing some or all of the work too early may result in rework or disruptions in effort. In

general, starting and completing tasks early is good, when performed in the proper order,

satisfying all prerequisites, and promoting a non-disruptive use of available resources.

Page 14: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

14 Version 4.0, 21 September 2012

Successors

Successors are the opposite of predecessors – or at least the view from the logic network looking

in the opposite direction. Successors are other tasks that cannot be performed before executing

the focus task.

The focus task therefore is a determining influence on one or perhaps more than one task and

this dependency is identified in the IMS. The logically connected path of tasks leading away

from the focus task is referred to as the successor path; the downstream path.

The important thing about successors is that their dates are determined by the focus task. If the

focus task moves to the right (becomes later), this can have a chain-reaction effect on all the

tasks in its successor path, moving them to the right, and can potentially impact the date of a

program milestone, driving it to the right also. Conversely, if the focus task moves to the left

(becomes) earlier, the same chain-reaction throughout the network up to the program milestone

can allow the tasks to move to the left and become earlier too.

A task that is missing a successor may be in danger of not reflecting the correct deliverable date

and possibly not reflecting the correct amount of time needed to meet a program milestone or

program completion date.

One other item to mention about successors that is discussed in later sections is to avoid adding

unnecessary successors. Assigning successors to tasks that do not have a valid rationale for

creating a dependency may cause problems later if tasks are inappropriately delayed because of

this false relationship.

Another type of assigning unnecessary successors is applying “redundant logic” to the network.

Redundant logic is best described as Task A has a successor of Task B, and Task B has a

successor of Task C. It is not necessary to assign Task C as a successor to Task A; that is

redundant.

Task A Task B

Task B Task C

Task A Task C

There are other scenarios that come into play resulting in redundant logic. Assigning these

unnecessary relationships does not cause an improper schedule calculation, but it does present

difficulties when trying to analyze a schedule for tasks that may be causing a schedule problem.

Constraints

Constraints refer to several types of restrictions that influence a task’s start or finish dates and

may determine a task’s need date and possibly affect other tasks in the logic network. Ideally, the

logic network should establish the start and finish dates for all tasks in the IMS. However, logic

relationships alone cannot always reflect the conditions that exist on a program.

For example, a task may have a predecessor that determines the task’s start date, but the planner

may be aware of a related condition that may influence when the task may commence; such as

when planning to use a test facility that is not under the program’s control. The planner can

Page 15: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

15 Version 4.0, 21 September 2012

apply a documented “no earlier than” type date constraint to the task that does not permit the

task to schedule earlier than a specific date (the first availability for accessing and using the test

facility). This places the task in the proper time, and also correctly impacts remaining successor

path tasks, and acknowledges the rationale for applying the constraint

Other types of date constraints affect the need date for tasks that are logically tied to a specific

task or milestone. Recall from the logic network topic discussions, that the network can predict

when tasks occur in the schedule and the impact to milestones and end date. The network also

provides a prioritized value indicating tasks’ criticality for achieving the identified need dates.

Again, using the test facility’s availability as an example, all program test activities must be

completed by a known date, as there are other programs and operations that need the test facility

for their use and are planning their respective activities around reserved test facility schedule

date availability. Placing a related “no later than” type date constraint on the milestone depicting

the end of program test activities establishes an end date that enforces a need date for all

predecessor path tasks leading up to the milestone. Each task in the path now reflects this

priority for meeting the identified end date that may be earlier than the priority established by the

logic network alone.

Date constraints should be continuously monitored and updated in the IMS to reflect the most

current date known. These changes need to be reflected in the IMS to provide value to the users.

Constraint dates should be minimized, relying on the logic network to establish dates and

determine the criticality as much as practical. Overuse of constraints reduces the network’s

ability to be provide reliable forecasts and can negate the significance of establishing need dates

and the importance of criticality values. Recording the rationale for using date constraints in the

IMS is a best practice.

Float

This unusually sounding term embodies one of the more telling aspects of critical path method

scheduling. Up to now, the discussion mentioned task priority and criticality. Float is the term

that captures these characteristics. The two types of float commonly used are total float and free

float.

Total float relates to need dates and whether the forecast dates as calculated by the logic network

are going to be earlier, on time, or later than required to meet the desired end date.

Fundamentally, the total float is a calculated value for each incomplete task that is expressed in

the common duration time unit, normally in days, as the number of days difference between the

forecasted schedule date (when a task will occur) and the need date (when a task must occur to

support the end date).

The forecast start and finish dates are determined by the logic network and any applicable “no

earlier than” constraints. These dates are the earliest dates a task can either start or finish and are

referred to as the Early Start and Early Finish.

Technically, total float is the calculated difference between the Early Dates and the Late Dates.

Total float provides a meaningful value depicting whether a task is supporting, determining, or

not supporting the program end date.

Page 16: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

16 Version 4.0, 21 September 2012

For incomplete tasks, total float values can be:

Positive - Reflecting the number of workdays that a task can delay before impacting the

end date

Zero - Determining the program end date, where negative values do not exist

Negative – reflecting the number of workdays that a task is impacting the program end

date, making it later than the desired end date

Generally speaking, tasks with the most negative total float values are impacting the end date

more than any other task and are the primary focus of attention when attempting to resolve these

negative impacts and reclaim the desired end date. Completed tasks have a zero total float value.

Keep in mind that the total float value is determined by several things, and not just the focus task

alone. The total float value is the result of a combination of the logic network and all the tasks

that are on the path to the milestone end date, their task durations / remaining durations, any

constraints applied that affect their start or finish dates, and the need date established for the end

milestone. Later discussions about calendars also introduce the number of working days for tasks

that can also have an effect on the total float value.

Free float is the second common float type. It is slightly easier to understand than total float.

Basically, free float is the amount of time a task can delay without affecting the start of its

successor task. This type of information can be helpful when making trade-off decisions such as

assigning scarce resources to work tasks and trying to discern where some flexibility exists to

make the best choice. Awareness of free float may sway the decision knowing one task can

afford the delay without affecting its successors’ dates.

Critical Path

The term critical path is one of the more misused terms in scheduling, although it is quite simple

to understand. The critical path is the longest sequence of logically connected tasks from the

present time to the program end date. If a task on the critical path slips, the program end date

will slip. The present time is the status date of the schedule or the Timenow date.

Collectively, all the tasks and their durations that comprise the critical path equate to total

program duration; the span time. Understanding which group of logically tied tasks that are

affecting the program end date is important to know. Maintaining control of the program’s

critical path is the most important single influence program management has for completing the

program on time and with the best opportunity to control costs.

Generally, if discussing the critical path before commencing work on the program, the critical

path is determined from all tasks in the IMS. Once execution has begun and the schedule is

assessed for progress and remaining effort, the schedule reflects the progress from the date the

status is taken. That date is referred to as the status date, data date, or Timenow. The critical path

during execution is no longer concerned with completed tasks; it is only concerned with

incomplete or remaining tasks and their potential impact to the program’s completion. Therefore,

determining the longest sequence of tasks from Timenow to the program end date, considers

only the incomplete tasks that can have an impact to the program completion.

Page 17: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

17 Version 4.0, 21 September 2012

Reducing the Critical Path Length

When the end date moves to the right, beyond an acceptable limit, efforts to pull it back to the

left involve focusing on the critical path tasks. If the program needs to end sooner, shorten the

critical path.

Efforts to reduce the impact of tasks on the critical path may result in the program end date

moving back to the left. Additional efforts to reduce the critical path duration may be required to

bring the end date back to its desired date. Through this iterative process of reducing the total

duration to program completion, the effects may be that another set of logically related tasks

becomes the determining factor on the program completion date; comprising a new critical path.

That is the dynamic nature of the critical path. Task durations increasing or decreasing determine

which set of logically connected tasks becomes the critical path. That is why developing a clear

method to determine the critical path is important and minimizing the “no later than” constraint

types applied to tasks, potentially masking critical path visibility, is essential.

Resources

The IMS reflects the work to be performed. It takes resource in the form of people, tools and

equipment, facilities, and materials for the work to happen. The sum of all resources employed

and consumed to accomplish the scope of work equates to the total cost.

Identifying the required resources and having the correct amounts, and available at the right

time, becomes the great leveler of plans and schedules. Directly allocating defined resources to

tasks in the IMS provides the constant knowledge of what it takes to perform the work associated

with each task and stays connected to the scope when task dates change. Knowing the resource

amount required and the resource amount available provides the schedule achievability program

management must know to be effective in managing the program.

Assigning resources to tasks in the IMS provides a visibility into the program’s ability to execute

the tasks as planned. The level of resource detail is similar to defining the level of task detail; the

more well-defined the resources, the better the granularity for managing their use. Human

resources defined at too high a level, for example “engineer” can reflect the amount needed

throughout the IMS. But, “Test Engineer IV” defines a more specific category; reflecting the

skill level and amount available for use on the program.

More specific resource categories:

Aids decisions about assigning the correct resources with applicable skill level to

appropriate tasks.

Provides the confidence that sufficient resource amounts exist to perform tasks as

scheduled by the logic network.

Identifies where and when additional specific resource types are required to perform

crucial or critical tasks.

Helps reflect specific resource amounts in relation to all resource type amounts required

on the program.

Page 18: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

18 Version 4.0, 21 September 2012

Calendar(s)

So how do tasks get their dates? Same way most everyone else does, from a calendar. The IMS

uses a defined calendar for tasks to reflect their start and finish dates and to be able to calculate

duration based items such as total float. Because “30 days hath September …” does not define

September well enough for scheduling purposes. How many of those September days are

working days, accounting for days that work can occur?

Calendars define working days so that tasks are scheduled to begin, end, and occur during

working time. This is not counting weekends and holidays. A seven day task that is scheduled to

start on a Tuesday should not reflect a finish date of the following Monday, if the established

calendar uses a typical five day work week of Monday through Friday. The task should reflect a

finish date of the following Wednesday, seven working days later. Calendars also acknowledge

the length of a day, typically eight hours, comprising 40 hours for the work week.

Further calendar definition can identify non-working periods such as recognized holidays or

organizational shut-down periods. By identifying non-work days in the calendar, the IMS does

not recognize those days when applying task duration against the available work days and

establishing valid dates for tasks.

Defining an accurate calendar that represents how the program operates contributes valuable

insight into forecasting the remaining program effort. Planning to conduct work on days

normally reserved as no work days can falsely permit the IMS to project a more optimistic end

date than what common sense dictates is possible.

Awareness of using multiple calendars in the IMS is important for understanding how dates and

duration based calculations such as total float values are determined. Sometimes it may be

practical to model a portion of tasks in the IMS with a different calendar than the default

calendar used by most of the tasks.

Baseline Dates

Creating the IMS involves significant planning efforts by most of the program team. The central

idea around this planning is to develop an executable schedule; one that contains as much

knowledge and foresight into potential difficulties and is the best tool to manage the effort

during the execution stage. The team captures this IMS version prior to execution as their

planned schedule. This IMS version is referred to as the baseline schedule and is used to help the

program team understand how well things are going during program execution compared to the

established expectations. Setting or “snapping” the baseline prior to commencing task execution

is a preferred practice. Formalized baseline information maintenance is required during

execution to ensure the baseline remains relevant and continues to provide meaningful feedback

to program management.

Baseline dates are the planned start and finish dates as anticipated during the planning phase.

Comparisons of the tasks’ current forecast start dates or actual start dates to their baseline start

dates and current forecast finish dates or actual finish dates to their baseline finish dates provide

insight into the execution performance. The comparison indicates how specific tasks are

performing against the plan. Trends can indicate where program management attention is

required to determine the cause for unsatisfactory performance indicators and facilitate applying

Page 19: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

19 Version 4.0, 21 September 2012

solutions to improve the performance on remaining efforts. Understanding past performance

trends provides an opportunity to learn from these conditions and apply solutions to future tasks.

This action has the benefit of relaying history learned to future efforts and providing the reality

that makes projected forecasts information more reliable.

For example, performance trends of conducting post-test analyses indicate that the baseline was

too optimistic in estimating the amount of time required to execute the work. Program

management may decide to increase the current task durations on similar future work efforts to

more accurately project the required effort and any potential impacts to subsequent downstream

tasks. It is also beneficial to update the baseline information for the related tasks affected by this

decision, facilitating an accurate comparison on those future tasks once they are executed. It does

not benefit the program and does not provide meaningful metric information to use baseline

information that does not accurately reflect the method of execution. The program should update

the baseline to reflect changes to the execution approach. The baseline continues providing

value-added information only when it reflects the method of execution. Since the baseline

comparison provides a great deal of performance information to the program, regimented

process controls and procedures are established to control and manage baseline changes.

Permitting unlimited and uncontrolled baseline modifications can render the comparison

information meaningless.

Baselines and Planning Packages

Typically, knowledge about task definition and approaches has a higher degree of confidence in

near term, than in later periods of the program. Recognizing this, planners utilize techniques to

describe future periods of work in comprehensive, less detailed tasks. These tasks are referred to

as “planning packages”. Planning packages contain all the related scope that must be performed,

but the detailed definition effort is reserved to occur later in the program when the timing for

detailing the scope into executable tasks is closer and perhaps more specifics are understood

about the effort required. This approach permits developing more accurate estimates about the

work without sacrificing the previous visibility of this effort represented in the schedule in the

proper time period.

Planning packages still adhere to proper planning and logic network application and contribute

to defining the total IMS period of performance and the critical path. Exercise care that planning

packages are not so long in duration and too vague in definition that they mask visibility into

their content and their part in establishing a meaning critical path. A duration that is too long

may eliminate the ability to reflect proper logic ties to other tasks that may exist within the

duration length, but are unable to identify because it is one single task.

Statusing

Probably the first thing to notice about this topic is that “statusing” is not a proper word, but it is

a common scheduling term and one that encompasses much of what establishes the validity of

the IMS. Statusing is the process of updating the IMS to reflect progress achieved and

forecasting remaining work. Several derivations of the term status may be used to:

Page 20: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

20 Version 4.0, 21 September 2012

Take status – solicit updates from tasks owners about the condition of their tasks;

started, completed, or in-progress and determining the amount of completion and

forecasting finish dates for partially completed tasks.

Status the schedule – apply the collected task update information to the related tasks in

the IMS and recalculate the effects on the remaining tasks and milestones through the

logic network.

Report the status – generate the reports and communicate the effects from updating the

IMS with the latest performance information, highlighting the changes with the most

pronounced impacts.

Recording history

Recording accurate historical information in the IMS is important for several reasons. Metrics

that compare performance versus baseline information is enhanced through recording the

accurate actual start and actual finish dates. Approximations of actual dates may skew the

performance comparison information leading to inaccurate performance measurements and

possibly indicating poor or exceptional false performance metrics. Also, the value of accurate

actual dates increases the reliability of making decisions to modify future similar tasks using the

historical performance information.

Avoiding Invalid Dates

It may be possible to enter an actual start date or an actual finish date in the future (to the right of

Timenow) however; a task with an actual start date or an actual finish date that is later than

Timenow is inaccurate. This condition is similar to saying that this task completed (past tense) in

the future.

Likewise, allowing incomplete tasks that do not have the appropriate start dates and finish dates

to remain to the left of Timenow is presenting invalid information. A task with a forecast start

date or forecast finish date (and do not have the related actual start or actual finish dates) that is

earlier than Timenow, has not been properly scheduled.

Both conditions not only represent inaccurate schedule status, but also can affect successor path

task calculations, causing inaccurate forecast information in the IMS. In addition to entering

accurate start and finish dates, care must be taken to recognize the existing logical relationships

on tasks when performing status updates. Reflecting the actual dates as tasks are executed may

possibly require modifying related logic.

Avoiding Out-of-Sequence Status Conditions

Sometimes task execution produces situations where the existing logical relationships must be

modified to permit recording accurate actual date information and properly reflecting the logical

impacts to other tasks.

A task that was able to commence for valid reasons, but does so without its identified

predecessor completing, is said to be out-of-sequence with its logic. The remedy is to modify the

Page 21: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

21 Version 4.0, 21 September 2012

existing predecessor relationship to reflect the appropriate task dependencies. By not modifying

the logic, the out-of-sequence task can produce incorrect forecast conditions on its successor

path tasks and skew calculations such as total float and potential impacts to milestones and end

dates. Modifying the logic relationships for these conditions keeps the IMS healthy and

projecting accurate forecast information throughout the logic network.

Accurately Recording Progress

Addressing tasks that are not started is easy; they are zero percent complete. Determining the

condition of completed tasks is also easy; they are 100 percent complete. Those tasks that are

started, but not completed present a different situation. Determining the amount of progress

achieved and the amount remaining on in-progress tasks can be a bit more involved.

Previously, the topic of task duration discussed the benefit of defining tasks with short durations,

stating that shorter duration tasks are more manageable and easier to determine progress during

status updates. Being able to claim a task started and completed during the same status period

makes it easier to update than having to assess how much work was accomplished and how

much work remains for the unfinished portion of the task. Typically, shorter duration tasks that

start and are in-progress during one status period will complete during the next status period.

This occurs where the status update period is monthly and shorter duration tasks are less than

two months long. For programs using Earned Value Management practices, guidelines establish

the methods for claiming the amount of value “earned”, such as a “50-50” Earned Value

Technique (EVT). Here 50% of the value is earned when the task is started and 50%

of the value is earned when it is completed. The idea is to simplify the assessment because the

amount of claimed progress and remaining effort is governed by consistently applying these

business rules.

Longer duration tasks that span more than two status periods, have an inherent problem to

determine the amount of progress achieved when not using techniques similar to “50-50” that are

designed for use on tasks that span only two status periods. Tasks with estimated durations of 60

working days (3 months) for example, could realistically span four months and possibly more if

encountering any delays once started. The ability to assess accurate and consistent progress is

more problematic for these tasks. Typically, these tasks use a method that defines and quantifies

portions of the included scope as steps. All the steps together equate to 100% of the task’s

included scope. Knowing which steps have been accomplished during its execution provides the

objective evidence to claim progress against those steps. Totaling all the completed steps

represent the percent complete achieved for these long duration tasks. Definition of steps occurs

prior to task execution and is part of the task’s configuration control.

Knowing how much has been accomplished and how much remains to be accomplished is

important. But equally, if not more important, is projecting when the task will finish.

Determining the forecast finish date for in-progress tasks takes many things into consideration.

The task owner knows the work and is the best person to assess its condition and status.

Understanding the task’s scope, and how easy or difficult it was to achieve that progress to date,

and how easy or difficult it will be to achieve the remaining scope effort is part of the

assessment. Knowledge about technical challenges and possibly competing priorities that may

Page 22: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

22 Version 4.0, 21 September 2012

draw attention away from this task may have an influence on estimating the productivity

expected to accomplish the remaining amount of work content in an achievable amount of time.

This remaining amount of time is the remaining duration. It is the amount of time from Timenow

to accomplish the remaining work that produces the forecast finish date. This finish date

projection will not only have an effect on the in-progress task, but also on the successor path

tasks. Underestimating the remaining duration could project a more optimistic schedule on

downstream tasks and possibly understate the impact to milestones and the program end date.

Part 1 Conclusion

In Part 1, the document has introduced the beginner scheduler to the IMS and introduced basic

scheduling concepts, terms and definitions. In the process, Part 1 has also introduced the

importance of having a sound schedule, one that is complete, accurate, and properly statused.

With this foundation established, the scheduler can begin to perform schedule assessments.

Page 23: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

23 Version 4.0, 21 September 2012

Part 2 – Basic Integrated Master Schedule (IMS) Assessments

Introduction / Purpose / Scope

This part of the document is written for intermediate level scheduling personnel desiring to

perform a schedule assessment on a contractor prepared IMS. This part contains key terms and

concepts and other discussion topics. A person with a sound understanding of scheduling

fundamentals may elect to skip Part 1 and begin with this part of the document.

Overview / Refresher

The IMS is a contractor-prepared schedule. It is often preceded by and aligned with program

level government-prepared schedules. The government prepared program schedule may be

called a Summary Master Schedule, Master Plan, Roadmap, Master Phasing Schedule or

Integrated Program Schedule. As the highest, least detailed schedule, this government-prepared

schedule highlights the contract period of performance, program milestones, and other

significant, measurable program events and phases. This schedule is normally initially prepared

in the pre-proposal phase.

The IMS is the focus of all related artifacts integrated through the planning process, establishing

the program baseline and executable schedule used to manage the program. The IMS is

foundational for determining performance measurement and utilizing its predictive forecasting

capabilities for remaining effort toward achieving program goals.

Valid, useful, and effective schedules are an essential part of a successful program execution.

Ideally, schedules help the program team:

Describe the entire discrete program effort to the detailed task and milestone level.

Integrate internal, Contractor, and external organizational schedules.

Define achievable, objective, and measureable targets or goals for key deliverables and

events.

Logically link dependent tasks and milestones to key schedule targets (control

milestones) to achieve vertical schedule integration.

Logically link dependent tasks within and between control accounts to achieve logical

horizontal schedule integration. Assess and record progress that accurately depicts the

program status.

Determine remaining effort with realistic forecast dates.

Understand the impacts of delays or changed priorities on remaining work.

Manage weekly and monthly operations and priorities.

Communicate status, forecast, and impacts to the program team and sponsors.

Control changes and manage the above via a controlled, repeatable process.

Enable fact-based decision making to effectively manage the program.

Assessing the schedule health helps ensure the IMS is supporting the above objectives and

determines if the IMS needs improvement. An IMS can typically be 10,000 lines or more.

Without assessment tools and techniques, it is easy for anomalies to occur and accumulate that

may impact the overall schedule accuracy and predictability.

Page 24: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

24 Version 4.0, 21 September 2012

Standards for IMS and IMS Assessments

Prior to objectively assessing an IMS, understanding the standards that are used to measure a

particular program IMS is important. The information below discusses the body of knowledge

that should be reviewed early in the assessment process.

Contract Requirements

Review the contract to determine the requirements for the IMS. IMS requirements often exist in

the Statement of Work (SOW) and may include granularity limits, definitions of custom fields

and unique codes used for givers and receivers. The Contract Data Requirements List (CDRL)

may also contain tailoring for the IMS Data Item Description (DID) DI-MGMT-81650.

The Data Item Description DI-MGMT-81650 is a good start for specifying an IMS. The IMS

requirements may be tailored in the SOW or through DID tailoring. Requirements defined in the

SOW under the program management section often have more visibility than a tailored DID.

Subcontractor IMS requirements should be identified in the tailored DID when subcontractors

plan to participate on the same SOW.

If there is a series of IMSs on the program, consider time-phasing the delivery of schedules for a

complete rollup of all contractor progress at the top tier. If multiple schedules are involved,

consider using external links across the schedules or giver-receiver fields so that the multiple

IMSs can be correlated.

The DoD Integrated Master Plan (IMP) and Integrated Master Schedule Preparation and Use

Guide contain recommended SOW language for an IMS. The DoD Earned Value Management

Implementation Guide also contains recommended tailoring for the IMS CDRL.

If an IMP is required by the contract, it will contain the essential Events and Accomplishments

that must be included in the IMS. The IMP is normally a contractual approval document and the

Program Events, Significant Accomplishments, and Accomplishment Criteria in the IMP are the

foundation for IMS development.

Contractor Requirements

IMS schedules, submitted as part of Earned Value Management (EVM) programs, are required

to comply with the contractor’s Earned Value Management System (EVMS) Description

document. Unique contract requirements that conflict with the document’s EVMS Process must

be documented in a Program Instruction or the equivalent and are typically approved by DCMA.

Requirements may include the treatment of Level of Effort (LOE), resource loading, and the use

of additional custom fields in the project file.

Page 25: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

25 Version 4.0, 21 September 2012

Note: A contract requirement that is less stringent than the Earned Value

Management System Description (EVMSD) should not result in non-compliance

with the EVMSD. For example, a contract that only requires submitting a statused

IMS once per quarter should not override the need to status the IMS monthly as

defined in the EVMSD.

Agency Standards / Guidelines

There are a number of government and civilian agencies that have published guidelines for

Integrated Master Schedules. Government Accountability Office (GAO), Navy Center for

Earned Value Management (CEVM), Defense Contract Management Agency (DCMA), Project

Management Institute (PMI), and others provide excellent insight into what constitutes a credible

schedule. The chart that follows shows a comparison of several standards or guidelines.

Page 26: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

26 Version 4.0, 21 September 2012

Generally Accepted Scheduling Principles (GASP)

The GASP are eight concise overarching tenets for building, maintaining, and using schedules as

effective management tools. The first five GASP tenets describe the requisite qualities of a valid

schedule; that is, one that provides complete, reasonable, and credible information based on

realistic logic, durations, and dates. The latter three GASP tenets reflect increased scheduling

maturity that yields an effective schedule. An effective schedule provides timely and reliable

data, helps align time-phased resources, and is built and maintained using controlled and

repeatable processes.

ANSI/EIA-748-B

Earned Value Management

Systems (EVMS)

Guidelines

US U.S. Government

Accountability Office (GAO)

10 Scheduling Best

Practices

Defense Contract Management Agency

(DCMA)

IMS 14 Point Assessment

Naval Center for Earned Value

Management (CEVM)

Program Schedule Assessment (PSA)

9. Invalid Dates – Tasks with invalid forecast

(incomplete tasks to the left of Time Now) or

actual start/finish (completed tasks in the

future) dates (Goal 0%)

11. Missed Tasks – Tasks not completed or

forecasted to complete as planned (Goal <

5%)

14. Baseline Execution Index (BEI) – Tasks

completed as a ratio to those tasks that

should have been completed to date

according to the baseline plan (Goal > 0.95)

6. High Float – Tasks with Total Float > 44

working days (Goal < 5%)

7. Negative Float – Tasks with Total Float < 0

working days (Goal 0%)

12. Critical Path Test – Tests the logic in the

critical path (Goal “Pass”)

13. Critical Path Length Index (CPLI) –

Measures the “realism” of the critical path

(Goal > 0.95)

5. Are constraints, leads, and lags justified?

2. Are critical target dates identified; are

they being used to plan the work?

7. Are resource estimates reasonable; are

key resources available to support the plan?

11. Can the current program schedule be

accomplished at an acceptable risk level?

Controlled

8 - Baselined Performance

Measurement Baseline (PMB)

29-32 - Documented

processes for tracing

authorized changes &

controlling retroactive changes

10. Creating a Baseline

Schedule.

1. Logic – Tasks without predecessors and /

or successors (Goal < 5%)

1 - WBS

2 - OBS

7 - Interim Milestones

10 - Work Packages &

Planning Packages

12 - LOE

3 - Integration

5 - RAM / Control Accounts

6 - Vertical & Horizontal

Integration

5. Are constraints, leads, and lags justified?

6. Are duration estimates meaningful?

2. Sequencing all activities.

1. Capturing all activities.

4. Establishing the duration of

all activities.

2. Sequencing all activities.

5. Schedule is Traceable

Horizontally and Vertically.

1. Does the schedule reflect the work to be

done?

3. Is work sequenced logically?

4. Are interdependencies planned in a

logical manner?

8. High Duration – Tasks with Duration > 44

working days (Goal < 5%)

4. Relationship Types – Finish-to-Start (FS)

should be the most common (Goal > 90%)

8. Conducting a schedule risk

analysis.

3. Assigning resources to all

activities.Resourced

5. Hard Constraints – Tasks not designated

as As Soon As Possible (ASAP), Start No

Earlier Than (SNET), or Finish No Earlier

Than (FNET) (Goal < 5%)

10. Resources – If loaded, tasks without

dollars or hours assigned (Goal 0%)

Usable

9 - Control Account Level

Budgets & Detail

10 - Control Account Plans

(CAPs)

6 - IMS Effective Planning,

Statusing, & Forecasting for

Achieving Requirements &

Measuring Performance

Generally Accepted

Scheduling Principles

(GASP)

Valid

Effective

Complete

Traceable

Predictive

Transparent

Statused

8. Does the critical path make sense; does

the scheduling software calculate it?

9. Are float times reasonable?

10. Does the schedule provide logical status

and forecasts of completion dates for all

authorized work?

6 - Vertical & Horizontal

Integration

10-Work Package

Characteristics, Objective

Techniques

6 - IMS Provides Current

Status for All Authorized Work

6 - IMS provides forecast of

completion dates for all

authorized work

6. Establishing the critical path.

7. Reasonable total float.

2. Leads – (negative lag) Distorts the critical

path (Goal 0%)

3. Lags – (positive lag) Excessive size and /

or usage distorts the critical path (Goal <

5%)

9. Updating the schedule.

Page 27: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

27 Version 4.0, 21 September 2012

GASP Description

Achieving a GASP-compliant schedule indicates the schedule is a useful and practical tool for

effective program management. Thus, meeting all eight GASP tenets demonstrates that the

program team builds and maintains the schedule with rigor and discipline, so the IMS remains a

meaningful management tool from program start through completion. The GASP were originally

developed as a governance mechanism for the Program Planning and Scheduling Subcommittee

(PPSS). The PPSS is a subcommittee formed by the Industrial Committee on Program

Management (ICPM) working group under the auspices of the National Defense Industrial

Association (NDIA). The GASP were developed collaboratively with inputs from both

government and industry.

The GASP serve several purposes. First, they are high level tenets, or targets, used as guides for

sound scheduling. The GASP also serve as a validation tool for the program team or organization

to assess schedule maturity or schedule areas needing improvement. Last, the GASP can be used

collectively as a governance tool to assess new or different scheduling approaches with

objectivity and detachment.

It is essential to understand that the GASP are intentionally broad and should not limit program

teams from continuous improvement and creativity when exploring tools and processes for

building and maintaining robust schedules. New practices or techniques are encouraged—if and

when they meet the GASP. There will be times when a given practice diminishes compliance to

one principle over another. This is expected and unavoidable, but what is paramount is that the

program team weighs overall benefits versus the risks and implements accordingly. The GASP

provide an independent mechanism to determine the acceptability of proposed schedule

practices. A description of the GASP follows:

Page 28: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

28 Version 4.0, 21 September 2012

GASP Description

Page 29: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

29 Version 4.0, 21 September 2012

The chart below shows the GASP criteria, characteristics of a schedule that meets GASP and

the artifacts observed in performing the Quick Look and IMS Comprehensive Assessments.

GASP Characteristics and Artifacts

Page 30: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

30 Version 4.0, 21 September 2012

Additional Schedule References

Listed below are a number of government, non-government published documents and references

that provide schedule guidance.

General Scheduling Guidance

o ANSI / EIA 748B, Earned Value Management Systems, June 2007

General discussion of program schedules

Lists the requirement for all levels of schedules to be integrated

Discusses the timing of detailed planning vice planning packages

o PMI Practice Standard for Scheduling, 2007

Defines a schedule development process

Contains guidelines for granularity

Defines schedule maintenance process

Detailed list of schedule components

Comprehensive list of scheduling terms & definitions

o DMSC Scheduling Guide for Program Managers, Oct 2001

General primer on scheduling in DoD environment written for

Program Managers

Discusses schedule types, levels of schedules, & scheduling theory

Emphasizes tying risk to schedule

Discusses resource constrained scheduling & schedule crashing

Discusses production scheduling & Line of Balance

IMS Specific Scheduling Guidance

o Air Force Instruction 63-101, 20 July 2010

Establishes requirement for an IMS to be maintained by the

government PM & contain the contractor’s IMS

Requires government PM to perform recurring analysis of

contractor schedules

Discusses Systems Engineering detail required in IMS

o DCMA Integrated Master Schedule Assessment Guide, Rev 7, 11 Dec 2009

Provides 15 or more mechanical checks on the wellness of an IMS

Provides two pages of general guidance for IMS

Identifies key checks performed by DCMA during reviews

o Defense Acquisition Guidebook

Requires Acquisition Strategy to include highlights from IMS

Emphasizes that IMS should be event based

Requires configuration control of the IMS as part of technical

baseline configuration control

Requires integration of SEP & IMS

Requires WBS as predecessor to IMS development

Recommends cross reference of CTP & TPM in IMS

o Defense Acquisition Program Support Methodology, 9 Jan 2009

Stresses early industry involvement in program schedule

development

Emphasizes risk handling activities be included in IMS

Discusses schedule reserve or schedule margin

Page 31: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

31 Version 4.0, 21 September 2012

Contains IMS attribute list & IMS relationship to EVMS chart

Provides focus questions for IMS evaluation

o Depicting Schedule Margin in Integrated Master Schedules, May 2009

NDIA white paper on techniques to depict schedule margin in IMS

o DI-MGMT-81650, Integrated Master Schedule, 30 May 2005

The DID required to be placed on EVMS applicable contracts

Identifies requirement for performance measurement & SRA

o DI-MGMT-81861, Integrated Program Management Report, 21 June 2012

The DID required for EVMS applicable contracts after 01 July 2012

Identifies requirements for IMS and SRA

o DoD Integrated Master Plan and Integrated Master Schedule Preparation

and Use Guide, 21 Oct 2005

Provides guidance to government for IMS language in RFP

Recommends duplicating the PE, SA, & AC from the IMP into the

IMS

Contains generic steps for IMS preparation

Contains discussion of SRA

Contains procedures to evaluate IMS submitted with proposal

o DoD Earned Value Management Implementation Guide, October 2006

Contains tailoring guidance for IMS CDRL

Recommends contract have procedures & timeline for rolling wave

planning

Discusses OTS

o GAO Schedule Assessment Guide, May 2012

Ten principles for a quality schedule

Stresses the importance of WBS in IMS development.

Recommends WBS be outline for IMS

o Navy CEVM Analysis Toolkit, Aug 2008

Contains steps for IMS analysis

Contains NAVAIR 11 Point Program Schedule Assessment

Methodology

Contains NAVAIR SRA process

o NASA Schedule Management Handbook, January 2010

Recommends schedule management plan for each project

Contains best practices for scheduling

Lists project documentation needed to build an IMS

Contains detailed procedures for resource loaded schedules

Contains schedule health metrics

Contains Joint Confidence Level techniques

o NDIA PMS EVMS Intent Guide, June 2009

Shows attributes of IMS

Shows objective evidence of IMS complying with EVMS

guidelines

Shows the contribution of the IMS to multiple EVMS guidelines

o Planning and Scheduling Excellence Guide (PASEG), June 2012

Practical guidance for building and maintaining schedules

Page 32: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

32 Version 4.0, 21 September 2012

Contains section on schedule assessment and analysis

Includes section with terms and definitions

o Prime Contractor’s EVMS Description Documents

Addresses preparation of IMS to establish baseline

Addresses schedule maintenance

Addresses baseline change procedures

o Prime Contractor’s Scheduling Guidelines & Handbooks

Contractors often have separate scheduling document to standardize

their scheduling practices

Contractors may have program unique scheduling instructions

where the contract or unusual situations establish more rigorous

requirements than in the EVMS Description or contractor

scheduling practices

Additional Schedule References

Key Terms and Concepts

This section lists key terms and concepts for the preparation and maintenance of IMSs and is

provided as a tutorial for the assessment team to facilitate understanding the basis for performing

schedule assessments.

The IMS Assessment Process uses these terms synonymously:

IMS and Schedule.

Program and Project.

Total Float and Total Slack.

Free Float and Free Slack.

Earned Value Method and Earned Value Technique (with respect to work packages).

Planners and Schedulers.

Supplier and Contractor.

Terminology may vary depending on which scheduling tool is used. Microsoft Project and Open

Plan Professional use the following terms somewhat synonymously.

Microsoft Project Open Plan Professional

Task Activity

Summary Tasks Subprojects

Constraint/Deadline Date Target Date

Constraint Types: Start/Finish No

Earlier Than, Start/Finish No Later

Than, As Late As Possible, As Soon

As Possible, Must Start/Finish On

Target Types: Start/Finish Not Earlier

Than, Start/Finish Not Later Than, On

Target, Fixed Target

Government Oversight Schedules and Supplier Schedules

The Government Program Management Office (PMO) has the challenging role of providing

oversight and management of multiple contractors and major subcontractors with multiple IMS

files that are integrated using a combination of electronic and manual interfaces or integration

points or techniques. Program schedules may be made up of one or more IMS files. While the

Page 33: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

33 Version 4.0, 21 September 2012

supplier’s PMO typically produces and maintains most schedule products provided to the

government, it is the whole program team, including government and supplier members, who

must claim ownership for schedule content and validity.

The government PMO is charged with managing the entire program for the Program Executive

Officer (PEO). Each program is typically comprised of one or more contracts. Each

development/production contract, in turn, has a single IMS that pertains to that contractor’s

contractual SOW. Each contract typically has one or more major subcontractors that may also

have their own IMS. Each program must also have a total program schedule that includes all

government responsible scope and interfaces. This content may be in a separate government

schedule or part of the contractor’s IMS.

IMS Hierarchy (WBS Levels, Summary Tasks, and Indentures)

The IMS typically starts with a total program task and summary tasks that may follow a number

of patterns. Some IMSs use the Program Events from the IMP as an initial indenture level with

supporting tasks aligned beneath. An indenture level refers to a particular level within a

hierarchal structure, in this case levels of an IMS, providing a meaningful arrangement of tasks

and data. Other programs may use the Work Breakdown Structure (WBS) as the initial indenture

level. IMSs may contain a summary section for milestones followed by an indentured level of

discrete tasks. Below each summary task level is a further breakdown until reaching the detailed

task level. At the detailed level these are simply called tasks or detailed tasks and can also be

milestones (tasks with zero duration).

IMS indentures, or levels, vary based on the program size, duration, and complexity. There is no

hard rule that requires planning to one level (e.g., level 3) for the entire program. Instead, the

program team needs to evaluate each WBS leg and determine the appropriate indenture that

provides manageable Control Account and Work Package sizes (budgets) and durations.

Major Subcontracts or supplier efforts typically fall under one WBS element, though it is

possible that a supplier provides effort across multiple WBS elements.

Ultimately, the IMS hierarchy should help the program team to plan the work in detail to a

manageable level such that each control account can be assigned to a single person (the CAM)

and the work can be statused and forecast per the routine business rhythm (e.g., weekly, bi-

weekly, or monthly updates) with consideration for collecting actual costs for control accounts

(minimally) or work packages (ideally). This provides effective management visibility, critical

path identification, analysis, and control.

This structured approach affords the program management team greater visibility and capability

to plan necessary resources adequately and to ensure adequate budget is available to accomplish

the work as planned.

Integrated Master Plan (IMP)

The IMP is a hierarchical, event-based program structure breakdown that defines the program’s

entire scope of work and approach to execute the work by Events, Accomplishments, and

Criteria through successive levels of supporting detail. This structure takes the form of high-level

interim Events to assess program progress, the more specific Accomplishments that define the

Page 34: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

34 Version 4.0, 21 September 2012

objectives to meet those progress points and the much more defined Criteria that are measurable

indicators providing tangible evidence of meeting the objectives.

Typically, there is more than one Accomplishment per Event and more than one Criterion per

Accomplishment. Event, Accomplishment and Criteria naming convention follows the practice

of ending the name with a paste-tense verb. For example, the Criteria for performing test data

analysis might be Static Test Data Analysis Completed.

While the IMP and IMS are inherently related, they are two separate products. The IMP answers

the question “What” and the IMS “When” the scope is time phased.

IMS Supplemental Information

Any submittal of the IMS to the government should be accompanied with supplemental

information. This information is normally provided in a narrative that accompanies the native

scheduling software data files. This information should include schedule status date, dates of

related artifacts such as IMP and risk database, and reference to any program unique scheduling

procedures. The narrative should also include explanations of schedule content such as LOE,

custom field use, deadline use, schedule margin, and external milestones.

Program Milestones or Control Milestones

The SOW or other contractual documents identify the customer required major events and

program start and finish dates.

Typically, an IMS includes a program start milestone and a program finish milestone. The start

milestone is used as a predecessor for work that begins at the start of a program. The finish

milestone is the successor for the ends of all logic paths. Milestones for major events are usually

included. These are Systems Requirements Review (SRR), Preliminary Design Review (PDR),

Critical Design Review (CDR), and Production Readiness Review (PRR). Other major events

such as important tests and demonstrations, achieving flight readiness, or first flight, in the case

of an aircraft program, may also be incorporated. When milestones are included in a section at

the top of the IMS, it improves readability.

Tip: Derive program milestones from the IMP Events and Significant

Accomplishments. These are typically significant start, interim, or end points for

major phases or efforts. Before baselining the IMS, synchronize the IMP and the

IMS program milestones.

Control Account or IPT Start, Finish, and Interim Milestones

Complex programs typically have many control accounts, IPTs, or other major sections that

contain internal predecessor-successor logic; that is, links among tasks within the control

account or IPT. Ideally, each control account has a defined start and finish as well as interim

milestones that establish natural start or break points within the work. Using these kinds of

milestones is useful for validating internal logic and for creating links among control accounts or

IPTs.

Page 35: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

35 Version 4.0, 21 September 2012

Planning Packages

Planning packages are groups of tasks that cannot be defined in detail at the current time that

must be turned into detail planned work packages before work can begin. Planning packages

have scheduled start and completion dates, determined by precedence logic. They also have

associated resource cost budgets and forecasts to support the scope they represent, providing that

information in the EVM System to maintain the Performance Measurement Baseline (PMB) and

forecast. Because planning packages lack sufficient detail for measured progress, EV methods or

techniques are not assigned until the detailed planning is conducted.

CAMs should be able to explain the content of any planning packages in their control accounts.

Planning packages should have durations of six months or less, if practical. Shorter periods align

with detailed rolling wave exercises where planning packages are efficiently converted to

detailed work packages at one time. Shorter planning package durations facilitate enhanced

critical path validity. Some programs conduct detailed planning at least once per year. Other

programs require planning packages be converted one or more accounting periods before work

can begin. Considering the stability of the program, it is advisable to require conversions of

planning packages as soon as possible with reasonable confidence in the plans.

Task Types

Microsoft Project scheduling software is widely used in the Air Force. For this reason the

discussion regarding task types will address those found in the Microsoft tool. There are three

task types; fixed-units, fixed-work, and fixed-duration.

Fixed-units: think resources - the assigned units or resources are a fixed value and the

changes to a task’s duration or amount of work do not affect the amount of units or

resources.

Fixed-work: think hours of work - the amount of work (usually hours) for a task are a

fixed value and changes to a task’s duration or amount of units (resources) do not affect

the amount of work.

Fixed-duration: think the span of work-time from start to finish - the task’s duration is

a fixed value and changes to a task’s work or assigned units or resources do not affect

the task’s duration.

The Fixed-duration task type provides the most stable condition for establishing the task

duration. When establishing the schedule; review and validate assigned resources required to

accomplish the work in the stated amount of time. When changes are made to the schedule the

result is dependent upon the task type. The figure below shows the result of changes by task

type.

Task Type Changes

Page 36: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

36 Version 4.0, 21 September 2012

Each calculation uses Duration multiplied by Units equals Work (Duration x Units = Work).

Task Names (Nomenclature)

Consistent and clear task naming conventions increase the usability and effectiveness of an IMS.

For example, performing name searches for similar tasks in multiple parts of the schedule is

easier when the naming structure is well-defined and constant. When naming tasks, it is

important to define the task (scope) and its output (deliverable). Write descriptive task names so

the user understands the content without its Summary Task structure to aid in its descriptive

clarification. Task names are most effective when beginning with a present-tense action verb and

describe the scope in such a manner that assists in determining its completion. An example is

“Complete Flight Survivability Test Analysis.”

Task Durations

Task durations reflect how much time it takes to perform the work under normal conditions.

Express this most-likely task duration in working days, as the time unit appropriate for most

programs. Shorter task durations are more manageable than longer task durations, complement

precedence logic development, and facilitate progress measurement. A task duration goal of less

than two months is desirable, normally 44 working days or fewer. Refrain from arbitrarily

breaking tasks into shorter duration increments if the natural task duration exceeds this desired

length. Well-defined tasks provide greater critical path assurance. In addition, planning package

durations should be less than six months if practical. LOE task durations typically align with an

associated period for providing the related effort, but should not encompass multiple Fiscal

Years, complicating schedule administration.

Note: Use consistent time units for all task durations in the same schedule. Avoid

mixing days, weeks, and months, as this makes analysis difficult.

Logical Relationships Between Tasks

The logic network consists of the tasks logically tied to one another with a precedence

relationship. A predecessor relationship for a task means it is dependent upon that task to

determine its start or finish date. A successor relationship for a task means it determines the start

or finish of the dependent task. Collectively, all tasks connected by predecessor and successor

relationships make up the logic network.

It is important to identify technical and meaningful dependence among tasks when developing

the network logic. Assigning realistic and meaningful network logic in building the schedule

correctly because the dependence on other tasks is not arbitrary and it establishes relationships

that horizontally integrate the likely sequence of work. This approach schedules the tasks for the

correct timing based on the predecessor tasks and their durations and enables more accurate

forecasting, as well as more efficient task execution.

Stated differently, tasks need logical relationships so the schedule as a whole moves based on

status. Modeling the schedule with appropriate logical relationships and ensuring all tasks have

at least one predecessor and one successor supports the schedule as a predictive tool. Any

discrete activity in the network can have an impact on the total program.

Page 37: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

37 Version 4.0, 21 September 2012

Identify all required relationships, but avoid redundant ties. These are unnecessary and add

confusion to understanding and analyzing the logic network. See the exhibit illustrating a

redundant logic tie condition below.

Example of a redundant logic tie

Caution: Avoid assigning “any” convenient predecessor or successor to a task just

to ensure it has one. For example, avoid linking tasks to the program start or finish

milestones unless those are truly the most appropriate relationships. Avoid linking

tasks merely because they happen to place tasks in some preconceived, ideal time

period. These practices might seem to help schedule health in the short term, but

ultimately increase the effort required for logic maintenance and resolving false

schedule impact drivers. The problem with this approach is such tasks usually

reflect excessive amounts of positive total float and become suspects of not having a

proper predecessor or successor identified. The concern is that these tasks are

missing a logic tie that might impact the critical path. If the more appropriate logic

tie does not exist and there truly is no impact to other tasks, consider inserting a

control milestone to identify the related effort completion and assign a deadline to

control the amount of total float calculated. Thus, assigning valid relationships

enables better schedule management and analysis—and helps create meaningful

total float values.

Identify the technical outputs needed from other tasks (predecessors) to perform the effort and

the tasks that need its technical outputs (successors) in order for those efforts to execute. Identify

and assign at least one predecessor and one successor to all tasks in the schedule, except for the

first and last task or milestone.

Some schedules depict feed-in points from external sources as milestones for Government

Furnished Equipment (GFE) or material delivered to the program. These are commonly referred

to as ‘feed-in ‘milestones. Planners may choose not to have a predecessor for these conditions,

but certainly have successors identified for the tasks that require the delivered material.

Likewise, some schedules may generate a deliverable to an outside source, such as a technical

documents package that a supplier requires to perform their related efforts. In this case, the

schedule may not explicitly reflect the suppliers’ efforts and treat them as an external hand-off

without a successor task. These are commonly referred to as “feed-out” milestones. These hand-

Page 38: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

38 Version 4.0, 21 September 2012

off milestones should still have predecessors identified whose efforts generate the product and

determine the timing for the deliverable.

Acknowledge and explain such special conditions and the related techniques in the IMS

Supplemental Guidance. Explain the feed-in and hand-off methodology; include identifying the

tasks and using constraint dates for anticipated delivery dates and need dates, and the

considerations for total float.

Assigning logic to tasks may require taking into consideration a preference for the order of

execution. A task may not have a technical dependency to another task, but perhaps the same

person (or resource) is required to perform both tasks and both have the same predecessor.

Instead of scheduling both tasks in parallel with the knowledge that the same person cannot

execute both parallel scopes simultaneously, make one task the predecessor to the other, so one

occurs prior to the other.

Caution: Avoid assigning too many predecessor or successor relationships to a

single task or milestone. Having more than ten relationships makes analysis

difficult. For example, employ toll–gate milestones as a method of collecting the

common predecessor ends into a single toll-gate milestone whose name describes

the related tasks. Make successor ties to the toll-gate milestone, thereby reducing

the number of individual logic path relationships.

Logical relationships

Logic relationships define how predecessors and successors interface. The predecessor task must

start and / or finish before the successor task according to the following conditions. There are

four possible types:

Finish-to-Start (FS) – The predecessor task must finish before the successor task can

start.

Start-to-Start (SS) – The predecessor task must start before the successor task can start.

Finish-to-Finish (FF) – The predecessor task must finish before the successor task can

finish.

Start-to-Finish (SF) – The predecessor task must start before the successor task can

finish (rare condition).

Always read the relationship from left to right to avoid confusion, from predecessor to successor

task.

Caution: Tasks with start-to-start successor relationships should also have a finish

relationship to another task, so that there is a consequence to the task’s completion.

Tasks in Microsoft Project that have an actual start, but do not have an actual finish, and have FF

predecessors, do not automatically adjust their finish dates by honoring their FF predecessors’

finish dates. This is unique to this tool. Thus, it is possible for a task to have an earlier finish date

than its FF predecessor’s finish date (not honoring the relationship). MS Project tries to alert the

user to this condition by reflecting a negative total slack value equal to the amount of duration

Page 39: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

39 Version 4.0, 21 September 2012

overlap. The user must decide if the relationship is still valid. If so, the task’s remaining duration

must be revised to reflect a finish date that is equal to or later than its FF predecessor’s finish

date. If the relationship is not valid, the FF predecessor must be modified and / or replaced by a

more appropriate relationship; such as FS or SS.

Finish to Finish Relationship

MS Project screen shot correctly depicting a FF relationship, the second in-progress task does

not finish earlier than its FF predecessor’s finish.

Incorrect Finish to Finish Relationship

MS Project exhibit depicting a FF relationship, the second in-progress task incorrectly finishes

earlier than its FF predecessor’s finish. Note the negative three days of total slack (aka total

float), indicating the amount of duration that is incongruent with the relationship. This affects the

third task and milestone, as they are incorrectly scheduled three working days earlier.

Caution: Check in-progress tasks with finish-to-finish predecessor relationships for

correct finish dates that honor their predecessor’s finish date. Also logically limit

the use of FF unless absolutely necessary.

Logical Relationships Between Summary Tasks

Caution: Software programs that use Summary Tasks (such as Microsoft Project)

permit assigning schedule logic relationships at the Summary Task level. Avoid

assigning logic relationships at the Summary Task level as this may have unintended

consequences on its subtasks and their logic relationships to tasks outside the

Summary. It is also difficult to analyze the schedule when sorting detail tasks if the

logic exists at the Summary Task level. Assign predecessors and successors at only

the detail task level.

Page 40: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

40 Version 4.0, 21 September 2012

Leads and Lags

A Lag is a modification to logical relationship that directs a delay in the successor activity. For

example, with a finish-to-start relationship and positive 3 working days lag, the successor starts

three working days after the end of the first task. Lags can be used to model wait time, and not to

represent contracted work effort. For example, a 30 day government review of a contactor

document may be modeled with lag. However never use a lead/lag to plan when a task starts or

ends. Lags should have a rationale for their use, documented in the task notes.

Consider using a documented “no earlier than” constraint to model resource

availability instead of applying lag for this purpose.

A lead is a logical relationship modification that allows acceleration in the successor activity;

modeling the successor’s start and possibly finish to occur earlier than the predecessor’s dates.

Leads should be avoided, as negative time is not demonstrable. Lags should not be used to

manipulate float / slack or constrain the schedule.

Consider decomposing a task into more than one task to enable properly modeling

finish-to-start logic relationships to avoid using a lead.

Constraints

Constraints are specific controls applied to tasks that help establish dates or monitor when

conditions are such that achieving objectives might be in jeopardy. A constraint potentially

overrides the calculated start or finish of a task with a date. There are two conditions that allow

only the logic to determine the date: As Soon as Possible and As Late as Possible.

The “No Earlier Than” constraints prohibit tasks from starting or finishing prior to a specific

date. The network logic can cause these tasks to schedule later, but not earlier than, the

constraint date. Modeling the anticipated receipt of material or components, receiving the

authorization or approval to proceed, or acknowledging the earliest availability of a limited

resource are examples of using this type of constraint.

The “No Later Than” constraints can have one of two possible effects, depending on the

schedule tool settings. They may either prohibit the start or finish date from reflecting the impact

of the network logic that would schedule the task later than its constraint date or allow the later

date, but establish the need to achieve the date. Regardless of the settings, the “No Later Than”

constraints calculate the total slack correctly and provide the related visibility into achieving the

need date. A negative total slack condition reflects the working days predecessor tasks must gain

so that the start and finish dates support the milestone constraint date.

Minimize use of all constraints in favor of establishing dates by the free flowing logic network.

Use documented “No Earlier Than” constraints to refine near future period dates that reflect

resource availability conditions. These constraints should be in addition to the logic relationships

and establish feed-in point dates for known deliverables not readily identified by logic

relationships. Apply “No Later Than” constraints and deadlines sparingly, and preferably with

the tool settings configured that allow predecessor impacts to determine the achievable dates.

They should reflect the known need dates that are different than the downstream logic path

Page 41: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

41 Version 4.0, 21 September 2012

naturally determines, and used to closely monitor portions of the schedule that are prone to

schedule slip, as a method to manage the total float. Identify and apply to milestones, as opposed

to tasks, that clearly indicate the intent of using related constraints.

As an example, the following is a list of constraints used in MSP:

As Soon As Possible (ASAP) – This constraint schedules the task to begin as early as possible.

This is the default condition for tasks when scheduling from the project start date. Do not enter a

start or finish date with this constraint.

As Late As Possible (ALAP) – This constraint schedules the task as late as possible according

to its successor path tasks. This is the default constraint for tasks when scheduling from the

project finish date. Do not enter a start or finish date with this constraint. Generally, minimize

the use of ALAP in that it may be difficult to manage and may have unintended consequences

that are difficult to detect.

Start No Earlier Than (SNET) – This constraint schedules the task to start on or after a

specified date. Use this constraint to ensure that a task does not start before a specified date. It is

sometimes used for representing a physical resource constraint.

Finish No Earlier Than (FNET) – This constraint schedules the task to finish on or after a

specified date. Use this constraint to ensure that a task does not finish before a certain date.

Start No Later Than (SNLT) – This constraint schedules the task to start on or before a

specified date. Use this constraint to ensure that a task does not start after a specified date.

Finish No Later Than (FNLT) – This constraint schedules the task to finish on or before a

specified date. Use this constraint to ensure that a task does not finish after a certain date.

Must Start On (MSO) – This constraint schedules the task to start on a specified date. This sets

the early, scheduled, and late start dates to the date entered and anchors the task in the schedule.

Must Finish On (MFO) – This constraint schedules the task to finish on a specified date. This

sets the early, scheduled, and late finish dates to the date entered and anchors the task in the

schedule.

Note: Reference specific tool settings to understand the constraint’s treatment under

different configurations and the related task behavior that make SNLT, FNLT, MSO,

and MFO constraints considered as “hard” constraints. Regardless of settings,

recommend not using MSO and MFO to reflect need dates.

Deadlines

MS Project offers a Deadline feature that behaves similarly to the “No Later Than” constraints

without restricting the effects of the network logic determining the task’s start or finish date.

There are benefits to using deadline dates rather than constraints. The use of deadlines allows

impacts to downstream successor path tasks caused by a schedule movement to the right.

Page 42: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

42 Version 4.0, 21 September 2012

The MFO constraint does not permit the task or milestone to move and subsequently does not

reflect the impact to successor path tasks (if option is checked for ‘Tasks will always honor their

constraint dates’ under Tool, Options, Schedule tab). The restrictive characteristics of MFO and

other hard constraints obscure visibility into the predictive capability of the logic network.

Alternatively, deadlines allow tasks or milestones to reflect the impact from predecessors, thus

reflecting the correct dates and also permitting the downstream successor path impacts to

forecast dates. The Total Slack is calculated correctly using either method for the focus task, but

using deadlines permits accurate successor path forecasting and therefore does not prohibit the

correct total slack calculation for those tasks.

Calendar(s)

Most tasks use the default program calendar to determine their start and finish dates. A program

calendar is typically based on an eight hour workday, Monday through Friday, 40 hour

workweek, and recognizes company holidays as non-work days. The calendar establishes those

available start and finish dates, based on the tasks’ timing. Apply a separate unique calendar of

workdays / hours representing those conditions to tasks requiring workdays and hours that are

different from the default standard calendar. This more accurately reflects how related tasks are

scheduled. For example, customized calendars could reflect working Saturdays for the next two

months or working two ten-hour shifts, seven days per week for a period. Be advised that

applying different calendars to tasks in the same schedule may result in some date / duration

calculations providing mixed values and making analysis more complex. For example, tasks with

different calendars that are on the same logical path to a program milestone could have different

total float values. Tasks using a five workdays per week calendar might have negative five days

of total float and tasks using a seven workdays per week calendar might have seven days of

negative total float. Both sets of tasks have one week of negative total float when measured

against their respective calendars.

Tip: Minimize the use of unique task and resource calendars and be aware of their

impact on the critical path. Use the indicators field to highlight the use of other than

the project calendar.

Critical Path and Driving Path(s)

The critical path is the longest path of related incomplete tasks in the logic network from

Timenow whose total duration determines the earliest program completion. Similarly, the

longest path of related incomplete tasks in the logic network from Timenow whose total duration

determines the earliest interim milestone completion is called the driving path. Defining the

driving path to relate to interim milestones and critical path to relate to program completion

differentiates these two terms. The critical path generally reflects the least amount of total float,

but this may not be the case in all situations, when constraints are applied to tasks and milestones

to reflect need dates. Any delay of tasks on the critical path results in an equal delay to program

completion or similarly with the driving path to interim milestone completion. Conversely, any

critical path shortening can result in an earlier program completion, earlier interim milestone

completion and sometimes results in a new set of related tasks determining the critical path.

Constraints configured to permit and reflect precedence impacts to dates, while still accurately

Page 43: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

43 Version 4.0, 21 September 2012

calculating total float values, such as appropriately configured “not later than” constraints or

deadlines, should be applied to endpoint milestones.

Warning: If LOE is included in the IMS, never link these tasks to discrete work. LOE

should never drive the critical path.

Crashing the Critical Path

After identifying all tasks and establishing logical relationships, observe the effects to the

program completion and program milestone dates. Often the initial schedule development

includes timing scenarios that do not support the target goals of the program. Find the milestone

dates and note any negative total float, caused by logic paths whose total duration results in a

calculated finish date that exceeds the desired end date.

Crashing the critical path involves iterative passes of driving path reviews to assess the tasks and

relationships that comprise these long duration sequence of tasks. Analyze task durations and

logical relationships for possible optimization. These reviews require inquiry into what trade-offs

are possible and acceptable with respect to risks to reduce the total time required to achieve the

end date.

Questions to be asked are:

Can certain tasks be performed in less time?

Can certain tasks be performed in parallel instead of the existing serial logic?

If the task were split into shorter duration tasks, would that provide the opportunity to

begin a subsequent task earlier, where its dependence is on the “up front” steps?

Can task durations be shortened if more resources are applied to the task? Would

productivity increase through acknowledged longer working days or additional work

periods?

As one path is reviewed and adjusted, another path of logically tied tasks may become the next

critical path that is not as long in duration as the previous path of tasks, but is too long to achieve

desired program goals. The same review process continues until all logical paths to the program

milestone are resolved for acceptable and executable conditions. It is through this type of

critique that task durations and logical relationships are tailored to reduce total duration and

achieve the program goals within acceptable amounts of risk.

If assigning resources after critical path development, the optimized schedule must be reviewed

in light of the resource allocation required to execute the schedule. The schedule may indicate

peak amounts of resources required at different time intervals of the program. Where resource

quantities do not exceed demand, some resource level smoothing may take place to avoid the

peaks and valleys of the resource load. This requires rescheduling tasks in the network to take

advantage of total float to level the amount of resources needed during the period. If resources

are scarce or the resource type with the required skill-set are in short supply, a more rigorous

review of task durations and precedence logic may be required to resolve resource allocation

issues. If the executable schedule envisioned cannot be achieved after resource allocation

smoothing, it is advisable that the supplier and customer negotiate a new achievable milestone

end date.

Page 44: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

44 Version 4.0, 21 September 2012

Resource Loading

There are different approaches to developing a resource loaded IMS. Some schedulers prefer

loading resources along with the other task information during schedule development. Others

prefer to develop the schedule, including all related information except for resources, to

concentrate on resolving schedule conflicts without the burden of the resources affecting any of

the adjustments. Whichever approach is taken, resource loading the IMS is an excellent method

to determine cost, observe the settings for assigning resources to tasks and the effect this action

has on task duration. After applying resources, review the critical path to determine any impact

of resource loading and resource leveling.

Assigning and maintaining resources in the IMS provides great visibility during program

execution. The dynamics encountered during execution require logic and task duration changes

to reflect the manner the program is unfolding. These changes could reschedule tasks such that

the resources required to perform the work are greater than available staffing. The lack of

resource loading in the IMS restricts the program’s visibility regarding over-allocated resource

conditions. Analyzing the schedule with resource allocation visibility enables the program to

make decisions on rescheduling tasks with the insight of available resources and the preferred

order of task execution.

Caution: Software programs that use Summary Tasks, such as Microsoft Project,

permit assigning resources at the Summary Task level. Avoid assigning resources at

Summary Task level, as this may have unintended consequences on the subtasks and

the ability to apply resource loading to specific task requirements. Assign resources

at the detail task level.

Another benefit to having a resource loaded IMS is the costs developed at the work package /

planning package level establish integration with the separate cost system, provide a method to

validate that the entire scope is planned, and provide a weighted value for each task.

The resource loaded IMS, along with the SOW and WBS, validates that the IMS contains all

program scope and is useful for reconciling program budget changes when performing baseline

change activity.

Not all tasks are equal, and a resource loaded IMS quantifies task magnitude by identifying the

number of resources involved or the amount of budget assigned to each task. This is especially

useful when analyzing the schedule and making management decisions. Without the benefit of

identifying resources at the task level, it is difficult to ascertain priorities and the importance of

one task over another or to assess the impact of management decisions. Quantifiable magnitude

is especially helpful when analyzing the schedule and making decisions regarding critical path /

near-critical path tasks, or tiebreaker decisions on task execution.

Resource-loading can be by name, functional category, or skill-set. The approach most likely

varies by program scope and complexity. A mixture is typical: by name for the key resources

and by category for the majority of the program resources.

Page 45: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

45 Version 4.0, 21 September 2012

Baselines

When the basic task information is established and the program has confidence the tasks can

execute within the constraints of capabilities, budget, and available resources to achieve program

targets and goals; the program is ready to baseline the tasks in the IMS. Setting the baseline for

tasks establishes a benchmark of start and finish dates, task durations, and estimated costs used

to measure periodic actual performance and revised projections on remaining work. Set the

baseline on all tasks in the IMS prior to entering the execution and maintenance / status phase.

Set the baseline for new tasks when adding them to the IMS. Establishing the baseline places the

IMS under configuration control and all subsequent task changes now follow the program’s

formal baseline change process.

Establish the baseline at the start of the program as equal to the forecast. The baseline is the

executable plan that performance is measured. In practical terms, the baseline uses the IMS’s

early dates and is controlled throughout program execution through the baseline change

management process.

Building an IMS

The following two sections offer guidance on building and maintaining an IMS. They offer

insight into the methods and procedures that contractors use in preparing IMS deliverables.

Building the IMS is a collaborative effort involving the entire program management team,

utilizing all pertinent technical, schedule, and resource information in constructing a tool with

the predictive capabilities necessary to manage the program effectively. This instruction is

intended for the person acting as the program planner and provides knowledge, suggestions, and

tips to use in building the IMS as a way to understand the features and characteristics of an IMS.

In reality, constructing the IMS is dependent upon the information provided by the program

management team, especially the CAMs. The CAMs are the task owners responsible for

planning, executing, and managing their efforts and performance. Avoid the IMS becoming the

planner’s schedule by involving the team throughout the process in developing, setting the

baseline, and managing the schedule.

The following are the major steps in constructing an IMS. The order of steps varies slightly

based on contractor procedures.

IMS Structure

Determine the program structure best suited for managing the effort, single project or multiple

integrated subprojects. Single project structure is easy to manage and requires less integration

effort for programs but limits the schedule management to a single user. Large programs with

separately managed entities may prefer the flexibility of managing these components

simultaneously. This requires more discipline and coordination to manage the integration and

configuration control of schedule techniques and file management.

Utilize the IMP

Enter the IMP Events, Accomplishments, and Criteria structure into the IMS, by employing user

defined codes, populated with IMP numbers or other schedule tool structure features. This

Page 46: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

46 Version 4.0, 21 September 2012

provides assurance of IMP / IMS alignment and the IMP numbering system provides a

convenient task cross reference with the IMP.

Establish Calendars

Define the project calendar. Organizations typically follow a standard workday calendar that

identifies non-workday holidays and accounting month-end periods. Define the default workdays

in the calendar. Determine if any unique task calendars or resource calendars are needed and

define them at this time if possible.

Identify Tasks

The planners, working with the CAMs, define the tasks in sufficient detail to accomplish the

program objectives using the source information described above. The tasks have duration, and

sequence required to perform the effort calculated against a workday calendar that produces a

schedule. The tasks align with program milestones to determine if the program’s goals are

achievable. The IMS is a predictive forecaster of future efforts and budgets required to achieve

success and a performance measurement tool for completed work effort.

Identify Task Constraints and Relationships

Program IPTs and CAMs identify the relationships among their tasks and milestones, external

activities, and other IPT tasks. Constraints for tasks are identified and documented.

Earned Value Methods or Earned Value Techniques (EVTs)

Maintain EVTs at the work package level, as required. It is recommended to maintain EVTs in

the IMS for best usability.

Numbering System

Provide cross reference numbering for the IMS tasks. Cross references to the IMP, SOW, WBS,

Control Account, program risks, deliverables, and baseline changes are some of the cross

reference numbers that should exist in the IMS.

Resources

If the IMS is a resource loaded schedule, add resources at the work package level and if possible

at the IMS discrete task level. Allocate resources to match availability. Following tasks’

structures revision, redo the analysis performed in the previous step. If resources are not included

in the IMS, then a detailed and frequent reconciliation to the baseline and forecast of resources to

IMS timelines is required to ensure alignment.

Initial Analysis

Conduct a critical path analysis and a Schedule Risk Assessment (SRA) on the completed IMS.

Document IMS

Prepare the IMS Supplemental Guidance or similar named document, such as IMS Basis and

Assumptions document, to provide a comprehensive understanding for an IMS construct and

intended use. The IMS Supplemental Guidance provides the user with instructions on how to use

the IMS, explanations on special techniques applied to its content, and defined methods for

determining program critical path or driving path to interim program milestones.

Page 47: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

47 Version 4.0, 21 September 2012

Maintaining an IMS

Maintenance is performed on the IMS to add new work, remove unnecessary work, modify

existing work, or portray the effects of executing the work through historical recording and

forecasting remaining work. This maintenance is accomplished through baseline changes and

status updates to the schedule.

Updating Forecasts

The IMS contains baseline dates for each task as well as forecast dates. The baseline dates reflect

the approved and controlled plan. The forecast dates reflect the currently projected activity dates

from the task owners. Forecast dates are updated during each IMS update cycle. This is done at

least monthly but on many programs is performed weekly.

Resource Changes

Resources frequently change and the schedule should reflect those updates. CAMs may be

reassigned and the IMS reflects the responsibility for each task. Resource types and quantities

may change to keep in alignment with project dynamics. These changes may be made to the

approved plan (baseline) or the forecast information. Approved plan changes require a Baseline

Change Request.

Baseline Changes

Change management in the IMS generally involves changes that affect the schedule baseline.

These changes have a potential impact to the schedule, control account budgets, estimates at

completion, and claiming performance. The Baseline Change Request (BCR) process is the

manner in which documented and approved scope changes and related budget changes are

reflected in the IMS. The BCR quantifies scope changes with both time and cost impacts.

Recording Progress

Perform schedule status during the execution phase. Schedule status refers either to the

schedule’s condition, as in the schedule projects completing prior to the need dates, or to the

process of determining the condition, such as in performing the monthly status cycle. The status

process seeks to determine if the scheduled work accomplished is ahead of, on, or behind

schedule or if the remaining work effort is ahead of, on, or behind schedule and with the

resulting projected forecast impacts properly reflected. The baseline dates provide the basis for

determining if completed, in-progress, or future tasks are ahead of, on, or behind schedule.

Assessing the program’s status condition is most meaningful when determined on a cyclical

basis. For example, once per month with the month end date coinciding with the accounting

period close. Soliciting status requires generating the schedule information pertaining to the

status cycle period for the responsible person to assess progress and record the proper

information. In an Earned Value Management environment, the responsible person is the CAM;

they are in control of executing the work and claiming performance on work accomplished. The

CAM must record accurate historical information for completed work and make accurate

projections for remaining work. This takes the form of recording Actual Start and Actual Finish

dates for completed tasks, determining remaining duration or forecasting finish dates for work

started but not completed, and projecting the start and or finish dates of work that should have

started or completed during the status cycle period, but did not. It is also important to look ahead

to the immediate period or two following the status cycle period to determine if those tasks can

execute as scheduled or validate rescheduling as a result of the current actual date performance

or projections reflected through the successor relationships. Tasks in the look-ahead period may

Page 48: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

48 Version 4.0, 21 September 2012

require modified logical relationships to reflect their revised execution due to other known

reasons. This is a direct reflection that consequences and program dynamics are better

understood in the near future as opposed to assumptions made during the planning phase.

Adjustments here ensure the most accurate schedule information and the predictability of the

IMS remains effective.

Note: Coordinate complementary status cycles and status dates with subcontractors

so the integrated schedule information is relevant.

The Status Date is the “as of” date for the schedule information in the program; also known as

Timenow or Data Date. In general, the status date is the point in time that identifies all tasks’

condition as either completed, in-progress, or future. The status date marks a relevant point in

time. Tasks scheduled earlier than the date should have completed, tasks scheduled later than the

date are either started and in-progress, with their finish dates projected later than the status date,

or they have not started, and are scheduled to start and finish in the future. Typically, status dates

represent an end to a monthly accounting period cycle and occur on a regular basis to enable

program period performance measurement. Status dates reflect the amount of effort completed

up to that point in time, how much progress is achieved since the previous status date and how

much effort is remaining. Metrics based on these end of period dates provide meaningful

measures to help managers assess progress, know where difficulties exist through performance

analysis and projected schedule impacts, and enable fact-based decision making to effectively

manage the program.

Determining the amount of task percent complete is easy for un-started or completed work; they

are either zero or 100 percent complete respectively. In-progress tasks present a more

challenging scenario when determining progress or percent complete. Assessing the amount of

work accomplished as opposed to the amount of time that has passed is the best method in

determining progress achieved. When applying status to an in-progress task (started but not

completed), identifying the percent complete can take the forms below:

Task Duration Percent Complete

Percent Complete (often represented as “% Complete”) refers to task duration percent complete.

It is a simple percentage of time through status date as compared to the total task’s duration. For

example, a task with a 20 day duration that is 50% complete means that 50% of the duration is

completed (10 working days) and 50% of the duration is remaining (10 working days).

In Microsoft Project (MSP) the percent complete field alone is inadequate for schedule status

due how it treats remaining duration. For MSP schedules the in-progress tasks should be updated

to the status date and the task duration percent complete is automatically calculated. Do not rely

on task duration percent complete as a progress indicator; EV percent complete and physical

percent complete convey more meaningful indications of progress. The remaining duration must

be forecast beginning at the status date to have the tool calculate an accurate finish date.

An example illustrates the MSP unique situation with statusing. For this example assume the

following scenario:

o A 10 day task is planned to start on 5/11/11 and finish on 5/24/11 (Calendar is

set for Saturday and Sunday as non-working)

Page 49: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

49 Version 4.0, 21 September 2012

o The task actually starts on 5/11/11

o The status date is Tue 5/17. The percent complete to the original duration is 50%

(5/10 days)

o As of the status date the forecast of completion is 5/30/2011 or a remaining

duration of 9 days from the status date. The percent complete to the forecast is

35.7% (5/14 days)

o The new duration of 14 days is calculated as 5 days complete plus 9 days

forecast

In the screenshot below, the effect of only statusing percent complete is analyzed. MSP uses the

following formulas:

o Actual Duration + Remaining Duration = Duration

o % Complete = Actual Duration / Duration

In the picture the vertical red line is the status date, and in the bars the blue is the original

duration, the black the progress based on percent complete, and the green is the remaining

duration.

Please note that despite different percentage completions, the end date of the task remains

unchanged at 5/24/11. This is not correct according to the forecast, and why percent complete

alone is not sufficient to capture status. Also note the realistic percent complete based on the

forecast showed the progress before Timenow, which is not appropriate.

The second example adds to the same scenario a status of remaining duration based on the

current forecast of 9 days to reflect the projected 5/30/11 date.

Notice now the first task is correctly reflecting the forecast finish date of Monday 5/30/11. The

second task is not projecting the correct forecast finish date, the 5/27/11 finish date is calculating

the remaining duration from the past period.

The graphical example showed the only accurate MSP status to be when percent complete is

through the status date and the revised remaining duration is also input. Only then is the forecast

finish date accurate.

Page 50: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

50 Version 4.0, 21 September 2012

Note: Statusing a task to Timenow demonstrates that the task has been statused,

focuses attention on the task’s remaining duration, and projects more accurate

forecast finish dates on the task and the downstream successor logic path tasks.

EV Percent Complete

Earned Value Percent Complete (often represented as % EV Complete) refers to the Earned

Value Method based percent complete. It simply reflects the percent complete based on the

EVM technique selected for the task. For example, consider a task that has a 50/50 EVM

technique. The 50/50 type earns half the value credit for starting the effort and takes credit for

the remaining half value credit when the effort has completed. For a task that has started and is

in-progress (not completed) the CAM claims 50% earned value (EV). The CAM can claim 100%

value upon task completion. A task with an EVM technique of 0/100 would not claim EV until

the task is completed. A Percent Complete (PC) EVM technique relies on the percent complete

method, possibly requiring defined criteria to substantiate the percent complete residing in

supplemental objective evidence known as Quantifiable Backup Data (QBD), as required by the

contractor’s documented EVM Process. The task owner or CAM establishes QBD items for a PC

task for which the items represent the steps and their weighting as a percentage of the total effort.

These steps do not have an ordered sequence and are meant to depict the relative value of work

in the task. The CAM updates the steps in the QBD as part of the status process, earning the

weighted percent complete for each step as effort is completed. The cumulative earned value of

QBD items establishes the total percent complete for the related task.

Regardless of the EVM technique selected for a work package, the EV percent complete value

must always match the conditions of its method and the status of the work. A 50/50 task cannot

reflect an EV Percent Complete value of 75% and a 0/100 task cannot reflect any values other

than zero for an incomplete task or 100% for a completed task. PC tasks cannot reflect a

percentage that is different than any combination of its QBD items. That is, if there are four

items each worth 25% (totaling 100% value), the EV Percent Complete value cannot be 37%. It

must be 25, 50, 75 or 100 % EV Complete.

Each program decides the best available EVM technique types for its team to use.

Note: When assigning EVM techniques for work packages consider conditions such

as the type of work effort; the task’s planned duration, and the number of accounting

periods the work is planned to span.

Physical Percent Complete

Physical Percent Complete refers to the amount of work completed. The Physical Percent

Complete approach strives to determine and reflect the amount of work accomplished as

opposed to the amount of time that has passed or the Earned Value method assigned. This

determination may be based on the CAM’s subjective but knowledgeable assessment of the work

completed based on physical accomplishment.

Physical Percent Complete is the most accurate means of determining progress achieved because

it focuses on the amount of effort accomplished and considers the remaining amount of effort

relative to the remaining duration. It also requires the status provider to consider whether the

Page 51: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

51 Version 4.0, 21 September 2012

remaining effort can be accomplished in the remaining amount of time (remaining duration) or if

more or less time is required, prompting a revised forecast finish date.

Invalid Dates

Exercise care when performing status in the IMS. Recording the Actual Start and / or the Actual

Finish dates involves status date recognition and precedence logic awareness of the tasks.

Recording invalid dates is a common problem affecting schedules, impeding accurate schedule

information and forecast projections.

Invalid dates can take the form of these conditions:

Actual Start or Actual Finish dates are later than the Status Date. The status date

indicates the cut-off between the past and the future periods. It is not feasible to have

completed work in the future. Actual dates should remain in the past relative to the status

date.

Start and Finish dates that are earlier than the Status Date and do not have applicable

Actual Start and / or Actual Finish dates. It is not feasible to have incomplete work in the

past: a forecasted Start or Finish that is before Timenow without a related Actual Start and /

or Actual Finish date. That is the equivalent to stating these tasks WILL start or finish in the

past.

Be aware of out-of-sequence status conditions that affect the calculated total float values and

downstream successor path tasks and their forecasted dates. Recording an Actual Start and / or

an Actual Finish without satisfying the predecessor requirements causes an out-of-sequence

status condition as illustrated in the examples below.

Finish to Start Logic

Finish-to-start logic on this in-progress task dictates that its successor cannot start until it is

finished.

Page 52: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

52 Version 4.0, 21 September 2012

Incorrect Finish to Start Logic

The second task reflecting an Actual Start before its finish-to-start predecessor completes ignores

the relationship and incorrectly shows that it can complete earlier than the logic determines it

should. This condition also allows the third task to start earlier and does not reflect the effects

that the logic intended. This distorts critical path determination along with total float values. It is

also logically impossible.

Corrected Logic

In this example, the second task is able to start after the first task starts and the logic reflects this

with the modified start-to-start relationship. The third task has a dependency on the completion

of the first and second tasks. The revised logic shows both tasks as finish-to-start relationships to

the third task, still starting earlier than originally modeled.

This completes the section on IMS key terms and concepts for the intermediate scheduler. The

next section contains processes and procedures to conduct a schedule assessment.

Page 53: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

53 Version 4.0, 21 September 2012

IMS Quick Look Assessment

The IMS Quick Look Assessment is an abbreviated IMS evaluation using the GASP. The goal of

the IMS Quick Look Assessment is not to just provide a report card, but to objectively assess the

pros and cons of the IMS content and, more importantly, provide concrete recommended actions

for improving the schedule. Some actions might be significant and entail considerable effort by

the program team. Other actions might seem minor or administrative in nature, but may be

essential to improving data integrity, reliability, and integration. The IMS Quick Look

Assessment applies at any stage of the program life cycle, from source selection to validating the

schedule as it evolves after contract award. The program manager may evaluate the IMS for

routine oversight, major program reviews, for major replanning or rebaselining efforts. The IMS

Assessment process typically begins with a decision by the PMO to conduct a Quick Look

Assessment. A small data call is provided to the contractor. The data items required for a Quick

Look Assessment are defined on the next page. The latest IMS deliverable should be adequate

for the IMS. If the contractor has not provided any supplemental guidance as part of the IMS

deliverable, the PMO may ask need to ask the contractor for it to help clarify how the IMS was

developed and which columns serve what function. After gathering artifacts, the team performs

an IMS Quick Look Assessment.

The time and effort required to complete IMS Quick Look Assessment actions will vary by

program and by sections within the IMS. Program management must decide on when the actions

are sufficiently completed to move on to an IMS Comprehensive Assessment. Minimally, the

first five GASP tenets should be satisfactorily met to achieve a valid schedule.

Overview of Mechanics

The IMS Quick Look Assessment focuses on non-summary tasks unless otherwise stated. Tests

including summary tasks are specified. Most Quick Look tests involve incomplete tasks. Tests

involving all tasks or tasks with either Actual Starts or Actual Finishes are specified.

Tests by GASP Tenet

The tests for performing a Quick Look Assessment are organized by GASP tenet. Each test is

listed in a table that follows.

General Information about the Quick Look Table

Prior to performing analysis on an IMS, save an archived copy of the original IMS file for file

backup assurance in a separate folder. Rename the IMS to designate it as the analysis version,

and change the IMS file attributes to NOT Read-Only and eliminate any password protection on

the analysis version file.

The Quick Look Assessment requires the following data items:

IMS

IMS Field Mapping (user defined fields in the IMS file)

IMS Basis and Assumptions or IMS Supplemental Guidance Document

Do NOT convert durations to days prior to conducting the IMS Quick Look Assessment as this

could invalidate some tests. However, tasks with elapsed durations may cause false detections

Page 54: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

54 Version 4.0, 21 September 2012

during IMS Quick Look Assessments due to Microsoft Project conversion interpretation. For

example a 20 elapsed day duration task (just under three weeks) is detected as having greater

than 44 working days because of the manner in which Microsoft Project handles the underlying

duration value. Ensure all incomplete tasks do not use “elapsed days”. Convert tasks durations to

days if tasks have elapsed units of measure detected.

Tip: After conducting the IMS Quick Look Assessment, consider converting all

incomplete tasks with task duration in “hours”, “weeks”, or “months” to “days” as

the unit of measure for subsequent analysis. For example, analyzing tasks with

common task duration units is easier when comparing total float values.

Analysts may use the accompanying IMS Quick Look Template Word format document to

record IMS Quick Look test results and focus program attention on schedule weaknesses

requiring improvement. The template facilitates recording Pass / Fail / NA results for each test

conducted. The analyst may determine if the IMS Quick Look test results constitute a Yes / No

rating for each of the applicable five GASP tenets.

The template has a section for highlighting favorable schedule observations, and identifying

unfavorable findings with related, actionable suggested improvements discovered from

performing each test. Using the template for subsequent IMS Quick Look assessments enables

trend visibility into the program’s IMS progress towards achieving schedule validity as the

program implements schedule improvements.

Summary Tasks vs. Tasks, Tasks vs. Milestones, Discrete vs. Planning Packages vs. LOE

The Quick Look examines and filters the schedule across a variety of activity types. The chart

below shows the normal hierarchy of schedule activities.

Task Definitions

Page 55: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

55 Version 4.0, 21 September 2012

Several of the tests require that LOE tasks be excluded. LOE may or may not be included in the

IMS depending on contractor processes. While LOE work packages must be clearly identified in

the IMS, LOE may be identified in a number of ways including the EVT field, as prefix or suffix

to task name, or a separate user ID field. Contact a Schedule Subject Matter Expert (SME) or the

contractor to determine the best way to isolate LOE.

Most of the tests are concerned with the schedule condition going forward. As a result, most tests

are focused on incomplete tasks. Tests can be run against the total number of tasks to get a feel

for the discipline used when initially planning the entire project.

The IMS Quick Look Assessment table explains how to perform each test by GASP tenet. The

first column labeled “Test Description” describes the test or check and provides a tip as to the

intent of performing the related test. The second column labeled “How to Determine” describes

the schedule items to include / exclude and how to perform the counts, determine percentages,

and identify test threshold goals. The third column labeled “Why It Matters; Corrective Action”

provides insight into related conditions and suggestions to make schedule improvements.

The appendix contains information about government and commercial tools that help filter the

data and perform the task counts.

Page 56: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

56 Version 4.0, 21 September 2012

Tenet 1: Complete

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 1

Baseline Durations >2

Months

Determine % of incomplete

tasks with baseline durations

greater than 44 working days

(2 months).

Tip: Shorter baseline durations

reflect original planning scope

granularity for efficient

execution & precise

performance measurement.

D. Exclude LOE, planning packages,

summary tasks, milestones, & non-

baselined tasks.

Include tasks w/o actual finishes &

count for total number of tasks.

N. Further, include tasks with baseline

duration > 44 working days & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

external, non-summary, non-milestone

tasks that have baseline durations

greater than 44 working days to (D)

number of incomplete, non-LOE, non-

planning package, non-external, non-

summary, non-milestone tasks with

baseline durations greater than 0 days.

Why It Matters:

Shorter activities (2 months

or less in duration) provide

more visibility into how

the activities were planned

& allow a more objective

evaluation of progress.

Corrective Action:

Review & verify tasks with

baseline durations longer

than 44 working days or

split into tasks less than 44

days.

Test 2

Forecast Durations > 2

Months

Determine % of incomplete

tasks with durations greater

than 44 working days (2

months).

Tip: Shorter task durations are

easier to status & provide

scope granularity for precise

performance measurement.

D. Exclude LOE, planning packages,

external tasks & milestones.

Include tasks w/o actual finishes &

count for total number of tasks.

N. Further, include tasks with durations

> 44 working days & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

external, non-summary, non-milestone

tasks that have durations greater than

44 working days to (D) number of

incomplete, non-LOE, non-planning

package, non-external, non-summary,

non-milestone tasks.

Why It Matters:

Shorter activities (2 months

or less in duration) provide

more visibility into how

the activities were planned

& allow a more objective

evaluation of progress.

Corrective Action:

Review & verify tasks with

forecast durations longer

than 44 working days or

split into tasks less than 44

days.

Page 57: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

57 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 3

Forecast Durations > 2

Months in 3 Month Look

Ahead Determine % of incomplete

tasks with durations greater

than 44 working days (2

months) that are within next 3

months.

Tip: Activities clearly defined

& well planned with easier to

status shorter durations

provide granularity for precise

performance measurement.

D. Exclude LOE, planning packages,

external tasks & milestones.

Include tasks w/o actual finishes;

Include tasks scheduled in the next 3

months & count for total number of

tasks.

N. Further, include tasks with durations

> 44 working days & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) within 3 months of

status date, number of incomplete, non-

LOE, non-planning package, non-

external, non-summary, non-milestone

tasks that have durations greater than

44 working days to (D) same period

number of incomplete, non-LOE, non-

planning package, non-external, non-

summary, non-milestone tasks.

Why It Matters:

3 month look ahead period

scope must be understood

& planned to execute

efficiently.

Shorter tasks (2 months or

less in duration) provide

more visibility into how

the tasks were planned &

allow a more objective

evaluation of progress.

Corrective Action:

Review & verify tasks

with forecast durations

longer than 44 working

days or split into shorter

tasks; apply this approach

to advanced look ahead

periods to affect changes.

Test 4

Estimated Durations

Determine number of

incomplete tasks with

estimated durations.

Tip: Indicates incomplete

planning (durations have not

been addressed).

Include tasks w/o actual finishes;

Include estimated tasks & count.

Goal: Zero exceptions.

Detects: Number of incomplete tasks

that have estimated durations.

Why It Matters:

Estimated durations are

the default in MSP

indicating there has not

been any duration input

for that task. This suggests

the planning has not yet

been completed.

Corrective Action:

Replace estimated

durations for all non-

milestone tasks with

durations from the CAM.

Page 58: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

58 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Tests 5 & 6

Missing Baseline Dates &

Baseline Duration

Determine all tasks without

baseline dates & valid baseline

durations.

Tip: Cannot determine if tasks

are early or late during

execution without proper

baseline.

Include all tasks w/o baseline start or

baseline finish or valid baseline

duration & count.

Goal: All tasks have baseline dates &

baseline duration.

Detects: Number of tasks that do not

have established baseline start, baseline

finish, or baseline duration.

Why It Matters:

Missing baseline

information may indicate

lapse in proper schedule

management processes &

exhibit lack of

performance measure

capabilities.

Corrective Action:

Populate & maintain

proper baseline dates &

durations (baseline the

schedule).

Test 7

Cross Reference Fields

Comprehensive data field

referencing in IMS.

Tip: Demonstrates source

information tracks to each

other, is represented in the

IMS, & enables better

program management.

Verify all documents cross-referenced

to the IMS are represented with their

own field in the IMS & are

appropriately populated.

Required: CAMs, Control Account

(CA), IMP, WBS, SOW, EVT, Work

Package, Planning Package

Recommended: OBS/IPT.

Determine related fields in the IMS

for each artifact & search for

completeness

Analyst uses judgment to determine if

IMS is adequately cross-referenced.

Goal: All required fields complete.

Why It Matters:

Data cross reference fields

exist & are populated to

demonstrate source data

alignment & provides a

verifiable basis for IMS

planning.

Corrective Action:

Populate & maintain

proper artifact data fields

in the IMS.

Page 59: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

59 Version 4.0, 21 September 2012

GASP Complete Evaluation

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 8

Duplicate / Blank Names

Search for blank or duplicate

task names in the entire IMS.

Tip: Unique & descriptive task

names define the scope

content & deliverable, aide

user comprehension &

facilitate determining progress

during status.

Sort the entire IMS by task name,

observe obvious task name duplicates

& blank names for Test 8.

Be aware of sorting parameters-

E.g. in MSP do not check the option

Keep Outline Structure for sorting

when including summary tasks; not

checking the option eliminates outline

structure as the primary sort that

would prevent task name alignment as

a primary sort for comparison.

Through several iterations, search task

names containing common words to

discern repetitive phrases that do not

exhibit uniqueness, such as several

tasks that merely state Perform Test,

not differentiating specific tests.

Goal: All names are unique & not

blank.

IMS task nomenclature is

best understood when

organized, unique,

meaningful, & not reliant

on summary or grouping

titles to supplement their

comprehension.

Corrective Actions:

Use present tense action

verbs as described in the

IMP , if applicable, for

each non-summary task

where possible, when

revising task names.

Words such as analyze,

design, draft, determine,

produce, conduct, review

& approve provide insight

into unique descriptive

task names & aid

understanding each task

deliverable.

Test 9

Missing Logic

Determine number of

incomplete tasks without logic

(predecessors or successors).

Tip: Logic is fundamental for

establishing an achievable

schedule & imperative for its

predictive capability. Missing

logic calls into question

schedule soundness & critical

path validity.

Exclude LOE tasks , external tasks,

summary tasks.

Include tasks w/o actual finishes that do

not contain predecessors or successors

& count.

Goal: Zero exceptions.

Detects: Number of incomplete, non-

LOE, non-external, non-summary

tasks that do not have at least one

predecessor or one successor.

Why It Matters:

External feed-in

milestones w/o

predecessor or feed-out

milestones w/o successor

may be appropriate, but

all other activities need

proper logic found within

the IMS.

Corrective Action:

Determine appropriate

predecessors & / or

successors for tasks

missing logic.

Page 60: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

60 Version 4.0, 21 September 2012

Tenet 2: Traceable

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 10

Summary Logic

(& Constraints / Deadlines)

Identify summary tasks with

applied logic or constraints.

Tip: Applying logic or

constraint to summary tasks

potentially obscures impacts to

detailed tasks & hinders

schedule analysis.

Include only summary tasks.

Include summary tasks containing:

predecessors or successors

or constraint dates

or MSP deadline dates & count.

Goal: Zero exceptions.

Detects: Number of all summary tasks

that have predecessors or successors or

constraint dates or deadline dates

applied.

Why It Matters:

Logic or constraints

applied to summary tasks

may have unintended

consequences to

subordinate detail tasks &

may be difficult to

discover when reviewing /

analyzing schedule

information.

Corrective Action:

Remove logic, constraints,

& Deadlines from

summary tasks & apply

logic & appropriate

constraints & deadlines to

detailed tasks.

Test 11

Finish-to-Start (FS)

Relationships

Determine % of incomplete

tasks using FS relationships

(preferred).

Tip: FS relationships avoid

scheduling activities in parallel

& ensure the least opportunity

for creating resource conflicts.

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes &

count for total number of tasks.

N. Further, include tasks that contain FS

predecessors & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 90% or greater.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

finish-to-start predecessor relationships

to (D) number of incomplete, non-LOE,

non-summary tasks.

Why It Matters:

Promoting parallel

activities risks scheduling

more work than can be

executed & potentially

understates projecting

accurate program finish.

Corrective Action:

Verify the use of any non-

FS relationships & change

to FS if appropriate.

Page 61: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

61 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 12

Start-to-Start (SS) or Start-

to-Finish (SF) Successor w/o

also Finish-to-Start (FS) or

Finish-to-Finish (FF)

Determine number of

incomplete activities using only

SS or SF successor

relationships.

Tip: SS relationships may be

valid, but not having at least

one additional FS successor

relationship prohibits

establishing finish

consequences, resulting in

meaningless total float values.

Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes that

contain SS or SF successors

& do not also contain FS or FF

successors (to another task) & count.

Goal: Zero exceptions.

Detects: Number of incomplete, non-

LOE activities that have a SS or SF

successor, but also do not have at least

one FS or FF successor relationship to

another activity.

Note: Condition, potentially equivalent

of missing a successor.

Why It Matters:

Relying only on SS or SF

successor relationships

does not model a finish

consequence to the

activity. Once in-progress,

it loses its impact to other

activities, does not retain

priority to finishing & can

reflect meaningless total

float value to program

end.

Corrective Action:

Determine & apply

additional, appropriate FS

or FF successor

relationships.

Page 62: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

62 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 13

Total Float > 3 Months

Determine % of tasks with total

float >60 working days.

Tip: Indicates a task may slip

greater than 3 months without

impact to program completion.

Suggests a task is starting too

early (missing an identified

predecessor), or is not

reflecting potential impacts to

critical path (missing an

identified successor).

Possibility that some scope is

not identified (tasks not present

in the IMS).

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes &

count for total number of tasks.

N. Further, include tasks with total float

> 60 working days & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

total float greater than 60 working days

to (D) number of incomplete, non-LOE,

non-summary tasks.

Why It Matters:

Excessive total float is

indication the task is not

properly sequenced, either

starting too early, or is

missing a potential

successor that could

impact critical path

determination & not

properly forecasting

program completion.

Usually, identifying the

end task in a path for

missing successors is

effective in addressing

high total float for all tasks

in the path.

Corrective Action:

Determine appropriate

predecessors & / or

successors for tasks with

excessive total float.

Tip: Sort the detected

tasks in descending total

float order to focus

corrective actions on tasks

with largest total float

values.

Page 63: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

63 Version 4.0, 21 September 2012

GASP Traceable Evaluation

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 14

SNETs / FNETs Beyond 3

Month Look Ahead

Determine % of SNET or

FNET constraints on tasks > 3

month look ahead.

Tip: Anticipate using fewer “no

earlier than” constraints in

periods further out, due to

uncertainty & related rationale,

relying more on logic alone to

schedule a project.

D. Exclude LOE tasks, summary tasks,

external tasks.

Include tasks w/o actual finishes;

Include tasks that start beyond 3 months

from status date & count for total

number of tasks.

N. Further, include tasks containing “no

earlier than” constraints & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary, non-external

tasks beyond 3 months from status date

that have SNETs or FNETs to (D)

number of incomplete, non-LOE, non-

summary, non-external tasks beyond 3

months from status date.

Why It Matters:

Generally, assumptions

are less accurate in further

look ahead periods,

especially when

attempting to model

resource availability with

SNETs / FNETs.

Corrective Action:

Review the “No Earlier

Than” constraints &

replace with logic

relationships where

practical.

Test 15

SNETs / FNETs within 3

Month Look Ahead

Determine % of SNET or

FNET constraints on tasks < =3

month look ahead.

Tip: Anticipate using more “no

earlier than” constraints in

immediate period, due to

certainty, to refine dates, where

logic alone may not adequately

model the project.

D. Exclude LOE tasks, summary tasks,

external tasks.

Include tasks w/o actual finishes;

Include tasks that start within 3 months

from status date or are in-progress &

count for total number of tasks.

N. Further, include tasks containing “no

earlier than” constraints & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 10% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary, non-external

tasks within 3 months from status date

that have SNETs or FNETs to (D)

number of incomplete, non-LOE, non-

summary, non-external tasks within 3

months from status date.

Why It Matters:

Generally, conditions are

well known in very near

term & predecessors alone

may not sufficiently model

resource availability for

task execution.

Use SNETs / FNETs

appropriately, but not in

place of logic.

Corrective Action:

Validate the “No Earlier

Than” constraints &

replace with logic

relationships where

practical.

Page 64: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

64 Version 4.0, 21 September 2012

Tenet 3: Transparent

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 16

Tasks with Leads

Determine number of

incomplete tasks with leads >

one day (imposed logic

accelerations to successors).

Tip: Ignores tasks finishing &

successor starting on same day

condition; Difficult to

understand & manage “time

overlap” created using leads.

Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes that

contain predecessor leads greater than one

day & count.

Note: Leads may be defined as a negative

lag.

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE,

non-summary tasks that have negative lag

predecessors (greater than one day).

Why It Matters:

Leads can distort total

float & mask potential

impacts to successor path

tasks.

Promote decomposing

tasks & durations to

facilitate FS relationships

without leads.

Corrective Action:

Eliminate leads to allow

schedule logic to drive

dates.

Test 17

Tasks with Lags

Determine % of incomplete

tasks with lags (imposed logic

delays to successors).

Tip: Difficult to understand &

manage “time gap” created

using lags.

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes & count

for total number of tasks.

N. Further, include tasks that contain

predecessor lag & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

predecessors with lag to (D) number of

incomplete, non-LOE, non-summary

tasks.

Why It Matters:

Lags interject vagueness

related to a “time gap”

represented by the lag &

are difficult to understand

& manage.

Lags should only model

“wait time”, not replace

work effort or be used to

anticipate successor start

dates.

Corrective Action:

Minimize lags to allow

schedule logic to drive

dates.

Page 65: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

65 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 18

Constraints w/o Rationale

Determine % of incomplete

tasks that have constraints

without comments (rationale)

in Notes field.

Note: Recognize that the

schedule authors may utilize

another custom field or

document to explain constraints

use (such as in the IMS

Supplemental Guidance

documentation), may need to

adjust test results accordingly.

Tip: Rationale aids

understanding of applied

constraints.

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes that

contain types other than “as soon as

possible” & count for total number of

tasks.

Note: Consider MSP deadlines as

constraints for this test.

N. Further, include tasks without

documented rationale & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that are not

ASAP & do not have note entries to (D)

number of incomplete, non-LOE, non-

summary tasks that are not ASAP.

Why It Matters:

Documented explanations

are required to understand

constraint use, including

validity & underlying

intent.

Aids in decision making &

schedule maintenance.

Corrective Action:

Add explanations for

deadlines & constraints to

the Notes field.

Test 19

Lead/Lag w/o Rationale

Determine % of incomplete

tasks that have leads or lags

without comments (rationale)

in Notes field.

Tip: Rationale aids

understanding of applied delays

or accelerations.

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes that

contain predecessors with positive lead or

lag relationships & count for total number

of tasks.

N. Further, include tasks without

documented rationale & count.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

predecessor leads or lags & do not have

Notes entries to (D) number of

incomplete, non-LOE, non-summary tasks

that have predecessor leads or lags.

Why It Matters:

Rationale is required to

understand lead / lag use,

including validity &

underlying intent.

Aids in decision making &

schedule maintenance

Corrective Action:

Add explanations for leads

/ lags to the Notes field.

Also see Leads (Test 16)

above for alternative

techniques.

Page 66: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

66 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 20

Hard Constraints

Determine number of

incomplete tasks utilizing hard

constraints, prohibiting free

flow of logic-driven IMS.

Tip: Prevent dates from

reflecting driving predecessor

impacts.

e.g. in MSP Includes:

Must Start On

Must Finish On

Start No Later Than

Finish No Later Than

Exclude LOE tasks.

Include tasks w/o actual finishes that

contain “no later than” or “must”

constraint types & count.

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE

tasks that have MSP-like constraints such

as MSO or MFO or SNLT or FNLT

constraints applied.

Why It Matters:

Documented constraints

affecting late dates may be

necessary to establish key

need dates & total float

other than relying solely

on backward pass

calculations (use

sparingly).

Corrective Action:

Eliminate hard constraints

from IMS & consider

using constraints similar to

MSP deadlines instead.

Deadlines enable forecast

impacts while providing

accurate total float values.

Test 21

Excessive Lags

Determine number of

incomplete tasks with excessive

lags (delay values greater than

one month).

Tip: Excessive lag values

potentially extend beyond one

status period, complicating

dates.

Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes that

contain predecessor or successor lag

values greater than 20 working days &

count.

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE,

non-summary tasks that have predecessors

or successors with lag values greater than

20 working days.

Why It Matters:

Excessive “wait times”

complicate schedule

management / visibility.

Corrective Action:

Replace excessive lags

with documented /

maintained “no earlier

than” constraints.

GASP Transparent Evaluation

Page 67: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

67 Version 4.0, 21 September 2012

Tenet 4: Statused

4. Statused - Schedules reflect valid actual and forecast dates, and tasks maintain

previously established logical relationships.

Test Description How to Determine Why It Matters /

Corrective Action

Test 22

Invalid Forecast Dates

Determine number of

incomplete tasks that are not

statused up to status date.

Tip: Includes incomplete tasks

without appropriate actual

start or actual finish dates <

status date, or in-progress

tasks with remaining duration

starting < status date.

Tip: Unaccomplished work in

the past is not accurate status,

causes inaccurate projections,

& diminishes schedule

reliability.

Exclude summary tasks.

Include tasks w/o actual starts & forecast

starts less than or equal to status date;

Include tasks w/o actual finishes &

forecast finishes less than or equal to

status date; Include in-progress tasks with

remaining duration beginning earlier than

status date & count.

Goal: Zero exceptions.

Detects: Number of non-summary tasks

that have forecast start or forecast finish

dates earlier than the status date, without

the applicable actual start or actual finish

dates, or remaining duration not

beginning at the status date for in-

progress tasks.

Why It Matters:

It is not possible to

perform future work in the

past, therefore all tasks

with work scheduled

earlier than status date

must re-schedule that

work later than status date.

Corrective Action:

Address invalid dates &

incomplete tasks that are

earlier than Timenow by

providing accurate status

& / or forecast dates.

Not reflecting proper

status jeopardizes

performance measurement

& successor path task

projections.

Test 23

Invalid Actual Dates

Determine number of tasks

with actual start or actual

finish dates in future.

Tip: Tasks reflecting

achievement in the future do

not have accurate status,

which causes inaccurate

projections & diminishes

schedule reliability.

Exclude summary tasks.

Include tasks with actual starts greater

than status date; Include tasks with actual

finishes greater than status date & count.

Goal: Zero exceptions.

Detects: Number of non-summary tasks

that have actual start or actual finish dates

later than the status date.

Note: IMS cannot have tasks with invalid

actual dates.

Why It Matters:

Status date defines

separation between past &

future. It is not possible to

accomplish effort in the

future, beyond Timenow

(status date).

Corrective Action:

Correct the actual start or

finish dates of tasks listed

in the future.

Not reflecting proper

status jeopardizes

performance measurement

& successor path task

projections.

Page 68: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

68 Version 4.0, 21 September 2012

GASP Statused Evaluation

4. Statused - Schedules reflect valid actual and forecast dates, and tasks maintain

previously established logical relationships.

Test Description How to Determine Why It Matters /

Corrective Action

Test 24

Out-of-Sequence (OOS)

Status Conditions

Determine number of tasks

that contain status conditions

violating their logic

relationships.

Tip: Any tasks with out-of-

sequence status condition

render IMS projecting

capabilities unreliable.

Review & detect tasks reflecting actual

starts or actual finishes in current status

cycle that are incongruent with

predecessor logical relationships for Test

24.

E.g. an incomplete FS predecessor to an

in-progress successor – that has an actual

start & its predecessor does not have an

actual finish, does not honor the

relationship.

Goal: Zero exceptions.

Note: This test is performed more

efficiently using an analysis tool designed

to automate OOS status condition

detections.

See the Appendix – Schedule and

Schedule Assessment Tools for products

that perform this automated function.

Why It Matters:

Out-of-sequence status

conditions override logic

& potentially return overly

optimistic successor path

projections & meaningless

total float values.

Corrective Action:

Resolve out-of-sequence

status issues by either

changing logic (if

appropriate) or correcting

the actual start or finish

dates.

Page 69: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

69 Version 4.0, 21 September 2012

Tenet 5: Predictive

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 25

Push Forward Test

Assess logic network integrity

to program completion.

Tip: Delaying an incomplete

task with least total float

reflects a proportionate delay

to program completion,

demonstrating logic path to

program completion.

Observe / record program completion

milestone early finish date.

Perform a successor trace by selecting a

current period task with the least amount

of total float, add 600 working days to

existing duration, recalculate the

schedule.

Verify the program completion

milestone early finish date reflects a

proportionate delay; the delayed

milestone demonstrates it is logically

tied to the selected task.

Check for a logic break if the milestone

does not reflect a proportionate delay.

Failed test when milestone does not

reflect anticipated delay.

Repeat this test on other current period

tasks to ensure consistency.

Note: If task with least total float has

positive 25 working days total float, may

only expect a 575 working day delaying

impact to milestone.

Why It Matters:

Adding 600 working days

is more than two years

duration, introducing

dramatic impact to

program completion.

Failing the test indicates

either broken logic exists

or hard constraints prevent

delays to successor path

tasks.

Corrective Action:

Address missing logic or

applied hard constraint

issues.

Page 70: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

70 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 26

Program Completion Trace

Test

Determine % of non-LOE,

incomplete tasks logically tied

to program completion.

Tip: Interjecting a one year

earlier need date identifies

tasks that may not be tied to

milestone through resulting

negative total float values

feed-out tasks detected during

this test should have

documented rationale .

Note any hard constraints

assigned.

D. Exclude LOE tasks, summary tasks.

Include tasks w/o actual finishes & count

for total number of tasks.

Temporarily remove hard constraints.

Note: Apply a one year earlier “no later

than” constraint to program completion

milestone, recalculate IMS, count tasks

with negative / greatly reduced total float

values.

Divide the numerator (N) count by the

denominator (D) count.

Goal: 95% & greater.

Note: Ensure one year earlier is within

IMS project calendar range (adjust

accordingly); typically 240 – 260

working days.

Note: Review tasks without negative /

greatly reduced total float values.

These tasks may not be logically tied to

the program completion milestone.

Ensure majority of tasks do not have

excessive positive total float values

(resolve applicable issues).

Why It Matters:

Although a percentage is

calculated for test, it is

more meaningful to

review suspect tasks.

Even a relatively few

significant tasks without a

successor path to program

completion is reason for

concern.

Ideally all incomplete,

non-LOE, non-summary

tasks are logically tied to

completion milestone.

Corrective Action:

Investigate tasks not

detected by the test,

address missing successor

path logic to milestone.

Essential tasks not

logically tied to program

completion render IMS as

not predictive & invalidate

critical path.

Page 71: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

71 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 27

No LOE in Path to Program

Completion

Use the Program Completion

Trace Test set-up for this

check.

Tip: Identify LOE tasks

detected as having logical

successor paths to program

completion.

Use the Program Completion Trace Test

parameters, except include LOE tasks.

Review the LOE tasks impacted (have

negative or reduced total float values).

Investigate to confirm that these LOE

tasks are logically tied to discrete tasks

& milestone & recommend changing

logic.

Goal: No LOE tied to discrete effort.

Why It Matters:

LOE should not be

logically tied to discrete

work & should not be part

of the critical path.

Corrective Action:

Investigate & remove

LOE logic to discrete tasks

& program completion to

ensure LOE will not

become part of the critical

path.

Recommend using a LOE

completion milestone to

terminate LOE logic if

necessary.

Page 72: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

72 Version 4.0, 21 September 2012

GASP Predictive Evaluation

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 28

Appropriate Constraints

Applied to Endpoint

Milestones

Verify related milestones have

appropriate constraints that

provide meaningful schedule

measures.

Tip: Missing constraints

diminish program

management prioritization.

Avoid using hard constraints

that override predictive nature

of logic network.

Identify & review the endpoint

milestones to ensure appropriate,

documented constraints provide

meaningful total float values & permit

driving predecessors to establish forecast

dates.

Note: Method & rationale for

establishing need dates (late dates)

should align with IMS Supplemental

Guidance documentation.

Goal: All endpoint milestones should

have constraints applied.

Why It Matters:

Need dates reflect

management’s target.

Constraints affecting the

backward pass to program

end & major milestones (if

applicable) enable

accurate total float

calculation & permit

precedence logic impacts.

Corrective Actions:

Validate appropriate

constraints used on

endpoint milestones.

Consider using

documented MSP

equivalent deadlines.

Test 29

Critical Path Length Index

(CPLI)

Project performance indication

of the ability to finish on time.

Determine working days duration from

status date to program completion early

finish date in IMS, A (critical path

length).

Add amount of total float, B (least

positive or negative value) to A & total.

Divide total (A + B) by A (critical path

length, as determined above).

(A + B) / A

Goal: Should not be less than 0.95 with

target of 1.00 (>1.00 is favorable <1.00

is unfavorable).

Why It Matters:

Although geared towards

performance, this test

reflects IMS realism of

completing on time & is

meaningful when

satisfactorily passing all

previous GASP tests.

Page 73: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

73 Version 4.0, 21 September 2012

Using the Quick Look Results

The primary benefit of the Quick Look results is providing feedback to the contractor so that

they may improve the schedule. Most Air Force PMOs perform some evaluation of the IMS

deliverable. Some may use the DCMA 14 Point Assessment which is included in the tests above.

Others may go beyond the 14 Points with a Quick Look. The key is to feedback the information

to the contractor in time for changes to be incorporated in the next deliverable.

A number of the tests have a basis in DI-MGMT-81650 and may be grounds to reject the IMS

deliverable in addition to providing recommended changes. Some, but not all of the tests that

relate to the DID are: cross references to the SOW, IMP, and contract milestones; identifying

LOE, if included; and duration-based percent complete and physical percent complete for tasks

that have at least begun.

After the assessment team evaluates the IMS against the first five GASP items and passed the

results to the contractor, they may decide to proceed with the IMS Comprehensive Assessment.

An IMS Comprehensive Assessment may not be considered if substantial corrective actions

remain for the current IMS. The following are examples of conditions that could result in a No-

Go decision for performing an IMS Comprehensive Assessment:

Complete –SOW / IMP references omitted.

Traceable –Missing logic or excessive total float.

Transparent – IMS does not have documented explanations for leads, lags, and

constraints.

Statused – Invalid dates detected.

Predictive –LOE is logically tied to the program end milestone.

There may be a situation where the IMS has significant discrepancies against the Quick Look

tests but a program elects to perform the Comprehensive IMS Assessment. Normally this is

when the PMO wants to explore and determine the root cause for the discrepancies.

Page 74: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

74 Version 4.0, 21 September 2012

Performing an IMS Quick Look Assessment Using Run!23

This section describes using the Run!23 tool to perform an IMS assessment. Please refer to the

IMS Quick Look Assessment section for all guidance on performing a Quick Look assessment.

Prior to performing analysis on an IMS, save an archived copy of the original IMS file for

backup assurance in a separate folder. Rename the IMS to designate it as the analysis version,

change the IMS file attributes to NOT Read-Only, and eliminate any password protection on the

analysis version file.

Do NOT convert durations to days prior to conducting the IMS Quick Look Assessment as this

could invalidate some tests. However, tasks with elapsed durations may cause false detections

during IMS Quick Look Assessments due to Microsoft Project conversion interpretation. For

example a 20 elapsed day duration task (just under three weeks) is detected as having greater

than 44 working days because of the manner in which Microsoft Project handles the underlying

duration value. Ensure all incomplete tasks do not use “elapsed days”. Convert tasks durations to

days if tasks have elapsed units of measure detected.

Tip: After conducting the IMS Quick Look Assessment, consider converting all

incomplete tasks with task duration in “hours”, “weeks”, or “months” to “days” as

the unit of measure for subsequent analysis. For example, analyzing tasks with

common task duration units is easier when comparing total float values.

Run!23 is a freeware add-in for Microsoft Project that provides a number of the filters needed for

performing the IMS Quick Look Assessment tests. Filters for use in the Quick Look Assessment

are prefixed with “QL” labels and executed through the “Quick Look” button on the Run!23

toolbar. The tool also provides a task counting capability that helps automate filtering and

counting tasks or metrics, where required for manual processes, within the Microsoft Project file,

as well as performing forward and backward traces to analyze schedule logic. Reference the

Run!23 Toolbar section below for tool navigation tips and assistance.

Installing Run!23

Access Run!23 and the Run!23 Microsoft Project 2003 / 2007 Quick Start at the following

location:

https://www.my.af.mil/gcss-

af/USAF/ep/globalTab.do?channelPageId=s5FDEA9F02769C1090127867185EE02F8

Down load and save the AzTech Run!23.mpp file..

Open Run!23 from Microsoft Project. If the user has other Microsoft Project versions on their

computer, and wants to use version 2007, open Microsoft Project 2007 first, then open Run!23.

Otherwise, launching Run!23 from the Start menu first as described, will open the latest

Microsoft Project version. Having Run!23 open prior to opening other schedule files allows the

user to use its functionality to navigate and analyze open the other Microsoft Project schedules.

Configure MS Project (MSP) 2007 and the Run!23 (.MPP) Template

Open MSP 2007, then go to Tools > Options > Security tab to select the third radio button

under Legacy Formats to “Allow loading files with legacy or non-default file formats.”

Page 75: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

75 Version 4.0, 21 September 2012

While on the same Security tab, click the Macro Security… button.

In the dialog box select: Medium Security > click OK, and then click OK at the bottom of the

Security tab.

Page 76: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

76 Version 4.0, 21 September 2012

Close Microsoft Project. Microsoft Project must be closed and reopened for the Macro Security

settings to be changed.

Open Microsoft Project and open the Run!23 file. You are ready to configure Run!23.

A Microsoft Office Project Security Notice appears, click Enable Macros.

A Run!23 splash screen automatically appears (shown below). All users must click Rebuild

Toolbar only the first time opening a new or updated Run!23 version.

After the initial installation of Run!23, there is no need to rebuild the toolbar; just click

Continue. This splash screen will appear every time you open Run!23.

Page 77: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

77 Version 4.0, 21 September 2012

Click File > Save, to save all of the initial configuration settings within Run!23. Then, Close MS

Project. If prompted to save as Microsoft Project 2007 format, answer “Yes”.

Please note, the above instructions are only necessary to install and initially configure Run!23.

These steps are required only upon a new installation or upgrade to this tool.

Moving forward, launch Run!23 from the Start menu to open MS Project.

Preparation for using Run!23

Run!23 Tool Bar.

Click the Run!23 button on the Run!23 Toolbar.

This provides version details and allows the user to check for updates to Run!23 online.

Click the Set-Up button on the Run!23 Toolbar.

When clicked with the Run!23.mpp template file as the active window, acts as a one-time set-up

button that copies the Run!23 toolbar to the Global.mpp template file so that the toolbar opens

every time MS Project is opened.

When clicked with a live project file as the active window, sets up the appropriate Run!23 views

and settings for working with the file and provides the opportunity to change the EVM from the

tool’s default Text29 to another Text field. This should be run each time a live project file is

opened.

Page 78: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

78 Version 4.0, 21 September 2012

Use “Quick Look” Button to Execute Filters

Use Run!23 to perform the IMS Quick Look Assessment using filters that populate the open

Microsoft Project schedule for review after performing the Set-up on the open schedule. The

assessment filters are designated with a “QL” prefix label, are executed using the “Quick Look”

button, and align with the first five Generally Accepted Scheduling Principles (GASP) with the

filters grouped by GASP tenet. The Quick Look filters are sequentially numbered to correlate to

the tests described in the table below. Run the applicable Quick Look test for each assessment

test, Run!23 automatically applies the related filter or filters, and the results are displayed in a

message box. Filters are designed for percent score numerator and denominator calculations and

single zero exceptions tests. The accompanying Run!23 Quick Start document illustrates this

capability.

Execute the Run!23 “Quick Look” filters from the Quick Look Button on the Run!23 Toolbar.

Click the applicable Test Number to perform the desired test.

Page 79: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

79 Version 4.0, 21 September 2012

Check These Custom Fields

Ensure Flag19 custom field does not contain important data, perhaps needed for analysis, and

then clear existing data and formulas from this field. Copy the existing data from these custom

fields to other available custom fields to preserve the data, if desired. Run!23 uses these custom

fields to perform some of its functionality and clearing existing data and eliminating resident

formulas in these fields assures the tool can perform as intended.

Text29 is the Run!23 default custom field for Earned Value Method / Technique (EVM).

Identify the schedule’s EVM field and modify Run!23 to recognize the field accordingly using

the Set-up button on the active project. Determine how Level of Effort (LOE) and Planning

Packages (PPkgs) are identified in the schedule. Populate the appropriate Text field defined for

EVM with LOE and PPkg if these tasks are not already coded in the EVM field.

Tip: Using a copy of the original IMS file for analysis ensures modifications such as

populating the EVM field with planning package or LOE identifiers do not affect the

original IMS.

Using Run!23 to Perform an IMS Quick Look Assessment

The IMS Quick Look assessment table explains how to perform each test by GASP tenet using

Run!23. The first column labeled “Test Description” describes the test or check and provides a

tip as to the intent of performing the related test. The second column labeled “How to

Determine” describes the applicable Run!23 Quick Look test to use, with a description of the

filters involved, or uses the existing manual method if the feature is not included in Run!23.

Run!23 performs the counts, determines percentages where applicable, and identifies test

threshold goals. The third column labeled “Why It Matters; Corrective Action” provides insight

into related unsatisfactory conditions and suggestions to make schedule improvements.

Filters stating “non-external” exclusions refer to external tasks that reside in a separate project,

in a multiple project environment, but have a logical relationship to tasks in the analyzed project.

External tasks are often referred to as “ghost tasks”.

Most tests are either percent score or zero exceptions tests. Run!23 uses filters that identify the

amount of tasks for the numerator and the denominator, and then calculates and displays the

percentage for percentage score tests. For zero exceptions tests, Run!23 identifies the amount of

tasks detected for the related condition and displays the amount. Other tests require performing

described steps to determine the results.

Percent score tests use two related filters for each test. The first filter detects the amount of tasks

for the stated condition and counts the tasks for the numerator; then the second filter uses the

same parameters without the stated condition to determine the task count for the denominator.

Run!23 performs the mathematical division and provides the percentage of tasks detected in a

displayed message box as seen in the screen shot below. The analyst compares the result to the

stated goal amount to determine if the condition exceeds the threshold. Goals are stated in the

table for each applicable test such as “5% or less” for example. The message box also indicates

the Task Types considered for each related test.

Page 80: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

80 Version 4.0, 21 September 2012

Run!23 Quick Look GASP 1: Complete Test 1 displayed message box

The blue highlighted Reference Filter(s) in the message box above are hyperlinks that filter the

related tasks in the schedule. Click a hyperlink filter and click OK. This closes the message box

and filters the related tasks in the schedule. Investigate the detected tasks to understand more

about what tasks comprise the selection and the tasks’ condition.

Tip: Exceeding the threshold requires further investigation to understand the

condition and possibly recommend corrective actions to enable schedule

improvements.

Zero exception tests have a single filter used to detect the stated condition. Tasks detected in

these tests do not require a mathematical operation, just a count of the filtered tasks. Use the blue

hyperlink to filter the related tasks in the schedule. Tests for zero exception conditions may

require further investigation to understand the condition, but usually result in recommending

corrective actions to resolve the detected condition to make schedule improvements.

Filter exclusions identify the types of tasks not included in each test. Exclusions may be stated

such as “incomplete, non-LOE, non-planning package, non-external, non-summary, non-

milestone tasks” for example. Again, where used for a percent score test, the numerator and

denominator contain the same exclusions. The message box in the screen shot below indicates

that Test 4 considers all incomplete tasks and does not exclude other task types.

Page 81: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

81 Version 4.0, 21 September 2012

Run!23 Quick Look GASP 1: Complete Test 4 displayed message box

Test 4 is an example of a no-exceptions test as there is not paired numerator and denominator

filters applied to determine a percent score. Zero exception tests apply a single filter. Any

detected items must be addressed with the message box displaying the amount of tasks detected.

The blue colored Reference Filter(s) are hyper-links that when selected and OK clicked activate

the filter to display the detected tasks in the schedule. This makes it handy to investigate tasks

identified by the filter used in the test.

Tests requiring manual steps can use the Run!23 Count feature to determine number of tasks

detected, where applicable.

Determine the number of tasks detected by highlighting a column in the table and clicking the

“Count” button.

The number of tasks in the count is reported at the lower left of Microsoft Project screen.

The count in the above example displays in the lower left-hand side of the screen in this format:

Total Tasks Summary Tasks_In-Progress Tasks_Completed Tasks / Total Non-Summary Tasks

=> xx%.

Page 82: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

82 Version 4.0, 21 September 2012

Run!23 Toolbar

The Run!23 Toolbar is the navigation tool for conducting the various tests found in Run!23.

Each menu item is described below:

Run!23

Gives version details and allows the user to rebuild the toolbar.

Set-Up

When clicked with a live project file as the active window, sets up the appropriate Run!23 views

and settings for working with the file. Should be run each time a live project file is opened.

Expand

Unfilter the active project to show all tasks. Also resets any current AutoFiltering.

Summary

Toggles whether or not to show summary tasks.

Split

Toggles screen between showing two panes (top and bottom)—hiding the bottom pane.

1. Before you split the screen, select the View to appear in the top pane (e.g. Gantt).

2. Click the Split button to make the task Form View appears in the bottom pane.

3. To change the Details that are shown, right click in the grey area to the right and select

the Details to show, or choose View, and More Views to pick a different View for the

bottom pane.

4. To toggle the bottom pane on or off, click the Split button again.

Details

Splits the screen, showing the task Details Form in the bottom pane.

1. Select the task to see the Details.

2. Click the Details button to make the Task Details View appear in the bottom pane.

3. To change the Details shown, right click in the grey area to the right, and select the

Details to show.

4. To remove the split from the screen, click the Split button.

NOTE: Right-click the bottom pane to switch the information to be displayed.

Filter

Filters for a given input located in the Name field. The results do not include Summary tasks

unless a Summary task contains the input value. This is not case sensitive, but every space or

letter counts.

Page 83: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

83 Version 4.0, 21 September 2012

Count

For the selected tasks, counts the Total Number of tasks, Summary tasks, In-Progress tasks,

Completed Detailed tasks, Detailed tasks, and the % Complete of Detailed tasks (# Actual

Finishes / # Detailed tasks).

1. Highlight all of the tasks to count (select any column to count ALL tasks shown in the

current Filter).

2. Click the Count button.

NOTE: If the results do not show up, go to Tools, Options, View, Show, and verify that ‘Status

Bar’ is checked.

Sort by ID

Sorts all tasks by ID number (ascending order).

by Finish

Sorts all tasks by Finish Date (earliest to latest).

Unique ID

Jumps to the task with the user-entered Unique ID.

Bookmark / Go Back

Stores the selected task’s location. Go Back is used to return to the bookmarked task. Only one

location can be bookmarked at a time.

Gantt

Changes the Gantt Chart view to the AzTech Default (A_Gantt view).

Trace

Shows the predecessor tasks for the selected task (always select only ONE task or milestone).

1. Highlight the single task (or milestone) to trace--ONLY works off of ONE task.

Page 84: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

84 Version 4.0, 21 September 2012

2. Click Trace, choose the desired option from the Trace dialog box, and click OK.

NOTE: The default is to hide Summary tasks.

Stats

Creates a spreadsheet containing statistics about all or selected tasks.

1. Highlight any tasks in the project.

2. Click the Stats button.

3. Optional: Enter a name for the Excel document to be created.

4. Open the folder where the Project is saved.

5. The Excel document created is named: ‘AzTechStats_’, followed by the Project’s

filename and the name selected in Step 3.

6. Open the Excel document to see the program statistics.

NOTE: If the MS Project file has an .MPX or extension other than .MPP, save the file with the

.MPP extension before running the Stats tool.

Quick Look

Displays the automated Quick Look test menu. Select the desired Quick Look test from the drop-

down to automatically quantify data anomalies against this test in the IMS.

AzTech

Displays the full Run!AzTech menu, available through Run!AzTech software.

goAzTech.com

Link to AzTech’s website.

Performing a Trace of Predecessor Path or Successor Path Logic

With your project schedule as the active window, find and select the milestone or task you want

to trace.

NOTE: To find the last task or milestone in the schedule, click Summaries On/Off until

Summary tasks are hidden, click by Finish to sort by finish date, then CTRL-Arrow Down to

find the last task or milestone.

Click the Trace button on the Run!23 toolbar.

Select the appropriate Trace options from the dialog box (e.g. select C to perform a Critical Path

Trace), then click OK.

NOTE: There is no need to select an option for each category, but you may only select and enter

one option within each category (Predecessor Trace Options, Successor Trace Options, and Sort

Options) in the edit line at the bottom of the AzTech Trace Options window.

EXAMPLES:

For all predecessors, leave options blank.

For a critical path trace organized by finish, choose and enter “CF”.

Page 85: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

85 Version 4.0, 21 September 2012

For a critical path trace organized by finish, and to check for any LOE on the path, choose and

enter “CFL”.

Page 86: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

86 Version 4.0, 21 September 2012

Tenet 1: Complete

Please refer to the basic, non-tool specific IMS Quick Look section for definitions of task types such as

detailed, discrete, LOE, planning packages, and milestones.

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 1

Baseline Durations >2

Months

Determine % of incomplete

tasks with baseline durations

greater than 44 working days

(2 months). Tip: Shorter baseline durations

reflect original planning scope

granularity for efficient

execution & precise

performance measurement.

1. Apply Run!23 Quick Look GASP

1: Complete Test 1

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_01A_BL_Dur_>2mo_Numerator

QL_01B_BL_Dur_>2mo_Denominator

Run!23 divides the numerator (N)

count by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

external, non-summary, non-milestone

tasks that have baseline duration

greater than 44 working days to (D)

number of incomplete, non-LOE, non-

planning package, non-external, non-

summary, non-milestone tasks with

baseline durations greater than 0 days.

Why It Matters:

Shorter activities (2 months or

less in duration) provide more

visibility into how the activities

were planned & allow a more

objective evaluation of progress.

Corrective Action:

Review & verify tasks with

baseline durations longer than

44 working days or split into

tasks less than 44 days.

Page 87: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

87 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 2

Forecast Durations > 2

Months

Determine % of incomplete

tasks with durations greater

than 44 working days (2

months).

Tip: Shorter task durations are

easier to status & provide

scope granularity for precise

performance measurement.

1. Apply Run!23 Quick Look GASP

1: Complete Test 2

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_02A_Fcst_Dur_>2mo_Numerator

QL_02B_Fcst_Dur_>2mo

_Denominator

Run!23 divides the numerator (N)

count by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

external, non-summary, non-milestone

tasks that have durations greater than

44 working days to (D) number of

incomplete, non-LOE, non-planning

package, non-external, non-summary,

non-milestone tasks.

Why It Matters:

Shorter tasks (2 months or

less in duration) provide

more visibility into how

the tasks were planned &

allow a more objective

evaluation of progress.

Corrective Action:

Review & verify tasks

with forecast durations

longer than 44 working

days or split into tasks less

than 44 days.

Page 88: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

88 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 3

Forecast Durations > 2

Months in 3 Month Look

Ahead Determine % of incomplete

tasks with durations greater

than 44 working days (2

months) that are within next 3

months.

Tasks clearly defined & well

planned with easier to status

shorter durations, provide

granularity for precise

performance measurement.

1. Apply Run!23 Quick Look GASP

1: Complete Test 3

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_03A_Fcst_Dur_>2mo_within_3mo

_Numerator

QL_03B_Fcst_Dur_>2mo_within_3mo

_Denominator

Run!23 divides the numerator (N)

count by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

external, non-summary, non-milestone

tasks activities within 3 months of

status date that have durations greater

than 44 working days to (D) number of

incomplete, non-LOE, non-planning

package, non-external, non-summary,

non-milestone tasks within the same

period.

Why It Matters:

3 month look ahead period

scope must be understood

& planned to execute

efficiently.

Shorter tasks (2 months or

less in duration) provide

more visibility into how

the tasks were planned &

allow a more objective

evaluation of progress.

Corrective Action:

Review & verify tasks

with forecast durations

longer than 44 working

days or split into shorter

tasks; apply this approach

to advanced look ahead

periods to affect changes.

Test 4

Estimated Durations

Determine number of

incomplete tasks with

estimated durations.

Tip: Indicates incomplete

planning (durations have not

been addressed).

1. Apply Run!23 Quick Look GASP

1: Complete Test 4

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_04_Est_Dur

Goal: Zero exceptions.

Detects: number of incomplete tasks

that have estimated durations.

Why It Matters:

Estimated durations are

the default in MSP

indicating there has not

been any duration input

for that task. This suggests

the planning has not yet

been completed.

Corrective Action:

Replace estimated

durations for all non-

milestone tasks with

durations from the CAM.

Page 89: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

89 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Tests 5 & 6

Missing Baseline Dates &

Baseline Duration

Determine all tasks without

baseline dates & valid baseline

durations.

Tip: Cannot determine if tasks

are early or late during

execution without proper

baseline.

1. Apply Run!23 Quick Look GASP

1: Complete Tests 5 & 6.

2. Observe & record detected number

displayed in each message box.

Uses Quick Look Filters:

QL_05_No_BL_Dates, then

QL_06_No_BL_Dur

Goal: All tasks have baseline dates &

baseline duration.

Detects: number of all tasks that do not

have established baseline start, baseline

finish, or baseline duration.

Why It Matters:

Missing baseline

information may indicate

lapse in proper schedule

management processes &

exhibit lack of

performance measure

capabilities.

Corrective Action:

Populate & maintain

proper baseline dates &

durations (baseline the

schedule).

Test 7

Cross Reference Fields

Comprehensive data field

referencing in IMS.

Tip: Demonstrates source

information tracks to each

other, is represented in the

IMS, & enables better

program management.

Use the “A_AllFields” Table to identify

related User Defined Fields for Test 7.

Verify all documents cross-referenced

to the IMS are represented with their

own field in the IMS & are

appropriately populated

Required: CAMs, CAs, IMP, WBS,

SOW, EVT, Work Package, Planning

Package

Recommended: OBS/IPT

Determine related fields in the IMS

for each artifact & search for

completeness.

Analyst uses judgment to determine if

IMS is adequately cross-referenced.

Goal: All required fields complete.

Why It Matters:

Data cross reference fields

exist & are populated to

demonstrate source data

alignment & provides a

verifiable basis for IMS

planning.

Corrective Action:

Populate & maintain

proper artifact data fields

in the IMS.

Page 90: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

90 Version 4.0, 21 September 2012

GASP Complete Evaluation

1. Complete - Schedules reflect comprehensive planning and are effective for

execution. Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 8

Duplicate / Blank Names

Search for blank or duplicate

task names in the entire IMS.

Tip: Unique & descriptive task

names define the scope

content & deliverable, aide

user comprehension &

facilitate determining progress

during status.

Sort the entire IMS by task name,

observe obvious task name duplicates

& blank names for Test 8.

Be aware of sorting parameters

e.g. in MSP do not check the option

Keep Outline Structure for sorting

when including summary tasks; not

checking the option eliminates outline

structure as the primary sort that

would prevent task name alignment as

a primary sort for comparison.

Through several iterations, search task

names containing common words to

discern repetitive phrases that do not

exhibit uniqueness, such as several

tasks that merely state “Perform Test”,

not differentiating specific tests.

Goal: All names are unique & not

blank.

Why It Matters:

IMS task nomenclature is

best understood when

organized, unique,

meaningful, & not reliant

on summary or grouping

titles to supplement their

comprehension.

Corrective Actions:

Use present tense action

verbs as described in the

IMP if applicable, for each

non-summary task where

possible, when revising

task names.

Words such as analyze,

design, draft, determine,

produce, conduct, review

& approve provide insight

into unique descriptive

task names & aid

understanding each task

deliverable.

Page 91: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

91 Version 4.0, 21 September 2012

Tenet 2: Traceable

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 9

Missing Logic

Determine number of

incomplete tasks without logic

(predecessors or successors).

Tip: Logic is fundamental for

establishing an achievable

schedule & imperative for its

predictive capability. Missing

logic calls into question

schedule soundness & critical

path validity.

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 9

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_09_No_Logic

Goal: Zero exceptions.

Detects: Number of incomplete, non-

LOE, non-external, non-summary tasks

that do not have at least one predecessor

or one successor.

Why It Matters:

External feed-in

milestones w/o

predecessor or feed-out

milestones w/o successor

may be appropriate, but

all other activities need

proper logic found within

the IMS.

Corrective Action:

Determine appropriate

predecessors & / or

successors for tasks

missing logic.

Test 10

Summary Logic

(& Constraints / Deadlines)

Identify summary tasks with

applied logic or constraints.

Tip: Applying logic or

constraint to summary tasks

potentially obscures impacts to

detailed tasks & hinders

schedule analysis.

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 10

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_10_Summary_Logic

Goal: Zero exceptions.

Detects: Number of all summary tasks

that have predecessors or successors or

constraint dates or deadline dates

applied.

Why It Matters:

Logic or constraints

applied to summary tasks

may have unintended

consequences to

subordinate detail tasks &

may be difficult to

discover when reviewing /

analyzing schedule

information.

Corrective Action:

Remove logic, constraints,

& deadlines from

summary tasks & apply

logic & appropriate

constraints & deadlines to

detailed tasks.

Page 92: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

92 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 11

Finish-to-Start (FS)

Relationships

Determine % of incomplete

tasks using FS relationships

(preferred).

Tip: FS relationships avoid

scheduling activities in parallel

& ensure the least opportunity

for creating resource conflicts.

1. Apply Run!23 Quick Look GASP

2: Traceable Test 11

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_11A_FS_Rel_Numerator

QL_11B_FS_Rel_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 90% or greater.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

finish-to-start predecessor relationships

to (D) number of incomplete, non-LOE,

non-summary tasks.

Why It Matters:

Promoting parallel

activities risks scheduling

more work than can be

executed & potentially

understates projecting

accurate program finish.

Corrective Action:

Verify the use of any non-

FS relationships & change

to FS if appropriate.

Test 12

Start-to-Start (SS) or Start-

to-Finish (SF) Successor w/o

also Finish-to-Start (FS) or

Finish-to-Finish (FF)

Determine number of

incomplete activities using only

SS or SF successor

relationships.

Tip: SS relationships may be

valid, but not having at least

one additional FS successor

relationship prohibits

establishing finish

consequences, resulting in

meaningless total float values.

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 12

2. Observe & record detected number

displayed in message box.

Run!23 function

Goal: Zero exceptions.

Detects: Number of incomplete, non-

LOE tasks that have a SS or SF

successor, but also do not have at least

one FS or FF successor relationship to

another task.

Note: Condition, potentially equivalent

of missing a successor.

Why It Matters:

Relying only on SS or SF

successor relationships

does not model a finish

consequence to the

activity. Once in-progress,

it loses its impact to other

activities, does not retain

priority to finishing & can

reflect meaningless total

float value to program

end.

Corrective Action:

Determine & apply

additional, appropriate FS

or FF successor

relationships.

Page 93: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

93 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 13

Total Float > 3 Months

Determine % of tasks with total

float >60 working days.

Tip: Indicates a task may slip

greater than 3 months without

impact to program completion.

Suggests a task is starting too

early (missing an identified

predecessor), or is not

reflecting potential impacts to

critical path (missing an

identified successor).

Possibility that some scope is

not identified (tasks not present

in the IMS).

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 13

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_13A_TF_>3mo_Numerator

QL_13B_TF_>3mo_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

total float greater than 60 working days

to (D) number of incomplete, non-LOE,

non-summary tasks.

Why It Matters:

Excessive total float is

indication the task is not

properly sequenced, either

starting too early, or is

missing a potential

successor that could

impact critical path

determination & not

properly forecasting

program completion.

Usually, identifying the

end task in a path for

missing successors is

effective in addressing

high total float for all tasks

in the path.

Corrective Action:

Determine appropriate

predecessors & / or

successors for tasks with

excessive total float.

Tip: Sort the detected

tasks in descending total

float order to focus

corrective actions on tasks

with largest total float

values.

Page 94: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

94 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 14

SNETs / FNETs Beyond 3

Month Look Ahead

Determine % of SNET or

FNET constraints on tasks > 3

month look ahead.

Tip: Anticipate using less “no

earlier than” constraints in

periods further out, due to

uncertainty & related rationale,

relying more on logic alone to

schedule a project.

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 14

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_14A_SNETorFNET_beyond

_3mo_Numerator

QL_14B_SNETorFNET_beyond

_3mo_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary, non-external

tasks beyond 3 months from status date

that have SNETs or FNETs to (D)

number of incomplete, non-LOE, non-

summary, non-external tasks beyond 3

months from status date.

Why It Matters:

Generally, assumptions

are less accurate in further

look ahead periods,

especially when

attempting to model

resource availability with

SNETs / FNETs.

Corrective Action:

Review the “No Earlier

Than” constraints &

replace with logic

relationships where

practical.

Page 95: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

95 Version 4.0, 21 September 2012

GASP Traceable Evaluation

2. Traceable - Schedules have full network logic that reflects potential impacts to

program completion. Schedules have populated code fields relating to required field

mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 15

SNETs / FNETs within 3

Month Look Ahead

Determine % of SNET or

FNET constraints on tasks < =3

month look ahead.

Tip: Anticipate using more “no

earlier than” constraints in

immediate period, due to

certainty, to refine dates, where

logic alone may not adequately

model the project.

1. Apply Run!23 Quick Look GASP 2:

Traceable Test 15

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_15A_SNETorFNET_within

_3mo_Numerator

QL_15B_SNETorFNET_within

_3mo_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 10% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary, non-external

tasks within 3 months from status date

that have SNETs or FNETs to (D)

number of incomplete, non-LOE, non-

summary, non-external tasks within 3

months from status date.

Why It Matters:

Generally, conditions are

well known in very near

term & predecessors alone

may not sufficiently model

resource availability for

task execution.

Use SNETs / FNETs

appropriately, but not in

place of logic.

Corrective Action:

Validate the “No Earlier

Than” constraints &

replace with logic

relationships where

practical.

Page 96: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

96 Version 4.0, 21 September 2012

Tenet 3: Transparent

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 16

Tasks with Leads

Determine number of

incomplete tasks with leads >

one day (imposed logic

accelerations to successors).

Tip: Ignores tasks finishing &

successor starting on same day

condition; Difficult to

understand & manage “time

overlap” created using leads.

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 16

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_16_Leads_>1d

Note: Leads may be defined as a negative

lag.

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE,

non-summary tasks that have negative lag

predecessors (greater than one day).

Why It Matters:

Leads can distort total

float & mask potential

impacts to successor path

tasks.

Promote decomposing

tasks & durations to

facilitate Finish-to-Start

relationships without

leads.

Corrective Action:

Eliminate leads to allow

schedule logic to drive

dates.

Test 17

Tasks with Lags

Determine % of incomplete

tasks with lags (imposed logic

delays to successors).

Tip: Difficult to understand &

manage “time gap” created

using lags.

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 17

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_17A_Lags_Numerator

QL_17B_Lags_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

predecessors with lag to (D) number of

incomplete, non-LOE, non-summary

tasks.

Why It Matters:

Lags interject vagueness

related to a “time gap”

represented by the lag &

are difficult to understand

& manage.

Lags should only model

“wait time”, not replace

work effort or be used to

anticipate successor start

dates.

Corrective Action:

Minimize lags to allow

schedule logic to drive

dates.

Page 97: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

97 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 18

Constraints w/o Rationale

Determine % of incomplete

tasks that have constraints

without comments (rationale)

in Notes field.

Note: Recognize that the

schedule authors may utilize

another custom field or

document to explain

constraints use (such as in the

IMS Supplemental Guidance

documentation), may need to

adjust test results accordingly.

Tip: Rationale aids

understanding of applied

constraints.

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 18

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_18A_Constraints_No_Notes

_Numerator

QL_18B_Constraints_No_Notes

_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that are not

ASAP & do not have Notes entries to (D)

number of incomplete, non-LOE, non-

summary tasks that are not ASAP.

Why It Matters:

Documented explanations

are required to understand

constraint use, including

validity & underlying

intent.

Aids in decision making &

schedule maintenance.

Corrective Action:

Add explanations for

deadlines & constraints to

the Notes field.

Page 98: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

98 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 19

Lead/Lag w/o Rationale

Determine % of incomplete

tasks that have leads or lags

without comments (rationale)

in Notes field.

Tip: Rationale aids

understanding of applied delays

or accelerations.

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 19

2. Observe & record percent score

displayed in message box.

Uses Quick Look Filters:

QL_19A_Leads_Lags_No_Notes

_Numerator

QL_19B_Leads_Lags_No_Notes

_Denominator

Run!23 divides the numerator (N) count

by the denominator (D) count.

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-summary tasks that have

predecessor leads or lags & do not have

note entries to (D) number of incomplete,

non-LOE, non-summary tasks that have

predecessor leads or lags.

Why It Matters:

Rationale is required to

understand lead / lag use,

including validity &

underlying intent.

Aids in decision making &

schedule maintenance.

Corrective Action:

Add explanations for leads

/ lags to the Notes field.

Also see Leads (Test 16)

above for alternative

techniques.

Test 20

Hard Constraints

Determine number of

incomplete tasks utilizing hard

constraints, prohibiting free

flow of logic-driven IMS.

Tip: Prevent dates from

reflecting driving predecessor

impacts.

Includes:

Must Start On

Must Finish On

Start No Later Than

Finish No Later Than

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 20

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_20_Hard Constraints

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE

tasks that have MSO or MFO or SNLT or

FNLT constraints applied.

Why It Matters:

Documented constraints

affecting late dates may be

necessary to establish key

need dates & total float

other than relying solely

on backward pass

calculations (use

sparingly).

Corrective Actions:

Eliminate hard constraints

from IMS & consider

using deadlines instead.

Deadlines enable forecast

impacts while providing

accurate total float values.

Page 99: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

99 Version 4.0, 21 September 2012

GASP Transparent Evaluation

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and

reflect rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 21

Excessive Lags

Determine number of

incomplete tasks with excessive

lags (delay values greater than

one month).

Tip: Excessive lag values

potentially extend beyond one

status period, complicating

analysis of dates.

1. Apply Run!23 Quick Look

GASP 3: Transparent Test 21

2. Observe & record detected number

displayed in message box.

Run!23 function

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE,

non-summary tasks that have predecessors

or successors with lag values greater than

20 working days.

Why It Matters:

Excessive “wait time”

complicates schedule

management / visibility.

Corrective Action:

Replace excessive lags

with documented /

maintained “no earlier

than” constraints”.

Page 100: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

100 Version 4.0, 21 September 2012

Tenet 4: Statused

4. Statused - Schedules reflect valid actual and forecast dates, and tasks maintain

previously established logical relationships.

Test Description How to Determine Why It Matters /

Corrective Action

Test 22

Invalid Forecast Dates

Determine number of

incomplete tasks that are not

statused up to status date.

Tip: Includes incomplete tasks

without appropriate actual

start or actual finish dates <

status date, or in-progress

tasks with remaining duration

starting < status date.

Tip: Unaccomplished work in

the past is not accurate status,

causes inaccurate projections,

& diminishes schedule

reliability.

1. Apply Run!23 Quick Look

GASP 4: Statused Test 22

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_22_Invalid_Forecast_Dates

Goal: Zero exceptions

Detects: Number of non-summary tasks

that have forecast start or forecast finish

dates earlier than the status date, without

the applicable actual start or actual finish

dates, or remaining duration not

beginning at the status date for in-

progress tasks.

Note: IMS cannot have tasks with invalid

forecast dates.

Why It Matters:

It is not possible to

perform future work in the

past, therefore all tasks

with work scheduled

earlier than status date

must re-schedule that

work later than status date.

Corrective Actions:

Address invalid dates &

incomplete tasks that are

earlier than Timenow by

providing accurate status

& / or forecast dates.

Not reflecting proper

status jeopardizes

performance measurement

& successor path task

projections.

Test 23

Invalid Actual Dates

Determine number of tasks

with actual start or actual

finish dates in future.

Tip: Tasks reflecting

achievement in the future do

not have accurate status,

which causes inaccurate

projections & diminishes

schedule reliability.

1. Apply Run!23 Quick Look

GASP 4: Statused Test 23

2. Observe & record detected number

displayed in message box.

Uses Quick Look Filter:

QL_23_Invalid_Actual_Dates

Goal: Zero exceptions.

Detects: Number of non-summary tasks

that have actual start or actual finish dates

later than the status date.

Note: IMS cannot have tasks with invalid

actual dates.

Why It Matters:

Status date defines

separation between past &

future. It is not possible to

accomplish effort in the

future, beyond Timenow

(status date).

Corrective Actions:

Correct the actual start or

finish dates of tasks listed

in the future.

Not reflecting proper

status jeopardizes

performance measurement

& successor path task

projections.

Page 101: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

101 Version 4.0, 21 September 2012

GASP Statused Evaluation

4. Statused - Schedules reflect valid actual and forecast dates, and tasks maintain

previously established logical relationships.

Test Description How to Determine Why It Matters /

Corrective Action

Test 24

Out-of-Sequence (OOS)

Status Conditions

Determine number of tasks

that contain status conditions

violating their logic

relationships.

Tip: Any tasks with out-of-

sequence status condition

render IMS projecting

capabilities unreliable.

Review & detect tasks reflecting Actual

Starts or Actual Finishes in current status

cycle that are incongruent with

predecessor logical relationships for Test

24.

E.g. an incomplete FS predecessor to an

in-progress successor – that has an Actual

Start & its predecessor does not have an

Actual Finish, does not honor the

relationship.

Goal: Zero exceptions.

Note: This test is performed more

efficiently using an analysis tool designed

to automate OOS status condition

detections.

See the Appendix – Schedule and

Schedule Assessment Tools for products

that perform this automated function.

Why It Matters:

Out-of-sequence status

conditions override logic

& potentially return overly

optimistic successor path

projections & meaningless

total float values.

Corrective Action:

Resolve out-of-sequence

status issues by either

changing logic (if

appropriate) or correcting

the actual start or finish

dates.

Page 102: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

102 Version 4.0, 21 September 2012

Tenet 5: Predictive

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 25

Push Forward Test

Assess logic network integrity

to program completion.

Tip: Delaying an incomplete

task with least total float

reflects proportionate delay to

program completion,

demonstrating logic path to

program completion.

Observe / record program completion

milestone Early Finish date.

Perform a successor trace by selecting a

current period task with the least amount

of total float, add 600 working days to

existing duration, recalculate the

schedule & click the Trace button, using

the R (for Right) option.

Verify the program completion

milestone Early Finish date reflects a

proportionate delay; the milestone is in

the filtered set of tasks if it is logically

tied to the successor trace task.

Check for a logic break if the milestone

is not present in the filtered set of tasks.

Failed test when milestone does not

reflect anticipated delay.

Repeat this test on another current period

task to ensure consistency.

Note: If task with least total float has

positive 25 working days total float, may

only expect a 575 working day delaying

impact to milestone.

Why It Matters:

Adding 600 working days

is more than two years

duration, introducing

dramatic impact to

program completion.

Failing the test indicates

either broken logic exists

or hard constraints prevent

delays to successor path

tasks.

Corrective Action:

Address missing logic or

applied hard constraint

issues.

Page 103: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

103 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 26

Program Completion Trace

Test

Determine % of non-LOE,

incomplete tasks logically tied

to program completion.

Tip: Feed-out tasks detected

during this test should have

documented rationale.

Note any hard constraints

assigned.

1. Perform a predecessor trace by

selecting the program completion

milestone & clicking the Trace button,

using defaults (no options), highlight all

tasks & Count.

Note LOEs detected in path; decrement

number of LOE from detected number

for accurate calculation

See Test 27 with respect to detected

LOE.

2. Apply Run!23 Quick Look

GASP 5: Predictive Test 26

3. Observe & record detected number

displayed in message box for

denominator.

Uses Quick Look Filter:

QL_26B_Program_Completion

_Trace_Test_Denominator.

Divide Trace Count by number of total

incomplete, non-LOE, non-summary

task (QL_26B).

Goal: 95% & greater.

Note: Review tasks not detected in the

path by selecting Flag19 = “No” before

continuing with other tests (Trace

populates Flag19 with “Yes”).

These are the tasks not logically tied to

the program completion milestone.

Why It Matters:

Although a percentage is

calculated for test, it is

more meaningful to

review suspect tasks.

Even a relatively few

significant tasks without a

successor path to program

completion is reason for

concern.

Ideally all incomplete,

non-LOE, non-summary

tasks are logically tied to

completion milestone.

Corrective Actions:

Investigate tasks not

detected by the test,

address missing successor

path logic to milestone.

Essential tasks not

logically tied to program

completion render IMS as

not predictive & invalidate

critical path.

Page 104: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

104 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 27

No LOE in Path to Program

Completion

Use the Program Completion

Trace Test set-up for this

check.

Tip: Identify LOE tasks

detected as having logical

successor paths to program

completion.

Perform a predecessor trace by selecting

the program completion milestone &

clicking the Trace button, using the L

option to detect LOE in the path;

window displays the number of LOE

detected.

Review the LOE tasks detected.

Investigate to confirm that these LOE

tasks are logically tied to discrete tasks

& milestone & recommend changing

logic.

Goal: No LOE tied to discrete effort.

Why It Matters:

LOE should not be

logically tied to discrete

work & should not be part

of the critical path.

Corrective Action:

Investigate & remove

LOE logic to discrete tasks

& program completion to

ensure LOE will not

become part of the critical

path.

Recommend using a LOE

completion milestone to

terminate LOE logic if

necessary.

Test 28

Appropriate Constraints

Applied to Endpoint

Milestones

Verify related milestones have

appropriate constraints that

provide meaningful schedule

measures.

Tip: Missing constraints

diminish program

management prioritization

Avoid using hard constraints

that override predictive nature

of logic network.

Identify & review the endpoint

milestones to ensure appropriate,

documented constraints provide

meaningful total float values & permit

driving predecessors to establish forecast

dates.

Note: Method & rationale for

establishing need dates (Late Dates)

should align with IMS Supplemental

Guidance documentation.

Goal: All endpoint milestones should

have constraints applied.

Why It Matters:

Need dates reflect

management’s target.

Constraints affecting the

backward pass to program

end & major milestones (if

applicable) enable

accurate total float

calculation & permit

precedence logic impacts.

Corrective Action:

Validate appropriate

constraints used on

endpoint milestones.

Consider using

documented deadlines.

Page 105: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

105 Version 4.0, 21 September 2012

GASP Predictive Evaluation

5. Predictive - Schedules provide logic-driven forecast information, meaningful

critical paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 29

Critical Path Length Index

(CPLI)

Project performance indication

of the ability to finish on time.

1. Determine working days duration

from status date to program completion

Early Finish date in IMS, A (critical path

length).

2. Add amount of total float, B (least

positive or negative value) to A & total.

3. Divide total (A + B) by A (critical

path length, as determined above).

(A + B) / A

Goal: Should not be less than 0.95 with

target of 1.00 (>1.00 is favorable <1.00

is unfavorable).

Why It Matters:

Although geared towards

performance, this test

reflects IMS realism of

completing on time & is

meaningful when

satisfactorily passing all

previous GASP tests.

Page 106: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

106 Version 4.0, 21 September 2012

Performing an IMS Quick Look Assessment Using Open Plan Professional

This section describes using Open Plan to perform an IMS Assessment. Please refer to the IMS

Quick Look Assessment section for all guidance on performing an IMS Quick Look Assessment.

Prior to performing analysis on an IMS, save a copy of the IMS for analysis using a unique

project name.

BK3 File

A file, Air Force Open Plan.bk3, is provided with this process. The BK3 file provides views,

calculated fields and filters to use for performing a Quick Look Assessment.

Note: The Air Force BK3 file also contains additional views, filters, sorts, and bar

sets to aid in other schedule analysis functions.

Follow the steps below to import the BK3 file.

1. Access the BK3 file on the Air Force Portal at: https://www.my.af.mil/gcss-

af/USAF/ep/globalTab.do?channelPageId=s5FDEA9F02769C1090127867185EE02F8

2. Save the file to your computer.

3. In Open Plan, select File > Manage Files > Restore File.

Page 107: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

107 Version 4.0, 21 September 2012

4. Navigate to the folder the BK3 file was saved in.

Page 108: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

108 Version 4.0, 21 September 2012

5. Select the Air Force Open Plan BK3 file and select Open.

Page 109: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

109 Version 4.0, 21 September 2012

6. All fields, filters, views, sorts and bar charts included in the Air Force Open Plan BK3

file are checked within the next two screens. Select Next.

Page 110: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

110 Version 4.0, 21 September 2012

7. Select Finish.

8. For any items that already exists, select Skip restore of item so that any elements that

have been previously modified by the user are not overwritten. However, if previously

modified items need to be replaced by the original, Overwrite existing item may be

selected.

Page 111: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

111 Version 4.0, 21 September 2012

Views

Three views are provided to use with Quick Look Assessments. These views are named AF QL

Assessment, AF QL CPLI and AF QL Leads & Lags. In Open Plan Explorer, these may be

selected within the Open Plan Library > Views folder.

Note: Other Air Force views are provided in the BK3 file to assist with additional

schedule analysis. These include AF Critical Activity, AF Delinquent Finish, AF

Float Sort, AF IMS Overview and AF Target Types views.

AF QL Assessment View

The AF QL Assessment view contains fields that are helpful to use when performing a Quick

Look Assessment.

There is a Count column that totals the number of activities resulting when different filters are

applied.

Page 112: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

112 Version 4.0, 21 September 2012

Date fields include baseline dates, early dates, late dates, actual dates and target dates. Start and

finish dates appear in the same cell, consolidating space. The start dates are reflected on the top

of each cell and the finish dates are reflected on the bottom. If a date is not listed, it does not

exists for the specific activity.

For subproject activities (identified in the Activity Type field as Subproject), the date fields

reflect the earliest start and the latest finish dates of the lower level activities that roll up to the

subproject.

The Target Dates field reflects the target type on top and the target date on bottom for both start

and finish targets that have been assigned to an activity.

Page 113: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

113 Version 4.0, 21 September 2012

The Predecessors and Successors fields list the activity id, relationship type, and lag value

(positive or negative) on a line for each relationship assigned to an activity.

If desired, the bar chart may be viewed by expanding the right section of the screen.

Page 114: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

114 Version 4.0, 21 September 2012

AF QL CPLI View

The AF QL CPLI view contains fields that are helpful to use when determining the CPLI for a

schedule. The view is also filtered and sorted for the CPLI test.

AF QL Leads & Lags View

The AF QL Leads & Lags view is a relationship view that contains fields and groupings that are

helpful when performing the Activities with Leads and Excessive Lags checks for a Quick Look

Assessment. This view counts relationships, rather than individual activities.

Filters

The QL Filters provided in the BK3 file are used for the various tests included in a Quick Look

Assessment. To apply a filter, either select the filter icon on the toolbar or select Tools > Filters

> Manage Filters.

Page 115: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

115 Version 4.0, 21 September 2012

The filters to be used for the Quick Look Assessment checks all begin with QL. There are 29

Quick Look activity filters available.

Page 116: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

116 Version 4.0, 21 September 2012

Page 117: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

117 Version 4.0, 21 September 2012

There are also four Quick Look relationship filters available in the AF QL Leads & Lags view.

Types of Quick Look Filters

Most Quick Look tests are either percent score or zero exceptions tests.

Percent Score Filters

Percent score tests use two related filters for each test. The first filter detects the number of

activities for the stated condition and counts the activities for the numerator. The second filter

uses the same parameters without the stated condition to determine the task count for the

denominator. The mathematical division needs to be completed manually to provide the

percentage of activities detected. The analyst compares the result to the stated goal to determine

if the condition exceeds the threshold. Exceeding the threshold requires further investigation to

understand the condition and to possibly recommend corrective actions to enable schedule

improvements.

Zero Exception Filters

For zero exceptions tests, Open Plan Professional uses a single filter to identify the number of

activities detected for the related condition. Activities detected in these tests do not require a

mathematical operation, just a count of the filtered activities which is displayed in the Count

column of the AF QL Assessment view. Tests for zero exception conditions may require further

investigation to understand the condition, but usually result in recommending corrective actions

to resolve the detected condition to make schedule improvements.

Note: Filtering is not viable for some tests which will require manually performing

the described steps in the table below to determine the results.

Page 118: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

118 Version 4.0, 21 September 2012

Filter Exclusions

Filters may exclude certain types of activities from a test. Exclusions may include activities that

have completed, LOE, planning packages, milestones, subprojects, external subprojects, foreign

activities, foreign subprojects, and foreign projects. Again, where used for a percent score test,

the numerator and denominator contain the same exclusions.

Using Open Plan Professional to Perform an IMS Quick Look Assessment

Most of the IMS deliveries for analysis will be in BK3 format. The PMO analyst will restore the

file using Open Plan.

Prior to performing a Quick Look Assessment, ensure Time Analysis has been run on the

schedule.

The IMS Quick Look Assessment table explains how to perform each test by GASP tenet using

Open Plan.

The first column, Test Description, describes the test or check and provides a tip regarding the

intent of performing the related test.

The second column, How to Determine, list the steps necessary to perform each check. This

includes the applicable Open Plan Quick Look filter to use with a description or describes the

manual method if a filter is not viable for a check. Threshold goals are also stated in this column

for each applicable test, such as “5% or less”. Unless otherwise stated, the AF QL Assessment

view should be used.

In the descriptions, some Open Plan fields are referenced in general terms. For example,

External Subproject, Foreign Activity, Foreign Subproject and Foreign Project are not listed

individually but included in the Non-External and Non-Foreign references.

The third column, Why It Matters / Corrective Action, provides insight into related unsatisfactory

conditions and suggestions to make schedule improvements.

Page 119: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

119 Version 4.0, 21 September 2012

Tenet 1: Complete

Please refer to the basic, non-tool specific IMS Quick Look section for definitions of activity types such as

detailed, discrete, LOE, planning packages, and milestones.

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 1

Baseline Durations > 2

Months

Determine % of incomplete

activities with baseline

durations greater than 44

working days (2 months).

Tip: Shorter baseline

durations reflect original

planning scope granularity for

efficient execution & precise

performance measurement.

Note: The AF Baseline

Duration field calculates the

number of working days

between the baseline start &

baseline finish dates.

1. Apply QL filter:

QL_01A_BL_Dur_GT_2mo_

Numerator

2. Observe & record count

3. Apply QL filter:

QL_01B_BL_Dur_GT_2mo_

Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete,

non-LOE, non-planning package, non-

subproject, non-milestone, non-

external, non-foreign activities that

have baseline durations greater than 44

working days to (D) number of

incomplete, non-LOE, non-planning

package, non-subproject, non-

milestone, non-external, non-foreign

activities.

Why It Matters:

Shorter activities (2 months or

less in duration) provide more

visibility into how the activities

were planned & allow a more

objective evaluation of progress.

Corrective Action:

Review & verify activities with

baseline durations longer than

44 working days or split into

activities less than 44 days.

Page 120: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

120 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 2

Forecast Durations > 2 Months

Determine % of incomplete

activities with durations greater

than 44 working days (2 months).

Tip: Shorter activity durations

are easier to status & provide

scope granularity for precise

performance measurement.

1. Apply QL filter:

QL_02A_Fcst_Dur_GT_2mo_

Numerator

2. Observe & record count

3. Apply QL filter:

QL_02B_Fcst_Dur_GT_2mo_

Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-planning package, non-

subproject, non-milestone, non-external,

non-foreign activities that have durations

greater than 44 working days to (D)

number of incomplete, non-LOE, non-

subproject, non-milestone, non-external,

non-foreign activities.

Why It Matters:

Shorter activities (2 months

or less in duration) provide

more visibility into how the

activities were planned &

allow a more objective

evaluation of progress.

Corrective Action:

Review & verify activities

with forecast durations

longer than 44 working days

or split into activities less

than 44 days. Note that near

term activities may be

prohibited from being split

due to baseline change freeze

periods.

Page 121: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

121 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 3

Forecast Durations > 2 Months

in 3 Month Look Ahead Determine % of incomplete

activities with durations greater

than 44 working days (2 months)

that are within next 3 months.

Tip: Activities clearly defined &

well planned with easier to status

shorter durations provide

granularity for precise

performance measurement.

1. Apply QL filter:

QL_03A_Fcst_Dur_GT_2mo_

within_3mo_Numerator

2. Observe & record count

3. Apply QL filter:

QL_03B_Fcst_Dur_GT_2mo_

within_3mo _Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-planning package, non-

subproject, non-milestone, non-external,

non-foreign activities within 3 months of

Timenow that have durations greater than

44 working days to (D) number of

incomplete, non-LOE, non-planning

package, non-subproject, non-milestone,

non-external, non-foreign activities within

the same period.

Why It Matters:

3 month look ahead period

scope must be well

understood & planned to

execute efficiently.

Corrective Action:

Review & verify activities

with forecast durations

longer than 44 working days

or split into shorter activities.

Apply this approach to

advanced look ahead periods

to affect changes.

Page 122: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

122 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 4

Estimated Durations

Determine number of incomplete

activities with estimated

durations.

Tip: Indicates incomplete

planning (durations have not

been addressed).

1. Apply QL filter: QL_04_Est_Dur

2. Observe & record count

Goal: zero exceptions.

Detects: number of incomplete activities

that have estimated durations.

Why It Matters:

A zero day duration may

indicate there has not been

any duration input for a non-

milestone activity. This

suggests the planning has not

yet been completed.

Corrective Action:

Replace estimated durations

for all non-milestone

activities with durations from

the CAM.

Tests 5 (& 6)

Missing Baseline Dates

Determine all activities without

baseline dates.

Tip: Cannot determine if

activities are early or late during

execution without proper

baseline.

Note: Missing Baseline

Durations (Test 6) do not need to

be checked when using Open

Plan since this is not a field that

Open Plan provides (only

provided via the BK3 for AF

use).

1. Apply QL filter:

QL_05_No_BL_Dates

2. Observe & record count

Goal: All activities have baseline dates.

Detects: Number of activities that do not

have established baseline start or baseline

finish dates.

Why It Matters:

Missing baseline information

may indicate a lapse in

proper schedule management

processes & exhibit lack of

performance measure

capabilities.

Corrective Action:

Populate & maintain proper

baseline dates (baseline the

schedule).

Page 123: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

123 Version 4.0, 21 September 2012

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 7

Cross Reference Fields

Comprehensive data field

referencing in IMS.

Tip: Demonstrates source

information tracks to each other,

is represented in the IMS, &

enables better program

management.

1. Remove all filters

2. Determine related fields in the IMS for

each artifact & add to view:

Required: CAMs, CAs, IMP,

WBS, SOW, EVT, Work

Package, Planning Package

Recommended: OBS/IPT

3. Verify all documents cross-

referenced to the IMS are

represented with their own fields

in the IMS & are appropriately

populated

4. Determine related fields in

the IMS for each artifact &

search for completeness

Analyst uses judgment to determine if IMS

is adequately cross-referenced.

Goal: All required fields complete.

Why It Matters:

Data cross reference fields

exist & are populated to

demonstrate source data

alignment & provides a

verifiable basis for IMS

planning.

Corrective Action:

Populate & maintain

proper artifact data fields

in the IMS.

Page 124: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

124 Version 4.0, 21 September 2012

GASP Complete Evaluation

1. Complete - Schedules reflect comprehensive planning and are effective for execution.

Level of Effort may be excluded from the IMS.

Test Description How to Determine Why It Matters /

Corrective Action

Test 8

Duplicate / Blank Names

Search for blank or duplicate

activity names in the entire IMS.

Tip: Unique & descriptive

activity descriptions define the

scope content & deliverable, aide

user comprehension, & facilitate

determining progress during

status.

1. Remove all filters

2. Sort the IMS by activity description

3. Observe obvious activity description

duplicates

4. Through several iterations, search

activity descriptions containing

common words to discern repetitive

phrases that do not exhibit uniqueness,

such as several activities that merely

state “Perform Test”, not differentiating

specific tests

Goal: All descriptions unique & not

blank.

Why It Matters:

IMS task nomenclature is

best understood when

organized, unique,

meaningful, & not reliant on

subproject or grouping titles

to supplement their

comprehension.

Corrective Action:

Use present tense action

verbs, as described in the

IMP, if applicable, for each

non-subproject activity

where possible, when

revising activity descriptions.

Words such as analyze,

design, draft, determine,

produce, conduct, review &

approve provide insight into

unique expressive activity

descriptions & aid

understanding each activity

deliverable.

Page 125: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

125 Version 4.0, 21 September 2012

Tenet 2: Traceable

2. Traceable - Schedules have full network logic that reflects potential impacts to program

completion. Schedules have populated code fields relating to required field mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 9

Missing Logic

Determine number of incomplete

activities without logic

(predecessors or successors).

Tip: Logic is fundamental for

establishing an achievable

schedule & imperative for its

predictive capability. Missing

logic calls into question schedule

soundness & critical path validity.

1. Apply QL filter: QL_09_No_Logic

2. Observe & record count

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE,

non-external, non-subproject activities that

do not have at least one predecessor or one

successor.

Why It Matters:

External feed-in milestones

w/o predecessor or feed-out

milestones w/o successor

may be appropriate, but all

other activities need proper

logic found within the IMS.

Corrective Action:

Determine appropriate

predecessors & / or

successors for activities

missing logic

Test 10

Subproject Logic (& Target

Dates)

Identify subproject activities with

applied logic or constraints.

Tip: Applying logic or constraints

to subproject activities potentially

obscures impacts to detailed

activities & hinders schedule

analysis.

1. Apply QL filter:

QL_10_Subproject_Logic

2. Observe & record count

Goal: Zero exceptions.

Detects: Number of all subproject

activities that have predecessors or

successors or target dates applied.

Why It Matters:

Logic or target dates applied

to subproject activities may

have unintended

consequences to subordinate

detail activities that may be

difficult to discover when

reviewing / analyzing

schedule information.

Corrective Action:

Remove logic & target dates

from subproject activities &

apply logic & appropriate

target dates to detailed

activities.

Page 126: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

126 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to program

completion. Schedules have populated code fields relating to required field mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 11

Finish-to-Start (FS)

Relationships

Determine % of incomplete

activities using FS relationships

(preferred).

Tip: FS relationships avoid

scheduling activities in parallel &

ensure the least opportunity for

creating resource conflicts.

1. Apply QL filter:

QL_11A_FS_Rel_Numerator

2. Observe & record count

3. Apply QL filter:

QL_11B_FS_Rel_Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 90% or greater.

Compares: (N) number of incomplete, non-

LOE, non-subproject activities that have

finish-to-start predecessor relationships to

(D) number of incomplete, non-LOE, non-

subproject activities.

Why It Matters:

Promoting parallel activities

risks scheduling more work

than can be executed &

potentially understates

projecting accurate program

finish.

Corrective Action:

Verify the use of any non-

FS relationships & change

to FS if appropriate.

Test 12

Start-to-Start (SS) or Start-to-

Finish (SF) Successor w/o also

Finish-to-Start (FS) or Finish-

to-Finish (FF)

Determine number of incomplete

activities using only SS or SF

successor relationships.

Tip: SS relationships may be

valid, but not having at least one

additional FS successor

relationship prohibits establishing

finish consequences, resulting in

meaningless total float values.

1. Apply QL filter:

QL_12_SS_or_SF_Succ_no_F_Succ

2. Observe & record count

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE

activities that have a SS or SF successor,

but also do not have at least one FS or FF

successor relationship to another activity.

Note: Condition, potentially equivalent of

missing a successor.

Why It Matters:

Relying only on SS or SF

successor relationships does

not model a finish

consequence to the activity.

Once in-progress, it loses its

impact to other activities,

does not retain priority to

finishing & can reflect

meaningless total float value

to program end.

Corrective Action:

Determine & apply

additional, appropriate FS or

FF successor relationships.

Page 127: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

127 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to program

completion. Schedules have populated code fields relating to required field mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 13

Total Float > 3 Months

Determine % of activities with

total float >60 working days.

Tip: Indicates an activity may slip

greater than 3 months without

impacting program completion.

Suggests an activity is starting too

early (missing an identified

predecessor) or is not reflecting

potential impacts to critical path

(missing an identified successor).

Possibility that some scope has

not been identified (activities not

present in the IMS).

1. Apply QL filter:

QL_13A_TF_GT_3mo_Numerator

2. Observe & record count

3. Apply QL filter:

QL_13B_TF_GT_3mo_Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject activities that have

total float greater than 60 working days to

(D) number of incomplete, non-LOE, non-

subproject activities.

Why It Matters:

Excessive total float is an

indication the activity is not

properly sequenced, either

starting too early or is

missing a potential

successor that could impact

critical path determination.

Usually, identifying the end

activity in a path for missing

successors is effective in

addressing high total float

for all activities in the path.

Corrective Action:

Determine appropriate

predecessors & / or

successors for activities with

excessive total float.

Tip: Sort the detected

activities in descending total

float order to focus

corrective actions on

activities with largest total

float values (use AF Float

Sort view).

Page 128: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

128 Version 4.0, 21 September 2012

2. Traceable - Schedules have full network logic that reflects potential impacts to program

completion. Schedules have populated code fields relating to required field mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 14

Not Earlier Than or On Target

Beyond 3 Month Look Ahead

Determine % of Not Earlier Than

or On Target start or finish dates

on activities > 3 month look

ahead.

Tip: Anticipate using fewer Not

Earlier Than & On Target dates in

periods further out, due to

uncertainty & related rationale,

relying more on logic alone to

schedule a project.

1. Apply QL filter:

QL_14A_Not_Earlier_Than_or_

On_Target_beyond _3mo_Numerator

2. Observe & record count

3. Apply QL filter:

QL_14B_Not_Earlier_Than_or_

On_Target_beyond _3mo _Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject, non-external

activities beyond 3 months from Timenow

that have Not Earlier Than or On Target

start or finish dates to (D) number of

incomplete, non-LOE, non-subproject,

non-external activities beyond 3 months

from Timenow.

Why It Matters:

Generally, assumptions are

less accurate in further look

ahead periods, especially

when attempting to model

resource availability with

Not Earlier Than & On

Target dates.

Corrective Action:

Review the Not Earlier Than

and On Target dates &

replace with logic

relationships where

practical.

Page 129: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

129 Version 4.0, 21 September 2012

GASP Traceable Evaluation

2. Traceable - Schedules have full network logic that reflects potential impacts to program

completion. Schedules have populated code fields relating to required field mapping.

Test Description How to Determine Why It Matters /

Corrective Action

Test 15

Not Earlier Than or On Target

within 3 Month Look Ahead

Determine % of Not Earlier Than

or On Target start or finish dates

on activities < =3 month look

ahead.

Tip: Anticipate using more Not

Earlier Than & On Target dates in

immediate period, due to

certainty, to refine dates, where

logic alone may not adequately

model the project.

1. Apply QL filter:

QL_15A_Not_Earlier_Than_or_

On_Target_within_3mo_Numerator

2. Observe & record count

3. Apply QL filter:

QL_15B_ Not_Earlier_Than_or_

On_Target _within_3mo _Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 10% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject, non-external

activities within 3 months from Timenow

that have Not Earlier Than or On Target

start or finish dates to (D) number of

incomplete, non-LOE, non-subproject,

non-external activities within 3 months

from Timenow.

Why It Matters:

Generally, conditions are

well known in very near

term periods & predecessors

alone may not sufficiently

model resource availability

for task execution.

Use Not Earlier Than & On

Target dates appropriately,

but not in place of logic.

Corrective Action:

Validate the Not Earlier

Than & On Target dates &

replace with logic

relationships where

practical.

Page 130: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

130 Version 4.0, 21 September 2012

Tenet 3: Transparent

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and reflect

rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 16

Activities with Leads

Determine number of

relationships with an incomplete

predecessor with leads > one day

(imposed logic accelerations to

successors).

1. Open view: AF QL Leads & Lags

2. Apply QL filter: QL_16_Leads_GT_1d

3. Observe & record count

Note: Leads may be defined as negative lag

Goal: Zero exceptions.

Detects: Number of relationships with an

incomplete, non-LOE, non-subproject

predecessor that have leads (greater than one

day).

Why It Matters:

Leads can distort total float

& mask potential impacts to

successor path activities.

Promote decomposing

activities & durations to

facilitate FS relationships

without leads.

Corrective Action:

Eliminate leads to allow

schedule logic to drive

dates.

Test 17

Activities with Lags

Determine % incomplete

relationships with lags (imposed

logic delays to successors).

Tip: Difficult to understand &

manage “time gap” created using

lags.

1. Open view: AF QL Leads & Lags (if not

already open)

2. Apply QL filter:

QL_17A_Lags_Numerator

3. Observe & record count

4. Apply QL filter:

QL_17B_Lags_Denominator

5. Observe & record count

6. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject relationships that have

predecessors or successors with lag to (D)

number of incomplete, non-LOE, non-

subproject relationships.

Why It Matters:

Lags interject vagueness

related to a “time gap”

represented by the lag & are

difficult to understand &

manage.

Lags should only model

“wait time”, not replace

work effort or be used to

anticipate successor start

dates.

Corrective Action:

Minimize lags to allow

schedule logic to drive

dates. Appropriate target

dates, rather than lags,

should be used to model

resource availability.

Page 131: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

131 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and reflect

rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 18

Target Dates w/o Rationale

Determine % of incomplete

activities that have target dates

without comments (rationale) in

Notes field.

Note: Recognize that the

schedule authors may utilize

another user defined field or

document to explain target date

use (such as in the IMS

Supplemental Guidance

documentation). May need to

adjust test results accordingly.

The calculated field created for

Notes only detects notes in the

Default category. If a project has

additional note categories, these

need to be added to the

calculated field expression.

Tip: Rationale aids

understanding of applied

constraints.

1. Apply QL filter:

QL_18A_Target_Dates_No_Notes

_Numerator

2. Observe & record count

3. Apply QL filter:

QL_18B_Target_Dates_No_Notes

_Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject activities that have

target dates & do not have Note entries to

(D) number of incomplete, non-LOE, non-

subproject activities that have target dates.

Why It Matters:

Documented explanations

are required to understand

target date use, including

validity & underlying

intent.

Aids in decision making &

schedule maintenance.

Corrective Action:

Add explanations for target

dates to the Notes field.

Page 132: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

132 Version 4.0, 21 September 2012

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and reflect

rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 19

Lead/Lag w/o Rationale

Determine % of incomplete

activities that have leads or lags

without comments (rationale) in

Notes field.

Note: The calculated field

created for Notes only detects

notes in the Default category. If a

project has additional note

categories, these need to be

added to the calculated field

expression.

Tip: Rationale aids

understanding of applied delays

or accelerations.

1. Apply QL filter:

QL_19A_Leads_Lags_No_Notes

_Numerator

2. Observe & record count

3. Apply QL filter:

QL_19B_Leads_Lags_No_Notes

_Denominator

4. Observe & record count

5. Divide the numerator (N) count by

the denominator (D) count

Goal: 5% or less.

Compares: (N) number of incomplete, non-

LOE, non-subproject activities that have

predecessor leads or lags & do not have

Notes entries to (D) number of incomplete,

non-LOE, non-subproject activities that

have predecessor leads or lags.

Why It Matters:

Rationale is required to

understand lead / lag use,

including validity &

underlying intent.

Aids in decision making &

schedule maintenance.

Corrective Action:

Add explanations for leads /

lags to the Notes field.

Also see Leads (Test 16)

above for alternative

techniques.

Test 20

Hard Target Dates

Determine number of incomplete

activities utilizing Fixed target

dates, prohibiting free flow of

logic-driven IMS.

Tip: Prevent dates from

reflecting driving predecessor

impacts.

Includes:

Start & Finish Fixed Target

Dates.

1. Apply QL filter:

QL_20_Hard_Targets

2. Observe & record count

Goal: Zero exceptions.

Detects: Number of incomplete, non-LOE

activities that have start or finish Fixed

target dates applied.

Why It Matters:

Hard target dates impact

calculation of the critical

path & total float values.

Corrective Action:

Eliminate Fixed target dates

from IMS & use target

types that enable forecast

impacts, while providing

accurate total float values.

Page 133: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

133 Version 4.0, 21 September 2012

GASP Transparent Evaluation

3. Transparent - Schedules are constructed, used, maintained, and analyzed consistently

with the IMS Supplemental Guidance (or equivalent documentation), rely on status and

network logic as the primary forecast technique, identify risks and opportunities, and reflect

rationale for constraints and lags.

Test Description How to Determine Why It Matters /

Corrective Action

Test 21

Excessive Lags

Determine number of

relationships with an incomplete

predecessor with excessive lags

(delay values greater than one

month).

Tip: Excessive lag values

potentially extend beyond one

status period, complicating

analysis of dates.

1. Open view: AF QL Leads & Lags

2. Apply QL filter: QL_21_Excessive_Lags

3. Observe & record count

Goal: Zero exceptions.

Detects: Number of relationships with an

incomplete, non-LOE, non-subproject

predecessor that have lag values greater than

20 working days.

Why It Matters:

Excessive lag, or “wait

time”, complicates schedule

management & visibility.

Corrective Action:

Replace excessive lags with

documented / maintained

Not Earlier Than target

dates

Page 134: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

134 Version 4.0, 21 September 2012

Tenet 4: Statused

GASP Statused Evaluation

4. Statused - Schedules reflect valid actual and forecast dates, and activities maintain

previously established logical relationships.

Test Description How to Determine Why It Matters /

Corrective Action

Test 22

Invalid Forecast Dates

This test is not required when using Open

Plan because activities are statused to

Timenow when using Time Analysis.

Test 23

Invalid Actual Dates

Determine number of activities

with actual start or actual finish

dates in future.

Tip: Activities reflecting

achievement in the future do not

have accurate status, which

causes inaccurate projections &

diminishes schedule reliability.

1. Apply QL filter:

QL_23_Invalid_Actual_Dates

2. Observe & record count

Goal: Zero exceptions

Detects: Number of non-subproject activities

that have actual start or actual finish dates

later than Timenow.

Why It Matters:

Timenow defines separation

between past & future. It is

not possible to accomplish

effort in the future, beyond

Timenow (status date).

Corrective Action:

Correct the actual start or

finish dates of activities

listed in the future.

Not reflecting proper status

jeopardizes performance

measurement & successor

path activity projections.

Test 24

Out-of-Sequence (OOS) Status

Conditions

Determine number of activities

that contain status conditions

violating their logic

relationships.

Tip: Any activities with out-of-

sequence status condition render

IMS projecting capabilities

unreliable.

1. Run Time Analysis from the

Project menu

2. Apply QL filter: QL_24_OOS

3. Observe & record count of OOS

activities

Goal: Zero exceptions.

Example of OOS condition: an incomplete

FS predecessor to an in-progress successor-

that has an actual start even though its

predecessor does not have an actual finish,

causing it to not honor the relationship.

Why It Matters:

Out-of-sequence status

conditions override logic &

potentially return overly

optimistic successor path

projections & meaningless

total float values.

Corrective Action:

Resolve out-of-sequence

status issues by either

changing logic (if

appropriate) or correcting

the actual start or finish

dates.

Page 135: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

135 Version 4.0, 21 September 2012

Tenet 5: Predictive

5. Predictive - Schedules provide logic-driven forecast information, meaningful critical

paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 25

Push Forward Test

Assess logic network integrity

to program completion.

Tip: Delaying an incomplete

activity with least total float

reflects a proportionate delay

to program completion,

demonstrating logic path to

milestone.

1. Determine & record the program

completion milestone’s early finish

date

2. Select the program completion

milestone

3. Select Trace Critical Path from the

Add- Ins menu

4. Select one activity with least amount of

total float that was identified as

being on the critical path (coded as 01

in the Trace Critical Path field)

5. Add 600 working days to the activity’s

existing duration

6. Run Time Analysis from the

Project menu

7. Verify the program completion

milestone Early Finish date reflects a

proportionate delay

Failed test when milestone does not

reflect anticipated delay

8. Repeat this test on other current

period activities to ensure consistency

Note: If activity with least total float has

positive 25 working days total float, may

only expect a 575 working day delaying

impact to milestone.

Why It Matters:

Adding 600 working days is

more than two years duration,

introducing dramatic impact to

program completion.

Failing the test indicates either

broken logic exists or hard target

dates prevent delays to successor

path activities.

Corrective Action:

Address missing logic & applied

hard target date issues.

Page 136: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

136 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful critical

paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 26

Program Completion Trace

Test

Determines % of non-LOE,

incomplete activities logically

tied to program completion.

Tip: Feed-out activities

detected during this test

should have documented

rationale.

Note any hard target dates

assigned.

1. Ensure the LOGICT view has been

copied from the Open Plan Library

to the IMS project to enable trace

logic functionality to run properly

2. Select the program completion

milestone

3. Select Trace Logic from the Add-

Ins menu, using User Numeric Field 1

4. Show the AF QL Assessment view

5. Apply QL filter:

QL_26A_Program_Completion_Trace

_Test _Numerator

6. Observe & record count

7. Apply QL filter:

QL_26B_Program_Completion_Trace

_Test_Denominator

8. Observe & record count

9. Divide the numerator (N) count by

the denominator (D) count

Goal: 95% & greater.

See Test 27, No LOE in Path to Program

Completion, with respect to detected

LOE.

Note: Review activities not detected in

the path by selecting those for which

Trace Logic equals 0.00 before

continuing with other tests. Activities not

logically tied to the program completion

milestone are identified with 0.00. The

program completion milestone selected

for the trace is identified with 3.00.

Why It Matters:

Although a percentage is

calculated for test, it is more

meaningful to review suspect

activities.

Even a relatively few significant

activities without a successor

path to program completion is

reason for concern.

Ideally all incomplete, non-LOE,

non-subprojects, non-foreign

subprojects, & non-foreign

projects are logically tied to

completion milestone.

Corrective Action:

Investigate activities not detected

by the test & address missing

successor path logic to milestone.

Essential activities not logically

tied to program completion

render the IMS as not predictive

& invalidate critical path.

Page 137: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

137 Version 4.0, 21 September 2012

5. Predictive - Schedules provide logic-driven forecast information, meaningful critical

paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 27

No LOE in Path to Program

Completion

Use the Program Completion

Trace Test set-up for this

check.

Tip: Identify LOE activities

detected as having logical

successor paths to program

completion.

If Test 26 was previously performed, skip

to step 5 because steps 1-4 have already

been completed.

1. Ensure the LOGICT view has been

copied from the Open Plan Library

to the IMS project to enable trace

logic functionality to run properly

2. Select the program completion

milestone

3. Select Trace Logic from the Add-

Ins menu, using User Numeric Field 1

4. Show the AF QL Assessment view

5. Apply QL filter:

QL_27_LOE_to_Program_Completion

6. Review the LOE activities detected &

investigate to confirm they are

logically tied to discrete activities &

milestones

Goal: No LOE tied to discrete effort.

Why It Matters:

By definition, LOE does not

reflect schedule slips & may

mask an accurate critical path

determination.

LOE should not be logically tied

to discrete work & should not be

part of the critical path.

Corrective Action:

Investigate & remove LOE logic

to discrete activities & program

completion to ensure LOE will

not become part of the critical

path.

Recommend using a LOE

completion milestone to

terminate LOE logic if necessary.

Test 28

Appropriate Target Dates

Applied to Endpoint

Milestones

Verify related milestones have

appropriate target types that

provide meaningful schedule

measures.

Tip: Missing targets diminish

program management

prioritization.

Avoid using hard target types

that override predictive nature

of logic network.

Identify & review the endpoint

milestones to ensure appropriate,

documented targets provide meaningful

total float values & permit driving

predecessors to establish forecast dates.

Note: method & rationale for establishing

need dates (late dates) should align with

IMS Supplemental Guidance

documentation.

Why It Matters:

Need dates reflect management’s

target to measure progress

against.

Targets can enable accurate total

float calculation & permit

precedence logic impacts to the

IMS.

Corrective Action:

Validate appropriate targets used

on endpoint milestones.

Consider using documented Not

Later Than or On Target dates.

Page 138: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

138 Version 4.0, 21 September 2012

GASP Predictive Evaluation

5. Predictive - Schedules provide logic-driven forecast information, meaningful critical

paths, and reflect achievable program completion dates.

Test Description How to Determine Why It Matters /

Corrective Action

Test 29

Critical Path Length Index

(CPLI)

Project performance indication

of the ability to finish on time.

1. Open view: AF QL CPLI

2. Determine working days duration

from Timenow to program

completion early finish date in IMS,

A (critical path length)

Add a dummy activity and apply

a SNET=Status Date

Adjust the duration of the

dummy activity to match the

early finish of the last linked

activity

3. Add number of total float, B (least

positive or negative value) to A &

total

4. Divide total (A + B) by A (critical

path length, as determined above)

(A + B) / A

Goal: Should not be less than 0.95 with

target of 1.00 (>1.00 is favorable <1.00 is

unfavorable).

Why It Matters:

Although geared towards

performance, this test reflects

IMS realism of completing on

time & is meaningful when

satisfactorily passing all previous

GASP tests.

Page 139: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

139 Version 4.0, 21 September 2012

Part 3 – Integrated Master Schedule (IMS) Comprehensive Assessment

IMS Comprehensive Assessment

This part of the IMS Assessment Process document provides the scope, steps, and procedures for

performing an IMS Comprehensive Assessment.

Scope of the IMS Comprehensive Assessment

The IMS Comprehensive Assessment is the most detailed review of the IMS. The IMS Quick

Look Assessment determines the mechanical soundness of an IMS. Further tests are required in

the IMS Comprehensive Assessment to determine if all aspects of the Generally Accepted

Scheduling Principles (GASP) tenets are achieved. The IMS Comprehensive Assessment

considers many program artifacts to create a more extensive analysis. The table below highlights

the differences between the Quick Look and the IMS Comprehensive Assessment.

Attribute Quick Look Comprehensive

Objective Determine schedule health Determine root cause(s) for

significant schedule health

discrepancies; identify

preclusive action(s)

Scope First Five GASP tenets (mechanical checks

of Complete, Traceable, Transparent,

Statused, and Predictive)

All eight GASP tenets (adds

Resourced, Usable, and

Controlled tenets)

Data Analyzed Single deliverable IMS, Basis and

Assumptions, IMS Supplemental Guidance

Document, IMS Field Mapping / Data

Dictionary (if separate document)

Extensive data call addressing

most artifacts that interface with

the IMS; may address multiple

IMS deliveries for trends

Expertise

Required

Scheduler or Schedule Analyst Senior or Master Scheduler

Time Required One or two days One or more weeks depending

upon tests selected

Tailoring All 29 tests should be performed to

adequately assess schedule health

Only those test needed to

address the reasons for the

assessment need to be

performed.

Tools Required /

Suggested

Run!23 for MSP and BK3 file for OPP views

and filters and the reporting template

provided with AF Process document.

Third party schedule analysis

tools for added efficiency in

performing IMS

Comprehensive Assessment

The IMS Comprehensive Assessment covers areas not addressed by the Quick Look, particularly

the last three GASP tenets. The IMS Comprehensive Assessment also explores additional

aspects of the first five GASP tenets, reaching beyond simple observations to determine the root

cause and preclusive actions appropriate to prevent like errors, or to recommend / enforce

improvements in future schedules.

Page 140: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

140 Version 4.0, 21 September 2012

In addition to more detailed schedule health assessment tests, the IMS Comprehensive Assessment

investigates project schedule performance issues. A number of tests and specific measurements

within those tests seek to focus on areas and causes of poor project performance. The Quick Look is

predominately a schedule health assessment of a single deliverable IMS. The IMS Comprehensive

Assessment may examine multiple submissions of an IMS over time to explore performance trends.

Performance results may indicate the thoroughness of the IMS planning and the ability of the IMS to

be an effective management tool.

Reasons for an IMS Comprehensive Assessment

There are several reasons the Air Force requires an IMS Comprehensive Assessment:

1. The program may be approaching a major decision point and the credibility and

effectiveness of the schedule to be used in that decision may need to be verified.

2. The IMS has not been useful in predicting future performance, rather only recording slips

and delays as they occur. Thus, the IMS’s use as a predictive tool needs to be evaluated.

3. A Schedule Risk Assessment (SRA) is pending and there is a need to ensure the IMS is

ready and demonstrates “fitness” for the simulation and the ability to project completion

dates and their probabilities.

4. Significant discrepancies were discovered during a Quick Look and the program

leadership needs to identify root cause(s) and preclusive action(s) through a more detailed

and focused analysis.

5. The program leadership desires to complete an assessment of the IMS against the eight

GASP tenets to gain assurance of its alignment with program artifacts, its predictive

capabilities, and maturity for use as a decision making tool.

6. There is a need to evaluate project schedule performance and schedule performance

trends.

7. Program leadership has surfaced areas of concern regarding the program schedule and

needs to determine if corrective actions are necessary.

8. There is a need to evaluate subcontractor schedule integration.

9. Program leadership wants to understand potential schedule risks and opportunities not

previously identified.

Expertise Required for the IMS Comprehensive Assessment

The IMS Comprehensive Assessment should be performed by experienced schedulers or senior

schedule analysts. The process for IMS Comprehensive Assessments has been written with the

assumption that scheduling subject matter experts (SMEs) are leading the effort. The IMS

Comprehensive Assessment will likely require participation on the part of the contractor to

include program SMEs.

A number of the tests in the IMS Comprehensive Assessment require detailed knowledge of the

applicable scheduling software and the use of third party schedule analysis tools (such as

Steelray Project Analyzer or Acumen Fuse). Attempting many of the tests without these tools

can be very labor intensive. Later discussion explains tailoring of the IMS Comprehensive

Assessment.

Page 141: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

141 Version 4.0, 21 September 2012

Tailoring Tests / Activities in the IMS Comprehensive Assessment

There are a multitude of tests and investigative activities that can be performed in an IMS

Comprehensive Assessment. The tests and activities selected depend upon the results of the

Quick Look and the reason for conducting the Comprehensive Assessment. Having selected the

applicable tests, the scheduler also validates the criteria to apply for each test. The scheduler

documents the tests selected and the rationale for each criterion for inclusion in the assessment

report.

The first five GASP tenets are examined in the Quick Look. If only minor discrepancies were

discovered during the Quick Look in a particular GASP tenet, the scheduler may elect to not

pursue any additional tests in that GASP tenet. The IMS Comprehensive Assessment Test /

Activity List is organized by GASP tenet to facilitate such tailoring.

The PMO and scheduler discuss the next steps where the Quick Look results did not meet the

expected standard or guideline. As an example, if the percent of tasks without logic is twice the

standard / guideline, the scheduler may elect to review: (1) the corrections performed after the

Quick Look to verify they were appropriate, (2) any remaining tasks without logic to determine

if potential critical tasks are missing from the program critical path, and (3) assure other tasks

have appropriate logic. The scheduler may elect to document the test plan in a spreadsheet and

include the following information:

Test Selected

Performance Expected

Test Results

Recommendation for further investigation or recommended corrective actions

The scheduler performs all the selected tests. As results are feeding back from the tests, the

scheduler may elect to include additional tests where doing so would help identify root causes.

After the tests are complete, the scheduler begins to analyze the results. For each deficient result,

its impact on the ability to execute the program is determined and included in the assessment

report.

Steps in the IMS Comprehensive Assessment

The chart below shows the key steps in the IMS Comprehensive Assessment, with detail for each

step following the chart.

Page 142: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

142 Version 4.0, 21 September 2012

Perform IMS

Quick LookUpdate IMS

Go-No Go

Decision

Assign

Assessment

Lead

Scope

Assessment

Build

Assessment

Schedule

Issue Data

Call

PMO Gathers

Data

Contractor

Participation

PMO

Participation

Document

Assessment

Results

Corrective and Preclusive Actions

Results

Meeting

IMS Comprehensive Assessment Steps

The following are the key IMS Comprehensive Assessment steps, beginning with the Quick

Look:

Perform IMS Quick Look

If the Quick Look Assessment was not performed earlier and the PMO is moving directly into a

Comprehensive Assessment, a Quick Look will need to be performed. The results of the Quick

Look Assessment are provided to the contractor so that the IMS may be improved. It is

imperative that the PMO communicate contractor commitment and progress toward IMS

improvements with the scheduler performing the IMS Comprehensive Assessment to help

determine test and criterion decisions.

Update the IMS

After the contractor improves the IMS based on the Quick Look results, the PMO provides the

IMS to the scheduler responsible for the assessment.

Go-No Go Decision

Upon satisfactorily concluding the IMS Quick Look Assessment, the Go-No Go decision for the

Comprehensive Assessment is made. If the Quick Look revealed significant issues with the IMS

and those issues have not been corrected, the driving reason for proceeding with a

Comprehensive Assessment may be to discover root causes or complete the remaining GASP

tenet checks. If the IMS is still in an unacceptable condition, the PMO may elect to delay the

Comprehensive Assessment.

If the IMS has been adequately corrected, the Comprehensive Assessment can begin.

Page 143: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

143 Version 4.0, 21 September 2012

Assign Assessment Lead

Typically the scheduler is assigned as the IMS Comprehensive Assessment Lead and manages

data requirements/delivery and PMO SME participation, as required.

Scope Assessment

Identify specific tests and checks to perform for the IMS Comprehensive Assessment. Include

the previous documents needed for trend analysis.

Build Assessment Schedule

PMO and scheduler decide on a plan and timeline for the IMS Comprehensive Assessment and

consider resource availability.

Issue Data Call

The PMO sends correspondence to the contractor, including the data call, explaining the nature

of the assessment and setting expectations. The data call items will be tailored to the agreed upon

scope of the Comprehensive Assessment. The PMO requests specific, non-CDRL data from the

contractor based on the data call list. Previous reporting period data may be required to facilitate

trends analysis. The data call items may include, but are not limited to:

WBS and WBS Dictionary

Integrated Master Plan

Integrated Master Schedule (CDRL deliverable)

IMS file used to perform SRA

Dollarized Responsibility Assignment Matrix (RAM)

Subcontract Management Plan

Subcontractor schedules (if applicable)

Control Account Plan (CAP)

Master Phasing Schedule / Milestone Chart / Top Tier Program Schedule

IMS Supplemental Guidance (or IMS Basis and Assumptions (B&A), Data Dictionary /

IMS Field Mapping

Program Monthly Calendar; or “Business Rhythm” (if separate document)

Desktop procedures, if not included in the IMS supplemental guidance

Staffing Profile (including previous periods as identified)

Quantifiable Backup Data (QBD) for Percent Complete EVT work packages

Resource Histogram (including previous periods as identified)

Latest Estimate at Completion (EAC) / Latest Revised Estimate (LRE)

Latest Monthly IMS Analysis Narrative

Applicable Variance Analysis Reports (Format 5)

Latest Schedule Health Assessment Results

Program Budget Logs

Baseline Change Logs

Contractor Performance Reports

Statement of Work (SOW), Statement of Objectives (SOO), Performance Work

Statement (PWS)

System Engineering Management Plan (SEMP)

Basis of Estimate (BOE)

Page 144: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

144 Version 4.0, 21 September 2012

Work Authorization Documents (WAD)

GFE / GFI Lists

Risk and Opportunities Management Plan (ROMP)

Risk Register

EVM System Description (EVMSD)

Additional program specific data artifacts as identified by the scheduler

Note: Items that are contract deliverables (CDRL Items) need not be included on data call requests.

PMO Gathers Data

The scheduler gathers PMO information including System Metric and Reporting Tool (SMART)

data.

Contractor / PMO Participation

Contractor provides data requested in the data call to the PMO. Depending upon the roles and

responsibilities agreed to when the Comprehensive Assessment plan was completed, a number of

coordination meetings between the contractor and PMO may be appropriate. Establish the

required amount and frequency of telephone or video conference calls between the contractor

and PMO based on the assessment scope and schedule. Depending upon discoveries during the

IMS Comprehensive Assessment, additional data may need to be requested from the contractor.

These meetings will review status and track new and existing action items. The government

scheduler leads these meetings.

Document Assessment Results

Develop the report that includes the assessment results and recommended actions for any issues.

Provide this to the Government PMO.

Results Meeting

A Results Meeting is conducted with the PMO and the contractor to present the outcome of the

IMS Comprehensive Assessment.

Corrective and Preclusive Actions

Based on the results of the assessment, the PMO and contractor will agree on a corrective action

plan.

IMS Comprehensive Assessment Test / Activity List

The information on the following pages shows the possible activities in a Comprehensive

Assessment. Tests that were performed in the predecessor Quick Look are cross referenced.

Complete Tenet Comprehensive Assessment Activities

Baseline Durations Greater than 2 Months

Quick Look Assessment Cross Reference: Test 1

Rationale: Smaller task durations provide granularity for precise performance measurement.

Activity: Review justifications for baseline durations over 44 working days. Check forecast

durations as a reflection of project performance. Difference between the number of tasks with

over 44 forecast working days and the number of tasks with over 44 baseline working days could

Page 145: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

145 Version 4.0, 21 September 2012

be an indicator of project schedule performance issues. Find similar tasks (by portion of task

name) and compare baseline / forecast duration deltas. For MSP schedules check that baseline

duration is the calculated difference between baseline finish and baseline start.

Tip: Justification or explanation may be in task notes or in justification code fields. Check

the IMS Basis and Assumptions (B&A) or IMS Supplemental Guidance documents for

identified justification codes.

Forecast Durations Greater than 2 Months

Quick Look Assessment Cross Reference: Test 2

Rationale: Smaller task durations provide granularity for precise performance measurement.

Activity: Review justifications for forecast durations over 44 working days. Check forecast

durations as a reflection of project performance. The difference between the number of tasks

with over 44 forecast working days and the number of tasks with over 44 baseline working days

could be an indicator of project schedule performance issues. Compare baseline duration to

forecast duration. If forecast durations break threshold but baseline does not, schedule

performance issues may be indicated and corrective action may be necessary to respond to

schedule variance.

Tip: Justification or explanation may be in task notes or in justification code fields. Check

the IMS B&A or IMS Supplemental Guidance documents for identified justification codes.

Forecast Durations Greater than 2 months in a 3 month look ahead

Quick Look Assessment Cross Reference: Test 3

Rationale: Near-term shorter duration tasks reflect clarity and provide granularity for precise

performance measurement.

Activity: Review justifications for forecast durations over 44 working days. Examine

justification codes if used to rationalize durations. If duration analysis shows more near-term

deltas compared to future term, this may indicate lack of attention for reflecting accurate task

information. Compare baseline duration to forecast duration. If forecast durations break

threshold but baseline does not, schedule performance issues may be indicated and corrective

action may be necessary to respond to schedule variance.

Tip: Justification or explanation may be in task notes or in justification code fields. Check

the IMS B&A or IMS Supplemental Guidance documents for identified justification codes.

Estimated Durations

Quick Look Assessment Cross Reference: Test 4

Rationale: Tasks with estimated durations may indicate incomplete planning.

Page 146: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

146 Version 4.0, 21 September 2012

Activity: Review tasks with estimated durations. Determine reason for estimated durations and

plans to update estimates.

Tip: Estimated durations may occur because one day is the MSP default duration and

schedulers may not reenter the duration to remove the estimated duration. A scheduler may

turn off the question mark in the duration field that flags estimated durations. In Open Plan

Professional, estimated task durations have durations of zero and are not milestones.

Missing Baseline Dates and Baseline Duration

Quick Look Assessment Cross Reference: Test 5 and 6

Rationale: The IMS must be baselined to measure performance.

Activity: Check that baseline dates and baseline durations agree, as it is possible to manually

enter baseline dates in MSP. Baseline durations do not need to be checked when using Open

Plan Professional. Perform schedule vertical integration checks to ensure that summary task

baseline dates reflect subordinate task baseline dates (See Traceable Confirm Vertical

Integration).

Tip: MSP has a function "ProjDateDiff" that may be used to calculate the working day

difference between baseline start and finish dates. ProjDateDiff works properly for

contiguous tasks. Several third party schedule analysis tools check for vertical schedule

integration.

Cross Reference Fields

Quick Look Assessment Cross Reference: Test 7

Rationale: Data cross reference fields should exist and be populated. Data fields required:

CAM, IMP, WBS, SOW, EVT/WP/PPkg, and CA. (OBS/IPT also recommended but not

required). Cross references ensure all artifacts track to each other and enable better management.

Activity: The Quick Look checked for the presence of the cross reference fields and that they

were appropriately populated. This check examines the cross reference fields for correct data.

Review the IMS B&A or IMS Supplemental Guidance Documents to determine user defined

field use. Check that user defined field entries are complete and valid. Check Baseline Change

documents against the IMS to ensure that changes are reflected in cross reference fields. This

may also include a check for inappropriately or inconsistently populated fields, such as numbers

in the CAM name field or names in the WBS field instead of numbers (See IMP Cross Reference

Check below for specific procedures when checking IMP cross references.)

Tip: Sorts and auto filter selections can assist with checks to these other documents. For

example, an auto filtered view of the SOW cross reference field may be compared to the

table of contents of the SOW to ensure all SOW paragraphs are represented in the IMS.

Duplicate / Blank Names

Quick Look Assessment Cross Reference: Test 8

Page 147: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

147 Version 4.0, 21 September 2012

Rationale: Duplicate or blank task names make schedule progress reporting and analysis

difficult. It makes the use of filters and groupings impractical.

Activity: Check that IMS task nomenclature is organized. Action verbs described in the IMP

should be used for each task where possible. IMP tasks should be written in past tense. IMS

tasks should be written in present tense and are most effective when they begin with a present-

tense action verb. Check that task descriptions are written so the exit criteria for task completion

are clear. Task names should describe the scope in such a manner that clearly defines the intent,

such as “Analyze Flight Survivability Test Data. Check that summary tasks are not needed to

understand task meaning.

Tip: For large schedules, a filter for action verbs may be created and used to find exceptions.

Be aware that searching on action verbs or words may result in detecting similar words used

in different context and should be vetted.

Review IMS Basis and Assumptions and IMS Supplemental Guidance Documents

Quick Look Assessment Cross Reference: N/A

Activity: Check that the IMS B&A or IMS Supplemental Guidance (or similar, but differently

named guidance) documents exist. Topics should include data dictionary, tools, tool settings, file

architecture, calendar use, baseline process, field mapping, treatment of handoffs and external

tasks, supplier schedule integration, health metrics, schedule maintenance, schedule techniques,

critical and driving path analysis, and statusing procedures. Review documents for accuracy and

completeness. Determine if the IMS complies with these documents. Note: Data dictionary may

be separate document.

Tip: The requirement for the IMS B&A exists in DI-MGMT-81650. The requirement for an

IMS Supplemental Guidance document may exists in the EVMSD, or contractor schedule

guidance processes and procedures.

Check GFE, GFI, and GFP in Contractor IMS

Quick Look Assessment Cross Reference: N/A

Activity: Check for GFE GFI and GFP in the IMS. Compare these to lists in contract and verify

GF material is contained in the IMS. Validate delivery dates if included in the contract. Search

task / activity name field and related custom fields in the IMS to verify Government furnished

items are included.

Tip: Create task description filters to determine if Government Furnished items are in the

IMS.

IMS Scope Check

Quick Look Assessment Cross Reference: N/A

Page 148: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

148 Version 4.0, 21 September 2012

Activity: Check for all scope in the IMS. The IMS is required by DI-MGMT- 81650 to represent

the entire contract. Compare Performance Work Statement (PWS), SOW, SOO, WBS, and

WAD to ensure all scope of work is addressed. Note: LOE activities may not be in the IMS,

check the IMS B&A or IMS Supplemental Guidance. Verify that subcontracted effort is included

in the IMS.

Tip: Use sorts or auto filters on cross reference fields to compare to other documents such

as SOW or CWBS.

Review Earned Value Techniques

Quick Look Assessment Cross Reference: N/A

Activity:

Compare baseline durations to selected EVT. Look at relative percentages of each EVT type.

Examine "No EV" work packages to ensure EVTs are correctly applied. Check that EVTs

used are consistent with those identified in EVMSD. Ensure planning packages are

converted to work packages in appropriate look ahead period, according to EVMSD /

Program Instructions (See Transparent Verify Rolling Wave Planning).

Tip: Create a table or view that shows baseline dates or baseline duration and the EVT field.

Use group by EVTs to determine relative percent of each type of EVT. Alternate: Group by

durations to align with the EVT.

Evaluate Essential Subcontracted or External Work

Quick Look Assessment Cross Reference: N/A

Activity: Check that subcontracted work that is within scope of the contract is integrated yet

distinguishable from internal work. Check that key external work is reflected in the IMS as

milestones (at a minimum), representative tasks, or entire external schedules.

Tip: Major subcontracted efforts are often included as separate control accounts. External

work may be reflected as feed-in milestones. Check the field mapping to determine if

specific codes are used to identify subcontracted and external work and check IMS

Supplemental Guidance for identified subcontracted / external approach.

IMP Cross Reference

Quick Look Assessment Cross Reference: N/A

Activity: Check for IMP cross references. The IMS may have IMP Events, Accomplishments

and Criteria duplicated in the IMS. Other IMS’s may have a cross reference field that uses IMP

coding structure to reflect Events, Accomplishments, and Criteria. Determine procedure from the

IMS B&A or IMS Supplemental Guidance document. Check IMP content or cross references for

accuracy and completeness.

Page 149: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

149 Version 4.0, 21 September 2012

Program Milestones

Quick Look Assessment Cross Reference: N/A

Activity: Evaluate if Program Milestones are clearly identified in IMS. Confirm milestones'

name, phase, and baseline dates align with contractual information. Check program milestones

that have Deadlines / Targets reflecting schedule margin approach as identified in IMS

Supplemental Guidance.

Tip: Program milestones appear in a view or table enabling easy identification and are

aligned with the appropriate phase and PoP. Coordinate with Traceable Vertical Integration.

Clarification may be provided in the IMS B&A or IMS Supplemental Guidance.

Missing Logic

Quick Look Assessment Cross Reference: Test 9

Rationale: Missing logic calls into question schedule soundness and is essential to critical path

(CP) validity.

Activity: Review all tasks that have missing logic. Review tasks with missing logic to consider

if potential exists for inclusion in CP. Review the paths between major milestones with

appropriate Government SMEs. Verify that the sequence of activities reflects the systems

engineering and integration approach defined in the Systems Engineering Plan or Systems

Engineering Management Plan.

Tip: Filter for tasks with missing logic and then browse the tasks, looking for tasks that need

predecessors or successors.

Traceable Tenet Comprehensive Assessment Activities

Summary Logic (& Constraints / Deadlines)

Quick Look Assessment Cross Reference: Test 10.

Rationale: Applying logic or constraints to summary tasks potentially obscures impacts to

detailed tasks and may hinder analysis.

Activity: Test for the ratio of summary tasks to effort tasks. A schedule with at least one

summary for every 20 tasks will be easier to collapse for display and analysis. Similarly, having

too many summary tasks will make the schedule difficult to read and understand.

Tip: Use schedule application or third party schedule analysis tools to count tasks and

determine this ratio. Once the Quick Look summary task with logic check has been

evaluated, then test for the ratio of summary to effort tasks.

Finish-to-Start Relationships

Quick Look Assessment Cross Reference: Test 11

Page 150: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

150 Version 4.0, 21 September 2012

Rationale: FS relationships avoid scheduling tasks in parallel. Other relationships can cause

resource conflicts.

Activity: The Quick Look test examined the percent of FS relationships. Review the use of other

than FS relationships. The rationale for use of other than FS relationships should be documented

in task notes. Check that the rationale justifies the use of the other relationships. Note if the use

of other predecessors or successors would increase the use of FS relationships.

Tip: Filter on the predecessor field in MSP or use relationship views in Open Plan

Professional to see the other relationships.

Start-to-Start or Start-to Finish Successor w/o also Finish to Start or Finish-to-Finish

Quick Look Assessment Cross Reference: Test 12

Rationale: Tasks with only SS or SF successor relationships need to have at least one FS or FF

successor to another task or the total float values will be meaningless.

Activity: If exceptions from the Quick Look test still exist in a schedule, determine impact of

not having finish successors on driving or critical paths. If corrected, check that added FS

successors are appropriate for the activity. Check that all discrete tasks in the IMS have at least

one non-FF predecessor.

Tip: Investigate the successors of the SS successor to determine if a similar FS or FF

relationship for the focus task is appropriate.

Total Float > 3 Months

Quick Look Assessment Cross Reference: Test 13

Rationale: Suggests tasks starting too early or missing an identified successor. Excessive total

float could also reflect missing scope in the IMS.

Activity: Determine if cause of high total float is bad or missing logic. If logic is sound,

determine reason for starting work early. Discuss specific high total float examples with PMO

and contractor. Check for acceptable high total float conditions documentation and that total

procedures for designating these conditions are contained in the IMS B&A or IMS Supplemental

Guidance documents.

Tip: Create a table that shows total float sorted high to low and displays successor tasks.

Browse this view to identify potential problems. Isolate the tasks that do not have zero free

float, eliminating the filtered tasks in a chain, to focus on the end task in the chain.

Resolving a missing or applying a more appropriate successor to the end task may correct

the high total float for all tasks in the chain.

SNETs / FNETs Beyond 3 Month Look Ahead

Quick Look Assessment Cross Reference: Test 14

Page 151: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

151 Version 4.0, 21 September 2012

Rationale: Some use of constraints to refine dates due to resource availability is acceptable.

Constraints should be used less frequently beyond a 3 month Look Ahead period than near-term,

as predicting resource availability is not practical in the far-term. Excessive use inhibits critical

path analysis.

Activity: Discuss the use of constraints delaying the forecast start / finish, rather than logic, with

the contractor scheduler . Examine the rationale for constraint use provided in task notes.

Constraint use should reflect known dates. Since resource availability is often a primary reason

for constraint use, conduct a bow wave analysis to determine if adequate resources exist.

Determine if constraints delaying the forecast start / finish are creating a bow wave of work.

Check for the percentage of milestones using constraints affecting the forecast dates.

Tip: Consider using third party schedule analysis software to test for tasks riding the status

date and for bow wave analysis.

SNETs / FNETs within a 3 Month Look Ahead

Quick Look Assessment Cross Reference: Test 15

Rationale: Some use of constraints to refine dates due to resource availability is acceptable.

Excessive use inhibits critical path analysis.

Activity: Discuss the use of constraints delaying the forecast start / finish, rather than logic, with

the contractor scheduler. Examine the rationale for constraint use provided in task notes.

Constraint use should reflect known dates. Resource availability is often a primary reason for

constraint use. Check for the percentage of milestones using constraints.

Tip: Consider using third party schedule analysis software to test for tasks riding the status

date and for bow wave analysis. Sort unstarted tasks by start and check for SNET dates and

verify no other appropriate predecessors exist.

Supplemental Logic Metrics

Quick Look Assessment Cross Reference: N/A

Activity: There are a number of checks for sound logic in the schedule. The logic density

measures the average number of relationship per activity or task. Greater than an average of

three relationships is reason for concern. Milestone ratio is the ratio of milestones to effort tasks.

There should be at least one milestone for each 20 tasks. Logic hot spots are tasks with a large

number of predecessors or successors or both. When logic hot spots are on the critical path they

should be of significant concern.

Tip: Consider using third party schedule analysis software to perform these logic tests. A

counting capability combined with filters, will enable the schedule to generate these metrics.

Compare Deadlines (Constraints / Targets) and Milestone Baseline Dates

Quick Look Assessment Cross Reference: N/A

Page 152: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

152 Version 4.0, 21 September 2012

Activity: Check for Major Program Milestones with later deadline dates than baseline dates

without comments. Review comments and evaluate rationale for later deadlines.

Tip: This test can be performed with a table or view containing milestones, deadlines or

target dates, and baseline finish dates. A second alternative is to create a flag field with a

formula that detects the later deadlines.

Compare IMS and Master Phasing Schedule

Quick Look Assessment Cross Reference: N/A

Activity: Check that the contractor IMS matches the Government Master Schedule, Master

Phasing Schedules, Program Roadmap or other presentation material provided for the

Comprehensive IMS Assessment. Check that summarized tasks and program milestones align

with the Master Phasing Schedule. Check that IMS dates match dates in all EVMS artifacts.

Tip: Third party software, such as Kidasa Milestones Professional, will electronically link

subordinate schedules to Master or Program Level schedules. Their use will make this test

more efficient.

Deliverables listed in the IMS

Quick Look Assessment Cross Reference: N/A

Activity: Check to ensure all contract deliverables (CDRL or SDRL numbers) are listed or

referenced in the IMS. Deliverables may be referenced in task descriptions, a user defined field

or as a milestone. Cross check deliverables in the IMS to WBS Dictionary, SOW, and CDRL.

Tip: Check the IMS B&A or IMS Supplement Guidance documents to determine how

deliverables (CDRL Items) are to be shown in the IMS.

Verify Horizontal Integration

Quick Look Assessment Cross Reference: N/A

Activity: The schedule should reflect the handoffs between OBS elements through milestones

and network logic. Determining that horizontal integration is adequate may involve the use of

experts SMEs. Handoffs between product development, integration organizations, and test

organizations are obvious points to check. Check horizontal integration between control accounts

through the use of milestones.

Tip: The check for horizontal integration can often be performed as part of the technical risk

review by having the technical panel review the IMS as part of risk identification. Group

tasks by performing IPT. Then search task description for other IPT names. Ensure there is a

task logic dependency that is consistent with the handoff described by the task. Knowing the

basic relationship between systems engineering, design, development, integration and test

organizations is essential when performing this test. Group tasks by WBS element and

ensure the WBS product is transitioning from one IPT to another as appropriate.

Page 153: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

153 Version 4.0, 21 September 2012

Confirm Vertical Integration

Quick Look Assessment Cross Reference: N/A

Activity: Vertical schedule integration was checked under the Complete Tenet (See QL Test 5 &

6). Typically an IMS contains a top level tier of milestones that represent the overall program at

a high level. Subordinate parts of the schedule should trace to this top level milestone set.

Perform a predecessor logic trace from a program milestone or IMP Event (if included in IMS)

and determine those activities, identified as supporting the milestone / event through task

description or related code field entries that are not logically tied to the end-point. Consider

using a User Defined Field (UDF) for identifying the traced items while performing subsequent

analysis.

Tip: Provide Government SMEs with a relationship trace of activities to milestones in their

area of interest. They can then determine if all essential activities are included in the path to

the milestones.

Transparent Tenet Comprehensive Assessment Activities

Tasks with Leads

Quick Look Assessment Cross Reference: Test 16

Rationale: Difficult to understand and manage time overlaps.

Activity: Evaluate the documented reason for the leads. Reasons for leads should be documented

in task notes and possibly in the IMS Supplemental Guidance to describe the application of this

technique. Anticipate acceptable one-day leads to model same day finish to start condition where

appropriate.

Tasks with Lags

Quick Look Assessment Cross Reference: Test 17

Rationale: Lags can adversely affect analysis and time gaps are difficult to understand and

manage. Lag use is permitted but should not be excessive.

Activity: Evaluate the documented reason for the lags and determine if soft constraints would

better reflect the relationship.

Tip: Combine a filter that shows task with lags with a table that shows task notes. If

justification codes are used for lags, display that field and filter on tasks with lag. Investigate

IMS Supplemental Guidance & data dictionary for justification / exclusion codes in user

defined fields. Look for inconsistencies where codes, notes / rationale, and technique

applications are not aligned.

Constraints w/o Rationale

Quick Look Assessment Cross Reference: Test 18

Page 154: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

154 Version 4.0, 21 September 2012

Rationale: Use of any schedule driver other than FS relationships needs to be documented to aid

understanding of the IMS.

Activity: Review all the notes providing rationale for constraints looking for meaningful

explanations. If justification codes or fields are provided, review those as well. Investigate IMS

Supplemental Guidance & data dictionary for justification / exclusion codes in user defined

fields mentioned. Look for inconsistencies where codes, notes / rationale, and technique

applications are not aligned.

Tip: Combine a filter that shows task with constraints with a table that shows task notes.

Lead / Lag w/o Rationale

Quick Look Assessment Cross Reference: Test 19

Rationale: Use of any schedule driver other than FS relationships needs to be documented to aid

understanding of the IMS.

Activity: Review all the notes providing rationale for leads and lags looking for meaningful

explanations. If justification codes or fields are provided, review those as well. Investigate IMS

Supplemental Guidance & data dictionary for justification / exclusion codes in user defined

fields mentioned. Look for inconsistencies where codes, notes / rationale, and technique

application are not aligned.

Tip: Combine a filter that shows tasks with leads and lags with a table that shows task notes.

Hard Constraints

Quick Look Assessment Cross Reference: Test 20

Rationale: Hard constraints prevent tasks from reflecting predecessor impacts.

Activity: Evaluate rationale for Hard Constraint use. If hard constraints remain in the schedule,

determine the impact on the critical path and possible SRA (if performed with constraints in

place).

Tip: Present the list of hard constraints to the PM and ask if they are valid and necessary,

and for reasons for not replacing these with more appropriate constraints and targets that

permit forecast impacts while reflecting accurate total float values.

Excessive Lags

Quick Look Assessment Cross Reference: Test 21

Rationale: Excessive (greater than 20 working days) lags cannot be statused and tend to hide

detail in schedules.

Activity: Excessive lags may result from poor dependency selection. Examine lags to determine

if alternate relationships could reduce the size of the lag. Also ask the contractor scheduler to

Page 155: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

155 Version 4.0, 21 September 2012

consider replacing lags with documented "no earlier than" type constraints to model the

anticipated start / finish.

Tip: Task notes may provide insight into the selection of the dependent task relationship and

size of the lag, and what constitutes the wait time.

Perform Critical Path (CP) and Driving Path (DP) Identification and Analysis

Quick Look Assessment Cross Reference: N/A

Activity: Review contractor's method for identifying and analyzing critical and driving paths

(check the IMS B&A or IMS Supplemental Guidance documents). Compare CP of previous

IMSs to determine CP movement. Determine if there is a joint contractor / customer agreement

on the definition of "near critical" paths. Evaluate how CP information is distributed to managers

that are on the critical path or the near critical paths. Determine CP / DP & related near paths

where schedule was corrected for invalid status or if previously used hard constraints were

converted to see impact. Also, consider performing CP / DP where code fields or separately

produced CP / DP documents exist & lack documented methodology to identify CP /DP. Check

for multiple calendar use & potential CP / DP impacts. (See Controlled Calendars for additional

calendar use analysis)

Program Completion Trace Test

Quick Look Assessment Cross Reference: Test 26

Analyze Path to Program Completion: Perform a backward trace from program completion

and analyze the tasks on the path. Determine if tasks on the path are not crucial to completion or

if crucial tasks are missing from this path.

Tip: Some scheduling software contains extensive trace features. Others have limited trace

capability. Most third party schedule analysis applications have forward and backward trace

capabilities.

Perform IMS Comparison

Quick Look Assessment Cross Reference: N/A

Activity: Perform a comparison of recent IMS versions to determine if unauthorized changes

have been made. Unauthorized changes are schedule modifications of configuration control

elements without an approved baseline change document. Also, pay particular attention to

changes in forecast duration and modification of logic. These changes could disguise schedule

performance issues, especially where observing near-term schedule slips and encroaching on

overrunning program goal end dates. Verify total float trend erosion that is maintained later.

Identify any performance trends and unauthorized changes to the IMS. Examine situations where

progress was de-earned.

Tip: Most schedule applications have the capability to compare two versions of the same

schedule. Third party schedule analysis software also has this capability.

Page 156: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

156 Version 4.0, 21 September 2012

Verify Deadlines or Targets

Quick Look Assessment Cross Reference: N/A

Activity: Examine the use of deadlines or targets. Check deadline / target use as described in

B&A or IMS Supplement Guidance document, and if used as part of the schedule margin

approach. Evaluate how deadlines / targets are used and documented. Check the percentage of

deadline / target use.

Evaluate Schedule Health Assessment (SHA)

Quick Look Assessment Cross Reference: N/A

Schedule Health Assessment (SHA): Determine if contractor prepares and maintains schedule

health metrics. Some contractors have adopted the DCMA 14 Point Assessment as their default

metric. Evaluate quantity and quality of metrics, their use, and schedule health trends. If

possible, validate the contractor's metrics by running the same metrics using another application.

Determine if the contractor's metrics are altered by the use of justification codes. If so, run the

metrics with and without the justification codes to determine the impact of justifications.

Investigate how SHA results are reviewed and applied. Evaluate the SHA trend.

Tip: Several third party schedule analysis applications provide the DCMA 14 Point

Assessment.

Verify Rolling Wave Planning

Quick Look Assessment Cross Reference: N/A

Activity: The IMS may not be detailed planned through to program completion. Check the IMS

B&A and IMS Supplemental Guidance documents for rolling wave planning procedures. Verify

the IMS is consistent with these procedures. Check that only planning packages are changed

during the rolling wave planning process by referencing applicable change logs and determining

newly added tasks and related baseline information align with previous planning package tasks.

Tip: Use EVT field and baseline start field to isolate portions of the IMS that are applicable

to the next rolling wave of detailed planning.

Evaluate Risk and Risk Mitigation Identification

Quick Look Assessment Cross Reference: N/A

Activity: For the IMS to be an effective management tool, it should be closely integrated with

the program risk and opportunity management efforts. Risks with schedule impacts should be

identified in the schedule. A common technique is to establish a user defined field to reflect the

risk number, typically from the risk register or risk management plan. Risk mitigation activities

should also be included in the IMS. Check the IMS B&A or IMS Supplemental Guidance

document to ensure that procedures for risk and risk management integration with the IMS are

established.

Page 157: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

157 Version 4.0, 21 September 2012

Note: Risk Integration is also addressed in the Usable Tenet (See Usable Risk Integration

into the IMS).

Determine Treatment of Schedule Visibility Tasks (SVT)

Quick Look Assessment Cross Reference: N/A

Activity: Review the procedures for the use of SVTs in the IMS B&A or IMS Supplemental

Guidance Documents. Verify that SVTs are being used and managed in accordance with

documented procedures. If the SVTs are more than 10 percent of discrete tasks, determine SHA

metrics without SVTs. Verify that SVTs are not resource loaded. Verify that SVTs have criteria

for progress measurement and are statused.

Verify Program Performance Metrics

Quick Look Assessment Cross Reference: N/A

Activity: Determine if the contractor maintains schedule performance metrics. IMS schedule

performance metrics should be translated into PM reports. Evaluate schedule performance

reports, how they are used, and program trends. A combination of several performance metrics is

appropriate for most schedules. Overall project metrics are important to determine current

performance, as well as predicting project completion. Lower tier performance metrics (task

level) are useful for determining potential problem areas and predicting specific organization or

WBS element performance. Trend analysis is important at both the project and WBS/OBS level

to determining if corrective and preventative measures are effective.

Tip: Consider using Air Force IMS Performance Trends Tool to determine schedule

performance trends.

Statused Tenet Comprehensive Assessment Activities

Invalid Forecast Dates

Quick Look Assessment Cross Reference: Test 22

Rationale: Includes improper status and in-progress tasks with status left of Timenow. The IMS

is not useful for predictions unless accurately statused.

Activity: Invalid forecast dates identified in the Quick Look should be corrected. Review the

process for statusing the schedule. Determine if tasks are being individually statused by CAMs

or Work Package Managers (preferred approach). The process should be documented in the IMS

B&A or IMS Supplemental Guidance documents. Determine if the process is being followed

through investigative follow-up with the contractor scheduler. Check configuration control for

meaningful forecast information. Determine if the contractor scheduler is performing checks

when statusing is complete. Count the number of tasks that have not been statused to the status

date. Then determine the impact on TF and significant milestones if the IMS was statused to

Timenow.

Page 158: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

158 Version 4.0, 21 September 2012

Invalid Actual Dates

Quick Look Assessment Cross Reference: Test 23

Rationale: Actual start or finish dates in the future. Invalid actual date entries imply that the

schedule is not properly statused.

Activity: Invalid actual dates should be corrected. Review the process for statusing the schedule.

The process should be documented in the IMS B&A or IMS Supplemental Guidance documents.

Determine if the process is being followed through investigative follow-up with the contractor

scheduler. Check configuration control of actual information. Determine if the contractor

scheduler is performing checks for this when statusing is complete.

Out-of-Sequence (OOS) Status Conditions

Quick Look Assessment Cross Reference: Test 24

Rationale: Any task with an OOS condition renders IMS predicting capabilities unreliable.

Activity: The Quick Look determines if OOS conditions exist. If found, determine cause by

interviewing contractor scheduler. OOS status reflects incorrect logic or not operating to plan.

Sometimes execution strategies are changed after the schedule is developed and the logic needs

to be updated. If the logic is incorrect, it can be updated as part of the statusing process.

Determine if a process exists to detect and prevent OOS conditions.

Check Tasks Riding the Status Date

Quick Look Assessment Cross Reference: N/A

Activity: Check to see if a number of forecast start dates are close to or equal to the status date.

This may mean that tasks are being pushed out and a bow wave of effort is being created.

Tip: Create a filter to list the detailed tasks close to the status date. Detected tasks may be

more significant if their predecessors are completed (not driving) and if their baseline dates

are earlier than forecast dates (late tasks that may not have been addressed). Count the

number of tasks and compare to previous accounting period schedules to determine if a

negative trend is developing. An alternative is to compare the last two accounting period

schedules and review all changes to forecast start dates.

Evaluate Logic Changes

Quick Look Assessment Cross Reference: N/A

Activity: Most contractors do not baseline or configuration control changes to schedule logic.

Using a comparison tool, check the amount of logic changed from one IMS delivery to the next.

Logic changes can disguise schedule delays. Discuss any significant logic changes with the

contractor scheduler and ensure current logic reflects accurate execution strategy.

Page 159: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

159 Version 4.0, 21 September 2012

Tip: Most scheduling application and third party schedule analysis software have a file

comparison function. Acknowledge that completed or in-progress tasks may reflect logic

changes made to enable execution to occur without inducing OOS status conditions.

Note: This activity can also be performed as part of Transparent or Traceable

tenets.

Determine Subcontractor / Supplier Schedule Status

Quick Look Assessment Cross Reference: N/A

Activity: Determine if subcontractor and supplier schedules are being entered into the IMS with

the same status date. Spot check subcontractor schedules to the contractor's IMS. Check that

subcontractor’s EVTs are consistent with contractor EVTs when data is directly transferred. If

subcontractor schedules are reflected as milestones in the contractor schedule, ensure that

subcontractor predecessors are 100% complete before marking milestones as 100% complete.

Check that subcontractor schedules have Quantifiable Backup Data where appropriate for

percent complete EVT.

Validate Remaining Durations

Quick Look Assessment Cross Reference: N/A

Activity: Verify that remaining durations are accurate. If appropriate, have them reviewed by

Government SMEs. The IMS must reflect sound estimates to completion. Spot check against

history by performing a duration variance analysis. Overly optimistic forecasting can invalidate

project completion predictions. Check resource loading against remaining duration to determine

achievable finish dates (for resource loaded schedules).

Tip: The Air Force IMS Performance Trends Tool has a duration variance analysis function

that will compare task actual and forecast durations to task baseline durations.

Predictive Tenet Comprehensive Assessment Activities

Push Forward Test

Quick Look Assessment Cross Reference: Test 25

Rationale: Used to assess logic network integrity to program completion.

Activity: If the Push Forward Test failed, perform a forward trace on each of the selected tasks

for the test. Determine the cause for the failure. Check to see if documented scheduling practices

in the IMS B&A or IMS Supplemental Guidance documents should have caught the problem.

See also critical path (CP) and driving path (DP) Identification and Analysis test in Transparent

Tenet.

Tip: Some scheduling applications and most third party schedule analysis software have

forward trace capability.

Page 160: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

160 Version 4.0, 21 September 2012

Program Completion Trace Test

Quick Look Assessment Cross Reference: Test 26

Rationale: Determines the percentage of non-LOE incomplete tasks logically tied to program

completion.

Activity: A high percent of tasks should be on the path to program completion. Mark or flag the

tasks that were on the Program Completion Trace. Filter for the unmarked tasks to determine if

any should be on the path to program completion. Discuss any questionable tasks with the

contractor scheduler.

No LOE in Path to Project Completion

Quick Look Assessment Cross Reference: Test 27

Rationale: LOE tasks should not be linked to other discrete tasks or be in the driving path to

project completion.

Activity: Examine the treatment of LOE tasks as described in the IMS B&A or IMS

Supplemental Guidance documents. Some contractors exclude LOE from the IMS. Others do not

apply logic to LOE. Check to see if the approach for LOE is consistently applied. If not,

determine the cause.

Appropriate Constraints Applied to Endpoint Milestones

Quick Look Assessment Cross Reference: Test 28

Rationale: Should be used for important endpoint milestones to generate meaningful total float

values and permit predecessor impacts. Investigate for issues.

Activity: Determine the impact these constraints have on critical path and driving path analysis

and in conducting SRAs. Using a performance trends tool, examine the total float changes on

these endpoint milestones.

Critical Path Length Index (CPLI)

Quick Look Assessment Cross Reference: Test 29

Rationale: Project performance indication of the ability to finish the project on time.

Activity: Examine the trend of CPLI over prior IMS reporting periods. Determine if contractor

is taking action to correct adverse trends.

Check Redundant Logic

Quick Look Assessment Cross Reference: N/A

Page 161: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

161 Version 4.0, 21 September 2012

Activity: Check for redundant logic. Redundant logic can be checked with a network view of the

schedule. Redundant logic, if removed, would not result in a change to task dates. Redundant

logic adds confusion to understanding and analyzing the network.

Tip: Several third party schedule analysis applications provide automated tests for redundant

logic.

Determine Parallel Critical Paths

Quick Look Assessment Cross Reference: N/A

Activity: Parallel critical paths are two separate paths of logically tied tasks with equal impact

determining the end date, where if one path was removed, no change in the end task dates would

occur. Check for existence of parallel critical paths and determine if they are appropriate,

through discussion with the contractor's scheduler.

Tip: Several third party schedule analysis applications provide automated tests for parallel

paths.

Evaluate Merge Points, Diverge Points, and Merge Hot Spots

Quick Look Assessment Cross Reference: N/A

Activity: Merge points are tasks with a large number of predecessors. Diverge points are tasks

with a large number of successors. Merge hot spots have both a high number of predecessors and

successors and may be on or nearly on the critical path. These points and hot spots pose higher

than normal schedule risk and should be closely monitored. Check to determine if the contractor

schedule assessment addresses these points and hot spots.

Tip: Merge / Diverge points and hot spots may be easily identified with third party schedule

analysis applications.

Perform Measures of Schedule Execution

Quick Look Assessment Cross Reference: N/A

Activity: The Quick Look provides one measure of overall project progress, the CPLI. If project

performance is less than desired, several tests or metrics may be applied to focus on the cause.

Since the number of metrics or tests in schedule analysis application is significant, it is best to

begin with a premise of the schedule execution issue and select the appropriate metric to validate

the issue and focus on a cause. For example, if tasks tend to be starting on schedule but taking

longer than baselined, a Baseline Hit Ratio or Current Execution Index may be appropriate.

Other measures of overall schedule execution include Baseline Execution Index, Forecast

Execution Index, Rate Charts, Schedule Performance Index, Earned Schedule, Project Schedule

Variance, Total Float, Total Float erosion, and Schedule Margin Burn-down.

Tip: Refer to the Specific IMS Comprehensive Assessment Techniques and Measurements

section narrative after this section for additional information. The PASEG provides several

suggested schedule execution metrics. Third party schedule analysis applications provide

Page 162: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

162 Version 4.0, 21 September 2012

additional capabilities. Review their offering when deciding on the appropriate metric for the

schedule issues.

Usable Tenet Comprehensive Assessment Activities

The Useable, Resourced and Controlled Tenets do not have any Quick Look cross references.

IMS Structure

Activity: Check that the IMS is logically organized to align with other documents. Check if the

IMS is WBS, OBS, or IMP based and consistent. If the IMS is WBS based, check for

consistency with MIL-STD-881C. Check that there are sufficient summary or hammock tasks.

Determine if the IMS structure is defined in the IMS B&A or IMS Supplemental Guidance

documents.

IMS Statistics and Performance Metrics

Activity: The IMS statistics reports contain quantified data about the schedule. IMS

Performance Metrics provides quantified data about schedule execution. Determine which

reports are developed from the IMS and how they are used to manage the program. Examine the

business rhythm for generating and distributing these reports. Examine the accuracy and the use

of the IMS related reports. Discuss statistics and performance metrics used with the contractor

scheduler.

Note: See related items on metrics in Transparent and Predictive Tenets.

Risk Integration into the IMS

Activity: Verify that identified risks are integrated into the IMS. Moderate and High Program

Risks that have a schedule impact or which can be identified with a task or work package, should

have a cross reference to the risk register. This can be accomplished with a user defined field,

typically referencing the risk ID number. Risk mitigation efforts containing assigned budget or

having potential schedule impact must be included in IMS. Cross check the risk register with the

IMS. Make sure that the risk register “as-of-date” is consistent with the IMS status date. The

check should include risk cross referencing, as well as a comparison of risk mitigation status.

Tip: Review the contractor Risk and Opportunities Management Plan, the IMS B&A and

IMS Supplemental Guidance documents to determine the process for integrating identified

risks into the IMS.

Schedule Hierarchy

Activity: On large complex programs, the Performance Measurement Baseline (PMB) may be

represented by a number of linked or inserted and linked schedules. Check the IMS B&A and

IMS Supplemental Guidance documents to see if multiple files are used to represent the IMS. If

Page 163: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

163 Version 4.0, 21 September 2012

multiple files are used, check to see their settings and status dates are consistent. Check the links

from schedule to schedule to ensure that correct linkages have been established.

IMS Usable for Decision Making

Activity: Through conversations with contractor IPT Leads and PM, determine what decisions

are made using the IMS. Determine if the IMS data used for these decisions is timely, accurate,

and summarized in IMS reports in a meaningful manner.

Adequacy of IMS for SRA

Activity: Check that the IMS is fundamentally sound enough to be used for a SRA. The

soundness for an IMS is defined in the Air Force SRA Process Document. Determine if a

previous SRA has been performed. If an SRA was performed, the IMS should contain the three

point values and distribution curves selected for the simulation. Review previously run SRA

results and analysis. Examine the schedule file that was used for the SRA to ensure that schedule

changes and SRA settings were consistent with the Air Force process.

Resourced Tenet Comprehensive Assessment Activities

The activities in this category assume that the IMS is a resource loaded schedule. Resource

loading may vary from contractor to contractor. Check the EVMSD, IMS B&A, or IMS

Supplemental Guidance document to determine resource loading standards for the IMS.

Resource Types and Categories

Activity: Resource types may include material, labor, and subcontractors. Resource categories

may include items such as engineers, technicians, and managers. Review the resource sheet or

equivalent in the schedule tool. Check the number and type of IMS resources. There should be

adequate types and categories to reflect the breadth of resources used on the project. For

example, a single resource category for engineering may not be adequate when electrical,

mechanical, and software subsystems are being developed. Determine if subcontractor resources

and material are included in the resource types and categories.

Tip: Scheduling applications have resource views or tables that show all the potential

resource types and categories that may be used in the schedule.

Complete Resource Loading

Activity: Check the percentage of detailed tasks containing resource assignments. All effort

work packages (excluding SVTs) in a resource loaded schedule should have resources assigned

consistent with the contractor's scheduling procedures. Check to ensure there is no mixing of

OBS elements in a single control account. Check that the resources are consistent with the task

description. Compare IMS resource loading to the RAM for appropriate OBS/WBS elements.

Page 164: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

164 Version 4.0, 21 September 2012

Tip: Filters may be used to isolate tasks that do not contain resources and to check for

proper resources. For example filter on "software" in the task name and use a task usage

view to see if software engineers are assigned to a task.

Resource Profiles / Histograms

Activity: Review any resource profiles prepared as part of schedule analysis or EVM reporting.

Check to ensure that resource profiles are reasonable. Ramp ups and phase outs should be

realistic and achievable. Crosscheck BOEs to IMS resource loading. Consider performing a task

density analysis to examine if in progress task loading and near term starts are reasonable. Check

that the resource profiles from the IMS are consistent with the staffing profiles presented in the

Contractor Performance Report and other management reports if applicable.

Resource Alignment with other EVMS Artifacts

Activity: Check that IMS resource assignments align with the CAP and addresses elements of

cost which should match to the baseline. Check the resource profiles or histograms to ensure

they are consistent with the ETC and EAC.

IMS with Resource Budgets

Activity: Some contractors include rates and budgets for resources. This provides the capability

for the IMS to generate the PMB for the program. If this is a requirement of the EVMSD or IMS

B&A documents, check the baseline totals for control accounts against the CAP or Cost System.

Tasks with Budgets

Activity: Observe tasks with budgets. For resource loaded schedules, effort tasks should have

some resources assigned. Match tasks, work packages, and control accounts to CAP and WAD.

Resource Changes

Activity: Compare Resource Baselines with Current Resource Forecasts. Where significant

differences occur, they should be documented in the task notes. Check with the contractor’s

scheduler for any significant undocumented changes.

Tip: Open a table or view showing baseline and forecast resources. Browse for significant

changes.

Resource Loading Consistent with Task Durations

Activity: Check that resource loaded tasks contain work (task description includes effort within

scope of contract). If the task description contains effort (a present tense action verb), the task

should be resource loaded, unless it is a SVT. Check for unrealistic duration and resource

Page 165: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

165 Version 4.0, 21 September 2012

combinations, such as a one day duration and 1000 labor hours. Check that milestones with

resources (such as material received) are shown as 1 day tasks.

Controlled Tenet Comprehensive Assessment Activities

Configuration Control

Activity: Verify configuration control of the IMS. The IMS is part of the PMB and requires

configuration control. Check the IMS B&A or IMS Supplemental Guidance document to

determine which elements of the IMS are baselined. Determine who has the authority for

changing the content of the IMS. Compare multiple IMS’s to identify items requiring baseline

configuration controls that changed without an authorizing related Baseline Change document.

Run a file compare application or routine to detect changes between IMS deliveries. Coordinate

this test with the Traceable Baseline History activity below.

Tip: Several file comparison routines require the user to specify the data elements to be

compared. Limit the comparison to baselined data elements such as Baseline Start, Baseline

Finish, Baseline Duration, and Baseline Work / Cost to speed the comparison routine. Check

the IMS B&A or IMS Supplemental Guidance documents for identified alternate baseline

field use.

Field Mapping

Activity: Check that IMS Field Mapping is maintained. All user defined fields must be defined

and documented. Customers may impose external requirements on user defined fields. Check the

contract for any special field mapping requirements. Review the Field Mapping document for

accuracy and completeness. If codes are used in any user defined fields, determine where the

codes are defined and check for their proper use. Maintain awareness of formula use in user

defined fields; should be identified in B&A or IMS Supplemental Guidance documents.

Business Rhythm

Activity: Review the EVMSD and IMS business rhythm. The rhythm should address all

activities in the typical program EVM operational cycle and describe a time phased process for

recording progress and updates, implementing baseline changes, generating reports, and

performing schedule analysis. The Business Rhythm may be in the EVMSD, IMS B&A, or IMS

Supplemental Guidance documents. The time-phasing should provide adequate time for variance

analysis and corrective action planning. The business rhythm should describe the update process

to ensure that the IMS is consistent with other EVMS artifacts such as WAD, CAP, and CPR.

Calendars

Activity: Check that Project, Task, and Resource Calendars are appropriately used. Specific

calendars should be used where their use adds precision to the IMS. Resource and task calendars

can alter the forecast duration of tasks. Their use should be documented in IMS B&A or IMS

Supplemental Guidance documents. Review application, appropriateness, and accuracy of

Page 166: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

166 Version 4.0, 21 September 2012

calendars. Evaluate multiple calendars in use on the critical and near critical paths to compare

float values. Review Resource Sheet for appropriate resource calendar use.

Tip: A change in the project calendar can alter dates in the schedule. One third party

schedule analysis application checks for calendar changes.

Traceable Baseline History

Activity: Determine if baseline history is traceable. Each change to the IMS baseline should be

documented. Check individual baseline change documents against the IMS. Compare latest

baseline change documents to determine if any changes impact the IMS baseline schedule.

Check IMS baseline information alignment with the schedule related baseline change document.

Verify a trace between BCR documentation and the IMS.

Tip: Determine if a user defined baseline change field is in the IMS (documented in field mapping,

IMS B&A or IMS Supplemental Guidance); check for alignment with related documents such as

BCR log and MR log activity.

Documenting an IMS Comprehensive Assessment

A well-documented IMS Assessment provides the PMO with the information they need to make

important decisions. The IMS Quick Look Assessment steps address the material required in an

IMS Quick Look Assessment out-brief and report.

The IMS Comprehensive Assessment should culminate in an out-briefing as well as a prepared

assessment report. The out-brief and the report should both contain executive summaries that are

appropriate for Program Manager, PEO and higher level reviews. The focus at this level is on

whether the schedule reflects the program plan, whether it has the granularity necessary to

execute the program, and the mechanical completeness of the schedule and effective use for

decision-making such that it is reliable for predicting future program performance.

Consider including the completed Quick Look Report Template as an attachment to the IMS

Comprehensive Assessment Report. If possible, include the current status (items identified for

correction and the status of those correction efforts).

IMS Comprehensive Assessment Report

The report content includes:

Executive Summary (One page with focus on significant observations, conclusions, and

recommendations).

Ground Rules and Assumptions

o Primary Purpose of the IMS Comprehensive Assessment

o Schedule and artifacts observed (address any significant artifacts not provided)

o Personnel contacted (significant contributors of information)

o Standards and guidelines applied (such as DID, contract, GASP)

o Measurements, techniques and tools used in the assessment

Observations.

o Instances where the IMS met or did not meet the criteria or guideline

Page 167: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

167 Version 4.0, 21 September 2012

o Observations on personnel competencies

o Observations on IMS related processes and procedures

Conclusions.

o On the schedule health and fitness

o On the program health as reflected by the IMS

o On program schedule performance

o On the use of the IMS to predict future program performance

Recommendations (include next steps and their priority) that will:

o Improve the IMS

o Maintain the improvements

o Improve the personnel competencies

o Improve processes and procedures

o Improve program schedule performance

Turning Reports into Results

An IMS Comprehensive Assessment report is a starting point for improving the contractor’s IMS

and ensuring it remains a useful tool for program management. The challenge is to turn the

report into results. In the case of non-compliance with standards, contracts, and contractor’s

procedures; results are usually achieved, although the pace may seem slow. For deliverables that

are approval documents, the government has a strong position. For other documents, such as the

IMS that is normally not an approval deliverable, other leverage can be pursued. Do not hesitate

to change the IMS to an approval document during a routine contract modification. An IMS that

does not meet the contractor’s EVMS Description requirements or the program unique

scheduling procedures can be pursued through the contractor’s management if necessary to

effect improvements.

Where using leverage is not feasible, and the contractor may be slow to react to needed changes,

interface between the government PM and the contractor PM is a logical escalation step. When

viewed from the higher government PM perspective, the program is a team effort with the

government and the contractor on the same team. Contractors may deemphasize the need for a

credible IMS. Or the contractor’s priorities levied upon them by the government may not allow

them to place priority upon maintaining the validity of the IMS. Both the government and

contractor should work together to create and maintain the best possible IMS for successful

program management.

Common Program Manager Questions Answered by IMS Comprehensive Assessments

In this section common questions are posed regarding a contractor’s IMS. Following each

question, is a description of potential government scheduler activity followed by suggested

responses to the PM. Where applicable, schedule analysis tools that would facilitate the response

are included.

Are the duration estimates in the schedule realistic?

Perform a Duration Variance Analysis to help determine if past performance supports future

projections. Future task durations should incorporate the demonstrated productivity of previously

completed and projected work efforts to determine a reliable forecast. Duration Variance

Analysis measures the difference in actual duration and baseline duration, as well as the

Page 168: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

168 Version 4.0, 21 September 2012

difference between baseline duration and forecast duration. If the program has a history of

significant differences then future expected performance should be questioned and the observed

results folded into SRA three point duration estimates.

Focus attention on the performance of similar completed work effort tasks to understand the

likelihood of achieving the stated durations, and the resulting projected finishes. The IMS may

not be accurately predicting future completion, and possibly cost, where significant differences

exist between longer actual performance and unaltered future projections. Discuss the reasons

for the demonstrated poor performance of completed tasks and what will prevent this from

occurring on similar work efforts in the future. If the explanations seem void of reasonable one-

time conditions, not applicable to future execution, then suggest the contractor perform a what-if

exercise applying the relatively longer durations on similar work to see IMS impacts and discuss

plans for possible mitigation.

Suggested Response to PM: Prepare charts for the PM that show the results of the duration

variance analysis (bar charts or scatter diagrams). Identify significant contributors to the duration

variances by WBS or Control Account.

Automated Tools: The IMS Performance Trends Tool (Duration Variance), Run!AzTech

(Duration Variance Analysis)

Is work being pushed to the right forming a bow wave of work that the contractor cannot sustain?

Perform a Bow Wave Analysis to help determine if the work can be executed as scheduled. An

additional discussion of Bow Wave Analysis is contained in a later section under Specific IMS

Comprehensive Assessment Techniques and Measurements. The number of tasks forecast to

start in the next month may point to an increasing amount (bow wave) of future work. If the

number of tasks forecast to start next month has a rising trend, a bow wave of work may be

forming (increasingly plowing the work ahead).

Tasks that appear in the current period may have slipped from an earlier period. This may

happen due to predecessors not executing as scheduled or resources unavailable to perform the

work. Bow wave conditions often reflect in-progress tasks and unstarted tasks with start dates at

the status date (Timenow) and not determined by predecessor impacts. This suggests that these

tasks have been pushed ahead by the passage of time and may be adding to the entire volume of

work existing in current and near term periods.

Review the history of completed tasks. If the current period indicates an inordinate amount of

tasks that have not been successfully executed as planned, discuss the contributing causes with

the contractor. Inquire if the contractor plans to increase resources to accomplish this volume of

work or has implemented solutions to address the causes for drag-on in-progress tasks. Discuss

with the contractor any plans for adding appropriate predecessors or applying “No-Earlier Than”

type constraints to reflect a near term look-ahead schedule that is executable. Focus attention on

tasks that may be holding up additional work from executing.

Suggested Response to PM: If the IMS is resource loaded, build a resource profile that shows if a

bow wave is forming. If the IMS is not resource loaded, consider a task density chart of forecast

starts or forecast in-progress tasks to show if a bow wave is forming. Provide any responses

from contractor queries on this matter. Use Baseline Execution Index (BEI) to gauge execution

Page 169: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

169 Version 4.0, 21 September 2012

to plan; if the percentage is low (threshold goal is 95% and higher), this may be an indication of

work being pushed into the future. Combine with Current Execution Index (CEI) to understand if

the contractor is executing to its forecast from one period to the next. Eliminating potential

baseline management challenges, determine if the contractors, “do what they said they would?”

Automated Tools: The IMS Performance Trends Tool (Task Density, Task Burn-down),

Run!AzTech (STAT-Bow Wave)

What is the total float trend and what is it telling me?

Perform a Total Float Trend Analysis to help determine if project goals are attainable.

Understanding total float value erosion or augmentation over time indicates the program’s ability

to achieve the program’s completion goal. If total float to project completion (less the schedule

margin) is decreasing, the risk to timely project completion is increasing.

Determine the growth or loss of total float values on remaining tasks over the past few periods.

Focus attention on the overall program to see how the average total float is changing relative to

the remaining time to program completion. Observe the total float changes on program

milestones to gain an understanding of the general program trend. Expand this awareness to

remaining work within control accounts and then to all remaining tasks to see more specific

areas of total float changes. Combine this with the consistency of applied constraints affecting

the backwards pass (late dates). Determine if the quantity of constraints and the dates applied are

changing by conducting a file compare to the previous IMSs. Determine if changes to the

constraints are consistent with the documented schedule margin approach.

Decreasing total float values may indicate a schedule slippage trend. Compare the critical /

driving paths from period to period (Critical Path Density) to see if there is a consistent picture

of same remaining tasks or similar type work, and if the percentage of tasks becoming critical /

near critical or driving / near driving is growing. If total float is negative, discuss the contributing

causes with the contractor and understand mitigation efforts to eliminate the negative values.

Suggest validating the IMS forecast and using new targets as constraints to represent achievable

goals and provide meaning total float values to aid program decision-making, where negative

total float is not remediable to the current targets.

Suggested Response to PM: Show chart reflecting the total float on program milestones for the

previous four reporting periods, note any constraint or deadline date changes that would affect

total float calculated values. Show the calculation of the TFCI for the past four reporting periods.

Correlate the amount of total float loss or gain to the remaining time to go to the applicable

program milestone or program completion.

Automated Tools: Scheduling Applications or Analysis tools that trace critical or driving paths

When I compare cost performance to schedule performance, what is that telling me?

SPI reflects the effectiveness of executing the schedule against the baseline plan. Individual

monthly SPI measurements, as well as the cumulative SPI, can indicate problems that need to be

investigated. Anticipate relative performance between schedule and cost metrics. CPI and SPI

can be compared at program down to control account levels. CV and SV are similar measures.

Page 170: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

170 Version 4.0, 21 September 2012

A positive CV and a negative SV reflect efficiency in performing planned work but delays in the

performance of the work. Check to make sure that actual cost reporting is not lagging behind

schedule performance reporting. If this is checked and is acceptable, another possibility may be

technical delays or inadequate resources are causing cost to not be accumulated as planned.

A negative CV and a positive SV are indicators that the work is being performed when planned

or even ahead of schedule, but work may be inefficiently executed. Further analysis may be

necessary to determine the cause for work taking more effort, perhaps due to underestimated

work.

Having a positive CV and SV is good news. The work is being accomplished as planned or

ahead of schedule at greater than planned efficiency.

Having a negative CV and SV is cause for concern, the work is not being accomplished when

planned and it is consuming more budget than allocated. Isolate the control account(s), where

possible, to focus attention.

Compare WBS elements or control account schedule variances (SV) in the IMS versus Contract

Performance Report (CPR) or Integrated Program Management Report (IPMR) Format 1 and 5

for alignment over the previous few periods and determine if Format 5 explanations make sense.

Verify that variance reporting satisfies contractual requirements for detailed reporting and

effectively addresses root cause issues. Compare the EAC with the IMS for period of

performance alignment. Discuss EAC composition and corrective action efforts with the

contractor to determine the projected cost and schedule impacts to the program.

Suggested Response to PM: Prepare summarized chart depicting WBS elements that should be

cause for concern by the PM. Investigate under spent or over spent and late conditions to

determine adequate explanation for lateness. Use this data to ask the contractor about potential

continuing trends or related get-well plans.

Automated Tools: MPM, Cobra, or wInsight

Which CAMs and WBS elements need help?

Determine the portions of the program that are most sensitive to impacting program completion.

These are the efforts on the critical and near critical paths. For those efforts on these paths,

determine which are not performing as planned, such as late starts, late finishes, longer

durations, and consuming more resources than planned. Analyze at the OBS level to determine if

bow wave conditions may impact particular functional areas.

Understand how the contractor integrates subcontracted efforts in the IMS. Whether using entire

subcontractor schedule import or representative tasks to capture the meaningful portions, or

applying delivery milestones to reflect the feed-in points to tasks in the IMS, the subcontracted

efforts represent a CAM’s responsibility and fall under a WBS element.

Investigate identified handoff conditions that might delay related CAM and WBS element tasks

due to late material deliveries, awaiting authority to proceed, or gaining access to required

facilities or equipment needed to perform the work.

Page 171: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

171 Version 4.0, 21 September 2012

Suggested Response to PM: Generate and show Bow Wave and Late and Critical tasks by CAM

and WBS elements to historically demonstrate problem work that may potentially threaten

program goals. Present contractor’s method of representing subcontracted tasks or receipt

milestones chart and discuss if it provides sufficient visibility for potential program impacts.

Automated Tools: The Air Force IMS Performance Trends Tool

Do we have enough schedule margin left?

Schedules should have a project level margin of time between the last activity and a contract

event or end item deliverable / end of the project (contract completion milestone). Verify the

contractor identifies milestones such as Preliminary Design Review (PDR), Critical Design

Review (CDR), and Test Readiness Review (TRR), as well as the Program Completion

milestone, and understand the management of margin. The difference between baseline and

forecast dates for these milestones is an indicator of margin consumption along the path to

program completion. Plot the remaining schedule margin for each IMS reporting period, such as

the previous four IMSs.

Decreasing trends of buffer amounts may indicate a general schedule slip or at least a slip to the

critical path or driving paths when the decreases do not align with the remaining time to go. For

example there may be less concern with a 35% decrease in buffer trend and four months

remaining to milestone versus a 35% reduction trend and 24 months remaining. Discuss the

program’s schedule margin approach and understand the factors for rapid buffer erosion with the

contractor, and if efforts to abate or reverse the trend are achievable.

Perform a Schedule Risk Assessment and compare the probable program completion date to the

schedule margin remaining.

Suggested Response to PM: Show a schedule margin burn-down chart over the last six reporting

periods. Compare the remaining margin to the 80 percent probability completion date from the

SRA. If necessary, make recommendations to compress or crash the schedule. A sample

schedule margin burn-down chart is shown below.

Page 172: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

172 Version 4.0, 21 September 2012

Automated Tools: SRA Tool, Air Force IMS Performance Trends Tool

What tasks have been moving closer to the critical path (less and less total float)?

Perform a Total Float Trend Analysis with a focus on tasks with low (and possibly negative)

total float values to help identify tasks that may become CP / DP tasks. Observe trends that

indicate significant loss of total float over the periods and flag tasks that show up supporting this

trend. These tasks may be slightly off the critical path radar. The causes for the tasks becoming

more critical may require immediate attention to avoid being caught by surprise if these tasks

“all-of-the-sudden” appear on the critical / near critical paths or driving / near driving paths.

Discuss the trend with the contractor to ensure awareness of the condition and plan to monitor

the related tasks in future IMS submittals to maintain vigilance.

Conduct a more extensive analysis by performing CP / DP Analysis for each of the previous

trend periods, again with a focus on the higher total float tasks to determine total float erosion

condition that indicate a trend towards becoming either a critical or driving path. This more

detailed analysis provides definite ranking for tasks with potential to impact the critical path /

driving paths. Compared over several periods this indicates the trend toward the specific

milestone or program completion.

Suggested Response to PM: Show and compare the previous four reporting periods’ total float

consumption charts reflecting tasks that may be approaching the total float values represented by

the primary, secondary, and tertiary critical and driving paths for those periods. Recognize that

some of the tasks could become critical or driving path from a previous period.

Automated Tools: Run!23 (Trace)

What is the probability that we can finish by a specific date?

There are several evaluations and analyses to select to effectively answer the question. A

combination of the some of the following will provide insight into determining the probability of

on-time completion.

Calculate the Baseline Execution Index (BEI) to determine if work is completing as planned.

Baseline Execution Index (BEI) measures the number of tasks completed as a ratio to those tasks

that should have completed to date, according to the original (baseline) plan.

Calculate the Start Execution Index (SEI) to determine if work is starting as planned.

Start Execution Index (SEI) measures the number of tasks started as a ratio to those tasks that

should have started to date, according to the original (baseline) plan.

If the SEI remains high but the BEI is dropping then duration estimates may be overly

optimistic; may indicate work is starting on-schedule, but not completing on-schedule.

Perform Current Execution Index (CEI) to measure how accurately the program is forecasting

and executing to its forecast from one period to the next. Use the results of CEI as a basis to

discuss the contractor’s ability to execute the immediate look-ahead period.

Page 173: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

173 Version 4.0, 21 September 2012

Perform Total Float Consumption Index (TFCI) to consider what would happen if the program

continued at its current rate of total float consumption and predict where a project would

complete if trends persist or corrective action is not taken. Total Float Consumption Index

(TFCI) applies the schedule’s current rate of total float consumption to the remaining scope of

work and projects a forecast finish date of the entire project. Discuss the factored total float trend

as predicted with the contractor to understand if previous schedule performance conditions

expect to continue in the future. Discuss the resulting projected finish compared to the EAC to

determine reasonableness.

Schedule risk assessments (SRAs) provide a probability of achieving a projected completion date

based on a range of possible durations and applied conditions, and understanding the period to

period results increases the confidence trend. The SRA provides a probability of project

completion on a specific date. Reflect the program’s projected and baseline completion date

along with the probabilistic completion to understand the potential risk. Discuss the results with

the contractor to understand their awareness and how they address the effects of schedule

uncertainty. Combined with poor schedule performance the SRA may necessitate revising

schedule projections to more accurately reflect expected execution and possibly address the

contractor’s ability to achieve program target goals.

Suggested Response to PM: Refer to the latest SRA or recommend conducting an SRA on the

contractor’s IMS. If there is time and the project has at least 15 percent BCWP, do an SRA using

global banding based on past performance. If there is not time to do the SRA, use the TFCI as a

projection of completion dates. Show the SEI, BEI, and Forecast Execution Index results over

the previous four reporting periods to reflect if work is trending later or earlier to the baseline

and as validation of the SRA and TFCI data. Present the Current Execution Index results for the

previous four reporting periods to provide an indication that the near term work is accurately

planned and executed.

Automated Tools: The Air Force IMS Performance Trends Tool (Schedule Performance),

Run!AzTech.

Specific IMS Comprehensive Assessment Techniques and Measurements

This section describes several assessment techniques and measurements. This is information that

was not developed earlier in Parts 1 or 2 of the IMS Assessment Process. The following tests that

are applicable to earlier Part 3 items are identified with references to those earlier IMS

Comprehensive Assessment Test / Activity List sections.

Measures of Schedule Execution

Reference: Predictive Tenet / Measures of Schedule Execution Test

There are a number of tests that may be used to measure project schedule performance. Most of

these measures are used as part of performance trends analysis. If the scope of the IMS

Comprehensive Assessment includes an analysis of project schedule performance, consider

using several of the metrics below to assess the project. Some of these measures may be labor

intensive. Select those that are most closely related to the postulate project performance issues.

Potential measures and a description of their potential uses are shown below:

Page 174: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

174 Version 4.0, 21 September 2012

Critical Path Length Index: The critical path length plus total float divided by the critical

path length. A measure of project completion realism to the planned completion. Used in

trend analysis.

Number of missed tasks (baseline compliance): The number of tasks that were not

completed by the baseline completion date.

Number of late starts: The number of tasks that started later than baselined. Used in

trend analysis.

Number of late finishes: The number of tasks that finished (completed) later than

baselined. Used in trend analysis.

Number of overdue starts: The number of tasks that were baselined to start by the status

date but have not yet started. Used in trend analysis.

Number of look ahead late finishes (forecast to finish late to their baseline finish in next

TBD days). Used in trend analysis.

Task Density: The number of tasks in progress. Used in trend analysis.

Task Burn-down: The number of tasks completed each reporting period, establishing

performance expectations. Used in trend analysis.

Remaining work (where the IMS contains resources or hours): The amount of work

remaining as of the status date. Can be used in trend analysis and to determine the work

completed each period.

Earned Schedule (SVt and SPIt): The measure of schedule performance in units of time

rather than dollars. Useful for determining confidence in predicting completion dates.

Schedule Performance Index (SPI): An EVM measure, cumulative BCWP divided by

cumulative BCWS. Used in trend analysis.

Forecast Execution Index (FEI): The percentage of forecasted finishes matching or are

earlier than their baseline finish dates / periods.

Baseline Execution Index (BEI): The percentage of total actual finishes compared to

total baseline finishes planned through the status date.

Determine the percentage of tasks with actual finish dates that are up to TBD days later

than their baseline finish date to the tasks with baseline finish dates earlier than the

status date. A decreasing percentage over time indicates the schedule is slipping.

Start Execution Index (SEI): The percentage of total actual starts compared to total

baseline starts planned through the status date.

Determine the percentage of tasks with actual start dates that are up to TBD days later

than their baseline start date to the tasks with baseline start dates earlier than the status

date. A decreasing percentage over time indicates the schedule is slipping.

Current Execution Index: The number of tasks completed in a period compared to their

previously forecasted completion in that period. May be used in trend analysis.

Rate Charts: These are trend charts of actual finishes compared to baseline finishes and

often segregated by OBS or WBS element. Used in trend analysis.

Bow Wave Charts: The number of forecast starts near the Status Date. Used to detect the

slipping for forecast starts and points to a potential resource issue.

Project Schedule Variance: The measure of the difference between BCWP and BCWS in

dollars or labor hours. Often segregated by OBS or WBS element.

Average days late: The average difference between task forecast completion dates and

baseline completion dates for incomplete tasks.

Page 175: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

175 Version 4.0, 21 September 2012

Number of tasks that started in period planned (baselined): Used in trend analysis.

Number of tasks that finished in period planned (baselined): Used in trend analysis.

Number of tasks that started on time but finished late: Potential measure of estimating

accuracy. May be used in trend analysis.

Total Float trend: The amount of time a task can slip without impact to the project

completion milestone. Compare amounts of total float to program completion across

periods. Compare total float to milestone(s) between periods. Used in trend analysis for

key milestones.

Schedule Margin Burn-down: A trend chart displaying the consumption of schedule

margin over time.

Schedule Risk Assessment: A probability distribution of project dates based on a

simulation.

Schedule Overrun (Duration Variance): The number of tasks with remaining duration

greater than original duration or baseline duration.

Remaining total project critical duration: The project critical path in task days.

(Accounts for parallel critical paths and is used in trend analysis).

Average baseline duration compared to average remaining duration for incomplete tasks

Total tasks days started early compared with total task days started late: Measure of

overall schedule performance.

Number of tasks that have been completed and took longer than baselined. Useful in

period by period analysis and trend analysis.

Total Float Consumption Index: Measure of the change in project total float and used to

predict the forecast project finish date.

A number of schedule analysis software tools exist that provide schedule execution metrics.

These tools include Acumen Fuse, Air Force IMS Performance Trends Tool, Run!Aztech, and

Steelray Project Analyzer.

Late and Critical Tasks

Reference: Predictive Tenet / Measures of Schedule Execution

Monitor late and critical tasks to understand the magnitude of tasks that are not performing to

their baseline dates. The combination of a late and critical condition focuses attention on those

tasks with greater impact to program milestones, as opposed to all late tasks. Comparing the

amount of late and critical tasks to previous status cycle periods offers insight into a possible

trend toward poor schedule performance and potential schedule impacts.

An example of late and critical tasks are incomplete tasks with forecast dates later than their

baseline dates and total float values less than the critical threshold. For example, the program

may decide to define the critical threshold as tasks with total float less than a positive six

working days to have more awareness for tasks close to reflecting negative total float values.

Critical Path / Driving Path Determination

Reference: Transparent Tenet / Critical Path and Driving Path Identification and Analysis

Page 176: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

176 Version 4.0, 21 September 2012

Analyze Path to Program Completion, and IMS Comparison tests. Note: the following describes

critical path analysis, but is also applicable for driving path analysis. When using the IMS as a

management tool, it is important that the critical path can be identified, analyzed, and validated.

Because of the importance of the critical path, a number of IMS assessment tools are available

that facilitate identification and analysis of the critical path. Scheduling software is adding

features to assist in critical path identification. Microsoft Project 2007 and later versions have an

increased capability for identifying task drivers.

Absent a tool for automatically analyzing the critical path or predecessor path to a milestone, one

technique that can be used in any scheduling tool is to ensure the endpoint milestone (focus task)

is constrained using an appropriate constraint, and temporarily neutralizing other existing

constraints that may prevent tasks from reflecting the impacts of the focus task’s constraint.

Next, change the latter constraint date to a date much earlier in the program, such as six months

or a year earlier. This creates negative total float for the critical and near critical paths. Filter for

the lowest total float value to see the critical path. Then filter for the next lowest total float value

to see the near critical path and so on. Optionally, use a text or flag field to annotate the critical

paths.

After determining the critical path, examine the tasks that comprise the path. Pay particular

attention to the critical tasks and their relationships. Sometimes a task may wind up on the

critical path due to an erroneously placed relationship. For example, a test result even in draft

form may be needed before integration can begin, but the critical path includes test results, draft

report, final report and report approval all as predecessors to integration. The three subsequent

efforts after test results, although are needed in the program, are not prerequisites to integration.

Including the tasks as predecessors to integration, adds unnecessary time, and in this example,

unnecessarily extends the critical path. In this example, review and revise the logic so the tasks

are not prerequisites for integration, but have appropriate logic ties to other tasks.

After the critical path is determined, look closely at the tasks that do not have complete logic

(both predecessors and successors). It is possible these tasks, if they had complete logic, would

be on the critical path. The point to remember is that the analysis work has just begun when

determining the critical path.

Associated with critical path determination is near-critical path determination. A task may be just

a one day delay away from joining or establishing the critical path. Most scheduling software

that highlights critical paths permits setting the total float value used to determine task critical

threshold. Use the scheduling software feature or other tools to look for tasks that are near

critical so that they also receive management attention.

Bow Wave Analysis

Reference: Statused Tenet / Riding the Status Date Test; Predictive Tenet / Measures of

Schedule Execution Test

In scheduling terms, bow wave tasks refer to those riding the status date or forecasted just

beyond the status date. This occurs for two reasons:

Tasks are running late and have been updated with a new Start Date that is determined

by the push of the new status date.

Page 177: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

177 Version 4.0, 21 September 2012

Tasks are running late and have been updated with a new Start Date using a near term

constraint, as opposed to precedence logic.

Objective: Count the number of tasks on the bow wave and determine if the work stacked up on

the status date can realistically be completed as scheduled when compared to historical

performance. If not, the scheduler should work with the CAMs to re-schedule or re-sequence the

work more realistically.

Typically, these tasks do not have a predecessor or their predecessors are complete and no longer

determine their start dates. They appear immediately following the status date because they are

rescheduled to Timenow or their start dates appear shortly after the status date due to the use of a

“no earlier than” type constraint date. Regardless, bow wave tasks are problematic because they

add tasks, and therefore work, in the immediate period that may not be executable in light of the

total scheduled work. An increasing bow wave obscures priorities regarding what tasks to

perform.

Another problem associated with bow wave conditions is lack of schedule predictability. The

increasing number of bow wave tasks is an indication that the rate of work accomplished is not

keeping pace with expectations, possibly due to technical problems or not having sufficient

resources to perform the work. This suggests that the tasks are pushed as the status date is moved

and are not executable. This eliminates predictability of potential impacts to successor path tasks

by not re-planning the tasks into more achievable periods. Bow wave conditions present an

overly optimistic view of the schedule and obscure a realistic achievement until discovery when

accurate forecasts are projected. Only at this time does program management realize that

program goals are not achievable.

Rolling Wave / Planning Packages

Reference: Complete Tenet / Earned Value Techniques Test; Transparent Tenet Verify Rolling

Wave Planning

Identify work not easily defined in future periods as planning packages to capture all program

scope along with the detailed planning identified in the work packages. As the program

continues and planning packages come into the near term horizon, they should be converted to

work packages by defining the task or tasks that reflect the approach for completing the work.

This includes defining the steps and the resources needed to execute the work. Converting

planning packages to work packages also requires establishing an Earned Value Technique for

each work package. Typically, the BCR process is used when converting planning packages to

work packages as part of the IMS configuration control. It is important that planning packages

are not scheduled to start within one month prior of the current status date. To ensure this has

been accomplished, filter for planning package tasks that have a start date within one month of

the current status date. If any exist, these should be rescheduled or converted to work packages.

Planning packages with long durations of at least six months can distort the results of an SRA.

Consider asking for planning packages to be broken into shorter durations. Planning packages

conversion into work packages is often best performed as a comprehensive exercise. Review the

program plans or instructions for converting planning packages. Validate that other CAMs

Page 178: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

178 Version 4.0, 21 September 2012

impacted by these planning package conversions have a chance to coordinate the detailed

planning.

First and Last Tasks Analysis

Reference: Compete Tenet / Program Milestones Test; Traceable Tenet / Finish-to-Start

Relationships Test; Predictive Tenet / Program Completion Trace Test; Controlled Tenet /

Calendars Test

Locate the first and last detail tasks or milestones that have successor and predecessor

relationships respectively in the IMS to determine the number of logic ties to the beginning and

end of the program

Analyzing “First” Tasks

After defining the First task, such as Contract Award or Project Start, conduct the following

sanity checks:

The First task should be baselined and fall within the program’s PoP.

Assess the number of successors. Analyze the time period between the start date of the

successor and the finish date of the First task.

Review the SS and SF relationships that do not provide a logical sequence throughout a

schedule.

Determine what types of calendar(s) are used. Using multiple calendars can cause

resource conflicts and make schedule analysis difficult.

Sometimes the successors have Start Dates well beyond the Project Start task and are often

“nailed down” using a “no earlier than” type constraint. This indicates poor logic ties where a

more meaningful predecessor may be more appropriate or an attempt at “beating the metrics” by

linking the Project Start (or other similar task) to future work just to avoid the Missing

Predecessor metric. Evaluate the total period of performance and ensure that all tasks fall within

the Period of Performance (PoP).

Analyzing “Last” Tasks in the IMS

Find the last task in the IMS that has a predecessor. Review those tasks for more appropriate

successor relationships, rather than assigning the last task in the IMS for their successor. After

defining the Last task, such as Contract Complete or Project Complete, conduct the following

sanity checks:

The Last task should be baselined and fall within the programs PoP.

Assess the number of predecessors. Analyze the time period between the finish date of

the predecessor and the start of the Last task; programs tend to link tasks to project

complete milestones that inflate total float values.

Look for tasks with negative or zero total float as they may be pushing the Last task

(deliverable date); determine if these tasks represent the scope or type of work that

makes sense for determining the last task date.

Review the; SS and SF relationships that do not provide a logical sequence throughout a

schedule.

Page 179: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

179 Version 4.0, 21 September 2012

Determine what types of calendar(s) the predecessors employ (task calendar). As the

project comes to an end, using multiple calendars can cause resource conflicts.

Software tools exist that can automatically determine the first and last tasks with corresponding

predecessors and successors.

Duration Variance / Pace / Earned Value Method Analysis

Reference: Statused Tenet / Remaining Duration Test

Pace Analysis compares baseline versus actual or forecast durations to help understand if the

current schedule is realistic. For example, if tasks are typically taking twice the baseline

duration, then why are the future tasks not forecasted to take longer? If not, is there an

explanation why the forecast durations are more or less than the baseline or past performance? Is

there a mitigation plan in place to assure the forecast durations are achievable and realistic?

Conduct a Pace Analysis by looking for anomalies, such as:

0/100 forecast or actual duration greater than one month, maybe even spanning several

months or longer--or much longer than the baseline.

50/50 forecast or actual duration greater than two months, maybe even spanning several

months or longer--or much longer than the baseline.

Percent Complete (sometimes called PC or % Complete) baselined at greater than three

month’s duration (i.e., > 12 weeks). This is not atypical, but the longer the tasks without

interim measures, the more difficult it is to assess the objective percent complete without

examining the quantifiable backup data (QBD or rationale) that supports the earned

value percent complete.

Percent Complete forecast or actual duration greater than three months--or much longer

than the baseline.

LOE or Planning Packages greater than one year (i.e., > 52 weeks) are not

recommended.

Risks & Opportunities Integration Analysis

Reference: Transparent Tenet / Risk and Risk Mitigation Identification Test; Usable Tenet / Risk

Integration into the IMS Test

Ideally, the IMS contains risk and opportunity (R&O) activities that are coded to readily identify

them as such. Typically, the R&O tasks have an R&O ID that aligns to the R&O register. At a

minimum, IMS activities should be traceable to individual risks and opportunities in the R&O

database/register.

To assess whether R&O tasks are identified, examine the Text fields, Notes fields, or the Name

field. If the R&O register is available, spot check if recent items exist in the schedule.

Tip: R&O visibility enables focused attention on related tasks

Page 180: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

180 Version 4.0, 21 September 2012

Verify that related R&O tasks are identified in the IMS & align with related source

documentation. Query schedule authors for explanations regarding items detected that do not

satisfy R&O alignment.

Page 181: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

181 Version 4.0, 21 September 2012

Acronyms

AC Accomplishment Criteria

ACWP Actual Cost of Work Performed

ALAP As Late As Possible

ASAP As Soon As Possible

BCR Baseline Change Request

BCWP Budgeted Cost of Work Performed

BCWS Budgeted Cost of Work Scheduled

BEI Baseline Execution Index

BOE Basis of Estimate

CAM Control Account Manager

CAP Control Account Plan

CDR Critical Design Review

CDRL Contract Data Requirements List

CEI Current Execution Index

CEVM Center for Earned Value Management

CLIN Contract Line Item Number

CP Critical Path

CPLI Critical Path Length Index

CPM Critical Path Method

CPR Contract Performance Report

CV Cost Variance

DCMA Defense Contract Management Agency

DID Data Item Description

DP Driving Path

EAC Estimate At Completion

ETC Estimate To Complete

EV Earned Value

EVM Earned Value Management

EVM Earned Value Method

EVMS Earned Value Management System

EVMSD Earned Value Management System Description

EVT Earned Value Technique

FAR Federal Acquisition Regulation

FF Finish to Finish

FNET Finish No Earlier Than

FNLT Finish No Later Than

FS Finish to Start

GAO Government Accountability Office

GASP Generally Accepted Scheduling Principles

GFE Government Furnished Equipment

GFI Government Furnished Information

GFP Government Furnished Property

IBR Integrated Baseline Review

IMP Integrated Master Plan

IMS Integrated Master Schedule

IPMR Integrated Program Management Report

Page 182: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

182 Version 4.0, 21 September 2012

IPT Integrated Product Team

LOE Level of Effort

LRE Latest Revised Estimate

MFO Must Finish On

MPM Microframe Program Manager

MSO Must Start On

MR Management Reserve

MSP Microsoft Project

NDIA National Defense Industrial Association

OBS Organizational Breakdown Structure

OOS Out Of Sequence

OOSS Out Of Sequence Status

OPP Open Plan Professional

PASEG Planning and Scheduling Excellence Guide

PC Percent Complete

PDR Preliminary Design Review

PE Program Event

PEO Program Executive Officer

PMB Performance Measurement Baseline

PMI Project Management Institute

PMO Program Management Office

PoP Period of Performance

PPKg Planning Package

PRR Production Readiness Review

PWS Performance Work Statement

QBD Quantifiable Backup Data

QL Quick Look

RAM Responsibility Assignment Matrix

ROMP Risk and Opportunity Management Plan

R&O Risk and Opportunity

SA Significant Accomplishment

SAF Secretary of the Air Force

SDRL Subcontract Data Requirements List

SEI Start Execution Index

SEMP Systems Engineering Management Plan

SF Start to Finish

SHA Schedule Health Assessment

SMART System Metric And Reporting Tool

SME Subject Matter Expert

SNET Start No Earlier Than

SNLT Start No Later Than

SOO Statement of Objectives

SOW Statement of Work

SPI Schedule Performance Index

SPIt Schedule Performance Index based on time

SRA Schedule Risk Assessment

SS Start to Start

SV Schedule Variance

Page 183: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

183 Version 4.0, 21 September 2012

SVt Schedule Variance based on time

SVT Schedule Visibility Task

TFCI Total Float Consumption Index

TRR Test Readiness Review

UDF User Defined Field

WAD Work Authorization Document

WBS Work Breakdown Structure

WP Work Package

Page 184: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

184 Version 4.0, 21 September 2012

Appendix – Schedule and Schedule Assessment Tools

This process document has described methods to perform IMS Quick Look and Comprehensive

IMS Assessments. Test methods are described independent of scheduling software and schedule

assessment tools to provide an understanding of the underlying scheduling principles. A section

is added to increase the efficiency of tests for Microsoft Project files using Run!23.

A number of government owned and commercial tools are available that make scheduling and

schedule assessment much more efficient. This appendix describes the more common of these

tools.

Schedule Generating Software

Within the aerospace industry, the companies working with the Air Force use primarily three

scheduling software applications, Deltek Open Plan Professional, Microsoft Project and Oracle’s

Primavera

Microsoft Project is currently supported in three versions 2003, 2007, and 2010. There are server

editions for all three versions, but IMSs prepared using the server version may be examined by

government agencies with a non-server version. Server versions are normally used on large

programs where multiple schedulers need to work on a schedule simultaneously.

Deltek Open Plan Professional is normally operated in a server environment on large programs

for the same reason as companies use Microsoft Project in a server edition. It also is available in

standalone desktop versions.

Oracle’s Primavera is the third most common scheduling software used by Air Force contractors

on acquisition programs. Primavera comes in several versions, P3 and P6 being the most

currently used applications. These applications come in server versions as well as standalone

applications.

DI-MGMT-81650, the required schedule data item description for Earned Value programs

mandates that schedules be provided in native scheduling software format. As a result most

program offices should have an adequate number of copies of the same scheduling software used

by their contractors to evaluate submitted schedules.

Schedule Assessment Tools Associated with Schedule Software

There are a number of products that work within the schedule software to facilitate schedule

assessments. Most of these products perform or facilitate the user in performing the DCMA 14

Point Assessment. Some require the schedule software to reside on the same computer as the

schedule assessment tool. Other tools can import the schedule data independently of the schedule

software being available on the same computer. These tools are listed alphabetically with no

advocacy or recommendation by SAF/AQXC.

Acumen Fuse

Acumen Fuse is a standalone commercially available tool for schedule analysis that analyzes

Primavera, Open Plan Professional, and Microsoft Project files. The software contains over 250

metrics and produces a MS Word document for the DCMA 14 Point Assessment. It does require

Page 185: Air Force Integrated Master Schedule (IMS) Assessment Process IMS_ Assessment... · Air Force IMS Assessment Process 2 Version 4.0, 21 September 2012 Document Configuration Management

Air Force IMS Assessment Process

185 Version 4.0, 21 September 2012

Open Plan Professional to be installed on the same computer as Acumen Fuse. It does not

require MS Project or Primavera to import and analyze those native files.

Air Force IMS Performance Trends Tool

This government MS Excel-based tool provides for an automatic import of Microsoft Project and

manual import of other schedule files using an Excel template. The tool is oriented to evaluating

project performance, particularly trend data. It can help perform several of the Quick Look tests

as well. This tool is available from SAF/AQXC.

Defense Contract Management Agency (DCMA) 14 Point IMS Assessment

This government tool comes in versions for Open Plan Professional and Microsoft Project. All

filters are provided in narrative form and may be created in the scheduling software. The

Microsoft Project application also comes in an Excel spreadsheet that can import the MSP

schedule data.

Deltek Open Plan Professional

Deltek Open Plan Professional is capable of generating a Schedule Health Assessment using

services provided by Deltek. Deltek Support Services will provide a set of metrics as an add-in

to the scheduling tool. The Air Force is currently developing a set of filters, views, sorts, and

calculated fields that will automate the performance of a number of the tests using a native Open

Plan schedule and described in the process document. This set of Open Plan Professional

elements will be provided in the next update to this Process Document.

Run!AzTech for MS Project

Run!AzTech is a commercial schedule analysis tool (add-in) for Microsoft Project. It operates as

an add-in for Microsoft Project. Run!AzTech contains over 240 filters and generates GASP and

checks and other advanced analysis and assessments. It also generates a statistics report of basic

and advanced schedule characteristics and an out-of-sequence status report, among others.

Run!23 for MS Project

Run!23 is a freeware add-in for Microsoft Project that provides a number of the filters needed for

the Quick Look Assessment tests. Filters for use in the IMS Quick Look Assessment are prefixed

with “QL” labels and the test number. The tool provides a task counting capability that helps

automate filtering and counting tasks or metrics as well as forward and backward traces to

analyze schedule logic.

Steelray Project Analyzer

Steelray Project Analyzer is a standalone commercial tool that analyses schedules in Primavera

and Microsoft Project native formats. The software comes with a tailorable filter set of 90

metrics that includes most of the filters needed to perform the IMS Quick Look Assessment.

Project Analyzer also performs the DCMA 14 Point Assessment and generates a report in

Clipboard for transfer to other applications such as MS Word or PowerPoint.