white paper on technology assessment rev a sponsored documents…  · web viewestablishment of...

59
Systematic Assessment of the Program/Project Impacts of Technological Advancement and Insertion Revision A

Upload: others

Post on 26-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Systematic Assessment of the Program/Project Impacts of Technological Advancement and

Insertion Revision A

James W. BilbroJB Consulting International*

October, 2007

* This is a revision of a white paper originally written during my tenure at the George C. Marshall Space Flight Center

Page 2: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Acknowledgements

I would like to acknowledge and express my appreciation for the valuable contributions to the preparation of this paper by the following individuals: Dale Thomas, Jack Stocky, Bill Burt, Dave Harris, Jay Dryer, Bill Nolte, James Cannon, Uwe Hueter, Mike May, Joel Best, Steve Newton, Richard Stutts, Wendel Coberg, Pravin Aggarwal, Endwell Daso, and Steve Pearson.

2

Page 3: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Executive Summary

It is incredibly important that program/project managers and systems engineers understand that technology development plays a far greater role in the life cycle of a program/project than has been traditionally considered and that it can have a tremendous impact on cost, schedule and risk. From a program/project perspective, technology development has traditionally been associated with the development and incorporation of “new” technology necessary to meet requirements. But, frequently overlooked and thus almost always unexpected is the technology development associated with using “heritage” systems. Because these systems are thought to be mature, critical systems engineering steps are often skipped or given short shrift. However, when “heritage” systems are incorporated into different architectures and operated in different environments from the ones for which they were designed, they frequently require modification. If the required modification falls within the existing experience base then it is straight forward engineering development; if it falls outside that experience base, it becomes technology development - and it is extremely difficult to know a priori which is which.

In order to determine whether or not technology development is required - and to subsequently quantify the associated cost, schedule and risk, it is necessary to first systematically assess the maturity of each system, sub-system or component in terms of the architecture and operational environment in which it is to be used. And then to assess what is required in the way of development to advance the maturity to a point where it can be successfully incorporated within cost, schedule and performance constraints. It should be noted at this point, that in the absence of adequate knowledge of the architecture and the operational environment, it is not possible to have a technology maturity beyond Level 4 (Appendix A.)1 In order to minimize impacts, the program/project manager must devote adequate resources to ensure that the assessments are done as early in the life cycle as possible. This will not be inexpensive, but then neither will the “fixes” if such assessments are not done.

It is the role of the systems engineer to develop an understanding of the extent of impact of technology development - maximizing benefits and minimizing adverse effects for the program/project. This is in many respects a new role for systems engineering, or at least a new perspective. Technology Assessment needs to play a role in all aspects of systems engineering throughout the design and development process from concept development through PDR. Lessons learned from a technology development point-of-view should then be captured in the final phase of the program.

Stakeholder Expectation: GAO studies have consistently identified the “mismatch” between stakeholder expectation and developer resources (specifically the resources required to develop the technology necessary to meet program/project requirements) as a major driver in schedule slip and cost overrun2.

Requirements Definition: If requirements are defined without fully understanding the resources required to accomplish needed technology developments, the program/project is at risk. Technology assessment must be done iteratively until requirements and available resources are aligned within an acceptable risk posture.

Design Solution: As in the case of requirements development, the design solution must iterate with the technology assessment process to ensure that performance requirements can be met with a design that can be implemented within the cost, schedule and risk constraints.

Risk Management: In many respects, technology assessment can be considered a subset of risk management and as such should be a primary component of the risk assessment. A stand alone report of technology readiness assessment must be provided as a deliverable at PDR per NPR 7120.5d.

Technical Assessment: Technology assessment is also a subset of technical assessment. Implementing the assessment process provides a substantial contribution to overall technical assessment.

Trade Studies: Technology assessment is a vital part of determining the overall outcome of trade studies, particularly with decisions regarding the use of heritage equipment.

Verification/Validation: The verification/validation process needs to incorporate the requirements for technology maturity assessment in that ultimately maturity is demonstrated only through test and/or operation in the appropriate environment.

3

Page 4: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Lessons Learned: Part of the reason for the lack of understanding of the impact of technology on programs/projects is that we have not systematically undertaken the processes to understand impacts.

4

Page 5: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Introduction, Purpose and Scope

NASA’s programs and projects, by their very nature, frequently require the development and infusion of new technological advances in order to meet performance requirements arising out of mission goals and objectives. Frequently, problems associated with technological advancement and subsequent infusion have resulted in schedule slips, cost overruns and occasionally even to cancellations or failures.3 It is the role of the Systems Engineer to develop an understanding of where those technological advances are required and to determine their impact on cost, schedule and risk. It should be noted that this issue is not confined to “new” technology. Often major technological advancement is required for a “heritage” system that is being incorporated into a different architecture and operated in a different environment from that for which it was originally designed. In this case, it is frequently not recognized that the adaptation requires technological advancement and as a result, key systems engineering steps in the development process are given short shrift – usually to the detriment of the program/project.

In both contexts of technological advancement (new and adapted heritage), infusion is a very complex process that has been dealt with over the years in an ad hoc manner differing greatly from project to project with varying degrees of success. In post mortem, the root cause of such events has often been attributed to “inadequate definition of requirements.” If such were indeed the “root cause,” then correcting the situation would simply be a matter of requiring better requirements definition, but since history seems frequently to repeat itself, this must not be the case - at least not in total. In fact there are many contributors to schedule slip, cost overrun, project cancellation and failure - among them lack of adequate requirements definition. The case can be made that most of these contributors are related to the degree of uncertainty at the outset of the project and that a dominant factor in the degree of uncertainty is the lack of understanding of the maturity of the technology required to bring the project to fruition and a concomitant lack of understanding of the cost and schedule reserves required to advance the technology from its present state to a point where it can be qualified and successfully infused with a high degree of confidence - in other words, where requirements and available resources are in line.3,4,5,6 Although this uncertainty cannot be eliminated, it can be substantially reduced through the early application of good systems engineering practices focused on understanding the technological requirements, the maturity of the required technology and the technological advancement required to meet program/project goals, objectives and requirements. The only way to ensure the necessary level of understanding is for the systems engineer to conduct a systematic assessment of all systems, subsystems and components at various stages in the design/development process. It is extremely important to begin the assessment at the earliest possible point since results play a major role in the determination of requirements, the outcome of trade studies, the available design solutions, the assessment of risk, and the determination of cost and schedule.

There are a number of processes that can be used to develop the appropriate level of understanding required for successful technology insertion. None of them provide the complete answer, but it is the intent of this section to describe a systematic process that can be used as an example of how to apply standard systems engineering practices to perform a comprehensive Technology Assessment (TA) that will enable a reduction in uncertainty in program/project success. It should be noted that the examples provided in this section are just examples. The process can and should be tailored to meet the needs of the particular program/project to which it is being applied.

The TA is comprised of two parts, a Technology Maturity Assessment (TMA) and an Advancement Degree of Difficulty Assessment (AD2). The process begins with the TMA which is used to determine technological maturity via NASA’s Technology Readiness Level (TRL) scale (Appendix A). It then proceeds to develop an understanding of what is required to advance the level of maturity through a process called the Advancement Degree of Difficulty (AD2) (Appendix C.).

It is necessary to conduct a TA at various stages throughout a program/project in order to provide the Key Decision Point (KDP) products required for transition between phases identified below:

KDP A – Transition from Pre-Phase A to Phase A: Requires an assessment of potential technology needs versus current and planned

technology readiness levels, as well as potential opportunities to use commercial, academic, and other government agency sources of technology. This assessment is included as part of the draft integrated baseline.

5

Page 6: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

KDP B – Transition from Phase A to Phase B: Requires a Technology Development plan identifying technologies to be developed,

heritage systems to be modified, alternate paths to be pursued, fall back positions and corresponding performance de-scopes, milestones, metrics and key decision points. This is to be incorporated in the preliminary Project Plan.

KDP C – Transition from Phase B to Phase C/D: Technology Readiness Assessment Report (TRAR) demonstrating that all systems,

subsystems and components have achieved a level of technological maturity with demonstrated evidence of qualification in a relevant environment. This is a stand alone report.

The initial TMA serves to provide the baseline maturity of the system at program/project outset.

Subsequent assessments can be used to monitor progress throughout development. The final TMA is performed just prior to the Preliminary Design Review (PDR) and forms the basis for the Technology Readiness Assessment Report (TRAR) which documents the maturity of the systems, subsystems and components demonstrated through test and analysis. The initial AD2 assessment provides the material necessary to develop preliminary cost and schedule plans and preliminary risk assessments. In subsequent assessments, the information is then used to build the technology development plan, in the process identifying alternative paths; fall-back positions and performance de-scope options. The information provided by the AD2 assessments is also vital to the preparation of milestones and metrics for subsequent Earned Value Management.

The assessment is performed against the hierarchical breakdown of the hardware and software products of the program/project Work Breakdown Structure (WBS) in order to achieve a systematic, overall understanding at the system, subsystem and component level.

Inputs/Entry Criteria

It is extremely important that a Technology Assessment process be defined at the beginning of the program/project and that it be performed at the earliest possible stage (concept development) throughout the program/project through PDR. Inputs to the process will vary in level of detail according to the phase of the program/project, and even though there is a lack of detail in pre-phase A, the TA will drive out the major critical technological advancements required. Therefore, at the beginning of pre-phase A, the following should be determined:

Refinement of Technology Readiness Level Definitions (beyond those contained in Appendix A.)

Refinement of definition of terms to be used in the assessment process (beyond those contained in Appendix B.)

Establishment of meaningful evaluation criteria and metrics that will allow for clear identification of gaps and shortfalls in performance.

Establishment of the TA team Establishment of an independent TA review team.

How to do Technology Assessment

The technology assessment process makes use of basic systems engineering principles and processes. As mentioned previously, it is structured to occur within the framework of the Work Breakdown Structure (WBS) (note, it should be a product oriented WBS) in order to facilitate incorporation of the results. Using the WBS as a framework has a two-fold benefit – it breaks the “problem” down into systems, subsystems and components that can be more accurately assessed, and it provides the results of the assessment in a

6

Page 7: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 1. General WBS example

format that can readily be used in the generation of program costs and schedules. It can also be highly beneficial in providing milestones and metrics for progress tracking using Earned Value Management. An example of a general product oriented WBS is shown in Figure 1.

As discussed above, Technology Assessment is a two step process comprised of as follows: step one, the determination of the current technological maturity in terms of Technology Readiness Levels (TRL’s), and step two, the determination of the difficulty associated with a moving a technology from one TRL to the next through the use of the Advancement Degree of Difficulty (AD2). The overall process, shown in Figure 2, is iterative, starting at the conceptual level during program formulation, establishing the initial identification of critical technologies and the preliminary cost, schedule and risk mitigation plans. Continuing into Phase A, it is used to establish the baseline maturity, the Technology Development plan and associated costs and schedule. The final TA consists only of the TMA and is used to develop the TRAR which validates that all elements are at the requisite maturity level.

Even at the conceptual level, it is important to use the formalism of a WBS to avoid having important technologies slip through the crack. Because of the preliminary nature of the concept, the systems, subsystems and components will of necessity be defined at a level that will not permit a detailed assessment. The process of performing the assessment, however, is the same as that used for subsequent, more detailed steps that occur later in the program/project where systems are defined in greater detail.

Once the concept has been formulated and the initial identification of critical technologies made, it is necessary to perform detailed architecture studies with the Technology Assessment Process intimately interwoven. The purpose of the architecture studies is to refine end-item system design to meet the overall scientific requirements of the mission. It is imperative that there be a continuous relationship between architectural studies and maturing technology advances as shown in Figure 3. The architectural studies must incorporate the results of the technology maturation, planning for alternate paths and identifying new areas required for development as the architecture is refined. Similarly, it is incumbent upon the technology maturation process to identify requirements that are not feasible and development routes that are not fruitful and to transmit that information to the architecture studies in a timely manner. The architecture studies in turn must provide feedback to the technology development process relative to changes in requirements. Particular attention must be given to “heritage” systems in that they are often used in architectures and environments different from those in which they were designed to operate.

7

Project XYZ

System A System B System C

Subsystem cSubsystem bSubsystem a

Component α Component β Component

Page 8: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 2. Technology Assessment Process

Establishing Technology Readiness Levels

TRL is, at its most basic, a description of the “performance history” of a given system, subsystem or component relative to a set of levels first described at NASA Headquarters in the 1980’s.7,8 The TRL essentially describes the level of maturity of a given technology and provides a “baseline” from which maturity is gauged and advancement defined. Even though the concept of TRL has been around for almost 20 years, it is not well understood and frequently misinterpreted. As a result we often undertake programs without fully understanding either the maturity of key technologies or what is needed to develop them to the required level. It is impossible to understand the magnitude and scope of a development program without having a clear understanding of the baseline technological maturity of all elements of the system.

8

Assign TRL to subsystems based on lowest TRL of components + TRL state of integration

Identify systems, subsystems and components per the hierarchical product breakdown of the WBS

Assign TRL to all components based on assessment of maturity

Assign TRL to Systems based on lowest TRL of subsystems + TRL state of integration

Identify all components, subsystems and systems that are at lower TRLs than required by the program

Baseline Technological Maturity Assessment

Perform AD2 on all identified components, subsystems and systems that are below requisite maturity level.

Technology Development Plan Cost Plan

Schedule Plan Risk Assessment

Requirements Concepts

TRL/AD 2 Assessment

Architecture Studies System Design

Technology Maturation

Figure 3. . Architectural studies and technology development

Page 9: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Establishing the TRL is a vital first step on the way to a successful program. A frequent misconception is that in practice it is too difficult to determine TRL’s and that when you do it is not meaningful. On the contrary, identifying TRLs can be a straightforward systems engineering process of determining what was demonstrated and under what conditions was it demonstrated.

The Technology Readiness Levels for hardware and software along with the exit criteria for each level and set of definitions for terms used within the TRL descriptions can be found in the NASA document, NPR 7120.8 and is shown in Appendix A. It should be noted that these descriptions include an expanded definition of hardware levels along with a definition of software levels and a set of corresponding exit criteria.

Careful attention should be paid to levels 4 and 5. Because testing requires knowledge of the final operating environment, technology cannot be matured beyond level 3 independent of the end-use requirements. In other words, higher levels of technology maturity can only be defined in terms of the program/project into which it is to be incorporated. This is not to say that progress cannot be made toward a higher level of maturity by defining a set of generic operating requirements and architectures that will cover a broad range of potential end-use, but in the final analysis, within a given program/project, the technology maturity is defined by the specific architecture and set of operational environments for that program/project.

Even though the expanded definitions offer a measure of improvement over previous descriptions, there is still plenty of room for interpretation and to that end, a definition of terms is also provided in NPR 7120.8 and shown in Appendix A. It may be necessary to expand this list of definitions to include specific terms that are of particular importance to the project. It is absolutely vital to have a set of common terminology throughout the life of a project.

Having established a common set of terminology, it is necessary to proceed to the next step – quantifying “judgment calls” on the basis of past experience. Even with clear definitions there will be the need for “judgment calls” when it comes time to assess just how similar a given element is relative to what is needed (i.e., is it “close enough” to a prototype to be considered a prototype, or is it more like an engineering breadboard?) Describing what has been done in terms of form, fit and function provides a means of quantifying an element based on its design intent and subsequent performance. Figure 4 provides a graphical representation of a 3-dimensional continuum where various models, breadboards, engineering units, prototypes, etc. are plotted. The X-axis of the graph represents “function,” the Y-axis represents “form” and the Z-axis represents “fit.” A breadboard, according to our definition, demonstrates function only, without regard to form or fit, and would consequently fall on the X-axis. If the breadboard were of the full “system,” then it would be at the 100% mark; if instead it were a breadboard of a subsystem or component, it would fall somewhere to the left of the 100% mark. Another example would be that of a wind tunnel model. These models are typically less than 1 % scale, but demonstrate aerodynamic properties associated with “form.” A mass model could demonstrate both form and fit, but would have no function. The prototype would be as close to the final article as possible and would consequently demonstrate form, fit and function. By plotting the device under question, it will be easier to classify its state of development.

9

Page 10: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 4. Form, Fit & Function

A third critical element of any assessment relates to the question of who is in the best position to make “judgment calls” relative to the status of the technology in question. For this step, it is extremely important to have a well-balanced, experienced assessment team. Team members do not necessarily have to be “discipline experts,” but they must have a good understanding at the system, or subsystem level of what has been done, under what type of conditions, and how that relates to what is under evaluation. Establishing a team with the appropriate level of experience is the most critical aspect of technology assessment.

Having established a set of definitions, defined a process for quantifying “judgment calls,” and assembled an expert assessment team, the process primarily consists of asking the right questions. It may be desirable to perform an initial screening of systems at a top level to identify initial targets; however, such screening should be undertaken with great caution since it can easily lead to missing critical areas that are not immediately apparent (such as those dealing with heritage equipment). A full assessment should be completed on all elements at the earliest possible time. An initial set of screening questions is shown in Figure 5. Again, it needs to be emphasized that these questions cannot be answered accurately if there is insufficient knowledge of the final architecture and operating environment. It is understandable that early in

10

TRL 9 YES

TRL5?

YES

TRL 8 YES

TRL 7 YES

TRL 6 YES

TRL 5 YES

TRL 4 YES

TRL 3 YES

TRL 2 YES

TRL 1 YES

Has an identical unit been successfully operated/launched in identical configuration/environment?

Has an identical unit been successfully operated in space or launch in a different configuration/ system architecture? If so then this initially drops to TRL 5 until differences are evaluated.

Has an identical unit been flight qualified, but not yet operated in space or launch?

Has a prototype unit (or one similar enough to be considered a prototype) been successfully operated in space or launch?

Has a prototype unit (or one similar enough to be considered a prototype) been demonstrated in a relevant environment?

Has a breadboard unit been demonstrated in a relevant environment?

Has a breadboard unit been demonstrated in a laboratory environment?

NO

NO

NO

NO

NO

NO

NO

Has analytical and experimental proof-of-concept been demonstrated?

Has concept or application been formulated?

Have basic principles been observed and reported?

RETHINK YOUR POSITION REGARDING THIS TECHNOLOGY!

NO

NO

NO

* Brassboard

*

*

Prototype

100%

Form

Fit

Function

100%

Wind Tunnel Model

Mass Model

Breadboard

Flight Proto-flight Qual Unit

*

*

*Subscale unitSu

* * *Engineering Unit

*Proof of

Concept

Page 11: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

a program/project, this knowledge will be incomplete, but at the same time, it should be recognized that all assessments are “preliminary” and subject to revision until the knowledge is itself mature.

Figure 5. Initial screening questions.

Again, if either the architecture or the environment in which the system is operating has changed from that for which it was orginally designed, then the TRL for that system drops to at most TRL 4 – at least initially. If, in subsequent analysis, the new environment is sufficiently close to the old environment, or the new architecture sufficiently close to the old architecture, the resulting evaluation could be TRL 5, 6 or 7. The most important thing to realize is that it is no longer at a TRL 9.

11

TRL 9 YES

TRL5?

YES

TRL 8 YES

TRL 7 YES

TRL 6 YES

TRL 5 YES

TRL 4 YES

TRL 3 YES

TRL 2 YES

TRL 1 YES

Has an identical unit been successfully operated/launched in identical configuration/environment?

Has an identical unit been successfully operated in space or launch in a different configuration/ system architecture? If so then this initially drops to TRL 5 until differences are evaluated.

Has an identical unit been flight qualified, but not yet operated in space or launch?

Has a prototype unit (or one similar enough to be considered a prototype) been successfully operated in space or launch?

Has a prototype unit (or one similar enough to be considered a prototype) been demonstrated in a relevant environment?

Has a breadboard unit been demonstrated in a relevant environment?

Has a breadboard unit been demonstrated in a laboratory environment?

NO

NO

NO

NO

NO

NO

NO

Has analytical and experimental proof-of-concept been demonstrated?

Has concept or application been formulated?

Have basic principles been observed and reported?

RETHINK YOUR POSITION REGARDING THIS TECHNOLOGY!

NO

NO

NO

4

Page 12: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Applying this process at the system level and then proceeding to lower levels of subsystem and component identifies those elements that require development, and sets the stage for the subsequent phase determining the AD2. A method for formalizing this process is shown in Figure 6. In this figure, the process has been incorporated into a table. On the left hand side of the table, the rows identify the systems, subsystems, and components that are under assessment. The columns identify the categories that will be used to determine the TRL, i.e., what units have been built, to what scale, and in what environment have they have been tested. Answers to these questions determine the TRL of an item under consideration. The TRL of the system is determined by the lowest TRL present in the system, i.e., a system is at TRL 2 if any single element in the system is at TRL 2. The problem of multiple elements being at low TRLs is dealt with in the AD2 process. It should be noted that the issue of integration affects the TRL of every system, subsystem and component. All of the elements can be at a higher TRL, but if they have never been integrated as a unit, the TRL will be lower for the unit. How much lower depends on the complexity of the integration.

Figure 6. TRL Assessment Matrix Example

An automated TRL calculator has been developed by AFRL which incorporates all of these issues (and more) and is available for use upon request.9,10 The process flow for the calculator is shown in Figure 8. The calculator can be set up to address TRLs or MRLs (Manufacturing Readiness Levels)11and PRLs (Program Readiness Level). Sample pages from the calculator are shown in Figure 7 - 9, and an example of an assessment is shown in Figure 11.

12

Page 13: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 7. The TRL Calculator Process Flow

Figure 8. The TRL Calculator option selection

13

For each TRL Answer Questions

Prior TRLN/A or Green?

Yes

No

Repeat until TRL = 9 A

A

Enoughfor Green?

No

Yes

Prior TRLN/A Green or

Yellow?

Yes

Enoughfor Yellow?

Yes

No

No

One ormore at this orhigher TRL ? No

Yes

For each TRL Answer Questions

Prior TRLN/A or Green?

Yes

No

Repeat until TRL = 9Repeat until TRL = 9 AA

AA

Enoughfor Green?

No

Yes

Prior TRLN/A Green or

Yellow?

Yes

Enoughfor Yellow?

Yes

No

No

One ormore at this orhigher TRL ? No

Yes

1

X

XXGreen / Yellow set points:

AFRL Hardware and Software Transition Readiness Level Calculator, Version 2.2

Release NotesMain Menu TRL Calculator

X

This worksheet summarizes the TRL Calculator results. It displays the TRL, MRL, and PRL computed elsewhere. You may select the technology types and TRL categories (elements) you wish to include here or on the Calculator worksheet. Choose Hardware, Software, or Both to fit your program. If you omit a category of readiness level, (TRL, MRL, or PRL) that calculation is removed from the summary. The box in front of each readiness level element is checked when that category is included in the summary.

Include Hardware and Software

Technology Readiness Level

Manufacturing Readiness Level

Programmatic Readiness Level

You can enter program identification information here, too. TRL documentation including discussions of TRL, MRL, and PRL is available from the Main Menu.

Include Hardware OnlyInclude Software Only

Here you can change the default values the spreadsheet uses to determine which color to award at a given level of question completion. System defaults are 100% for Green, and 67%

for Yellow. You can change these set points to any value above 75% for Green, and any value from 50% to 85% for Yellow; however, the Yellowset point will always be at least 15% below the Green set point. Use the spinners to set your desired values. The defaults kick in if you try to seta value less than the minimum values of 75% for Green and 50% for Yellow. Start with the "Up" arrow to change defaults.

100% 67%Green set point is now at: Yellow set point is now at:

UseOmit

UseOmit

UseOmit

Page 14: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 9. TRL Calculator Summary Page

Figure 10. TRL Calculator Example Results

A modification of the AFRL TRL Calculator has been developed and is also available upon request12. The beginning of the assessment page from this calculator is shown in Figure 11.

1 2 3 4 5 6 7 8 9

Green set point: 100%Yellow set point 50%

WBS#: Date: 10/26/2007Evaluator:Sys/subsys/comp:

SW Mission Class: Safety Critical?

Project:

Manufacturing Readiness Level Achieved Hardware

AFRL_NASA TRL Calculator, version 3beta 2007

Change yellow set point below.Technology Readiness Level Achieved Software

Technology Readiness Level Achieved Hide blank rowsHardware

Do you want to include Hardware TRL?Do you want to include Software SRL?Do you want to include Manufacturing MRL?

n/a NO

Figure 11. AFRL_NASA Calculator Assessment page header.

14

1 2 3 4 5 6 7 8 9

If Green and Yellow are at the same level, only the Green result shows. Yellow Level Achieved

Summary of the Technology's Readiness to Transition

Overall TRL is an aggregate TRL that includes contributions from each one of the three readiness level elements you have checked above.

Date TRL Calculated:

Overall TRL Achieved

Program Manager:Program Name:

Green Level Achieved

WBS Element Name Assessed Area

a) Primary Structure 5 6 7 5b) Secondary Structure 5 6 7 5 6c) Flight Separation System 5 6, 7 8 5 7d) TPS 5 6, 7 8 6 7a) Ducts & Lines 5 6 7 5 6b) Valves & Actuators 5 6 7 5 6

136905.08.05.05 US Reaction Control System 5 6, 7 8 5 7136905.08.05.06 1st Stage Reaction Control System 5 6, 7 8 5 7

a) Actuators 5 6 7 5 6b) Hydraulic Power Systems 4 5, 6 7 5 6

136905.08.05.08 Avionics a) C&DH 4 5, 6 7 5 6b) GNC Hardware 5 6 7 6c) Electrical Power 6 7 6 7d) Sensors 6 7 6 7

136905.08.05.09 Flight Software System 4 5 6 4 5136905.08.05.10 Component Integration & Test Not Applicable136905.08.05.11 Logistics Support Infrastructure GSE 6 7 8 6 7136905.08.05.12 Manufacturing and Assembly Tooling 5 6, 7 8 6 7

MRL Score

Thrust Vector Control136905.08.05.07

TRL Score

Structure & Thermal136905.08.05.03

136905.08.05.04 Main Propulsion System

Page 15: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

The modified calculator integrates the TRL calculator with the AD2 calculator and allows for information to be entered and stored according to WBS element. The calculator does not use and algorithms either for the TRL calculation or the AD2 calculation. In the case of TRL calculation, the calculator may be set to perform calculations for any combination of hardware, software and manufacturing. The questions are based on the steps needed to satisfy the exit criteria shown in Appendix A. As such the steps are either completed or not. Partial completion may be shown, but the level is not attained until all of the steps are complete. The detailed set of questions for Level 5 are shown in Figure 12.

Critical functions and associated subsystems and components identified?LEVEL 5 (Check all that apply or use sliders)

Manufacturing process developed?

Limited pre-production hardware produced?Some special purpose components combined with available laboratory components?Process, tooling, inspection, and test equipment in development?

Trade studies and lab experiments define key manufacturing processes?System interface requirements known?

System level performance predictions made for subsequent development phases?

Design techniques defined to the point where largest problems defined?

Critical subsystems and component implementations identified and designed?

Subsystems/integrated components successfully demonstrated in a relevant environment?

Relevant environments finalized?

Subsystem/component breadboards/brassboards built?

M&S pre-test performance predictions completed?Facilities, GSE, STE available to support testing in a relevant environment?

Scaling requirements defined & documented?

Subsystem/component integrations and implementations completed?

Successful demonstration documented along with scaling requirements?

Modeling & Simlation pre-test performance predictions completed?Facilities, GSE, STE available to support testing in a relevant environment?

Relevant environments finalized?Critical functions and associated subsystems and components identified?

System level performance predictions made for subsequent development phases?Breadboards/brassboards successfully demonstrated in a relevant environment?Successful demonstration documented along with scaling requirements?

Production processes reviewed with manufacturing for producibility?

Tooling and machines demonstrated in lab?Quality and reliability considered, but target levels not yet established?

Has a manufacturing flow chart been developed?

Initial assesment of assembly needs conducted?

Have facility requirements been identified?

% Complete

Scaling requirements defined & documented?

HH

HW/SW/Mfg

Critical subsystems and components breadboards/brassboards identified and designed?HHHHHHHHSSSSSSSSSSMMMMMMMMMMMMM

Figure 12. Questions pertaining to Level 5.

Assessing the AD 2 in moving technology to a higher TRL

Once a TRL has been established for the various elements of the system/subsystem or component under development, it becomes necessary to assess what will be required to advance the technology to the level required by the program. The importance of this step cannot be overemphasized because everything that transpires from this point forward will depend upon the accuracy of the assessment: the ability to prepare realistic schedules, make realistic cost projections, meet milestones and ultimately produce the desired results all depend upon having as accurate AD2 assessment as possible. This assessment is one of the most challenging aspects of technology development. – Not all technologies are the same. It requires the art of “prediction,” and the accuracy of the prediction relies on:

• Expert personnel• Detailed examination of required activity.• Review by independent advisory panel•

15

Page 16: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Establishing an accurate AD2 is difficult, but if approached in a systematic manner such as that used for establishing the TRL, then it becomes tractable. Once again, the key is to break it down to a fine enough granularity that problems can be quantified and solution probabilities accurately estimated. The AD2 of the overall system under development will be a function of the AD2 of the individual subsystems and components. It is not a straightforward combination of AD2 s. Neither is it determined solely by the most difficult element (although basing an entire system around a key component that is made of “unobtainium” would definitely place the entire system in the “breakthrough physics” category). A more likely case is where some of the elements have been demonstrated at the appropriate level, some are relatively close to being demonstrated and some have some significant hurdles to overcome. The more elements you have with low AD2 values, the greater the difficulty in advancing the system as a whole to the requisite level. This is intuitively obvious; however, knowing that a large number of elements have low AD2 values means that the problem has been examined at sufficient granularity. This is a difficult and time-consuming process that requires a broad mix of highly experienced individuals, and it is a process that is often inadequately performed. Simply put, a cursory examination of a problem may determine that there is only one element with a low AD2 value when in fact there are two or more such elements. If the program is structured to attack the identified area, it may in fact resolve the problem, only to find that the program is brought to a halt by one or more of the remaining problems.

Early recognition of the need for “quantifying” the difficulty in maturing technology resulted in establishing five categories of difficulty.13 The AD2 expands this concept into a systems engineering approach to systematically evaluate the effort required to advance the technological maturity of the elements identified in the TRL Assessment as being insufficiently mature. It combines a number of processes, Integration Readiness Levels,14 System Readiness Levels,15 Manufacturing Readiness Levels, Design Readiness Levels, Materials Readiness Levels, etc. into one overarching process. The AD2 increases the number of levels of difficulty from five to nine and reorganizes them, in recognition of a need for greater distinction in difficulty. The 9 levels of the AD2 as shown in Appendix C. It should be noted that the level 9 for AD2 has been chosen to reflect increased difficulty and therefore is an undesirable level as compared to level 9 for the TRL which is the desired level. Inverting the scale has been tried and while it makes it comparable to the TRL scale it also runs counter to the concept of increased difficulty and therefore the present designation has been made.

In determining the AD2, the focus is on the tools, processes and capabilities associated with the categories that address the development process required to produce a given element: i.e., design, analysis, manufacturing, operation, test and evaluation. It also includes categories of operations that must be considered in the development process and the development stages that must be undertaken to reduce risk. The result of the AD2 process is a matrix that provides a quantifiable assessment of the degree of difficulty in advancing the technology. The table shown in Figure 13 is an example showing the process in assessing the status of turbomachinery. It involves the systems, subsystems and components identified in the maturity assessment phase as ones that were insufficiently mature. The left hand side of the table is identical in form to that of the TRL assessment table but may in fact go to even finer granularity. The identification of the columns, however, is quite different. In this case, we are not looking for the existence of historical information relative to the current status of an element rather we are looking for an assessment of the degree of difficulty in maturing the element. This requires addressing appropriate questions regarding the

development process:

16

TRLR High Risk =1-4O Moderate Risk =5-6G Existing to Low Risk = 7-9W White = Not Considered

WBS Element1.0 Turbopump 1.1 Inducer 4 4 7 4 5 7 9 9 7 9 9 9 9 91.2 Impeller 8 8 8 8 9 7 9 9 8 9 9 9 9 91.3 Pump Housing1.3.1 Volute 9 9 9 7 8 8 9 91.3.2 Diffuser 9 9 9 7 8 8 9 91.4 Turbine manifold 1.4.1 Turbine Blades 5 8 9 8 8 8 9 7 8 6 9 9 9 9 91.4.2 Turbine Nozzles 8 9 9 8 8 8 9 7 8 6 9 9 9 91.5 Dynamic Seals 5 6 6 6 6 7 5 5 7 6 91.6 Bearings Supports 7 5 6 6 6 6 5 6 7 8 6 81.7 Secondary Flow Path 8 8 8 8 8 7 9 91.8 Axial Thrust Balance 7 7 7 7 7 7 7 8 8 7 81.9 Materials 9 9 91.10 Design Integration and Assembly 9 92.0 Fabrication 83.0 Validation Testing 9 9 9 9 8 8

Rep

rodu

cibi

lity

Met

rolo

gy

Max

imiz

ed R

elia

bilit

y

App

ropr

iate

Mod

els

Ass

embl

y &

Alli

gnm

ent

Per

sonn

el S

kills

Faci

litie

s

Man

ufac

tura

bilit

y

Test

abili

ty

Com

pone

nts

Demonstration UnitsTest & EvaluationDesign & Analysis Manufacturing Operability

Dat

abas

es

Des

ign

Met

hods

& T

ools

Ana

lytic

al M

etho

ds &

Too

ls

Per

sonn

el S

kills

Mat

eria

ls

Mac

hine

s

Tool

ing

Mfg

. Sof

twar

e

Per

sonn

el S

kills

Inte

grat

ion

Min

imiz

ed L

ife C

ycle

Cos

ts

Min

imiz

ed O

pera

ting

Cos

ts

Max

miz

ed R

aint

aina

bilit

y

Sca

le M

odel

Eng

inee

ring

Uni

t

Test

Equ

ipm

ent

Bre

adbo

ard

Bra

ssbo

ard

Cur

rent

TR

L

Criteria

Faci

litie

s

Env

ironm

enta

l Fac

ilitie

s

Ana

lytic

al T

ools

Max

imiz

ed A

vaila

bilit

y

GS

E R

equi

red

Per

sonn

el S

kils

Mfg

. Pro

cess

es

Min

. Pro

cess

Var

iabi

lity

Page 17: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Figure 13. AD2 Example (N.B. This example used a reverse scale where 9 represented 0% risk)

Design and Analysis Do the necessary data bases exist and if not, what level of development is required to produce them?Do the necessary design methods exist and if not, what level of development is required to produce them?Do the necessary design tools exist and if not, what level of development is required to produce them?Do the necessary analytical methods exist and if not, what level of development is required to produce them?Do the necessary analysis tools exist and if not, what level of development is required to produce them?Do the appropriate models with sufficient accuracy exist and if not, what level of development is required to produce them?Do the available personnel have the appropriate skills and if not, what level of development is required to acquire them?Has the design been optimized for manufacturability and if not, what level of development is required to optimize it?Has the design been optimized for testability and if not, what level of development is required to optimize it?

Manufacturing Do the necessary materials exist and if not, what level of development is required to produce them?Do the necessary manufacturing facilities exist and if not, what level of development is required to produce them?Do the necessary manufacturing machines exist and if not, what level of development is required to produce them?Does the necessary manufacturing tooling exist and if not, what level of development is required to produce it?Does the necessary metrology exist and if not, what level of development is required to produce it?Does the necessary manufacturing software exist and if not, what level of development is required to produce it?Do the available personnel have the appropriate skills and if not, what level of development is required to acquire them?Has the design been optimized for manufacturability and if not, what level of development is required to optimize it?Has the manufacturing process flow been optimized and if not, what level of development is required to optimize it?Has the manufacturing process variability been minimized and if not, what level of development is required to optimize it?Has the design been optimized for reproducibility and if not, what level of development is required to optimize it?Has the design been optimized for assembly & alignment and if not, what level of development is required to optimize it?Has the design been optimized for integration at the component, subsystem and system level and if not, what level of development is required to optimize it?Are breadboards required and if so what level of development is required to produce them?Are brassboards required and if so what level of development is required to produce them?Are subscale models required and if so what level of development is required to produce them?Are engineering models required and if so what level of development is required to produce them?Are prototypes required and if so what level of development is required to produce them?Are breadboards, brassboards, engineering models and prototypes at the appropriate scale and fidelity for what they are to demonstrate, and if not what level of development is required to modify them accordingly?Are Qualification models required and if so what level of development is required to produce them?

OperationsHas the design been optimized for maintainability and servicing and if not, what level of development is required to optimize it?

17

Page 18: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Has the design been optimized for minimum life cycle cost and if not, what level of development is required to optimize it?Has the design been optimized for minimum annual recurring / operational cost and if not, what level of development is required to optimize it?Has the design been optimized for reliability and if not, what level of development is required to optimize it?Has the design been optimized for availability {ratio of operating time (reliability) to downtime (maintainability/ supportability)} and if not, what level of development is required to optimize it?Do the necessary ground systems facilities & infrastructure exist and if not, what level of development is required to produce them?Does the necessary ground systems equipment exist and if not, what level of development is required to produce it?Does the necessary ground systems software exist and if not, what level of development is required to produce it?Do the available personnel have the appropriate skills and if not, what level of development is required to acquire them?

Test & EvaluationDo the necessary test facilities exist and if not, what level of development is required to produce them?Does the necessary test equipment exist and if not, what level of development is required to produce them?Does the necessary test tooling exist and if not, what level of development is required to produce it?Do the necessary test measurement systems exist and if not, what level of development is required to produce them?Does the necessary software exist and if not, what level of development is required to produce it?Do the available personnel have the appropriate skills and if not, what level of development is required to acquire them?Has the design been optimized for testability and if not, what level of development is required to optimize it?Are breadboards required to be tested and if so what level of development is required to test them?Are brassboards required to be tested and if so what level of development is required to test them?Are subscale models required to be tested and if so what level of development is required to test them?Are engineering models required to be tested and if so what level of development is required to test them?Are prototypes required to be tested and if so what level of development is required to test them?Are Qualification models required to be tested and if so what level of development is required to test them?

Each element is evaluated according to the scale in Appencix C. The resulting scores are tabulated at the far right of the table to indicate how many categories are at what level. The penultimate column provides a numerical assessment of the overall difficulty, and the final column displays a color-coded assessment corresponding to a level of concern. In scoring the table, the degree of difficulty is never higher than the lowest value in the row. However, the degree of difficulty may be increased further by the presence of multiple categories having low values. An algorithm for calculating AD2 scores is given in Appendix D. It should be carefully noted that the value in the AD2 process is in the generation of the data, it cannot be easily reduced to a single number and consequently such reductions are intended to be used as general indications of difficulty.

Again it is important to remember that the questions should be tailored to the project and the level of detail should be appropriate to the phase of the program. Many of the questions outlined above can be expanded upon, e.g. in the area of design:

Design Life: related to wear-out and replacementIdentification of key critical functions/failures and design solutions identified through testing and demonstrationErgonomics: human limitations and safety considerationsReduction in complexity Duplication to provide fault toleranceIs derating criteria established to limit stress on components?Are modularization concepts applied?Commonality with other systems

18

Page 19: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

ReconfigurabilityFeedback of failure information (lessons learned, reliability predictions, failure history, etc.) to influence designMaintenance philosophyStorage RequirementsTransportation RequirementsDesign for parts orientation and handlingDesign for ease of assembly

Or in manufacturing:Integrated Design/manufacturing tools?Materials? Are they compatible with manufacturing technology selected?Non-destructive evaluation? Other inspections?Workforce (with right skills) availability?Can building-block approach be followed?

Or in operations:Maximized Launch Readiness Probability?

o Related to Availability but covers the period of time from start of ground processing (e.g. start of launch vehicle stacking) to start of launch countdown (e.g. “LCC Call to Station”).

o For example: Launch Readiness of 85% or greater - Ability to be ready for launch countdown 85% of the time or better on the first scheduled launch attempt (excluding weather and ground systems, mission systems, and payload / spacecraft anomalies)

o Variables that influence the launch readiness probability: Availability MTBF MTTR Ability to meet H/W delivery dates

Successful determination of AD2 in this process requires that personnel with unique skills be added to the team who performed the initial TRL assessment. The TRL assessment required knowledge of what had and had not been done to date. It is imperative that these individuals know what has happened in the field, but they are not necessarily experts. In order for the AD2 assessment to have any validity, experts must perform the assessment. It is for all intents and purposes a predictive undertaking, and since crystal balls are in short supply, the only rational approach is to employ individuals that have the background and experience necessary to provide realistic assessments and projections. It is rare that any one individual (or even two individuals) will possess the requisite knowledge to cover all areas, and so expertise must be drawn from throughout the community.

Occasionally the required expertise does not exist at all, meaning that diverse groups will have to be put together to form a collective opinion. An example of this might involve a laser in which the laser researchers have no experience in space qualification and the personnel with space qualification experience have no experience with lasers. The only solution to this is to get the two groups together for an extended period of time, long enough that they can each understand the issues as a collective group. Establishing the proper assessment group is one of the most challenging activities that a program/project must face and one most critical to success.

19

Page 20: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

As in the case of the TRL Calculator, an automated version has been developed and integrated into the aforementioned AFRL_NASA Calculator and is available upon request.12 The assessment question page is shown in Figure 14. As was mentioned previously, the calculator does not use an algorithm but

Figure 14. AD2 Evaluation Page

rather provides the information as a function risk associated with each WBS element. An example is shown in Figure 15.

10/28/07 10:40 PM

Sensitivity 3Project: Example

WBSRecord Sub Sys Comp Name Problem Areas Schedule Cost Tech Dev Needed

5 1.1.05 1.1.0 Inducer3 1.2.0 Impeller4 1.3.0 1.3.1 Pump Housing4 1.3.1 Volute6 1.3.2 Diffuser7 1.4.0 Turbine Blades8 1.5.0 Turbine Nozzles

11 1.6.0 1.6.1 Turbine Housing11 1.6.1 Manifolds9 1.6.2 Guide Vanes

10 1.7.0 Dynamic Seals12 1.8.0 Bearings/Rotor13 1.10.0 Axial Thrust Balance14 1.10.2 Axial Thrust Balance22 a1.2.3.5 a1.2.3.5.21 Pressure control2 a1.2.3.5.21 Bleed valve

D&A - Necessary data bases zero time zero cost 60% Dev RiskD&A - Appropriate skills zero time $50M to $100M 60% Dev RiskMfg - Optimized for reproducibility 0 to 6mo > $100M 60% Dev RiskT&V - Qual. models require test 6mo to 1yr $1M to $10M 60% Dev Risk

1 a1.2.3.5.22 2nd Bleed valveD&A - Necessary design methods zero time $1M to $10M 60% Dev RiskMfg - Optimized mfg. process flow zero time $20M to $50M 60% Dev RiskOps - Ground systems software 1yr to 2yr > $100M 60% Dev RiskT&V - Qual. models require test 6mo to 1yr $1M to $10M 60% Dev Risk

60% Dev Risk

AD2 Roll-up of Subsystem Drivers

Figure 15. AD2 Roll-up Example

20

Advancement Degree of Difficulty - Questions Today's Date: 10/26/2007

Project:Title:

Evaluator:Evaluation Date (Saved data only):

WBS Product Hierarchy Name WBS#

System/SubsystemSubsystem/Component

AD2 Criteria AD2 Criteria Only Answer Those Questions That Apply

Technical Deg

Schedule Cost Of Difficulty Design and Analysis Comments (42 character limit)

4Do the necessary data bases exist and if not, what level of development is required to produce them?Do the necessary design methods exist and if not, what level of development is required to produce them?Do the necessary design tools exist and if not, what level of development is required to produce them?Do the necessary analytical methods exist and if not, what level of development is required to produce them?Do the necessary analysis tools exist and if not, what level of development is required to produce them?Do the appropriate models with sufficient accuracy exist and if not, what level of development is required to produce them?

Do the available personnel have the appropriate skills and if not, what level of development is required to acquire them?

Has the design been optimized for manufacturability and if not, what level of development is required to optimize it?

N.B. The name of the "Title" is used to identify saved data, it should be the element under

evaluation..The additional level can be used to provide more depth to the assessment.

Page 21: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Objective Independent Assessments - The Independent Advisory Panel

It is expected that the assessment both of the TRL and the AD2 will be accomplished from within the program/project with program/project personnel augmented as necessary when critical skills are absent. In order to validate this assessment it will be important to establishing an Independent Advisory Panel (IAP) to periodically review the assessment process. There are many pitfalls in maturing technology and even the most experienced program/project manager with the best team in the world will benefit from the advice of an independent advisory panel. An outside group of experts can provide insight into the direction and progress in a positive, constructive and “non-threatening” manner. This insight will help keep the program on track and maximize the probability of success. The advice from this panel will be of extreme importance throughout the duration of the program; however, its greatest impact will be in the evaluation of the initial set of goals/requirements and the results of the Assessment Teams that will identify the key technologies, establish their readiness levels and determine the degree of difficulty involved in advancing them to the requisite readiness levels. These results form the basis for success of the entire program/project. They are used to build roadmaps, establish priorities, estimate costs, and create development paths, milestones and schedules. It is incredibly important that these assessments are part of the base-lining process. Once the initial reviews have been conducted and the results assimilated, the IAP should be called upon to periodically assess results at critical points throughout the life of the program/project through PDR.

The makeup of the advisory panel is very important, and considerable time and effort should be expended in making sure that the proper mix of expertise is included. The panel should be comprised of very senior people who have no vested interest in the program/project. The panel should have as broad a range of experience as possible and contain genuine experts in all the critical technology areas. As a minimum the panel should include a senior manager with experience in managing technology programs/projects, a senior manager with experience in managing flight hardware programs/projects, a senior technologist, a senior systems engineer and a senior operations engineer. Depending upon the program/project, the panel should also include a senior member from the Department of Defense or any other agency that has similar activity underway. It should also include discipline experts drawn from the academic and industrial communities.

The IAP lead must be highly respected in the community with a broad range of experience and, most importantly, the ability to devote the necessary time to the program, particularly in the early stages. The role of the lead is to provide straight, unvarnished, objective advice relative to the program/project. Consequently, it is important that a lead be chosen who is capable of such interaction.

Establishing Milestones and Technical Performance Metrics (TPMs) relative to technology maturation

The purpose of establishing milestones is to enable progress to be tracked, anticipate problems as far in advance as possible and to make timely decisions relative to alternative paths and fall-back positions. They also allow external entities to track progress, and therein is the problem. It is extremely important in the technology maturation process that these milestones have quantifiable metrics associated with them. These time-phased metrics are known as Technical Performance Measures (TPMs). There is a natural tendency to call out milestones and define metrics that are easily met in order to “protect” the activity from unwanted outside influence. Unfortunately, there is a severe downside to this approach. Milestones and metrics that do not provide any insight to outside micromanagers also do not provide any insight to the program/project manager which is not good for the program/project. The only true alternative is to take the time to establish metrics and milestones that will be of value in assuring that the technological maturation required by the program/project occurs in a timely and effective manner and deal with unwanted advice when it comes.

Establishing good, quantifiable metrics/milestones is an art that requires in-depth knowledge of what is being undertaken, its current state and where it is supposed to be going. In other words, the information provided by the TRL and AD2 assessments. It is extremely important to emphasize the word quantifiable! If you can’t measure it – you can’t make it! The heart of the issue of progress measurement is testing, the most important aspects of measuring progress. Once it is accepted that “testing” is the underpinning of the way to measure progress, the definition of appropriate metrics becomes somewhat more tractable. One of the most difficult areas to track progress today is in the area of software development, whether it is in a flight program or a technology program. Lines of code do not tell you what “functions” are being developed. Neither do the number of “builds,” since certain functions are often deferred to the next build. What is quantifiable is the function a given set of software can perform. In other words what would be the

21

Page 22: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

result if it were tested? Difficult problems must be broken down into manageable sized “chunks” that have (1) reasonably well understood technical goals, (2) short enough development times that success or failure can have the appropriate impact on the overall effort, and (3) measurable output that demonstrates appropriate incremental progress. As was the case with cost, schedules and milestones, the data generated during the AD2 assessment provides important insight into establishing the appropriate metrics required.

Outputs/Products/Exit Criteria

Technology Development Plan

The Technology development plan identifies key technological advances and describes the steps necessary to bring them to a level of maturity (TRL) that will permit them to be successfully integrated into a program/project. It provides the overall direction of the effort. The plan is developed after the completion of the TMA and AD2 assessments. These assessments also provide critical data for program costing, scheduling and implementation planning. The Technology Assessment process identifies through the TMA the maturity of technologies required to be incorporated into the program, and the AD2 assessment establishes the requirements for maturing those technologies. Once these difficult processes have been successfully completed, generation of the plan is simply documentation of the results – a hierarchical collection of maps that starts at the highest system level and follows the breakdown into subsystems and components as established in the TRL assessment. The AD2 assessment is used to determine where parallel approaches should be put in place. Again, in order to maximize the probability of success it will be highly desirable to have multiple approaches to highly difficult activities (to the extent possible within cost constraints). The AD2 assessment also provides considerable insight into what breadboards, engineering models and or prototypes will be needed and what type of testing and test facilities will be needed. As was mentioned earlier, the AD2 assessment can play a principal role in establishing program costs and schedules, and it is impossible to overemphasize the importance of having an accurate assessment as early as possible. A preliminary version of the Technology Development Plan is required for transition from Pre-phase A to Phase A. The final version of the plan is required for transition from Phase A to Phase B. A Template is included as appendix E.

Technology Assessment Report

The Technology Assessment Report is prepared at the end of Phase B and presented at the PDR. It serves to document that all systems, subsystems and components have been demonstrated through test and analysis to be at a maturity level at or above TRL6. The TAR is required for transition from Phase A to Phase C. A Template is included as Appendix F.

Bibliography:

1. Jay Mandelbaum, Technology Readiness assessment, AFRL 2007 Technology Maturity Conference, Virginia Beach Virginia, Sep 11-13.

2. Schinasi, Katherine, V., Sullivan, Michael, “Findings and Recommendations on Incorporating New Technology into Acquisition Programs,” Technology Readiness and Development Seminar, Space System Engineering and Acquisition Excellence Forum, The Aerospace Corporation, April 28, 2005.

3. “Better Management of Technology Development Can Improve Weapon System Outcomes,” GAO Report, GAO/NSIAD-99-162, July 1999.

4. “Using a Knowledge-Based Approach to Improve Weapon Acquisition,” GAO Report, GAO-04-386SP. January 2004.

5. “Capturing Design and Manufacturing Knowledge Early Improves Acquisition Outcomes,” GAO Report, GAO-02-701, July 2002.

6. Wheaton, Marilee, Valerdi, Ricardo, “EIA/ANSI 632 as a Standardized WBS for COSYSMO,” 2005 NASA Cost Analysis Symposium, April 13, 2005.

7. Sadin, Stanley T.; Povinelli, Frederick P.; Rosen, Robert, “NASA technology push towards future space mission systems,” Space and Humanity Conference Bangalore, India, Selected Proceedings

22

Page 23: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

of the 39th International Astronautical Federation Congress, Acta Astronautica, pp 73-77, V 20, 1989

8. Mankins, John C. “Technology Readiness Levels” a White Paper, April 6, 1995.9. Nolte, William, “Technology Readiness Level Calculator, “Technology Readiness and

Development Seminar, Space System Engineering and Acquisition Excellence Forum, The Aerospace Corporation, April 28, 2005.

10. TRL Calculator is available at the Defense Acquisition University Website at the following URL: https://acc.dau.mil/communitybrowser.aspx?id=25811

11. Manufacturing Readiness Level description is found at the Defense Acquisition University Website at the following URL: https://acc.dau.mil/CommunityBrowser.aspx?id=18231

12. For information on automated NASA_AFRL integrated TRL/AD2 Calculatorcontact [email protected]

13. Mankins, John C. , “Research & Development Degree of Difficulty (RD3)” A White Paper, March 10, 1998.

14. Sauser, Brian J., “Determining system Interoperability using an Integration Readiness Level”, Proceedings, NDIA Conference Proceedings.

15. Sauser, Brian, Ramirez-Marquez, Jose, Verma, Dinesh, Gove, Ryan, “ From TRL to SRL: The Concept of Systems Readiness Levels, Paper #126, Conference on Systems Engineering Proceedings.

Additional material of interest

De Meyer, Arnould, Loch, Christoph H., and Pich Michael T., “Managing Project Uncertainty: From Variation to Chaos,” MIT Sloan Management Review, pp. 60-67, Winter 2002.

23

Page 24: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix A – NASA Technology Readiness Level Descriptions

24

Technology Readiness

Level - (TRL)Definition Hardware Description Software Description Exit Criteria

1 Basic principles observed and reported

Scientific knowledge generated underpinning hardware technology concepts/applications.

Scientific knowledge generated underpinning basic properties of software architecture and mathematical formulation.

Peer reviewed publication of research underlying the proposed concept/application

2 Technology concept or application formulated

Invention begins, practical application is identified but is speculative, no experimental proof or detailed analysis is available to support the conjecture.

Practical application is identified but is speculative, no experimental proof or detailed analysis is available to support the conjecture. Basic properties of algorithms, representations & concepts defined. Basic principles coded. Experiments performed with synthetic data.

Documented description of the application/concept that addresses feasibility and benefit

3

Analytical and/or experimental critical function or characteristic proof-of-concept

Analytical studies place the technology in an appropriate context and laboratory demonstrations, modeling and simulation validate analytical prediction.

Development of limited functionality to validate critical properties and predictions using non-integrated software components

Documented analytical/experimental results validating predicitions of key parameters

4Component or breadboard validation in laboratory

A low fidelity system/component breadboard is built and operated to demonstrate basic functionality and critical test environments and associated performance predicitions are defined relative to the final operating environment.

Key, functionally critical, software components are integrated, and functionally validated, to establish interoperability and begin architecture development. Relevant Environments defined and performance in this environment predicted.

Documented test performance demonstrating agreement with analytical predictions. Documented definition of relevant environment.

5Component or breadboard validation in a relevant environment

A mid-level fidelity system/component brassboard is built and operated to demonstrate overall performance in a simulated operational environment with realistic support elements that demonstrates overall performance in critical areas. Performance predictions are made for subsequent development phases.

End-to-end Software elements implemented and interfaced with existing systems/simulations conforming to target environment. End-to-end software system, tested in relevant environment, meeting predicted performance. Operational Environment Performance Predicted. Prototype implementations developed.

Documented test performance demonstrating agreement with analytical predictions. Documented definition of scaling requirements

6

System/subsystem model or prototype demonstration in a relevant environment

A high-fidelity system/component prototype that adequately addresses all critical scaling issues is built and operated in a relevant environment to demonstrate operations under critical environmental conditions.

Prototype implementations of the software demonstrated on full-scale realistic problems. Partially integrate with existing hardware/software systems. Limited documentation available. Engineering feasibility fully demonstrated.

Documented test performance demonstrating agreement with analytical predictions

7 System prototype demonstration in space

A high fidelity engineering unit that adequately addresses all critical scaling issues is built and operated in a relevant environment to demonstrate performance in the actual operational environment and platform (ground, airborne or space).

Prototype software exists having all key functionality available for demonstration and test. Well integrated with operational hardware/software systems demonstrating operational feasibility. Most software bugs removed. Limited documentation available.

Documented test performance demonstrating agreement with analytical predictions

8

Actual system completed and flight qualified through test and demonstration

The final product in its final configuration is successfully demonstrated through test and analysis for its intended operational environment and platform (ground, airborne or space).

All software has been thoroughly debugged and fully integrated with all operational hardware and software systems. All user documentation, training documentation, and maintenance documentation completed. All functionality successfully demonstrated in simulated operational scenarios. V&V completed..

Documented test performance verifying analytical predictions

9

Actual system flight proven through successful mission operations

The final product is successfully operated in an actual mission.

All software has been thoroughly debugged and fully integrated with all operational hardware/software systems. All documentation has been completed. Sustaining software engineering support is in place. System has been successfully operated in the operational environment.

Documented mission operational results

Page 25: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix B -Technology Development Terminology

Proof of Concept: (TRL 3)Analytical and experimental demonstration of hardware/software concepts that may or may not be

incorporated into subsequent development and/or operational units.

Breadboard: (TRL 4)A low fidelity unit that demonstrates function only, without respect to form or fit in the case of

hardware, or platform in the case of software. It often uses commercial and/or ad hoc components and is not intended to provide definitive information regarding operational performance.

Developmental Model/ Developmental Test Model: (TRL 4) Any of a series of units built to evaluate various aspects of form, fit, function or any combination thereof. In general these units may have some high fidelity aspects but overall will be in the breadboard category.

Brassboard: (TRL 5 – TRL6)A mid-fidelity functional unit that typically tries to make use of as much operational

hardware/software as possible and begins to address scaling issues associated with the operational system. It does not have the engineering pedigree in all aspects, but is structured to be able to operate in simulated operational environments in order to assess performance of critical functions.

Mass Model: (TRL 5)Nonfunctional hardware that demonstrates form and/or fit for use in interface testing, handling, and

modal anchoring.

Subscale model: (TRL 5 – TRL7)Hardware demonstrated in subscale to reduce cost and address critical aspects of the final system.

If done at a scale that is adequate to address final system performance issue it may become the prototype..

Proof Model: (TRL 6)Hardware built for functional validation up to the breaking point, usually associated with fluid

system over pressure, vibration, force loads, environmental extremes, and other mechanical stresses..

Proto-type Unit: (TRL 6 – TRL 7) The proto-type unit demonstrates form (shape and interfaces), fit (must be at a scale to adequately

address critical full size issues), and function (full performance capability) of the final hardware. It can be considered as the first Engineering Model. It does not have the engineering pedigree or data to support its use in environments outside of a controlled laboratory environment – except for instances where a specific environment is required to enable the functional operation including in-space. It is to the maximum extent possible identical to flight hardware/software and is built to test the manufacturing and testing processes at a scale that is appropriate to address critical full scale issues.

Engineering Model: (TRL 6 – TRL 8) A full scale high-fidelity unit that demonstrates critical aspects of the engineering processes involved in the development of the operational unit. It demonstrates function, form, fit or any combination thereof at a scale that is deemed to be representative of the final product operating in its operational environment. Engineering test units are intended to closely resemble the final product (hardware/software) to the maximum extent possible and are built and tested so as to establish confidence that the design will function in the expected environments. In some cases, the engineering unit will become the protoflight or final product, assuming proper traceability has been exercised over the components and hardware handling.

Flight Qualification Unit: (TRL 8) Flight hardware that is tested to the levels that demonstrate the desired margins, particularly for exposing fatigue stress., typically 20-30%. Sometimes this means testing to failure. This unit is never

25

Page 26: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

flown. Key overtest levels are usually +6db above maximum expected for 3 minutes in all axes for shock, acoustic, and vibration; thermal vacuum 10C beyond acceptance for 6 cycles, and 1.25 times static load for unmanned flight.

Protoflight Unit: (TRL 8 – TRL 9)Hardware built for the flight mission that includes the lessons learned from the Engineering Model

but where no Qualification model was built to reduce cost. It is however tested to enhanced environmental acceptance levels. It becomes the mission flight article. A higher risk tolerance is accepted as a tradeoff. Key protoflight overtest levels are usually +3db for shock, vibration, and acoustic; 5C beyond acceptance levels for thermal vacuum tests.

Flight Qualified Unit: (TRL8 – TRL9)Actual flight hardware/software that has been through acceptance testing. Acceptance test levels

are designed to demonstrate flight-worthiness, to screen for infant failures without degrading performance. The levels are typically less than anticipated levels.

Flight Proven: (TRL 9)Hardware/software that is identical to hardware/software that has been successfully operated in a

space mission.

Environmental Definitions;

Laboratory Environment:An environment that does not address in any manner the environment to be encountered by the

system, subsystem or component (hardware or software) during its intended operation. Tests in a laboratory environment are solely for the purpose of demonstrating the underlying principles of technical performance (functions) without respect to the impact of environment.

Relevant Environment: Not all systems, subsystems and/or components need to be operated in the operational environment

in order to satisfactorily address performance margin requirements. Consequently, the relevant environment is the specific subset of the operational environment that is required to demonstrate critical “at risk” aspects of the final product performance in an operational environment.

Operational Environment: The environment in which the final product will be operated. In the case of spaceflight

hardware/software it is space. In the case of ground based or airborne systems that are not directed toward space flight it will be the environments defined by the scope of operations. For software, the environment will be defined by the operational platform and software operating system.

Additional Definitions:

Mission Configuration:The final architecture/system design of the product that will be used in the operational

environment. If the product is a subsystem/component then it is embedded in the actual system in the actual configuration used in operation.

Verification – Demonstration by test that a device meets its functional and environmental requirements. (ie., did I build the thing right?)

Validation – Determination that a device was built in accordance with the totality of its prescribed requirements by any appropriate method. Commonly uses a verification matrix of requirement and method of verification. (ie., did I build the right thing?)

Part – Single piece or joined pieces impaired or destroyed if disassembled –eg., a resistor.

26

Page 27: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Subassembly or component – Two or more parts capable of disassembly or replacement – eg., populated printed circuit board..

Assembly or Unit – a complete and separate lowest level functional item – eg., a valve.Subsystem – Assembly of functionally related and interconnected units - eg ., electrical power subsystem.

System – The composite equipment, methods, and facilities to perform and operational role.

Segment - The constellation of systems, segments, software, ground support, and other attributes required for an integrated constellation of systems.

27

Page 28: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix C – Advancement Degree of Difficulty Levels

Degree of Difficulty Description

9

100% Development Risk - Requires new development outside of any existing experience base. No viable approaches exist that can be pursued with any degree of confidence. Basic research in key areas needed before feasible approaches can be defined.

8 80% Development Risk - Requires new development where similarity to existing experience base can be defined only in the broadest sense. Multiple development routes must be pursued.

7

60% Development Risk - Requires new development but similarity to existing experience is sufficient to warrant comparison in only a subset of critical areas. Multiple development routes must be pursued

6

50% Development Risk - Requires new development but similarity to existing experience is sufficient to warrant comparison in only a subset of critical areas. Dual development approaches should be pursued in order to achieve a moderate degree of confidence for success. (Desired performance can be achieved in subsequent block upgrades with high degree of confidence.)

5

40% Development Risk - Requires new development but similarity to existing experience is sufficient to warrant comparison in all critical areas. Dual development approaches should be pursued to provide a high degree of confidence for success.

4

30% Development Risk - Requires new development but similarity to existing experience is sufficient to warrant comparison across the board. A single development approach can be taken with a high degree of confidence for success.

3 20% Development Risk - Requires new development well within the experience base. A single development approach is adequate.

2 10% Development Risk - Exists but requires major modifications. A single development approach is adequate.

1 0% Development Risk - Exists with no or only minor modifications being required. A single development approach is adequate.

28

Page 29: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix D: AD 2 Weighting Algorithms

Note, reducing the AD2 process to a single number should be viewed very cautiously in that it can mask critical single point failures. The important information is gained in the process, and cannot be adequately described in a single number.

If any category of any system, subsystem or component is at an AD2 Level 5 or above then the system, subsystem and component are all at that level.

If all of the categories are at or below AD2 Level 4, then:

Composite AD2 = [ Category Levels] # Categories

An alternative algorithm would be as follows where the AD 2 level for an element is set at the highest level plus an additional factor that accounts for multiple categories that have high levels:

Highest category Level +(number of categories at Level 9 x 10) + (number of categories at level 8 x 9) + ---

(number of categories) x 10

For the subsystem, the overall subsystem risk level would be calculated in a similar manner:

Highest element overall value + element overall scores (number of elements) x 10

And finally for the system, the overall system risk level is calculated on the basis of the subsystem values.Highest subsystem overall value +

subsystem overall scores (number of subsystems)x 10

The color code associated with each row is a somewhat subjective assessment of the concern associated with accomplishing the tasks associated with the row and serves to provide a visual cue relative to the degree of difficulty:

The color code associated with each row provides a visual cue relative to the degree of difficulty:Red: High Risk 7-9Yellow: Moderate Risk 5-6Green: Low Risk 1-4White: Not considered

29

Page 30: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix E: Technology Development Plan Template

Technology Development Plan TemplateTBD Project

Technology Development Plan

30

Page 31: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Table of Contents

1 Introduction.................................................................................................................................................01.1 Purpose and Scope........................................................................................................................01.2 Relation to Other Plans.................................................................................................................01.3 Approach to Updates.....................................................................................................................01.4 Document Overview.....................................................................................................................0

2 Project Requirements..................................................................................................................................02.1 Mandatory Outcomes....................................................................................................................02.2 Specified Methods.........................................................................................................................0

3 Criteria and Definitions..............................................................................................................................03.1 Technology Readiness Levels.......................................................................................................03.2 Advancement Degree of Difficulty...............................................................................................03.3 Associated Definitions..................................................................................................................0

4 Roles and Responsibilities...........................................................................................................................04.1 Planning and Execution.................................................................................................................04.2 Oversight and Approval................................................................................................................0

5 Implementation Approach..........................................................................................................................05.1 Identify and Classify Technologies...............................................................................................05.2 Develop and Integrate Planning....................................................................................................05.3 Track and Status Activities...........................................................................................................05.4 Document and Report Achievements............................................................................................0

6 Plan Support Resources..............................................................................................................................06.1 Templates and Schema..................................................................................................................06.2 Methods and Techniques...............................................................................................................06.3 Facilities and Equipment...............................................................................................................0

7 Integrated Plan & Schedule........................................................................................................................07.1 Critical Technologies List.............................................................................................................07.2 Integrated Schedule.......................................................................................................................07.3 Current Issues................................................................................................................................07.4 Technology Roadmaps..................................................................................................................0

Appendices......................................................................................................................................................0Appendix A Acronyms................................................................................................................................0Appendix B Template 1...............................................................................................................................0

31

Page 32: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Introduction

Introduce the plan and provide the context for understanding its intent.

Purpose and Scope

Provide the purpose of the Technology Development (TD) Plan. State why the plan was written and what is expected from its execution.

Provide the scope of the TD Plan, stating the programs, organizations, processes, systems, and timeframes to which the plan applies

Provide the philosophy of the TD Plan, stating the rationale and derivation of the approach and the key tenets of the process.

Relation to Other Plans

Discuss how this TD plan, and the processes described within, interacts with other Project plans and processes. Discuss how they complement each other and don’t duplicate effort or responsibilities.

In particular, discuss the relationship with the Architecture Definition process, as these two processes are performed in parallel and highly interwoven.

In addition, discuss the interaction with the Risk Management plan, as technology development planning is primarily a form of risk mitigation for the program.

Also discuss the relation to the Project Test Plan, as much of the validation of achieving readiness levels is via test.

Approach to Updates

Describe how and when the TD Plan gets updated. Name the person responsible for updating the document. Identify the event triggers that initiate an update. For a given revision, describe the major changes since the last release.

Document Overview

Provide a short outline of the information contained in the plan and how it is arranged.

32

Page 33: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Project Requirements

Describe the programmatic and technical requirements for Technology development activities levied onto the Program

Mandatory Outcomes

State the required outcomes of the TD plan. They can include specific goals & objectives, products & data, and decisions & status, with associated timeliness & quality conditions.

Include the Enterprise level requirements that apply to all Programs, and any additional Program specific requirements. Identify the timeframes when they are to be met.

Common outcomes include:

achievement of a system level TRL by a given milestone (like TRL 6 by PDR)

production of specific TRL planning and validation documents (like Technology Assessment Report)

stipulation of exit criteria and fall back approaches for each relevant technology

Specified Methods

Identify any methods that must be utilized when implementing technology development for the Program. These can be levied from the enterprise level or the program level.

Common specified methods include:

coverage approaches (like product breakdown structure)

system architecture assessment methods (like RoSA)

technology classification methods (like NASA TRL)

difficulty of achievement determination methods (like AD2)

how to roll up TRLs and AD2s to form composite values

how to handle heritage hardware in new configurations and environments

having certain aspects performed by independent body

stipulated status reporting schedules;

33

Page 34: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Criteria and Definitions Outline the frameworks and definitions used to formulate evaluation and achievement criteria for each technology and the overall system. Discuss the sanctioning of the frameworks. State who developed them and who approved them. Discuss what process improvement provisions are in place.

Technology Readiness Levels

Outline the Technology Readiness Level (TRL) framework as agreed to by the program and other stakeholders. Provide figures and tables as appropriate.

Define the discrete levels for each in terms of architecture and environment fidelity and associated criteria. Include as appropriate definitions in terms of form, fit, and function.

If appropriate and distinct, outline the TRL framework for software. Define the levels in terms of architecture and integration fidelity. (Software technology is mainly concerned with the maturity of complex algorithms and functional flows).

Define quantifiable exit criteria for each level (documentation of achievement, such as demonstration of agreement between prediction and test, under stipulated conditions).

Advancement Degree of Difficulty

Outline the Advancement Degree of Difficulty (AD2) framework applicable to the program. Provide figures and tables as appropriate.

Identify the development stages (design, manufacturing, test, operation) and associated resource areas to be addressed in determining the AD2 for each technology.

If appropriate, define outline the Manufacturing Readiness Levels (MRL) and Integration readiness levels (IRL) as part of an AD2 approach.

Describe the scaling and weighting conventions for scoring the resource areas, and the approach to combining results to determine the composite AD2 for each technology.

Associated Definitions

Define the terms used to characterize the readiness levels and degrees of difficulty. It is common for the levels and degrees to have been defined in the previous sections.

Name and define the discrete architecture fidelity categories. Common items include: breadboard, Brassboard, prototype, flight, etc

Name and define the discrete environment fidelity categories. Common items include: laboratory, relevant, operational, etc

Name and define the discrete software integration fidelity categories. Common items include: non integrated, basic components, existing system, operational system, etc

Provide other secondary definitions as required. Common examples include: Architecture, Environment, Function, Capability, technology

34

Page 35: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Roles and Responsibilities

Describe the duties of the individuals, teams, and organizations associated with the implementation of the TD Plan.

Planning and Execution

Identify the individuals, teams, and /or organizations responsible for:

Identification and classification of critical technologies

Individual technology and system level integrated planning (roadmap development)

Execution of plan activities, including modeling, analysis, manufacture, testing, progress tracking and status reporting

Documentation of the individual technology and final assessment reports.

Include as appropriate the roles of researchers, developers, users, and contractors.

Oversight and Approval

Identify the responsibilities for TD plan related decision making, including:

Approval of technologies being identified and classified as critical (making the list)

Approval of individual activity plans and the overall program integrated TD plan and schedule

Approval of individual assessment reports and the overall integrated technology assessment report (documentation of meeting the TD requirements

Approval of changes, as required to the TRL and AD2 definitions frameworks.

Include as appropriate the roles of program managers and independent advisory panels.

35

Page 36: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Implementation Approach

Describe the general activities to be conducted to meet the stated TD requirements.

Identify and Classify Technologies

Discuss the general approach for identifying critical technologies

Define how the system will be surveyed to reveal technologies for classification. Include how the approach ensures completeness of the assessment.

Describe the method for assessing technologies to identify those falling below the minimum criteria threshold. (Reference the criteria provided in Section 4 above).

Discuss the considerations for prioritizing the Critical Technologies List (CTL). Include how heritage elements will be assessed. Include how software will be addressed.

Develop and Integrate Planning

Discuss the general approach for developing an integrated plan.

Describe the steps for formulating, documenting and approving a roadmap for each critical technology. (Reference an individual plan template in Section 6 below).

Discuss how the collection of roadmaps is consolidated into an integrated plan and schedule. (Reference an individual plan template in Section 6 below

Discuss how initial TD plans are modified to account for interim findings, external influences, etc. Identify the triggers, the re-planning steps, and the decision makers.

Track and Status Activities

Discuss the general approach for managing TD activities

Describe the method of tracking technology development activities. State who is the secretariat and how is it updated. (Reference a database schema in Section 6 below)

Describe the method of assessing issues: what are the thresholds, and who decides what action to take. Discuss how TD issues convert to project risks.

Describe the method of reporting status: how often, what forums, what format. Include how Technical Performance Measures (TPMs) will be used to indicate progress.

Document and Report Achievements

Discuss the general approach for providing TD deliverables.

Describe the method of capturing results, recommendations and certification for each individual critical technology. (Reference a report template in Section 6 below).

Describe the method for consolidating information in to a system level technology readiness assessment final report. This document contains the validation that all TD requirements have been met. (Reference a template in Section 6 below)

36

Page 37: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Plan Support Resources

Describe the tools and resources available for managing and conducting the technology development activities.

Templates and Schema

Describe any templates or schema that technologists should utilize in documenting and reporting TD activities. If applicable, reference the applicable forms in the Appendix

Define the minimum format and content for an individual technology roadmap. Common elements of a roadmap include: TRL and AD2 values and rationale, steps (analysis and test), responsibilities, cost, schedule, risks, alternative paths, decision gates, off-ramps, fallback positions, and specific exit criteria.

Define the minimum schema for the critical technology database needed to ensure adequate tracking, reporting and assessment. Also, define the minimum format and content for status charts.

Define the minimum format and content for an individual technology assessment report needed to ensure the correct information is archived. Describe the planned content and format for documenting the final system level technology assessment report.

Methods and Techniques

Describe any methods, procedures, processes or techniques to be used in identifying, classifying, or assessing critical technologies.

This provides an overflow from the previous section to provide more detail as required to ensure consistency in the classification of technologies, scoring of their related TRLs and determination of their AD2s.

Facilities and Equipment

Describe any stipulated or available infrastructure for use in managing and conducting the technology development activities.

Identify any specified equipment, including computer applications and other data management tools that may be utilized.

Identify any specified facilities, including simulations, test support equipment or other items that may be utilized.

37

Page 38: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Integrated Plan & ScheduleProvide the current critical technologies list and integrated schedule, indicating the tasks required and decisions made to meet the stipulated TD requirements.

Note: the information in this chapter can be handled in a variety of ways with respect to the content and organization of this document.

The content can be include or not included: Treat this plan as a living document and update it on a regular basis. With each update

provide the latest baseline of the CTL, the integrated schedule and the critical issues list.Or, maintain the application status of this plan (CTL, schedule and issues) in a separate

database. Possibly include a list of potential critical items in the appendix when the plan is first released as a starting point. The CTL will be formally recorded in the final report.

If included, the content can be placed in one of several locations. Leave it in this location, providing a quasi chronological flow to the document.Move it to the Appendices to accommodate easier updating of the document.Move it forward in the document to emphasize its importance

Critical Technologies List

Provide the table of identified critical technologies. Include technology, category, WBS mapping, owner, current TRL level, status, trend, the difficulty factor, etc. provide references to the associated plans and reports.

Integrated Schedule

Provide an integrated schedule of the TD activities related to the CTL. Include milestones, activities and discrete events. Include a major tasks including testing. Provide references to more detailed scheduled for each technology.

Current Issues

Provide the list of current TD issues and risks that are being worked. Indicate issue, status, trend, owner, and a summary of the action plan.

Technology Roadmaps

Include a copy, or reference a document, of each individual technology roadmap. If included, suggest placing roadmaps in the Appendix and pointing to them from here.

38

Page 39: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Appendix F: Technology Readiness Assessment Report

39

Page 40: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Technology Maturity Assessment Report Template

TBD ProjectTechnology Readiness Assessment Report

40

Page 41: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Table of Contents

1 Introduction.................................................................................................................................................01.1 Purpose and Scope........................................................................................................................01.2 Relation to Other Reports, Plans...................................................................................................01.3 Document Overview.....................................................................................................................0

2 Assessment Approach.................................................................................................................................02.1 Mandatory Outcomes....................................................................................................................02.2 Specified Methods.........................................................................................................................0

3 Criteria and Definitions..............................................................................................................................03.1 Technology Readiness Levels.......................................................................................................03.2 Associated Definitions..................................................................................................................0

4 Roles and Responsibilities...........................................................................................................................04.1 Planning and Execution.................................................................................................................04.2 Oversight and Approval................................................................................................................0

5 Assessment Results......................................................................................................................................05.1 Coverage........................................................................................................................................75.2 Maturity.........................................................................................................................................0

Appendices......................................................................................................................................................0Appendix A Acronyms................................................................................................................................0

41

Page 42: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Introduction

Introduce the plan and provide the context for understanding its intent.

Purpose and ScopeProvide the purpose of the Technology Readiness Assessment Report (TRAR). State why the report was written and what is expected from its execution. {In essence this fulfills a requirement of NPR7120.5d for KDPC (transition from Phase B to Phase C/D).

Provide the scope of the TRAR, stating the project, organizations, processes, systems, and system elements to which the report applies.

Provide the philosophy associated with the choice of assessment approach stating the rationale and derivation of assessment processes along with key tenets of those processes.

Relation to Other Reports, Plans and Processes

Discuss how this TRAR, and the processes described within, interacts with other Project plans and processes.

In particular, discuss the relationship with the Technology Development Plan in that this plan together with the Project Test Plan provides the basis for the report.

In addition, discuss the interaction with the Risk Management plan, as the TRAR is intended to document that all risks associated with the technological maturity of the systems, subsystems and components have been appropriately mitigated.

Document Overview

Provide a short summary of the information contained in the report and describe how it is arranged.

42

Page 43: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Assessment Approach

Describe the assessment approach used in the preparation of the report.

Mandatory OutcomesThe Technology Readiness Assessment Report (TRAR ) is required to document that all systems, subsystems and components have achieved a level of technological maturity with demonstrated evidence of qualification in a relevant environment.

The TRAR is required to be delivered at PDR.

Specified Methods

Identify and describe methods & tools that utilized in implementing the technology readiness assessment used in the preparation of the report. The assessment must be against the Technology Readiness Levels identified in Chapter 3.

Common specified methods include:

coverage approaches (e.g. WBS product breakdown structure)

assessment approaches/tools (e.g. TRL Calculator)

maturity classification (e.g. TRL Scale)

how to roll up TRLs from component to system level to form composite values

how to handle heritage hardware in new configurations and environments

having certain aspects performed by independent body

stipulated status reporting schedules;

43

Page 44: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Criteria and Definitions Outline the frameworks and definitions used to formulate evaluation and achievement criteria for each component, subsystem & system. Discuss the sanctioning of the frameworks. State who developed them and who approved them. Discuss what process improvement provisions are in place.

Technology Readiness Levels

Outline the Technology Readiness Level (TRL) framework as specified by the Agency. Provide figures and tables as appropriate.

Define the discrete levels for each in terms of architecture and environment fidelity and associated criteria. Include as appropriate definitions in terms of form, fit, and function.

If appropriate and distinct, outline the TRL framework for software. Define the levels in terms of architecture and integration fidelity. (Software technology is mainly concerned with the maturity of complex algorithms and functional flows).

Define quantifiable exit criteria for each level (documentation of achievement, such as demonstration of agreement between prediction and test, under stipulated conditions).

Assess the documented results to verify maturity as specified by the exit criteria.

Associated Definitions

Define the terms used to characterize the readiness levels and degrees of difficulty. It is common for the levels and degrees to have been defined in the previous sections.

Name and define the discrete architecture fidelity categories. Common items include: breadboard, Brassboard, prototype, flight, etc

Name and define the discrete environment fidelity categories. Common items include: laboratory, relevant, operational, etc

Name and define the discrete software integration fidelity categories. Common items include: non integrated, basic components, existing system, operational system, etc

Provide other secondary definitions as required. Common examples include: Architecture, Environment, Function, Capability, technology

44

Page 45: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Roles and Responsibilities

Describe the duties of the individuals, teams, and organizations associated with the preparation of the report.

Planning and Execution

Identify the individuals, teams, and /or organizations responsible for:

Identification and classification of critical technologies

Identification of assessment process

Identification of assessment coverage

Evaluation of test and analyses results

Documentation of assessment reports.

Include as appropriate the roles of researchers, developers, users, and contractors.

Oversight and Approval

Identify the responsibilities for TRAR related decision making, including:

Approval of assessment process

Approval of assessment coverage

Approval of assessment report

Include as appropriate the roles of program managers and independent advisory panels.

45

Page 46: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

Assessment Results

Describe the results of the assessment.

Coverage

Provide a detailed description of what was covered in the assessment (e.g. WBS Product Breakdown)

Maturity

Discuss the maturity of each element identified in the description of coverage. Particular attention should be paid to those elements that were brought to maturity by means of executing the Technology Development Plan.

Test data and analyses should be documented validating the maturity assessment (may be referenced).

46

Page 47: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

AppendicesAppendix A Acronyms

47

Page 48: White Paper on Technology Assessment Rev A Sponsored Documents…  · Web viewEstablishment of meaningful evaluation criteria and metrics that will allow for clear identification

AppendicesAppendix A Acronyms Appendix B Template 1

48