industrial technologists’ toolkit for technical management

26
Industrial Technologists’ Toolkit For Technical Management (ITTTM) Courseware Applications, Project Portfolio Template Synchronous Tools (31-36), Regular Critique Assessment’s and Standard Dedicated Audit’s Regular critique assessments (RCA) and standard dedicated assessments (SDA) are part of ITTTM courseware systems. After reading assigned ITTTM long and short forms, team members do independent research based on RCA’s and SDA’s, posting their RCA/SDA work forms at threads in e-classroom discussion boards. All also are compilers, using all researcher postings, collectively presenting a final portfolio for each tool and at the end of the course. There are three RCA’s repeated/evolved for each tool critique, presented ahead of SDA’s, in the portfolio, as follows: Project Portfolio Assessment, Research Methodology, Plan (PPARMP); Review Of Literature, Documentation Assessment (ROLDA); and, Portfolio Presentation Management Team Assessment (PPMTA). This template should be filled in for each assignment, in the order as presented, completed based on individual/team work following instructions in the forms (note, again, all persons on team do work independently for each form, and then all compile one portfolio of forms). Each tool portfolio uses all SDA’s compiled to reflect total team research enhanced over time to address course outcomes, and to do project deliverables for each objective. As each tool portfolio is done, phase work is a “rollover” of work already done in tool portfolios, but enhanced and better focused on project and course objectives and outcomes. This is further explained in Gantt charts below, each separate form steps, documentation in ITTTM courseware, and student project examples at www.bgsu.edu/colleges/technology/qs ). PROJECT PORTFOLIO ASSESSMENT, RESEARCH METHODOLOGY, PLAN (PPARMP) RCA General Use/Application (done by Team Leadership—note that team leaders rotate continuously just like all other team functions): Part 1 A, B, C, D. Team leaders assign independent and grand form team members RCA and SDA compilation responsibilities, due dates. Part 2 A, B, C, D. Continuously evolve “Project” information, and diagrams, to help explain and connect all aspects of team work. Part 3. FACR—RCA is updated and addressed with each tool/phase based on FACR’s done at specific SDA’s. Part 4. General methodological reflection, based on total completed portfolio, and input from all, to assist all in improving. PART 1. Team leaders prepare threads in discussion board work areas for this tool, where all work can be posted as assigned. All persons on the team should be shown on a line below, given an assignment, corresponding to work posts in discussion board threads. Phase: Tool: Date: Compiler (s), Team Leader: Team: Researcher 1A. Assigned Forms Due, First Half Posting Cycle 1B. Researcher Independent Work Reviewed By 1C. Grand Form Compiled, Second Half Posting Cycle 1D. Final Compiled Work Reviewed By Example Joe Smith ROLDA John Doe 10/8/03 James Hale PART 2. This section should be expanded and evolved to address and develop parts 2 A, B, C, and D over time to become several pages in length and assist non-team participants (as well as team participants) in understanding the project focus and context. 2 A. Project Background: (ex, number/type persons in systems, recent changes, historical of systems/processes, product information, etc.) 2 B. Project Problem Statement: (ex. how are systems organized and managed to obtain the full benefit of collaborative communication.) 2 C. Project Objectives: (three to maximum six, perhaps provided by customer/consultation with others, continuously refined). 2 D. Project Research Methodology: (how data is gathered, analyzed with SDA’s/RCA’s. Teams do a modified Gantt chart like those below). Part 2 includes three diagrams as follow, all designed to graphically help describe how the project is being done, also continuously evolved. Part 2 Diagram 1: Project Organizational Chart For Team (illustrate how team is organized to do work) Part 2 Diagram 2: Facility Layout Of Process/System (illustrate the work area used to complete the project) Project Team Leadership Team member Team member Team member Workers 1, 2, 3 Placed here Workers 4, 5, 6 Placed here Work center 2, metal processing Work center 3, packaging, processing Work center 1, general processing

Upload: others

Post on 11-Feb-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Industrial Technologists’ Toolkit For Technical Management (ITTTM) Courseware Applications, Project Portfolio Template

Synchronous Tools (31-36), Regular Critique Assessment’s and Standard Dedicated Audit’s

Regular critique assessments (RCA) and standard dedicated assessments (SDA) are part of ITTTM courseware systems. After reading assigned ITTTM long and short forms, team members do independent research based on RCA’s and SDA’s, posting their RCA/SDA work forms at threads in e-classroom discussion boards. All also are compilers, using all researcher postings, collectively presenting a final portfolio for each tool and at the end of the course. There are three RCA’s repeated/evolved for each tool critique, presented ahead of SDA’s, in the portfolio, as follows:

• Project Portfolio Assessment, Research Methodology, Plan (PPARMP); • Review Of Literature, Documentation Assessment (ROLDA); and, • Portfolio Presentation Management Team Assessment (PPMTA).

This template should be filled in for each assignment, in the order as presented, completed based on individual/team work following instructions in the forms (note, again, all persons on team do work independently for each form, and then all compile one portfolio of forms). Each tool portfolio uses all SDA’s compiled to reflect total team research enhanced over time to address course outcomes, and to do project deliverables for each objective. As each tool portfolio is done, phase work is a “rollover” of work already done in tool portfolios, but enhanced and better focused on project and course objectives and outcomes. This is further explained in Gantt charts below, each separate form steps, documentation in ITTTM courseware, and student project examples at www.bgsu.edu/colleges/technology/qs ).

PROJECT PORTFOLIO ASSESSMENT, RESEARCH METHODOLOGY, PLAN (PPARMP) RCA

General Use/Application (done by Team Leadership—note that team leaders rotate continuously just like all other team functions): Part 1 A, B, C, D. Team leaders assign independent and grand form team members RCA and SDA compilation responsibilities, due dates. Part 2 A, B, C, D. Continuously evolve “Project” information, and diagrams, to help explain and connect all aspects of team work. Part 3. FACR—RCA is updated and addressed with each tool/phase based on FACR’s done at specific SDA’s. Part 4. General methodological reflection, based on total completed portfolio, and input from all, to assist all in improving. PART 1. Team leaders prepare threads in discussion board work areas for this tool, where all work can be posted as assigned. All persons on the team should be shown on a line below, given an assignment, corresponding to work posts in discussion board threads. Phase: Tool: Date: Compiler (s), Team Leader: Team:

Researcher 1A. Assigned Forms Due, First Half Posting Cycle

1B. Researcher Independent Work Reviewed By

1C. Grand Form Compiled, Second Half Posting Cycle

1D. Final Compiled Work Reviewed By

Example Joe Smith

ROLDA John Doe 10/8/03 James Hale

PART 2. This section should be expanded and evolved to address and develop parts 2 A, B, C, and D over time to become several pages in length and assist non-team participants (as well as team participants) in understanding the project focus and context. 2 A. Project Background: (ex, number/type persons in systems, recent changes, historical of systems/processes, product information, etc.) 2 B. Project Problem Statement: (ex. how are systems organized and managed to obtain the full benefit of collaborative communication.) 2 C. Project Objectives: (three to maximum six, perhaps provided by customer/consultation with others, continuously refined). 2 D. Project Research Methodology: (how data is gathered, analyzed with SDA’s/RCA’s. Teams do a modified Gantt chart like those below). Part 2 includes three diagrams as follow, all designed to graphically help describe how the project is being done, also continuously evolved.

Part 2 Diagram 1: Project Organizational Chart For Team (illustrate how team is organized to do work)

Part 2 Diagram 2: Facility Layout Of Process/System (illustrate the work area used to complete the project)

Project Team Leadership

Team member Team member Team member

Workers 1, 2, 3 Placed here

Workers 4, 5, 6 Placed here

Work center 2, metal

processing

Work center 3, packaging,

processing

Work center 1, general processing

Part 2 Diagram 3: Flow Chart Of Process (how product or service is produced as part of facility layout) Part 3 A, B, C, and D. Findings, Analysis, Conclusions, Recommendations (FACR) RCA. As each SDA is used and a FACR done for each, the general results and findings from all persons on the team are brought together as general project findings, analyses, conclusions and recommendations for future planning and work. All work should be organized around objectives in project. Researcher 3A. Project objective

(s) written…… 3B. Findings, analyses observed as data, documentation in SDA……

3C. Conclusions, recommendations in methods, course outcomes……

3D. ROLDA, PPMTA, PPARMP relationships?

Researcher Contribution

Collect From All, Compile

Part 4 A, B, C. General Methodological Reflection. Based on the tool and phase portfolio methodologies shown in Gantt chart forms below, all other input from all, what PPARMP issues, and opportunities are we seeing, and how can we best address these to improve and move forward (Note that places for these responses are provided after Gantt charts)? Part 4 A, Tool/SDA Project Methodology, Rollout: Tool/SDA methodology is a planned completion of tools and SDA’s in course and project with each tool and phase portfolio. Relationships to FACR, and tool/phase methodologies are updated with each portfolio, coinciding with rollout of work shown in course syllabus. Final phase II portfolio presentation has 18 tools completed, minimum. SDA 1 and 2 are enhanced ongoing throughout course; SDA 3, 4 and 5 are done as shown below; and other SDA’s may be modified per team methodology (ex: up to two SDA’s used in 3, 4, 5 can be repeated and “grown” and other SDA’s may be used from a different set of tools within the total 42 ITTTM). Are any changes recommended for the rollout at this time, and if yes, what is the basis for the proposed changes?

Tool SDA 1 SDA 2 SDA 3 SDA 4 SDA 5 Other (s) 31 OPCP FMEA DSDC LVAOACA ISOQSAOPP 32 OPCP FMEA GCA KCA MAACE 33 OPCP FMEA SOPATA A-VISSPC PSDOE 34 OPCP FMEA APQPVC APEIAR OCA 35 OPCP FMEA APQPVC QFD CEAS 36 OPCP FMEA GBAPS or GFAPS LVAOAVA GSICPC or MTA

Part 4 B, Tool Portfolio Methodology Rollout. Tool portfolio completion in two week posting cycle combines tasks, who should do, and when to complete in relation to all other work. Three tool portfolios are done before phase I and three after, all leading to final phase II compilation (both phases result in a “grand” accumulative portfolio, explained in the next chart). Posting cycle includes all days, and can be done ahead of time. *Astericked tasks are first done by team leader or assistant team leader on a rotating basis, and all coincide with rollout of work in syllabus.

Timeline in Days/Weeks/Months (denoted by color) Task/Action/Step

01

02

03

04

05

06

07

08

09

1 0

1 1

12

13

14

15

16

17

18

19

20

21

22

23

Faculty prepares tool discussion board area

X

Team leadership prepares PPARMP, assigns work*

X

Team leadership prepares threads in work area*

X

SDA’s researched by all, all do all SDA’s

X X X

SDA’s independently posted at threads by all

X X X X

Non-required team chat about work in motion

X X X

SDA’s compiled by team members per PPARMP

X X X X X X

RCA’s independently posted at threads by all

X X X X X X

RCA’s compiled by team members per PPARMP

X X X X X X

Tool portfolio assembled by team leadership*

X X

Tool portfolio reviewed by all on team

X X

Tool portfolio posted by team leadership*

X

Faculty assessment, post in new discussion board

X X

Required team chat about POAM, next tool

X X

Part 4 C, Phase Portfolio Methodology, Rollout.. Phase portfolio completion at mid term and final are completed with all other elements of

Use text to explain symbols…

Text boxes, Auto shapes in MS Word

course. Three tool portfolios are done before phase I, three after, all leading into final phase II compilation (phases result in a “grand” accumulative portfolio). *Startup at outset of course leads into first tool, and all tasks coincide with rollout of work as shown in course syllabus.

Timeline in Days/Weeks/Months (denoted by color) Task/Action/Step

01

02

03

04

05

06

07

08

09

1 0

1 1

12

13

14

15

16

17

18

19

20

21

22

23

*Team prepares startup portfolio, start of course

X X

Team completes first tool portfolio

X X

Faculty assesses portfolio per POAM, requires chat

X

Team completes second tool portfolio

X X

Faculty assesses portfolio per POAM, requires chat

X

Team completes third tool portfolio

X X

Faculty assesses portfolio per POAM, requires chat

X

Team enhances all tools, compiled as grand forms

X X

Phase portfolio reviewed by all on team

X

Faculty assesses phase portfolio, POAM chat

X

Part 4 D. Reflections on Methodology, Plan. Based on completion of work and updates in PPARMP, what are all on the team seeing that may be modified methodologically to improve the deliverables in the project. Researcher 1. Project focus as noted in abstract, problem statement,

objectives, methodology, diagrams, FACR’s, others: 2. Assessment Question and reflective thoughts, Pros and Cons, for use in chats and in other ways, to improve:

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Review Of Literature, Documentation Assessment (ROLDA) RCA

General Use/Application (top part): Read tool long and short forms from ITTTM and provide general tool bibliographic information and 300-500 word abstract of the tool in appropriate areas below NOTE: all must do this individually, one on team compiles all work posted). 1. Response 1A, give the main technological concept of the tool. 2. Response 2A, explain how the tool added value to the team project. 3. Response 3A, form a question pertaining to the tool for chat discussion.

General Use/Application (bottom part): Identify an article/source related to the tool, and summarize in 300-500 words (abstract). 4. Response 1B, give the bibliographic source, being careful to cite all details correctly. 5. Response 2B, explain how this article added value to the team project. 6. Response 3B, SDA connection to the article—usually one or two, minimum. Phase: Tool: Date: Researcher: Compiler (s): Team: Information Source (Tool reviewed for this toolkit completion) In Bibliographic Form:

Abstract And Synthesis Of Key Information (usually approximately 300-500 words) completed by compiler (s) based on submissions by all on team as well as their own review of the tool content: Researcher 1A. Main Technological Concept: 2A. Relationship, Value Added, To

Project/Technology: 3A. Assessment Question, Pros And Cons, For Chat:

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Article Abstract Of Key Information (usually approximately 300-500 words—each tool submission requires one for each team member) :

Researcher 1B. Bibliographic Source: Author; How To Access; How/Who, Published; Type Source, etc.:

2B. Why Is This Source Relevant; How Does It Add Value (Reflections By Researcher)?

3B. SDA’s Connected:

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Portfolio Presentation Management Team Assessment (PPMTA) RCA

General Use/Application (NOTE, all do this independently, and one on team compiles all posted inputs to reflect team collective views): Part 1 A, B, C. Analyze/review SDA/RCA compilations at discussion board threads, open agenda items, actions needed. Part 2 A, B. Based on all work, do Excel numerical ratings for individuals on your team and other teams based on portfolio submissions. General Note: Excel tables are activated by double clicking at table, when using a machine with excel installed. Phase: Tool: Date: Researcher (s): Compiler (s): Team:

Researcher/ Compiler

1A. Technical Project Open Agenda Items, How done, Content Specific?

1B. Presentation Improvement Actions, Management/Organization, Process?

1C. Ongoing Listing, How Issues Were Resolved, Adding Value In PPARMP.

Observ-ation

Collect All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

2A. Analyze/review portfolio management contributions and reflect all in numerical ratings below for persons on your team.

2B. Compare and benchmark work of all teams, numerically below, how all is managed and organized relative to Gantt charts above, to what must be accomplished in PPARMP, and overall outcomes desired in course.

Int ernal Team

A ssessment

General Commun-

icat ion

Threaded Commun- icat ions

Chat Part-icipat ion (non/req.)

SDA/RCA Contribu-

t ions

Writ ing Quality In General

Analysis , Ref lect ion In Writ ing

Format Compliance

+ Details

Data Analysis, Accuracy

Did M ore/Less Than Asked

General Timeliness,

Delivery

Coopera- t ion And Att itude

Leadership, Prof.

Demeanor

Grand Total Per Individual

WORK/RATING (1-10): 1 = low /bad; 10 = high/good; 0 = no contributionM EM BERTeam 1+ 1 8 7 5 5 5 5 5 5 5 8 2 5.08333up to 12 1 5 9 0 9 9 9 9 9 9 9 9 7.25

1 10 8 8 2 8 8 8 8 1 8 8 6.51 10 7 7 7 7 7 7 7 7 7 7 6.756 2 6 6 10 10 3 6 6 6 6 6 6.08333

10 4 10 10 10 10 10 10 10 1 8 10 8.583334 10 4 4 4 4 1 4 0 4 4 4 3.916674 5 4 4 4 4 4 4 0 4 4 4 3.754 7 4 4 4 4 4 4 0 4 4 4 3.916674 9 4 4 5 4 4 4 0 4 4 4 4.166674 7 4 4 1 4 4 10 0 10 10 4 5.166674 9 4 4 4 4 4 10 0 10 10 4 5.58333

TOTAL TEAM AVERAGE 5.5625

Ext ernal Team

A ssessment

General Commun-

icat ion

Threaded Commun- icat ions

Chat Part-icipat ion (non/req.)

SDA/RCA Contribu-

t ions

Writ ing Quality In General

Analysis , Ref lect ion In Writ ing

Format Compliance

+ Details

Data Analysis, Accuracy

Did M ore/Less Than Asked

General Timeliness,

Delivery

Coopera- t ion And Att itude

Leadership, Prof .

Demeanor

Grand Total Per Individual

WORK/RATING (1-10): 1 = low /bad; 10 = high/good; 0 = no contributionTeam NameTeam 1 1 8 7 5 5 5 5 5 5 5 8 2 5.08333Team 2 1 5 9 0 9 9 9 9 9 9 9 9 7.25Team 3 1 5 8 8 2 8 8 8 8 1 8 8 6.08333Team 4 1 10 7 7 7 7 7 7 7 7 7 7 6.75Team 5 6 2 6 8 10 10 3 6 6 6 6 6 6.25

TOTAL TEAM AVERAGE 6.28333

Continuous Documentation Tools (31-36) SDA-Ongoing Process Control Plan (OPCP) The continuous SDA is developed over the entire course based on what is learned throughout the course. A separate discussion board work area will be set up where applied research can be conducted, systematically, on the continuous SDA’s. Note that it is assigned in the SDA tool/SDA project methodology table provided earlier, and should be assigned to various persons on the team routinely for continuous improvement. At phase review, this SDA should be highlighted and explained based on progress made toward “best practices” and accomplishment of outcomes of the course as reflected in the POAM from the course syllabus. General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issue, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR), completed by compilers for inclusion in the final portfolio when compiled by team leadership. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Part Name: Process Location: OPCP Form Of

Supplier:

Revision:

Other Pertinent General Information:

Process Description. Provide a written description of process in text format and a process flow chart depicting the same: Tools For Manufacture/Production. Describe equipment required to produce the product, consistent with the flow chart above: Process Parameters. Describe unique elements involved in processing the product which require additional explanation: Product Characteristics. Describe all characteristics of the product, primarily as attributes or variables which are critical: Class. Describe any codes or other unique identifiers per the specific industry or organization to help further distinguish the product: Product/Process Specification. Provide specific measurable definitions for each product characteristic identified above: Evaluation Method. Describe how specifications of characteristics are evaluated, what the measurement systems are: Sample Size, Frequency of Inspection. Describe the sample, how it is derived, including devices used to inspect in a step by step manner: Analysis Methods. Describe the methods and systems used to analyze the sampled data, such as statistical, experimental, and so on: Reaction Program. Describe steps to be taken when product or process is found to be functioning inappropriately: SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Continuous Documentation Tools (31-36) SDA-Failure Mode And Effects Analysis (FMEA)

The continuous SDA is developed over the entire course based on what is learned throughout the course. A separate discussion board work area will be set up where applied research can be conducted, systematically, on the continuous SDA’s. Note that it is assigned in the SDA tool/SDA project methodology table provided earlier, and should be assigned to various persons on the team routinely for continuous improvement. At phase review, this SDA should be highlighted and explained based on progress made toward “best practices” and accomplishment of outcomes of the course as reflected in the POAM from the course syllabus.

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issue, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR), completed by compilers for inclusion in the final portfolio when compiled by team leadership. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Part Name/Service: Process/Service Location: FMEA Form Of

Supplier: Product, Process or Design FMEA, and Why? Revision:

Other Pertinent General Information, Particularly Oriented To Other Supportive Documentation And Data Attached:

Process Description. Provide a general written text description of the process, and detailed flow chart of same:

Product Description. Provide a general written text description of the product, and detailed step by step functioning of same:

Potential Failure Mode. Describe the general failure which has occurred or may occur, giving appropriate details to help all improve on same:

Key Characteristics. Describe key characteristics in product or process to be addressed to decrease likelihood of failure (rank order according to level of criticality, 1 being low and 10 being high):

Characteristic Ranking 1-10:

Potential Effects of Failure. Describe the general failure effects, particularly focused on implications for damage to surrounding environment (provide a rating of severity where 1 is low and 10 is high):

Severity Rating 1-10:

Potential Causes of Failure. Describe general failure cause, particularly focused on process or product design which are implicated, with sufficient details to enable improvement. Also provide a occurrence of failure rating ( 1 = low and 10 = high):

Occurrence Failure 1-10:

Current Detection Control. Describe systems or methods of detecting failure which can lead to control of failure, including methods for same. Also provide a rating from 1-10 to indicate likelihood of occurrence ( where 1 = high and 10 = low):

Detection Control 1-10:

Risk Priority Number (RPN). Provide a RPN based on multiplying severity X occurrence X detection. RPN Calculated:

Recommended Action. Describe appropriate actions to be taken in process or product to alleviate failure, including area to take action and when (if multiple actions are needed, add rows):

Responsible Area: When:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 31 SDA-ISO/QS Audit, Objective Prioritization Plan (ISOQSAOPP)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher: Team: Phase: Tool: Date:

Organization Under Discussion: Current Operation: Location: General Description Of Operations/Production Functions/Systems Being Assessed As An Internal Audit:

ISO/QS Element (TS 16949 2002—Other Standard)

Organization Area/Function Technical Description

Actual Audit Findings/ Preliminary Information

Actions Required, Recommendations For Change

(list each element as a separate category for analysis/audit)

(describe the technical function being audited)

(explain how the area/function compares to the standard)

(explain anticipated necessary actions, changes required)

Criticality Level: HML Where H = 5; M = 3; L = 1

Planning Statement Or Issue, Particularly As Associated With The Service Environment:

Fac = Facility issues; Cost = Cost issues Time = Time issues Pers = Personnel issues

Criticality HML Objectives Prioritized, 5 Being Highest: Fac Cost Time Pers

5.

4.

3.

2.

1.

Problem Background: Resources? (Budget/Other Concerns)?

Task/Deliverable/Objective: Who Will Do? How Measured?

Gantt Chart Showing Time/Task/Deliverable (Goal/Step/Objective) Relationships (Indicate Weeks/Days/Months: X = ) Task/Deliverable 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

1. 2. SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 31 SDA-Documentation System Design, Communication (DSDC)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Team: Phase: Tool: Date:

Researcher Communication Method, Internal And

External Customers: Toolkit/Blackboard Design Element, How Does It Work?

How To Improve Professional Relationships, Communication?

1. 1.

1. 1.

Identify How The Systems’ Design You Are Advocating Can Assist IN Tieing Together PPDPOA, TPMSS, FACR, TRIRPA And Other Elements Of Team Problem Solving And Improvement: Explain ISO/QS Characteristics And Other Necessary Entities To Be Possessed By Documentation And Communication Systems, Categorically, Particularly Focused On Service Functions: Using The Template At The End Of The Applications Complete A Cause And Effect Diagram Indicating Issues And Opportunities For Systemic Documentation Relationships/Improvements, Particularly Focused On Service Functions: Identify And Explain Why Your Organization Is Well Suited To The Documentation And Communication System Being Developed, Particularly Focused On Service Functions:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 31 SDA-Launch Value Analysis, Open Agenda, Corrective Action (LVAOACA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Team: Phase: Tool: Date:

Identified Nature Of Product To Be Supplied, Specific Details:

Explain ISO/QS Characteristics (And Other Necessary Definitions) To Be Possessed By Launch Systems, Categorically:

Explain Potential Opportunities For improvement In Supplier Chain Systems:

Identify And Explain Why Your Organization Is Well Suited To Supply The Product Being Contracted:

Develop A Flow Chart, With Explanation, Of The Overall Launch System:

Explain How Key Documentation, Data Systems (OPCP, SPC, DOE, APQPVC, FMEA, Others) Interface With Launch System (Use Flow Chart Above To Explain): Perform Value Analysis Based On The Following Notes. Note 1: Value Rating (VR) Is Low Value = 1; High Value = 10. Note 2: Part Cost (PC) Is The Actual Production Cost. Note 3. % Of Total Cost (TC%) Is An Estimated % Based On Best Information Available. Note 4. Value Added Weight (VAW) Is Calculated Based On (VR) (PC) (TC%) Multiplied To Provide A Value Which Can Be Compared To Other Parts Of Product. Current Product Cost: Target Production

Cost: Current Sales Cost:

Target Sales Cost:

Other Costs or Functions:

Part/Component Function/Purpose Value Rating (VR) (1-10)

Part Cost (PC)

% Of Cost Total

(TC%)

Value Added Weight (VAW)

Alternatives to Add Value

Column Totals:

Value Analysis Summary Statement:

Open Agenda Topic/Issue/Activity When Started/Done/By Who? Follow Up Details/Actions?

Concern/Complaint/Problem Title:

General Background Information On Issue Or Problem Requiring Improvement:

Identified Defect Or Problem Requiring Improvement, Detailed Description:

Root Causes/Definition (Attach A Cause And Effect Diagram):

Immediate Interim Action/Effective Date, Containment:

Permanent Action, Effective Date And Verification (describe improvements made in components, assemblies, SOP’s, processes and locations, inspection procedures, or other permanent changes: Control For Prevention:

Other pertinent information/source and general description of concern. Attach or refer to appropriate information, and explain how to access: SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 32 SDA-Kaizen Cost Analysis (KCA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Product/Process Under Discussion (include attachments such as Man Machine Analysis, SPC Data, Time Studies, Other):

Current Operation/location:

Setup Time: Changeover Time: Defect Rate %:

Non-Value Added Time: Other Suspected Wastes: Thruput Time:

General Description Of New Targeted System/Method Being Proposed:

Analysis Item Current Method

Cost Estimate

Targeted New Method

Results (Time, Cost, Output, Other Improvements)

Daily Requirements

Shifts Per Day

Available Minutes/Day (Shifts/Day X Hrs/Shift X 60 Min/Hr) - (Breaks + Cleanup + Changeovers, Etc.)

Operating Rate % (Avail Min/Day – Scheduled Downtime) / (Avail Min/Day)

Scrap Rate %:

Tact Required (Min/Piece) (Avail Min/Day X Op Rate) / [Daily Req. X (1 + Scrap Rate)]

Line Balance (Op1 + Op2 + …. Op n) / (# Of Op X Slowest Op)

Slowest Pace Time (Min/Pc)

Output/Hr

(Pcs/Hr) (60 Min/Hr) / (Slowest Pace Time) # Of Ops

Output/Op/Hr (Pcs/Op/Hr) (Output/Hr) / (# Of Op)

Productivity % (New Output/Op/Hr – Old Output/Op/Hr X 100) / (OldOutput/Op/Hr)

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 32 SDA-General Cost Analysis (GCA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation/Part: Process Location:

General Use/Application: 1. Study examples and general information below, after reviewing content and discussions in long and short forms. 2. Identify all necessary costs associated with product as part/unit costs. 3. Calculate the Break Even Point (BEP) and Profit or Loss (P or L) based upon team’s project. 4. Calculate Payback (PB) In Percent based upon team project figures.

Part/ Unit Costs

Materials/ Components

Operation Or Function

Specifications/ Related Information

Sub- Unit Cost

Total Unit Cost

Dir. Cost

Ind. Cost

Other Costs

Total Part Costs

Column Grand Totals

Cost Descriptions: Break Even Point (BEP): where income equals expense. BEP = FC/(SIU – VCU) Fixed Costs (FC): Taxes, Interest, Utilities. Variable Costs Per Unit (VCU): Material And Labor (also called direct). Sales Income Per Unit (SIU), revenue generated. Indirect Costs, Support Services Such As Sales, Engineering, Quality.

(Profit) +(Cost Of Production, Sales Cost) Payback (PB) In Percent CI/ITC = PB Cash Inflow (CI), Labor savings +Scrap, reject savings (others) Initial Total Cost (ITC) Cost of system (shipping, setup, training, etc.) Sub Unit Cost, Various Parts Or Costs. Total Unit Cost, Multiple Sub Units Can Be Summed As A System.

Example Break Even Point (BEP) based upon: BEP = FC/(SIU – VCU) FC = $100.00 VCU = $.50 Per Unit SIU = $1.00 Per Unit BEP = 100.00/ (1.00 - .50) = 100/(50) = 200 Or $200.00

Example profit or loss (P or L) based upon: P or L = I – (FC + VC). I = $200.00 FC = $100.00 VC = $100.00 P or L = 200.00 – (100.00 + 100.00) = 200.00 – (200.00) = 0

Calculate BEP based upon team’s project. BEP = FC/(SIU – VCU) FC = VCU = SIU = BEP = FC/(SIU – VCU)

Calculate profit or loss (P or L) based upon team’s project. P or L = I – (FC + VC) I = FC = VC = P or L = I – (FC + VC)

Technical Description Of Proposed Improvement: This typically will include general information about where the improvement is located in production, how it works, and anticipated general enhancements to be noted. Specific cash inflow generated by upgrade is listed also, such as:

• Quality defects will be reduced by 20% = $1/unit inflow • Production rate will be increased by 10% = $.10/unit inflow • Safety improvement reduces lost time accidents by 50% = $.20/unit inflow • Work stoppage reductions will enhance throughput by 50% = $.50/unit inflow • WIP reductions at workplace will be enhanced = $.40/unit inflow

Total cash inflow = $2.20/unit Example Payback (PB) Calculation: CI/ITC = PB CI = $700.00 ITC = $1050.00 PB = $700.00/$1050.00= 61% (annually)

Calculate Payback (PB) for project: CI/ITC = PB CI = ITC = PB =

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 32 SDA-Methods Analysis And Cost Estimates (MAACE)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation/Process: Location:

Product/Component Under Discussion:

Persons Interviewed:

Current Processing/Other Pertinent Information Reviewed/Analyzed: General Focus Of Study:

Cost Estimates And Methods, Comparison Current Method Proposed Method Potential Savings Potential Cost

Maintenance Costs: Transport and Handling: Service Requirements: Processing Capability: Weight Issues: Storage and Contract: Preparation: Aesthetics/Market Appeal: Availability: Others As Defined: Column Totals: Summary InterpretationsObservations:

Issues/Functions To Be Improved? How To Improve? Who Will Do? When

Completed? Other Observations?

1. 2. Identify And Attach Other Pertinent Inspection Information, Or Related Data:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 33 SDA-SOP And Takt Analysis (SOPATA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation/Process: Location: Reason For Analysis: Individuals Interviewed:

# Of Pieces: Other Pertinent Information: Part:

General Operation Information Distance Man. Work

Proc. Work

Capacity Time

Potential Waste Reductions

Proc Symb Step Operation Description Ft. In. Hr. Min. WIP Cyc T Other Totals General Interpretation Of Findings Based On Ongoing Improvement In Product: An Example Provided Requires Addressing Operations 5 And 4 To Get A Better Balance. Also Note That This Analysis Does Not Account For Handling Time, Storage, WIP And Related Issues.

10 9 8 7 6 5

Identify Time In Seconds, Minutes Or Hours. Color The Values In To Provide A Bar Graph 4 Seconds 3 Minutes X 2

Hours 1 Operation 1 2 3 4 5 6 7 8 9 10 11 12 13 14

Identify each operation in order of occurrence, listed in the spaces to the right.

Dril

l Pilo

t Hol

es

Face

At l

athe

Mill

To

Size

Turn

To

Dia

met

er

Dril

l Fin

ish

Hol

es

Ass

embl

e

Test

TAKT Time = OpT1 + OpT2 + OpT3………Σ OpT/n Operations To Be Combined, Eliminated Or Improved—How?

Total Operations Time = 48 Minutes 1. Operation 5 is 9 Minutes, The Highest And Well Above TAKT. Total n Operations = 7 2. Operation 4 Is Low, Causing Unbalance In Line. Current TAKT Time = 6.85 Minutes 3. TAKT Target Time = 6 Minutes 4. Attach Flow Charts, Facility Layouts, Man Machine Analyses. 5. TAKT Time Is Calculated Where OpT Represents The Time For Each Operation Summed And Divided By The Number Of Operations (n). In The example Given TAKT Is Found By Adding 7 + 8 + 6 + 5 + 9 + 7 + 6, All Minutes.

Process/Functions To Be Improved Or Eliminated? How? By Who? When? 1. 2. Actual TAKT Time (Grand Average Of All Process Times): Targeted TAKT Time (Identify The Best Case Time Projected, Or Goal, For This Grand Average Of All Times): Potential WIP Waste (Identify Any Specific Work In Process Waste Opportunities For Reduction—Not Covered Elsewhere): Others Possible Savings (With Explanation): Total Estimated Savings (With Explanation): Recommendations From Study (Provide A Detailed Listing For Future Follow Through In Various Ways—Including Shown Below): Functions To Be Improved? How To Improve? Who Will Do? When Completed? Other Observations?

1. SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 33 SDA-Attribute--Variable Inspection, Statistical Process Control System (A-VISPCS)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Team: Phase: Tool: Date:

Part:

Reviewed By:

Supplier:

Gage/Device:

Operation:

Characteristic:

General Inspection Description:

No. Date Operator Run Quantity

Sample Size

Number Rejected

Number Accepted

Corrective Actions

1

2

Based On Existing Data, Design A System To Do PSDOE For Attributes, And Collect Actual Data, As Part Of The Broader Synchronous Program:

Complete A Flow Chart Analysis, Using The Form Attached, Demonstrating The Flow And Standard Procedure For Attribute PSDOE Analysis:

Explain How Factors And Levels Will Impact The Broader PSDOE System, And How It Will Occur: Define Attributes To Be Analyzed, Who Will Do, Where It Will Happen, And How Related To PSDOE?

What Should Be The Sampling Plan, And What Is The Estimated Cost To Perform PSDOE For Each Attribute Identified: Address Relationships Inherent In Gage R & R, SPC, Cpk, OPCP, FMEA And Other Data And Documentation Systems For PSDOE.

Identify How The PSDOE Information Gathered Can Be Used To Improve Quality In Product Ongoing, Focused On Noise Reduction:

Explain What A Variable Gage Would Consist Of To Do The Same Function, And Why Or Why Not To Move This Direction. How Can This Data And Documentation Form Be Enhanced To Improve Product Quality Ongoing—And Demonstrate Actual Applications: Identify General Interpretation Of Findings Based On The PSDOE Process, And How Documentation Will Occur Cross Functionally:

Date Time

1 2 3 4

Measure Value

5 Sum

Average Range Calculate Grand Mean or X Double Bar = All Averages Summed/n Calculate UCL = X Double Bar + (A2) (R Bar); Calculate LCL = X Double Bar – (A2) (R Bar) Where A2 = .58 Constant Calculate R Bar = All Ranges Summed/n Calculate UCL R = (D4) (R Bar); And LCL R = (D3) (R Bar) Where D3 = 0 And D4 = 2.11 Constants Configure Graphs To Accommodate Values Calculated, And To Fit Within Work Areas, Illustrating Process Sample 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 UCL = X Double Bar = LCL =

UCL R = R Bar = LCL R =

Identify Trends Or Obvious Issues Based On Graphed Values, General Interpretation Of Statistical Process: General Interpretation Of Implications Of Process Control For Improvement: SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 33 SDA-Preliminary Systematic Design Of Experiments (PSDOE)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Researcher (s): Compiler (s): Team: Phase: Tool: Date:

Part:

Operator:

Characteristic: Operation:

Inspection Description:

Run 1; Date: Run 2; Date: Run 3; Date: Subgroup Date/time 1-1 1-2 1-3 1-4 1-5 2-1 2-2 2-3 2-4 2-5 3-1 3-2 3-3 3-4 3-5

1 2 1 1 1 2 2 1 1 1 1 2 1 2 1 2 2 1 2 2

Sample Measures

1 1 2 2 2 Sums 7 9 6 8 7

Averages 1.4 1.8 1.2 1.6 1.4 All Averages Summed = 7.4 All Averages Summed = All Averages Summed = Total Average/5 = Mean = 7.4/5 = 1.48

Total Average/5 =

Total Average/5 =

Mean = 1.48

Trial: Notes: Trial: Notes: Trial: Notes: Conditions: Conditions: Conditions: Factor: Factor: Factor: Factor: Factor: Factor: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: Level: SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

1 4

1 6

1 8

2 0

1 2

1 0

Tool 34 SDA-Operation Capacity Analysis (OCA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation: Process: Location: Other Pertinent Information (attach charts, tables, information as needed):

General Operation Information Production Functions Dist. Capacity Time

Potential Waste Reductions/Other

Process Symbol

Operation Description People Req.

Mech. Proc.

Manual Work

WIP Lev.

Units Prod.

Inv. Lev.

Ft. In. Hr. Mi

TAKT Time: Process/Functions To Be Improved – How? Total Cycle Time: 1. Potential WIP Waste: 2. Others: 3. SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 34 SDA-Advanced Production Quality Planning, Verification, Compliance (APQPVC)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. This is a comprehensive form designed to assist all in understanding and using APQP systems appropriately. This form covers part production feasibility; part measurement verification; characteristic change verifications; production part tooling verification; and compliance certification. Although these can be one document they are frequently separated for various reasons, generally done in the order shown. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Part:

Characteristic:

Supplier:

Operation:

General Explanation Of Procedure: This form should be completed initially as an internal self study to determine potential to respond to a specific part production. Taken through compliance this will serve as demonstration of a supplier’s ability to produce the part as quoted.

No Yes 1. Part Production Feasibility Does your organization have sufficient information to properly respond to the part design and management? Explain.

Revised Drawing New Part New Supplier Process Change Characteristic Change

Can the proposed product performance and specifications be met by your organization? Explain. Are existing products or processes currently being produced which are similar? Explain. Can data be presented which demonstrates capability to produce the product? Explain. Is sufficient production capacity available in house to enable proper response to producing the product. Explain. Will additional costs be incurred to respond to any areas of this proposal? Explain.

Tooling: Capital:

Other: Will training be required to respond to this proposal? Explain.

Does engineering support exist compatible to technical requirements of the product? Explain. Reviewed By: Gage/Device: Characteristic:

2. Part Verification Measurement Procedure:

No. Char. Ident.

Drawing Dimension

Actual Measure

Tolerance Deviation General Observations

1 2

Determine Difference Between Drawing And Actual Measure For Each Characteristic, Entering This As Deviation. Use Separate Form For Each. Compare Deviation Against Tolerance Allowed By Contract. Deviation Expiration Date: General Observations: Other/Details/Attachments:

3. Characteristic Change Verification Part: Gage/Device: Operation: Characteristic:

Measurement Procedure:

No. Char. Ident.

Curr. Drawing X Y Z

New Drawing X Y Z

Tolerances X Y Z

Deviation X Y Z

1 2

Determine Difference Between Current Drawing And New Drawing For Each Characteristic. Measure Actual Parts To Determine Extent Of Deviation From Desired. Compare Deviation Against Tolerance Allowed By Contract. Deviation Expiration Date: General Observations:

No Yes 4. Part Production Tooling Considerations Does your organization have sufficient information management systems to properly track and manage tooling? Explain.

Are existing products or processes currently being produced using similar tooling? Explain. Can statistical data be presented which demonstrates capability to document existing tooling systems? Explain. As related to maintenance issues? As related to insurance issues?

As related to transfer/disposition issues? As related to engineering design issues? As related to production use?

Is sufficient capacity available in house for proper storage and security response for proprietary tooling issues? Explain. Will additional costs be incurred to respond to any areas of this part proposal related to proprietary tooling? Explain. Will training be required to respond to this proposal? Explain.

Does engineering support exist compatible to technical requirements of the production tooling? Explain. 5. Certificate Of Compliance

General Product Description: Product #: Blanket P.O.#: Supplier: Statistical Indeces: Characteristic: Tests Performed:

Inspection Procedure:

Date Lot Size Lot # General Observations: Attachments Supporting Lots Gathered: Certified By: Date:

Production Signoff: Date:

Engineering Signoff: Date:

Disposition Status:

Quality Signoff: Date:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project.

Feasible: Product can be produced. Possibilities: Feasible but with concerns as attached. Not feasible: Significant changes must occur as detailed.

o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 34 SDA-Applications/Process Engineering, Innovation, Applied Research (APEIAR)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Technological Process, Application, Systems Or Function Being Audited, Description (Attach Appropriate Documentation To Help Define): Technology Systems Required For Doing The Work (Address Categories Below For Each Application/Process Engineering Issue):

Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Key Energy Sources Consumed By The Technology Being Audited (Address Categories Below For Each Application/Process Engineering Issue): Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Resources Used To Maintain, “Run”, The Technology, Once Produced (Address Categories Below For Application/Process Engineering): Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Resources And Environmental Impact, General Statement (Address The Categories Below For Application/Process Engineering): Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Explain The Technologists’ Responsibility, Other Ethical Issues (Address The Categories Below For Application/Process Engineering): Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Assess Recommended Changes To Make The Technology More Efficient (Address The Categories Below For Application/Process Engineering): Application/Process Engineering: 1.

Design Changes Described: 1.

Materials And Process Issues: 1.

Data, Documentation Support: 1.

Address Each Of The Following Areas Of Thought And Action Based On The Above Audit Information And Findings. Applied Research Opportunities Innovation Issues Infrastructural Change Change, Management Plan

1.

1. 1. 1.

Training Methods: 1.

Team Relationships And Opportunities: 1.

Measureables: 1.

Explain A Possible Standard Procedure (SOP) For Training To Implement Change, Address Innovation, Applied Research:

Explain How Applied Research And Innovation Systems Relate To Technological Leadership And Management Issues:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress.

Researcher 1. Project objective (s) written……

2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 35 SDA-Characteristic Evaluation And Audit System (CEAS)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

General Product/Part Description: Inspection Description:

Operation:

Characteristic: Gage/Device: Charts Used:

No. Date/ Operator

Run Size/ Sample

Cp Cpk X-bar R SPC UCL

SPC LCL

R&R%

Corrective Actions/Observations/Other

1 2

What Does SPC Charting, Cp And Cpk Indicate Regarding Potential Changes In Characteristics?

What Does The Most Recent Audit Of Systems Indicate Regarding Characteristics Changes?

What Engineering/Design Changes Are Being Considered And What Will Be The Potential Impacts On Characteristics?

What Relationships And Implications Can Be Noted based On Operator, Date, Size Of Run, And other Issues Such As Maintenance:

What Relationships And Implications Can Be Noted based On USL and LSL, Tolerancing, and Gage R & R %:

Identify And Attach Other Pertinent Inspection Information, Or Related Data:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 35 SDA-Quality Function Deployment (QFD)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio.

Compiler (s): Researchers (s): Team: Phase: Tool: Date:

Supplier: Customer: Product Or Process:

Supplier Technical Capabilities. Describe How Customer Demands Are Met. Provide A Priority Rating By Supplier Of Technical Capabilities, 1, 5 Or 9. Customer Requirements Stated And Prioritized By Customer. 1 Is A Low Priority And 9 Is High. (1, 5, 9)

Supplier Priority Rating For Each Customer Requirement. Use Priority Rating Of 1, 5 Or 9. 1 Is Low Priority And 9 Is High. Supplier Comments About Customer Requirements Should Be Placed Here. Grand Totals: Sum All Values In Horizontal Rows.

Customer Should Provide Priority Ratings Of Each Technical Capability Statement Provided By Supplier As A 1, 5 Or 9. Grand Totals: Sum All Values In Boxes. Higher Numbers Indicate Top Priorities. Customer Should Provide Comments About Supplier Technical Capabilities.

At Matrix Intersections Where Customer Demands And Supplier Capabilities Cross, Supplier Places Priority Rating In Upper Right Of Box. Customer Priority Rating Goes In Lower Left Of Box. Averaged Two Numbers Goes In Center Of Box As Sum For Grand Totals. Values Used Are 1, 2 Or 3.

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 36 SDA-General Safety Inspection Checklist /Pareto Chart (GSICPC)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. How To Use: • Assume GSIC is used 2 times per month, generating opportunities for OK/not safe a total of 28 times, annually. • Provide simulated data which demonstrates main safety/maintenance issues, organized in Pareto format, for an annualized basis. • Modify as deemed appropriate to best make the case for team project, and delete these instructions prior to presentation. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation: Location: Reason For Inspection: Inspection By: Individuals Interviewed: Reviewed By: Other Pertinent Information: Approval Date:

Checklist Item OK Not Safe

Not App.

Actual Site Explanations/Action/Other

Good Housekeeping/Cleanliness Piling and Storage/Tagging Systems Aisles, Walkways, and Exists Tools And Supplies Ladders And Stairs Machinery And Equipment Floors, Platforms, and Railings Electrical Fixtures/Equipment Dust, Ventilation, and Explosives Overhead Valves, Pipes, Markings Protective Clothing/Equipment Washroom, Lockers, Shower, Deluge Unsafe Practices/Horseplay/SOPs First Aid Facilities Vehicles, Hand and Power Trucks Fire Fighting Equipment Guards And Safety Devices Lighting, Work Tables/Areas General Maintenance Safety Training, Communication Company/OSHA Standards – Comply Cranes, Hoists, Conveyors Scrap And Rubbish Other Items, Circumstances Other Pertinent Information (Attached And Explained):

20 42

18 38

16 34

14 30

12 26

10 21

8 17

6 13

4 9

2 5

Freq

uenc

y O

f Occ

urre

nce

0 1

% O

f Tot

al

Occ

urre

nce

General Shape Of Chart Is Constant, But Frequency And % Values Shift To Present Relationships In Facts. Pareto Is Often Done Early In Analyses, based On Histograms. It Is Used To Show Areas Needing Attention Versus Those We Can Postpone--A Good Decision Tool.

Shade Columns To Show Differences Among Findings Rank Findings From Highest To Lowest And Left To Right

% Of Occurrences Is For Each Attribute Relative To Total Of All Occurrences. Chart Can Be Expanded To Show Unlimited Numbers Of Attributes

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 36 SDA-Maintenance Tracking Analysis (MTA)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Operation: Location:

Adjustments Inspections Parts Installed Others Work Date

Track General Work With A Brief Explanation Here And An X With Time Identified Categorically To The Right

1 2 3 Time Req.

1 2 3 Time Req.

1 2 3 Time Req.

1 2 3 Time Req.

X Identify Planned Preventive Maintenance Functions According To Frequency Of When It Is To Be Done (D = Daily; W = weekly; M = monthly; Q = quarterly; B = biannual; A = annual), Along With A Brief Description Of The Planned Functions.

# Work, Maintenance Function Planned To Be Performed Freq Project Cost

Time Standard

SOP Prepared? How To Access?

Person To Do?

Adjustments 1 Parts/Materials Installed 1 Inspections 1 Other 1

Remedial Maintenance Functions, Unplanned But Required To Keep The Operation Running.

Date Description Of Work Performed Parts Required Time? Costs Who? Other SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 36 SDA-General Benchmarking/Auditing Process, Systems (GBAPS)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

Describe The Process, Product, Issue Or Problem Requiring Benchmarking/Auditing:

What Are The Comparative Groups Or Organizations To Be Benchmarked/Audited?

Describe The General Data Collection And Analysis Methods:

Describe General Differences Noted Between Our Group And The Group Being Benchmarked/Audited: Describe The Overall Plan To Be Followed In The Benchmark/Audit, Including Pre And Post Steps, Timeframes And Follow Through:

Specific Issue/Concern? Timeframe What Is The Measurable? Who Will Do And How?

1.

Describe Basic Steps, Procedures And Protocol To be Followed In The Daily Conduct Of The Benchmarking Audit:

Contact Methods Clearances/Security General Arrangements Methods Of Communication

1.

Data To Be Reviewed? How To Review Data? Documentation To Be Reviewed? How To Review Documentation?

1.

Identify What Existing Resources Will Be Used In The Improvement, Based On The Benchmarking Process:

Existing Equipment?

Existing Space? Operator Training? Other?

1.

Identify What New Resources Will Be Required For Improvement, Based On The Benchmarking/Auditing Process: New Equipment? New Space? New SOP’s, Training? Other?

1.

After Completion Of The Benchmarking/Auditing Process, Identify All Findings Requiring Corrective Actions, And In What Ways:

Findings? Corrective Actions? Who Will Address? When To Be Completed?

1.

Explain How Operators, Supervisors, Quality, Engineering And Other Personnel Were Involved In The Process, Pre And Post:

Describe General Differences Noted Between Our Group And The Group Being Benchmarked/Audited: Describe New Goals, Or Changes To Be Made, Based On What Was Studied, Potential Implications For Strategic Planning:

Describe Groups To Be Communicated With, And Communication Methods To Be Used, Internal And External:

Describe New Broad Based Objectives Recommended At The Group/Team Level And Beyond, Based On Benchmarking/Auditing: Describe, In Detail, The Process Benchmarked, Differences Between The Way We Do It And The Process Benchmarked, Based on The Study:

Describe Changes/Improvements Recommended In Our Benchmarking/Auditing System Or Process, Based On This Study:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Tool 36 SDA-General Failsafing/Auditing Process, Systems (GFAPS)

General Use/Application: SDA focuses technical content and deliverables on project objectives, all associated with technological assessment for improvement. All researchers on team complete the information below and submit to the compiler for finalizing in portfolio. When completed, all researchers should be listed as contributing, detailing technical assessment issues, as reflections by all related to the SDA. Compilers also develop reflections synthesized over time, based on inputs contributed by each, and their own review and reflections of the team’s work. This also involves use of SDA-FACR (below) to determine findings, analysis, conclusions and recommendations (FACR) ultimately merged into grand form format by compilers for inclusion in the final portfolio when compiled by team leadership. Persons not contributing substantially at SDA’s, and FACR responses should have lower scores shown in PPMTA at final grand portfolio. Compiler (s): Researcher (s): Team: Phase: Tool: Date:

General Background Information On Process, Product, Issue Or Problem Requiring Failsafing/Auditing:

Describe The Overall Plan To Be Followed In The Audit, Including Pre And Post Audit Steps, Timeframes And Follow Through:

Specific Issue/Concern? Timeframe What Is The Measurable? Who Will Do?

1. Describe Basic Steps, Procedures And Protocol To be Followed In The Daily Conduct Of The Audit:

Contact Methods Clearances/Security General Arrangements Methods Of Communication

1. Identify Who Will Be Interviewed In The Audit, Purpose Of The Interview:

Interviewee? Interviewer? Specific Focus/Questions? Time And Place? 1. What Errors Are Needed To Be Failsafed?

Specific Issue/Concern? Timeframe What Is The Measurable? Who Will Do?

1. Describe Visibility In The Work Area:

Clutter/Lost Product Issues Potential Safety Issues Housekeeping Issues General Organization

1. Identify Specific Data And Documentation To Be Reviewed, And In What Ways:

Data To Be Reviewed? How To Review Data? Documentation To Be Reviewed? How To Review Documentation?

1. Identify What Existing Resources Will Be Used In The Improvement, Based On The Failsafing/Audit Process:

Existing Equipment? Existing Space? Operator Training? Other?

1. Identify What New Resources Will Be Required In The Improvement, Based On The Failsafing/Audit Process:

New Equipment? New Space? New SOP’s, Training? Other? 1. After Completion Of The Failsafing/Audit Process, Identify All Findings Requiring Corrective Actions, And In What Ways:

Findings? Corrective Actions? Who Will Address? When To Be Completed? 1. Explain How The Five Why’s Were Applied In The Process:

Explain How Upstream And Downstream Operations May Be Impacted By The Proposed Changes, Improvements:

After Completion Of The Failsafing/Audit, Identify All Findings Requiring Corrective Actions, And In What Ways:

Findings? Corrective Actions? Who Will Address? When To Be Completed? 1. Explain How Operators, Supervisors, Quality, Engineering And Other Personnel Were Involved In The Process:

Explain How This Audit Can Be Improved:

SDA-FACR General Use/Application: o Identify the project objective being addressed (usually at least one or two would be anticipated from PPARMP). o Explain findings, analyses observed as various data and documentation in SDA (ex: how form/work relates to ISO 9000?). o Explain conclusions, recommendations based on findings, analyses in data, documentation (ex: improvements to form, other tools to try). o Address relationships in other forms, particularly PPARMP to provide direction in methodology for project. o SDA compilers use SDA FACR’s posted at discussion boards compiled as grand forms for portfolio presentation with all SDA’s. o Team leadership summarizes SDA FACR inputs to show what was learned, objective accomplishment, course outcome progress. Researcher 1. Project objective

(s) written…… 2. Findings, analyses observed as data, documentation in SDA……

3. Conclusions, recommendations in methods, course outcomes……

4. PPARMP, ROLDA, PPMTA relationships?

Researcher Contribution

Collect From All, Compile

Compiler Portfolio Reflections, Summary Of Technical Work, Assessment, Based On Individual Researcher Inputs:

Compiler Reflections To Help Improve Quality Of Work In Portfolio Based On All Individual Researcher Inputs:

Major Effect, Symptom

Sub-issues, Effects

Sub-issues, Effects

Sub-issues, Effects

Major Effect, Symptom

Problem, Cause,

Major Issue

Sub-issues, Effects

Sub-issues, Effects

Sub-issues, Effects

Major Effect, Symptom

Sub-issues, Effects

Sub-issues, Effects

Sub-issues, Effects

Major Effect, Symptom

Sub-issues, Effects

Sub-issues, Effects

Sub-issues, Effects

General Cause And Effect Analysis Diagram