gmd major subcontractor quality performance measurement process michael l. swenson quality &...
TRANSCRIPT
GMD Major Subcontractor Quality Performance Measurement Process
Michael L. Swenson
Quality & Mission Assurance
Boeing GMD Program
21 September 2002
What is GMD?
Our Suppliers Build:
Booster Vehicles
Kill Vehicles
Radar Systems
Missile Fields
BMC3
GMD is a System of Systems
What is GMD?
Boeing Does:
Program Management
Systems Engineering
Supplier Management
Missile Field Operations
Flight Tests
Ground Tests
GMD is a System of Systems
GMD Geographical Deployment
Raytheon Electronic SystemBoston MA
Lockheed Martin Denver, CO
TRW Huntsville, AL
Lockheed Martin Sunnyvale, CA
Northrop Grumman Sunnyvale, CA
Boeing Anaheim, CA* GBI* Radars
Raytheon Missile Systems Tucson, AZ
Orbital Sciences Chandler, AZ
BoeingColorado Springs, CO
Boeing Huntsville, AL* T&E* D&S* Operations
JPOHuntsville, AL
Boeing WDC* Program Management* Systems Engineering* BMC3* Business Operations* Quality* Supplier Management
MDAWDC
VAFB
BechtelFort Greely, AK
Cheyenne MountainColorado Springs, CO
Legend:Red = Boeing Sites * = IPT Leader LocationGreen = Customer SitesBlue = SuppliersBlack = Test Sites
BoeingOgden, UT
Fort Greely, AK
Lockheed Martin Courtland, AL
RTS South Pacific
BechtelSan Francisco, CA
Background – May 2002
• Existing Supplier Quality measure in Boeing’s SPMS did not provide adequate visibility relative to Quality performance of GMD Major Subcontractors.
• DCMA was developing a new supplier performance metric with which to measure Boeing and their major GMD subcontractors.
• Through the auspices of the integrated supply chain GMD Quality Council, Boeing and DCMA agreed to develop a common Supplier Quality measure.
Development Team Statement of Work
• Develop a common Boeing/DCMA Supplier Quality metric that provides visibility and meaningful/actionable data to:
• The Boeing GMD IPT, Quality and Supplier Management
• DCMA for their MAR process
• The Supplier to facilitate continuous improvement action plans
• The metric needs to be compatible with SPMS to support the PSC process and must support the DCMA MAR process.
• The metric must be simple, easy to understand, and is to measure all aspects of Quality, not just hardware.
• Complete the development in 6 weeks or less.
Integrated Development Team
Name Org Function LocationBelinda Sturz DCMA Leader AnaheimBret Batterton Boeing Deputy AnaheimJohn Crissone DCMA Quality HuntsvilleWayne Estes JPO Quality HuntsvilleJack Fowler Boeing Quality HuntsvilleVictor Grasaglio RSA Quality HuntsvilleDennis Hattan DCMA Quality AnaheimBob Lance Mevatec Quality HuntsvilleGary Larkin RSA Quality HuntsvilleRon Newton Boeing SM&P AnaheimPhilip Ofton DCMA Quality AnaheimRicardo Pinos Boeing SM&P WDCDan Smith Boeing SM&P Huntsville
Deployment• Boeing and DCMA agreed to evaluate the suppliers together on a
monthly basis.
• Eight GMD facilities selected for initial use of tool based on complexity of procurement and locations to which letters of delegation have been issued:
TRW (Huntsville) Raytheon (Tucson)
Lockheed Martin (Denver) Northrop Grumman (Sunnyvale)
Raytheon (Andover) Orbital Sciences Corp (Chandler)
Boeing (Huntsville) Boeing (Anaheim)
• Feedback from initial launch will be used to continually improve the metric and feedback process.
Current Status
• Evaluation tool and process developed ahead of schedule.
• Tool has 10 line item elements that are bucketed in 3 categories (Systems, Defects & Management) and assesses Quality of Hardware, Software, Supplier Quality, Paper, etc et al
• 2 line items are still in development
• Deployed at 2 suppliers in July and 2 in August.
• Working with Boeing SM&P to input data into SPMS to support the Preferred Supplier Certification process.
• Interest in this process has been expressed by NASA and Lockheed Martin.
QUALITY SYSTEM ASSESSMENT
GMD Program (Incorporated changes from Sept 02 Council)For each data source, review the records for the reporting period and elect the rating which most describes the data. Note the rating in the last column. Once each element has been assessed, Boeing and DCMA assign an overall assessment rating based on the significance of the data source, the issues identified and the degree to which corrective action has evolved.
Data Source Green Yellow Red Rating
1. Boeing/DCMACorrective Action Requests
Issued (Verbal or Written)
To assess the types of Boeing/DCMA-issued Corrective Action Requests
Customer-issued corrective action requests
REF PROCAS CAR level
Definitions for Level I,
Level II, Level III, Level IV
No quality related deficiencies or only minor contractual noncompliance which requires no special management attention to correct. Corrective Action Requests (CAR) in this category may be either verbal or written.
Contractual noncompliance that are systemic in nature and could adversely affect cost, schedule or performance if not corrected in a timely manner. Nonconformances in this category are written and directed to the contract management level responsible for the process.
Serious contractual non compliances which may include contractual remedies such as reductions of progress payments, cost disallowances, cure notices, show cause letters or business management system disapprovals. Non conformances in this category are written and directed to top program management for resolution within a negotiated time frame.
Rated by Program,no (n/a)
2. External Audit Findings
To assess the severity of findings detected during customer/third party audits or process surveillance.
Scheduled audits and process surveillance conducted by DCMA representatives or Boeing Procurement Quality Assurance Specialists assigned to major subcontractor facilities.
Third party ISO assessments
Quality System Surveys, Process verification Assessments, First Articles Inspections, Functional Configuration Audits, Physical Configuration Audits, Hardware Acceptance Reviews and Independent Product Process Assessment.
There are no major findings.
MINOR: A nonconformance that is not likely to materially reduce the usability of the supplies or services for their intended purpose, or is a departure from established standards having little bearing on the effective use or operation of the supplies of services.
Findings are major or critical but a corrective action response is being developed within the specified due date.
MAJOR: A nonconformance, other than critical, that is likely to result in failure, or to materially reduce the usability of the supplies or services for their intended purpose.
Findings are major or critical and the corrective action response is unacceptable.
CRITICAL: A nonconformance that judgment and experience indicate is likely to result in hazardous or unsafe conditions for individuals using, maintaining, or depending upon the supplies or services; or is likely to prevent performance of a vital agency mission.
Rated by Program,no (n/a)
QUALITY SYSTEM ASSESSMENT
Data Source Green Yellow Red Rating
3. Supplier Self Governance Program
To assess the effectiveness of the supplier’s internal audit system.
Supplier Internal Audit Process
(Compliance / Product Quality)
Self audits are being performed and issues are non-repetitive, corrective action is effective and the Supplier has an auditable preventive action system in place.
Corrective/Preventative Action – action taken to eliminate the root cause(s) and symptom(s) of an existing undesirable deviation or nonconformity to prevent recurrence and to prevent potential occurrence of similar problems.
Self audits are behind schedule or are being performed on-time yet issues are repetitive and corrective action is being taken, or the Supplier has no auditable preventive action system in place.
Self audits are not being performed or corrective action is ineffective.
Rated by Program,or system no (n/a)
4. Supplier Software Self Governance Program
To assess the effectiveness of the supplier’s software Quality self governance process.
Supplier’s Software Quality Process Plan
Software audits are being performed on time and issues are non-repetitive, corrective action is effective.
Software audits are behind schedule or are being performed on-time yet issues are repetitive and corrective action is being taken.
Software self audits are not being performed or corrective action is ineffective.
Rated by Program,no (n/a)
5. Supplier Hardware Non-conformances
To assess the effectiveness of the supplier’s use of their hardware nonconformance data
(Not a measure of the supplier’s current mfg performance but how effectively does the supplier use their mfg metric data to remedy negative/adverse trends)
Subcontractor’s nonconformance metrics
Supplier uses their hardwarenonconformance data to effectivelyremedy poor manufacturing performance.
Supplier does not use their hardware nonconformance data to effectively remedy poor manufacturing performance.
Supplier does not have hardware nonconformance data system that objectively measures in-house performance.
Rated by Program,no (n/a)
QUALITY SYSTEM ASSESSMENT
Data Source Green Yellow Red Rating
6. Test Defects
To assess the effectiveness of the supplier’s use of their test failure data.
(Failure Reporting and Corrective Action System (FRACAS) data or related Nonconformance test data)
Subcontractor’s FRACAS data
Test Nonconformance data
Supplier uses their test failure data to effectively remedy failures found on units under test.
Supplier does not use their test failure data to effectively remedy failures found on units under test.
Supplier does not have a system that collects test failures.
Rated by program can use (n/a)
7. Supplier Quality Escapes
To assess the impact of supplier escapes.
( Products that are found acceptable through supplier’s quality system yet are later found defective by Boeing/ DCMA)
Boeing/DCMA Mandatory Inspection points.
Only evaluates those defects detected that are deemed supplier responsibility
No escapes. Defects detected by Boeing/DCMA at supplier’s facility following acceptance by supplier.
Defects detected by Boeing/DCMA at Boeing or ship to destination following acceptance by supplier.
Rated by program can use (n/a)
8. Supplier Quality Assurance Management System
To assesses the effectiveness of the subcontractor’s use of their supplier quality performance data.
Subcontractor’s supplier quality performance data.
Subcontractor uses their supplier quality performance data to effectively remedy poor performing suppliers as well as ensuring continued success with good performing suppliers.
Subcontractor does not use their supplier quality performance data to effectively remedy poor performing suppliers as well as ensuring continued success with good performing suppliers.
Subcontractor does not have a system that collects supplier quality performance data.
Rated by Program,or system no (n/a)
QUALITY SYSTEM ASSESSMENT
Data Source Green Yellow Red Rating
9. Supplier’s Shipping Documentation
To assess the quality of the supplier’s Shipping documentation.
Any shipping Documents required by DCMA or Boeing.
No errors detected or the errors detected during customer review of delivery data documentation are minor, easily correctable, and do not impact delivery.
Errors detected during customer review of delivery data documentation impact delivery or hardware.
Errors detected during customer review of delivery data documentation impact delivery or hardware, and program schedule.
Rated by program can use (n/a)
10. Management Effectiveness
To assess the health of the supplier and their quality system.
Management Responsiveness Subcontract Management Program Management
1. Identifies & applies Quality resources required to meet schedule.
2. Software Quality Staffing.
3. Communicates to the organization customer requirements.
4. Commitment to comply with requirement & continually improve the effectiveness of Quality Management Program.
Two (2) or less of the Data sources listed were not met.
Three (3) or four (4) of the Data sources listed were not met.
Five (5) or more of the Data sources listed were not met.
Rated by Program,no (n/a)
QUALITY SYSTEM ASSESSMENT
Data Source Green Yellow Red Rating
10. Management Effectiveness Continued
Assess the health of the supplier and their quality system.
5. Methods are defined for measurement of organizations performance in order to determine if planned objectives have been achieved.
6. Establishing teaming environment with internal and external suppliers to the organization.
7. Labor issues are reported that effect program schedules.
8. Significant Activities (SA) that must be completed prior to entering or exiting an event.
9. Establishes and implements an effective training program.
10. Customer responsiveness/satisfaction
Two (2) or less of the Data sources listed were not met.
Three (3) or four (4) of the Data sources listed were not met.
Five (5) or more of the Data sources listed were not met.
Quality Performance Measurement Elements
Assumptions
• The DCMA and Boeing Quality representatives will establish the specific methodology to accomplish the monthly rating.
• Data used to perform the monthly assessments is currently available at the supplier’s premises.
• Results of the monthly assessment will be posted in the GMD TINS system, but not the detailed Feedback Report. Both the assessment and Feedback Report will be posted in SPMS.
• Ratings are for the current month only and will not be carried over to the next month unless another event/assessment indicates similar performance.
Element # 1 - Boeing/DCMA Corrective Action Requests
Objective: To assess the types of Boeing/DCMA-issued Corrective Action Requests
• Data Sources • Customer-issued corrective action requests
• Green Rating - No quality related deficiencies or only minor contractual noncompliance which requires no special management attention to correct. Corrective Action Requests (CAR) in this category may be either verbal or written
• Yellow Rating -Contractual noncompliances that are systemic in nature and could adversely affect cost, schedule or performance if not corrected in a timely manner. Nonconformances in this category are written and directed to the contract management level responsible for the process.
• Red Rating - Serious contractual noncompliances which may include contractual remedies such as reductions of progress payments, cost disallowances, cure notices, show cause letters or business management system disapprovals. Nonconformances in this category are written and directed to top program management for resolution within a negotiated time frame
Element #2 – External Audit Findings
Objective: To assess the severity of findings detected during customer/third party audits or process surveillance
• Data sources• Scheduled audits and process surveillance conducted by DCMA
representatives or Boeing Procurement Quality Assurance Specialists assigned to major subcontractor facilities
• Third party ISO assessments• Quality System Surveys, Process Validation Assessments, First Article
Inspections, Functional Configuration Audits, Physical Configuration Audits, Hardware Acceptance Reviews and Independent Product/Process Assessments
• Green Rating - There are no major findings (see definitions on next slide)• Yellow Rating - Findings are major or critical but a corrective action response
is being developed within the specified due date• Red Rating - Findings are major or critical and the corrective action response
is unacceptable
Element #2 – External Audit Findings
Definitions per Mil-Std-109C:
- Minor: A nonconformance that is not likely to materially reduce the usability of the supplies or services for their intended purpose, or is a departure from established standards having little bearing on the effective use or operation of the supplies or services.
- Major: A nonconformance, other than critical, that is likely to result in failure, or to materially reduce the usability of the supplies or services for their intended purpose.
- Critical: A nonconformance that judgment and experience indicate is likely to result in hazardous or unsafe conditions for individuals using, maintaining, or depending upon the supplies or services; or is likely to prevent performance of a vital agency mission.
Element #3 - Supplier Self Governance Program
Objective: To assess the effectiveness of the supplier’s internal audit system
• Data Sources • Supplier Internal Audit process
• Green Rating - Self audits are being performed and issues are non-repetitive, corrective action is effective and the Supplier has an auditable preventive action system in place
• Yellow Rating - Self audits are behind schedule or are being performed on-time yet issues are repetitive and corrective action is being taken, or the Supplier has no auditable preventive action system in place
• Red Rating - Self audits are not being performed or corrective action is ineffective
Definition: (Corrective/Preventative Action – Action taken to eliminate the root cause(s) and sympton(s) of an existing undesirable deviation or nonconformity to prevent recurrence and to prevent potential occurrence of similar problems)
Element #4 Supplier Software Self Governance Program
Objective: To assess the effectiveness of the supplier’s software
Quality self governance process
Data Sources
- Supplier’s Software Quality Process Plan
• Green Rating - SW audits are being performed on time and issues are non-repetitive, corrective action is effective
• Yellow Rating - SW audits are behind schedule or are being performed on-time yet issues are repetitive and corrective action is being taken
• Red Rating - Self audits are not being performed or corrective action is ineffective
Element #5 - Supplier Hardware Nonconformances
Objective: To assess the effectiveness of the supplier’s use of their hardware nonconformance data (i.e. not a measure of the supplier’s current mfg performance but how effectively does the supplier use their mfg metric data to remedy negative/adverse trends)
Data Sources
Subcontractor’s nonconformance metrics
• Green rating – Supplier uses their hardware nonconformance data to effectively remedy poor manufacturing performance
• Yellow rating – Supplier does not use their hardware nonconformance data to effectively remedy poor manufacturing performance
• Red rating – Supplier does not have a hardware nonconformance data system that objectively measures in-house quality performance
Element #6 - Test Defects
Objective: To assess the effectiveness of the supplier’s use of their test failure data (i.e. Failure Reporting and Corrective Action System (FRACAS) data or related Nonconformance test data)
Data Sources
• Subcontractor’s FRACAS data
• Test nonconformance data
• Green rating – Supplier uses their test failure data to effectively remedy failures found on units under test
• Yellow rating – Supplier does not use their test failure data to effectively remedy failures found on units under test
• Red rating – Supplier does not have a system that collects test failures
Element #7 - Supplier Quality EscapesObjective: To assess the impact of supplier escapes (i.e., products
that are found acceptable through supplier’s quality system yet are later found defective by Boeing/DCMA)
Data Sources
• Boeing/DCMA Mandatory Inspection Points
• Only evaluates those defects detected that are deemed supplier responsibility
• Green rating - No escapes
• Yellow rating - Defects detected by Boeing/DCMA at supplier’s facility following acceptance by supplier
• Red rating - Defects detected by Boeing/DCMA at Boeing or ship to destination following acceptance by supplier
Element #8 - Supplier Quality Assurance Management System
Objective: To assess the effectiveness of the subcontractor’s use of their supplier quality performance data
Data Sources
• Subcontractor’s supplier quality performance data
• Green rating – Subcontractor uses their supplier quality performance data to effectively remedy poor performing suppliers as well as ensuring continued success with good performing suppliers
• Yellow rating – Subcontractor does not use their supplier quality performance data to effectively remedy poor performing suppliers as well as ensuring continued success with good performing suppliers
• Red rating – Subcontractor does not have a system that collects supplier quality performance data
Element #9 – Supplier’s Shipping Documentation
Objective: To assess the quality of the supplier’s shipping documentation
Data Sources
• Any shipping Documents required by DCMA or Boeing
• Green rating – No errors detected or the errors detected during customer review of delivery data documentation are minor, easily correctable, and do not impact delivery
• Yellow rating – Errors detected during customer review of delivery data documentation impact delivery or hardware
• Red rating – Errors detected during customer review of delivery data documentation impact delivery or hardware, and program schedule
Element #10 - Management Effectiveness
Objective: To assess the health of the supplier’s quality management system
Data Source Considerations• Identifies & applies Quality resources required to meet schedule.• Software Quality Staffing• Communicates to the organization customer requirements.• Commitment to comply with requirement & continually improve the effectiveness
of Quality Management Program.• Methods are defined for measurement of organizations performance in order to
determine if planned objectives have been achieved.• Establishing teaming environment with internal and external suppliers to the
organization.• Labor issues are reported that effect program schedules.• Significant Activities (SA) that must be completed prior to entering or exiting an
event.• Establishes and implements an effective training program• Customer responsiveness/satisfaction
Element #10 - Management Effectiveness
• Green Rating – Two (2) or less of the Data Sources listed were not met.
• Yellow Rating – Three (3) or four (4) of the Data Sources listed were not met
• Red Rating – Five (5) or more of the Data Sources listed were not met
Applicability Matrix
# ElementAlways Rated?
Program/System?
1 Corrective Action Requests Y P2 External Audits Y P/S3 Self-Governance Y P/S4 SW Self-Governance Y P5 HW Nonconformances Y P6 Test Defects N P7 Supplier Escapes N P8 Supplier Quality Management Y P/S9 Shipping Documentation N P
10 Management Effectiveness Y P/S
Future Elements Still In Development
Future Element # 11 – Design Quality
Objective: To assess the quality of supplier’s design and development system
Still Under Construction
Consider as post-launch
element
Future Element # 12 – Contract Required Deliverable Data
Objective: To assess the quality of supplier’s deliverable data
Data Sources
• Any data required to be delivered to Boeing by Purchase Order Contract
Still Under Construction
Consider as post-launch element
Rating Formula
Rating Formula
• Group I “System” Elements 1, 2, 3 13 pts each• Group II “Defects” Elements 5, 6, 7, 9 10 pts each• Group III “Management” Elements 4, 8, 10 7 pts each
• Green – all points available; yellow – ½ points available; red – 0 points
• Rating will be developed by calculating:Points Scored
Points Available• Resulting percentage will define monthly rating using following methodology
BOEING SPMS DCMA MAR100% Gold Green93% - 99% Silver Green85% - 92.9% Bronze Yellow78% - 84.9% Yellow Yellow< 77.9% Red Red
Rating Formula - Rationale
SCORE RATING RATIONALE
100% Gold Consistent with World Class performance
93% - 99.9% Silver Allows a Yellow in Group 1 or 2 or a Red in Group 3
85% - 92.9% Bronze Allows a Red in Group 1 or 2 or 2 Yellows in Group 3
78% - 84.9% Yellow Would not allow a Red in Group 1 or 2
77.9% or less Red Default
Rating Process Deployment Schedule• July ’02
• Raytheon (Tucson) Completed• Orbital Sciences (Chandler) Completed
• August ’02• TRW (Huntsville) Completed• Raytheon (Andover) Completed
• September ‘02 • Propose Rating Tool to Boeing Supplier Management Council Completed• Re-evaluate rating process with GMD Quality Council Completed• Boeing (Huntsville) Completed• Boeing (Anaheim) Completed• Brief NASA Quality Team Completed
• Next• Northrop Grumman (Sunnyvale)• Lockheed (Denver)