unconfirmed minutescdn.p-r-i.org/wp-content/uploads/2012/09/1252/nmc... · web viewconsider...

6
NMC STANDARDIZATION COMMITTEE FEBRUARY 2012 CONFIRMED MINUTES FEBRUARY 21, 2012 SHERATON – SAN DIEGO HOTEL & MARINA SAN DIEGO, CA These minutes are not final until confirmed by the NMC Standardization Committee in writing or by vote at a subsequent meeting. Information herein does not constitute a communication or recommendation from the NMC Standardization Sub-Team and shall not be considered as such by any agency. TUESDAY, FEBRUARY 21, 2012—CLOSED MEETING 1.0 OPENING COMMENTS 1.1 Call to Order / Quorum Check The NMC Standardization Sub-Team was called to order at 8:00 a.m., 21- FEB-12. The meeting was restricted to Sub-Team members as well as NMC members and invited guests. A quorum was established with the following representatives in attendance: Subscriber Members/Participants Present (* Indicates Voting Member) NAME COMPANY NAME Mike Baumann Boeing * Pascal Blondet Airbus * Richard Blyth Rolls-Royce * Robert Bodemuller Ball Aerospace & Technologies Katie Bradley Lockheed Martin Corporation Michael Brandt Alcoa Inc * Christian Buck SAFRAN Group * Robert Cashman Parker Aerospace Deneige Fitzpatric k Bombardier * John Haddock BAE Systems Air & Information Stephen Hunt Rolls-Royce Robert Koukol Honeywell * Eric Le Fort SONACA 1

Upload: others

Post on 19-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

UNCONFIRMED MINUTES

NMC STANDARDIZATION COMMITTEE

FEBRUARY 2012

CONFIRMED MINUTES

FEBRUARY 21, 2012

sHERATON – SAN DIEGO HOTEL & MARINA

SAN DIEGO, CA

These minutes are not final until confirmed by the NMC Standardization Committee in writing or by vote at a subsequent meeting. Information herein does not constitute a communication or recommendation from the NMC Standardization Sub-Team and shall not be considered as such by any agency.

TUESDAY, FEBRUARY 21, 2012—CLOSED MEETING

1.0OPENING COMMENTS

1.1Call to Order / Quorum Check

The NMC Standardization Sub-Team was called to order at 8:00 a.m., 21-FEB-12.

The meeting was restricted to Sub-Team members as well as NMC members and invited guests.

A quorum was established with the following representatives in attendance:

Subscriber Members/Participants Present (* Indicates Voting Member)

NAME

COMPANY NAME

Mike

Baumann

Boeing

*

Pascal

Blondet

Airbus

*

Richard

Blyth

Rolls-Royce

*

Robert

Bodemuller

Ball Aerospace & Technologies

Katie

Bradley

Lockheed Martin Corporation

Michael

Brandt

Alcoa Inc

*

Christian

Buck

SAFRAN Group

*

Robert

Cashman

Parker Aerospace

Deneige

Fitzpatrick

Bombardier

*

John

Haddock

BAE Systems Air & Information

Stephen

Hunt

Rolls-Royce

Robert

Koukol

Honeywell

*

Eric

Le Fort

SONACA

*

Daniel

Lecuru

Eurocopter

Jeff

Lott

Boeing

*

Frank

Mariot

Triumph Group

*

Steve

McGinn

Honeywell

*

Robin

McGuckin

Bombardier

Gary

Merrill

M7 Aerospace

*

Heather

Meyer

Cessna Aircraft Company

Chairperson

Tom

Nakamichi

Boeing

Luis Gustavo

Pacheco

Embraer S.A

Leandro Eduardo

Pereira

Embraer S.A

*

Rick

Peterson

Rockwell Collins

*

Mark

Rechtsteiner

GE Aviation

*

Brad

Richwine

Raytheon

*

Davide

Salerno

Alenia

Laurie

Strom

Honeywell Aerospace

D. Scott

Sullivan

Honeywell Aerospace

*

David

Thornhill

General Dynamics

Michael

Walker

Spirit Aero

*

Kevin

Ward

Goodrich Corporation

*

George

Winchester

Northrop Grumman

Other Members/Participants Present (* Indicates Voting Member)

NAME

COMPANY NAME

*

Tim

Crowe

Dearborn Precision

*

Robert

Custer

AAA Plating & Inspection

*

Dave

Michaud

Fountain Plating Company

*

Mike

Schleckman

Voss Industries Inc

PRI Staff Present

Mark

Aubele

John

Barrett

Jim

Mike

Borczyk

Graham

Mike

Gutridge

Scott

Klavon

Jim

Lewis

Justin

McCabe

Melanie

Petrucci

Stan

Revers

Jon

Steffey

1.2Review of Nadcap Code of Conduct and Meeting Conduct

The meeting attendees were reminded of the Code of Conduct and Conflict of Interest documented in the Attendees Guide on page 7 which should guide all in their participation in the meeting.

Code of Ethics and

Conflict of Interest.ppt

1.3Approval of Previous Meeting Minutes

Motion made to approve the minutes from 17-Oct-11 as written. Motion was seconded and passed unanimously.

NMCStandardization

CommitteeOct2011minutesHLM.doc

2.0RAIL REVIEW

Current status of RAIL was reviewed.

NMC Standardization

RAIL February 2012.xls

3.0UPDATE: Nadcap aUDITOR cONSISTENCY / eFFECTIVENESS

Richard Blyth reported out on the activities and status of the Auditor Consistency/Effectiveness sub-team.

Nadcap Auditor

Consistency Presentation February 2012.ppt

Key feedback:

· May be beneficial for some actions to be given to other sub-teams to complete.

4.0SMART CHECKLIST STATUS

Laurie Strom reported out on the activities and status of the Smart Checklist sub-team.

Smart Checklists

-Feb 2012.ppt

Concerns:

· If checklists were pre -collapsed prior to audit, what would happen if something changed and auditor needed the pre collapsed sections?

ACTION ITEM: Jon Steffey to distribute “Smart Checklist” presentation and survey results to Task Group Chairs and Staff Engineers. Goal: Feedback to TG’s regarding effectiveness of TG in grouping of questions and use of NA’s. Determine whether any actions are required on their part. (Questions can be addressed to Laurie Strom and Jon Steffey.) In June, 2012 would like feedback from the CP and HT Task Groups. (Reference “Next Steps” on slide 2 of 21-Feb-12 presentation.)

5.0UPDATE: FAILURE-RISK MITIGATION

Kevin Ward reported out on activities and status of Failure-Risk Mitigation sub-team.

NOP-011 draft

changesJan 24 2012.doc

Failure Risk

Mitigation Final version.ppt

Concerns:

· Task Group Chairs and SE’s need to be consulted on proposed changes.

Team Members include: Kevin Ward (Lead), Jim Diamond, Frank Marriot, Pascal Blondet, Bob Cashman, Richard Blyth, Martha Hogan-Battisti, Robin McGuckin, Astrid Colon-Tirado, Michael Graham

ACTION ITEM: Mike Graham to distribute “pre-ballot draft NOP-011 and Presentation to NMC and TG Chairs and SE’s by 23 March, 2012 and ask for feedback by 20 April, 2012. (Work with Kevin Ward on verbiage before distributing.)

ACTION ITEM: Mike Graham, Kevin Ward, and Pascal Blondet to put together communication plan including guidelines/expectations. (Due date: 20-Jun-2012)

6.0 PROPOSED NEW PROJECT – TIME CYCLE DEFINITION

Scott Sullivan of the NDT Task Group gave presentations on the Time Cycles Definitions. NDT has requested that the NMC creates a sub-team to standardize the definitions. If adopted by NMC, Sub-team to identify the responsible “Standards Body” that would need to be approached to incorporate requirements. (Nadcap should not establish its own requirements to avoid conflict with “Vision”).

NMC Time Cycle

Definition Charter.ppt

Time Cycle Definition

Scoring Template.xls

NMC Standardization

- Time Definitions 7feb12.pptx

ACTION ITEM: NDT sponsored cross commodity initiate- Project approved. NMC Sponsor: Henry Sikorski. NDT (Mike Gutridge) needs to “poll” other commodities and primes (include ISO/AS) to identify standards that define “yearly”, “annually”, “quarterly”, “monthly”, etc. Bring results to June NMC Standardization Meeting for review. (Will request volunteers at Planning and Ops).

7.0 NEW BUSINESS

7.1Update Committee Member Roster

The NMC Standardization Committee member roster was updated.

NMC Standardization

Committee Membership.doc

ACTION ITEM: M. Graham to verify membership with Henry Johansson, Mark Cathey and Serge Labbe.

8.0ADJOURN

ADJOURNMENT – 21-FEB-12 – Meeting was adjourned at 9:40 a.m.

Minutes Prepared by: Melanie Petrucci – [email protected]

***** For PRI Staff use only: ******

Are procedural/form changes required based on changes/actions approved during this meeting? (select one)

YES* FORMCHECKBOX

NO FORMCHECKBOX

*If yes, the following information is required:

Documents requiring revision:

Who is responsible:

Due date:

NOP-011

M. Graham

March 23, 2012

1

_1390894210.ppt

Failure/Risk Mitigation Sub-team

Team Final Recommendations

Feb 2012

Failed Audits

Failed audits currently are only about 2% of all audits- good news

Failed audits probably represent the highest risk of product escapes- bad news

Failed audits have the least amount of documentation within eAuditnet- bad news

RCCA activity is not documented – bad news

RCCA is only the auditor during the new audit

As is situation:

-When an audit is failed (initial or reaccred), the audit is closed in eAuditnet.

-Therefore the supplier cannot/is not requested to provide data on actions conducted for closing findings.

-A 90 days period is required before the supplier can apply to a new audit (initial).

-At the subsequent audit (initial), the auditor has to verify that findings from the failed audit are closed (Verification confirmed by ticking a box) single person review and not documented.

( All other audits RCCA is a process involving the staff engineer and the task group, well documented and multiple people reviewing the activity, a very robust approach.)

Concerns:

-The huge amount of work conducted by the supplier for closing the findings is neither visible nor documented, nor is it validated by the normal Nadcap RCCA process

-Primes customers have no visibility on actions conducted by the supplier, generating redundant requests from Primes customers and then generating disturbances at both Primes customers and supplier,

-Why wait a 90 days period before the supplier can apply for a new audit (initial), if the supplier is ready earlier? Reduces risk period to Primes

-At the subsequent audit (initial), how the auditor has to verify that findings from the failed audit are closed appropriately?

-What objective evidence has the auditor to support this verification? Not posted in the system

-What confidence Primes customers may have about this verification without objective evidence?

Improvements:

In order to:

-Give to the supplier the opportunity to document their action plans for closing findings,

-Give Primes customers the visibility on the supplier’s action plan,

-Give the auditor the visibility on the supplier’s action plan prior the next audit,

-Provide evidence to both Auditor and Primes customers on finding closure by using the normal RCCA process

It is proposed that:

-Suppliers are requested to answer to findings for failed audits,

-Suppliers responses to findings are reviewed by Staff Engineers with the limitations of 3 cycles within 90 days max (for initial), and 50 days max. (reaccred),

-When findings are not closed within the time limit, then an Alert is sent out to Primes,

-The supplier may apply to a new audit (initial) as soon as RCCA is complete .

Prime Support

When the Risk mitigation is activated per NOP 011 the identified Primes of the failed supplier are expected to be engaged in this activity.

No expectation for general task group involvement.

Risk activity does effect Program metrics, not counted for normal activity.

Failed Audit

Supplier to respond to findings in eAuditNet

within

Initial: 3 cycles / 90 days

Reaccred : 3 cycles / 50 days

Advisory type F

# of open findings

<

failure criteria

Alert to Primes Subscribers

No

Supplier may apply to a new audit (initial)

Yes

End

Recommendations by Mode

Mode A- Supplier Stops Audit

Team Recommends- revised process does not apply. (Do not continue review of C/A’s after failure).

Mode B-Excessive Number of NCRs (Non-conformance Report)

Team recommends that revised C/A process applies. Supplier will be given an additional 3 rounds of response to close the finding or within 90 days for an initial audit or 50 days for reaccreditation audits from the date of failure.

Mode C-Severity

Team recommends that revised C/A process applies. Supplier will be given an additional 3 rounds of response to close the finding or within 90 days for an initial audit or 50 days for reaccreditation audits from the date of failure.

Recommendations by Mode

Mode D- Excessive Cycles

Team recommends that revised C/A process applies. Supplier will be given an additional 3 rounds of response to close the finding or within 90 days for an initial audit or 50 days for reaccreditation audits from the date of failure. TG Sub-team however could stop earlier than 3 cycles if supplier is “non-persuasive”. If not closed after 3 cycles and time limit, processing halts and primes are notified. Supplier is still able to provide information to system after this point but does not require SE review.

Mode E- Non-responsiveness

Team recommends that revised C/A process applies. Supplier will be given an additional 3 rounds of response to close the finding or within 90 days for an initial audit or 50 days for reaccreditation audits from the date of failure. TG Sub-team however could stop earlier than 3 cycles if supplier is “non-persuasive”. If not closed after 3 cycles and time limit, processing halts and primes are notified. Supplier is still able to provide information to system after this point but does not require SE review.

Benefits:

-Supplier actions are documented, normal RCCA process is used

-Primes Customers have visibility on actions conducted,

Prime TG members are involved in the additional cycles.

-Less disturbance for Supplier and Primes customers,

-Supplier may apply to a new audit as soon as they do not exceed failure criteria,

-Next Auditor has visibility on actions conducted,

-Objective evidence are provided for both Auditor and Primes Customers

-Alert System to Primes for Suppliers that are still exceeding failure criteria after either 90 days (initial) or 50 days (reaccred.)

Task Group Process

Note: The purpose of the risk mitigation is not to allow for a lesser failure activity. It is important for TG’s to consistently enforce failure criteria to prevent negatively impacting cycle time and “on time” accreditation metrics. The Nadcap Management Council shall periodically monitor failure metrics to ensure that failure criteria continue to be consistently enforced.

Actions

Sub team requests ballot

Request NMC concurrence

Revise NOP 011, complete for ballot

Update supplier support committee

Implement and monitor activity

Questions?

*

*

2

_1391309439.ppt

Smart Checklists

Status Update for Nadcap Meeting

NMC Standardization Committee

(Former "Checklist on Demand" Sub-team)

Feb 2012

Smart Checklist – Feb 2012 Summary

 Next steps:

Request pilot groups (CP & HT?) discuss pros & cons of ‘pre-collapsed’ checklists based on supplier scope, both in electronic and hard-copy format. Determine if proceed.

Discuss what changes if any would be necessary for the checklist to be used in paper form ‘pre-collapsed’

Obtain ROM cost estimate for software changes needed to ‘pre-collapse’ and print full & collapsed versions of checklist.

Obtain ROM cost estimate for software changes needed to enhance search capability to enable auditor to transfer data from paper to electronic.

Weld task group discuss and provide timeline and proposed approach to tagging questions to enable ‘pre-collapsed’ checklists based on ‘supplier maturity.’

Created list of ‘Drivers ‘to capture benefits.

Increased understanding of checklist generation and complexities of implementation

Talked to SE/TG and Completed survey to clarify approach

Identified phased approach to minimize disruptions

Start with a tailored checklist based on Supplier Scope - enabler for the future options of Deep Dive and Risk

Work with Welding task group to determine feasibility of pilot for tagged questions.

Concerns:

Use of paper copy audits may limit opportunity for ‘dynamically composed’ checklists.

May be necessary to provide two versions of hard copy checklist – full and ‘pre-collapsed’

Subteam Members:

Laurie Strom (Project Champion) - Honeywell Aerospace

Martha Hogan-Battisti - The Boeing Company

Bob Koukol - Honeywell Aerospace

Doug Matson - The Boeing Company

Heather Meyer - Cessna Aircraft Company

David Thornhill - General Dynamics Corp

Michael Brandt – Alcoa

Bryan Cupples – Bell Helicopter

Lucille Snedaker – General Dynamics Corp

Jeff Conrad – Cessna Aircraft Company

Louise Stefanakis – PRI

Jon Steffey – PRI

Subteam Meeting Dates:

Oct SE/TG review, Nov –Dec Survey, Jan 20,

Weld subteam Feb 1

Response

161/922 = 17%

Backup Data

Survey results

Survey data may be useful for general review by task groups.

1.pdf

Constant Contact Survey ResultsSurvey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter: None

1/9/2012 9:20 AM EST

TextBlock:

This survey is to gather input for improvement ideas for Nadcap checklists. This may include software upgrades or improved

communications and training on existing functions. Five items are currently being considered:

-- the use of NA to collapse sections of checklists

-- the controlled selection of certain questions for certain audits

-- the use of key words or tagging

-- the use of free text searching

-- and the use of hyperlinks between questions.

The first two would result in some questions not being asked for some audits, but under rules and procedures developed by the

task group; the last 3 would be features intended to help the auditor find or group questions in order to fill out the checklist more

easily. The purpose of the survey is to understand if these assumptions are correct, and/or if these features would be beneficial.

What is your role in the Nadcap program?

Answer 0% 100%Number of

Response(s)Response

RatioSubscriber 52 32.2 %

Supplier 47 29.1 %

Auditor 53 32.9 %

Other 8 4.9 %

No Response(s) 1 <1 %

Totals 161 100%

Page 1

What Task Group(s) do you participate in?

Answer 0% 100%Number of

Response(s)Response

RatioAQS 21 13.0 %

CMSP 7 4.3 %

COMP 11 6.8 %

CP 40 24.8 %

CT 11 6.8 %

ETG 9 5.5 %

FLU 4 2.4 %

HT 30 18.6 %

MTL 17 10.5 %

NDT 36 22.3 %

NMMM 4 2.4 %

NMMT 5 3.1 %

NMSE 18 11.1 %

SEAL 4 2.4 %

SLT 3 1.8 %

WLD 16 9.9 %

None 2 1.2 %

Totals 161 100%

Page 2

Does your Task Group currently group related questions in the checklists?

Answer 0% 100%Number of

Response(s)Response

RatioYes 133 82.6 %

No 14 8.6 %

Not sure 14 8.6 %

No Response(s) 0 0.0 %

Totals 161 100%

Page 3

Are the question groupings in your Task Group checklists related to (you may select more than 1):

Answer 0% 100%Number of

Response(s)Response

RatioIdentified scopes suppliersselect for accreditation?

69 43.1 %

Prime or Customer Specificrequirements?

58 36.2 %

Quality System versusprocess specific questions?

76 47.5 %

MOU (Memorandums ofUnderstanding) with othertask groups?

12 7.5 %

Baseline or Proceduralversus Job Audit

81 50.6 %

Process steps (e.g.: gritblasting, cleaning, materialpreparation)

113 70.6 %

Don't know 9 5.6 %

Don't group 8 5.0 %

Other 3 1.8 %

Totals 160 100%

Page 4

Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different

sections of the checklist)?

Answer 0% 100%Number of

Response(s)Response

RatioYes 107 66.4 %

No 37 22.9 %

Not Sure 16 9.9 %

Other 1 <1 %

No Response(s) 0 0.0 %

Totals 161 100%

Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in

the checklist?

(Note: if the response is yes for groupings in some parts of the checklists and no for other groupings,

please mark yes and add clarifying comments in the comment box.)

Answer 0% 100%Number of

Response(s)Response

RatioYes 142 88.1 %

No 14 8.6 %

Not sure 4 2.4 %

No Response(s) 1 <1 %

Totals 161 100%

Page 5

Under what conditions would it be acceptable to exclude certain questions from the checklist? (You may

select more than 1.)

Answer 0% 100%Number of

Response(s)Response

RatioBased on scope or processthat supplier does notperform

149 93.7 %

Based on merit status 36 22.6 %

Based on 'sampling plan'(e.g. 3 out of 5 questionsfrom this section, 2 out of 4questions from this section,etc)

23 14.4 %

Based on 'rotation plan'(e.g.this question askedevery 3rd audit, or if thissupplier does work for 3primes, rotation will ensureall 3 primes questions areasked within a 3 yearrotation)

33 20.7 %

Based on initial versusreaccreditation

48 30.1 %

Based on risk 40 25.1 %

Other 12 7.5 %

Totals 159 100%

Page 6

What, if any, of these factors/considerations might be used to determine selection of questions? (You may

select more than 1.)

Answer 0% 100%Number of

Response(s)Response

RatioNumber of job audit versusbaseline or procedural basedon merit status

49 31.8 %

Number of job audit versusbaseline or procedural basedon number or severity ofprior NCR

47 30.5 %

Results of statistical analysis(e.g. number of NCRs forgiven question, consistencyof question being missed,variability from auditor toauditor, etc.)

40 25.9 %

'Core' versus 'Non-core' 38 24.6 %

The response to a particularquestion earlier in thechecklist - scope (e.g. is asupplier performing x)

87 56.4 %

Response to a particularquestion earlier in thechecklist - risk (e.g. is asupplier performing xCORRECTLY)

69 44.8 %

Other 13 8.4 %

Totals 154 100%

Page 7

How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Answer 1 2 3 4 5Number of

Response(s)RatingScore*

In grouping related questionsto make the checklist easierto use?

158 3.8

In grouping related questionsto collapse and makechecklist shorter when NotApplicable?

158 3.5

In identifying relationshipsbetween questions (e.g. jobaudit to baseline)?

158 3.4

In organizing the checklist ina logical fashion?

157 3.9

In providing a format whichis easy to search and findapplicable questions?

158 3.5

*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.

Page 8

Which of the following search features would be most useful to the auditors for your Task Group in terms

of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Answer 1 2 3 4 5Number of

Response(s)RatingScore*

Use of NA (Not Applicable)to collapse sections ofchecklists?

157 4.6

Free Text Searching (e.g.find word that user types inanywhere in checklist)?

151 4.1

Key Word Searching orTagging (e.g. ExportControlled, Training)?

152 4.0

Hyperlinks (e.g. from jobaudit to baseline questions)?

153 3.9

*The Rating Score is the weighted average calculated by dividing the sum of all weighted ratings by the number of total responses.

Your contact information (OPTIONAL):

Answers Number of Response(s)First Name 91Last Name 90Company Name 81Email Address 90

Page 9

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: Auditor to: What is your role in the Nadcap program?

1/19/2012 3:28 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber00.0%

Supplier00.0%

Auditor53100.0%

Other00.0%

No Responses00.0%

Total53100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS1120.7%

CMSP47.5%

COMP23.7%

CP1630.1%

CT35.6%

ETG11.8%

FLU11.8%

HT713.2%

MTL47.5%

NDT1324.5%

NMMM00.0%

NMMT11.8%

NMSE713.2%

SEAL00.0%

SLT11.8%

WLD59.4%

None11.8%

Total53100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes4279.2%

No47.5%

Not sure713.2%

No Responses00.0%

Total53100%

12 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?1528.3%

Prime or Customer Specific requirements?1833.9%

Quality System versus process specific questions?2139.6%

MOU (Memorandums of Understanding) with other task groups?11.8%

Baseline or Procedural versus Job Audit2649.0%

Process steps (e.g.: grit blasting, cleaning, material preparation)3973.5%

Don't know713.2%

Don't group11.8%

Other11.8%

Total53100%

3 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes3769.8%

No1018.8%

Not Sure611.3%

Other00.0%

No Responses00.0%

Total53100%

12 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes4992.4%

No23.7%

Not sure11.8%

No Responses11.8%

Total53100%

18 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform4688.4%

Based on merit status815.3%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)23.8%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)47.6%

Based on initial versus reaccreditation1223.0%

Based on risk713.4%

Other59.6%

Total52100%

11 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status1122.0%

Number of job audit versus baseline or procedural based on number or severity of prior NCR816.0%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)1122.0%

'Core' versus 'Non-core'816.0%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)3162.0%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)2142.0%

Other36.0%

Total50100%

11 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?21242113

4%23%8%40%25%

In grouping related questions to collapse and make checklist shorter when Not Applicable?41831215

8%35%6%23%29%

In identifying relationships between questions (e.g. job audit to baseline)?211131610

4%21%25%31%19%

In organizing the checklist in a logical fashion?11331817

2%25%6%35%33%

In providing a format which is easy to search and find applicable questions?211101910

4%21%19%37%19%

10 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?000448

0%0%0%8%92%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?2351522

4%6%11%32%47%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?4281914

9%4%17%40%30%

Hyperlinks (e.g. from job audit to baseline questions)?42141216

8%4%29%25%33%

9 Comment(s)

11. Your contact information (OPTIONAL):

First Name35

Last Name35

Company Name27

Email Address35

Macro1

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: CP to: What Task Group(s) do you participate in?

1/19/2012 3:24 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber717.5%

Supplier1435.0%

Auditor1640.0%

Other37.5%

No Responses00.0%

Total40100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS615.0%

CMSP25.0%

COMP25.0%

CP40100.0%

CT12.5%

ETG00.0%

FLU00.0%

HT512.5%

MTL12.5%

NDT1332.5%

NMMM00.0%

NMMT12.5%

NMSE410.0%

SEAL00.0%

SLT00.0%

WLD12.5%

None00.0%

Total40100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes3485.0%

No25.0%

Not sure410.0%

No Responses00.0%

Total40100%

8 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?1537.5%

Prime or Customer Specific requirements?922.5%

Quality System versus process specific questions?2050.0%

MOU (Memorandums of Understanding) with other task groups?37.5%

Baseline or Procedural versus Job Audit2152.5%

Process steps (e.g.: grit blasting, cleaning, material preparation)3382.5%

Don't know410.0%

Don't group00.0%

Other00.0%

Total40100%

2 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes3485.0%

No410.0%

Not Sure25.0%

Other00.0%

No Responses00.0%

Total40100%

9 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes3895.0%

No25.0%

Not sure00.0%

No Responses00.0%

Total40100%

11 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform3794.8%

Based on merit status1641.0%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)512.8%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)1128.2%

Based on initial versus reaccreditation1846.1%

Based on risk1025.6%

Other512.8%

Total39100%

12 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status1947.5%

Number of job audit versus baseline or procedural based on number or severity of prior NCR1230.0%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)1230.0%

'Core' versus 'Non-core'1537.5%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)2665.0%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)2152.5%

Other410.0%

Total40100%

6 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?3441810

8%10%10%46%26%

In grouping related questions to collapse and make checklist shorter when Not Applicable?496128

10%23%15%31%21%

In identifying relationships between questions (e.g. job audit to baseline)?3125127

8%31%13%31%18%

In organizing the checklist in a logical fashion?187149

3%21%18%36%23%

In providing a format which is easy to search and find applicable questions?2126163

5%31%15%41%8%

6 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?001631

0%0%3%16%82%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?2211120

6%6%3%31%56%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?2221317

6%6%6%36%47%

Hyperlinks (e.g. from job audit to baseline questions)?323921

8%5%8%24%55%

5 Comment(s)

11. Your contact information (OPTIONAL):

First Name27

Last Name27

Company Name21

Email Address27

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: HT to: What Task Group(s) do you participate in?

1/19/2012 3:25 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber1033.3%

Supplier1033.3%

Auditor723.3%

Other26.6%

No Responses13.3%

Total30100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS516.6%

CMSP00.0%

COMP00.0%

CP516.6%

CT13.3%

ETG00.0%

FLU00.0%

HT30100.0%

MTL310.0%

NDT516.6%

NMMM00.0%

NMMT13.3%

NMSE310.0%

SEAL00.0%

SLT00.0%

WLD516.6%

None13.3%

Total30100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes2686.6%

No26.6%

Not sure26.6%

No Responses00.0%

Total30100%

4 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?1551.7%

Prime or Customer Specific requirements?1655.1%

Quality System versus process specific questions?1862.0%

MOU (Memorandums of Understanding) with other task groups?413.7%

Baseline or Procedural versus Job Audit1862.0%

Process steps (e.g.: grit blasting, cleaning, material preparation)2068.9%

Don't know13.4%

Don't group13.4%

Other13.4%

Total29100%

0 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes2273.3%

No26.6%

Not Sure516.6%

Other13.3%

No Responses00.0%

Total30100%

3 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes2996.6%

No13.3%

Not sure00.0%

No Responses00.0%

Total30100%

5 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform2996.6%

Based on merit status723.3%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)620.0%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)620.0%

Based on initial versus reaccreditation723.3%

Based on risk620.0%

Other26.6%

Total30100%

7 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status1242.8%

Number of job audit versus baseline or procedural based on number or severity of prior NCR1139.2%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)621.4%

'Core' versus 'Non-core'517.8%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)1657.1%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)1450.0%

Other27.1%

Total28100%

4 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?273134

7%24%10%45%14%

In grouping related questions to collapse and make checklist shorter when Not Applicable?28784

7%28%24%28%14%

In identifying relationships between questions (e.g. job audit to baseline)?37973

10%24%31%24%10%

In organizing the checklist in a logical fashion?263134

7%21%11%46%14%

In providing a format which is easy to search and find applicable questions?365123

10%21%17%41%10%

5 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?0001118

0%0%0%38%62%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?0131015

0%3%10%34%52%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?1151111

3%3%17%38%38%

Hyperlinks (e.g. from job audit to baseline questions)?1231013

3%7%10%34%45%

5 Comment(s)

11. Your contact information (OPTIONAL):

First Name15

Last Name15

Company Name15

Email Address15

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: NDT to: What Task Group(s) do you participate in?

1/19/2012 3:25 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber1233.3%

Supplier1027.7%

Auditor1336.1%

Other12.7%

No Responses00.0%

Total36100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS616.6%

CMSP12.7%

COMP25.5%

CP1336.1%

CT12.7%

ETG00.0%

FLU00.0%

HT513.8%

MTL38.3%

NDT36100.0%

NMMM00.0%

NMMT12.7%

NMSE411.1%

SEAL00.0%

SLT00.0%

WLD38.3%

None00.0%

Total36100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes3186.1%

No25.5%

Not sure38.3%

No Responses00.0%

Total36100%

5 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?1336.1%

Prime or Customer Specific requirements?1336.1%

Quality System versus process specific questions?1644.4%

MOU (Memorandums of Understanding) with other task groups?12.7%

Baseline or Procedural versus Job Audit1747.2%

Process steps (e.g.: grit blasting, cleaning, material preparation)2466.6%

Don't know25.5%

Don't group12.7%

Other12.7%

Total36100%

1 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes1952.7%

No1438.8%

Not Sure38.3%

Other00.0%

No Responses00.0%

Total36100%

6 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes3288.8%

No38.3%

Not sure12.7%

No Responses00.0%

Total36100%

9 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform3497.1%

Based on merit status822.8%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)38.5%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)822.8%

Based on initial versus reaccreditation822.8%

Based on risk514.2%

Other00.0%

Total35100%

7 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status822.8%

Number of job audit versus baseline or procedural based on number or severity of prior NCR822.8%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)925.7%

'Core' versus 'Non-core'822.8%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)2262.8%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)1851.4%

Other38.5%

Total35100%

7 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?1541114

3%14%11%31%40%

In grouping related questions to collapse and make checklist shorter when Not Applicable?2841011

6%23%11%29%31%

In identifying relationships between questions (e.g. job audit to baseline)?18979

3%24%26%21%26%

In organizing the checklist in a logical fashion?1551410

3%14%14%40%29%

In providing a format which is easy to search and find applicable questions?258137

6%14%23%37%20%

6 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?0001024

0%0%0%29%71%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?2041014

7%0%13%33%47%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?206158

6%0%19%48%26%

Hyperlinks (e.g. from job audit to baseline questions)?2161310

6%3%19%41%31%

5 Comment(s)

11. Your contact information (OPTIONAL):

First Name22

Last Name22

Company Name18

Email Address22

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: WLD to: What Task Group(s) do you participate in?

1/19/2012 3:26 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber850.0%

Supplier212.5%

Auditor531.2%

Other16.2%

No Responses00.0%

Total16100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS16.2%

CMSP16.2%

COMP00.0%

CP16.2%

CT16.2%

ETG00.0%

FLU00.0%

HT531.2%

MTL00.0%

NDT318.7%

NMMM00.0%

NMMT16.2%

NMSE16.2%

SEAL00.0%

SLT00.0%

WLD16100.0%

None00.0%

Total16100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes1275.0%

No16.2%

Not sure318.7%

No Responses00.0%

Total16100%

2 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?743.7%

Prime or Customer Specific requirements?637.5%

Quality System versus process specific questions?637.5%

MOU (Memorandums of Understanding) with other task groups?212.5%

Baseline or Procedural versus Job Audit850.0%

Process steps (e.g.: grit blasting, cleaning, material preparation)1168.7%

Don't know16.2%

Don't group16.2%

Other212.5%

Total16100%

2 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes850.0%

No637.5%

Not Sure212.5%

Other00.0%

No Responses00.0%

Total16100%

1 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes1275.0%

No318.7%

Not sure00.0%

No Responses16.2%

Total16100%

7 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform1386.6%

Based on merit status746.6%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)320.0%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)746.6%

Based on initial versus reaccreditation746.6%

Based on risk426.6%

Other213.3%

Total15100%

3 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status861.5%

Number of job audit versus baseline or procedural based on number or severity of prior NCR646.1%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)538.4%

'Core' versus 'Non-core'646.1%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)215.3%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)323.0%

Other215.3%

Total13100%

2 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?11147

7%7%7%29%50%

In grouping related questions to collapse and make checklist shorter when Not Applicable?22136

14%14%7%21%43%

In identifying relationships between questions (e.g. job audit to baseline)?12425

7%14%29%14%36%

In organizing the checklist in a logical fashion?12146

7%14%7%29%43%

In providing a format which is easy to search and find applicable questions?01274

0%7%14%50%29%

3 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?10157

7%0%7%36%50%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?12155

7%14%7%36%36%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?02174

0%14%7%50%29%

Hyperlinks (e.g. from job audit to baseline questions)?01562

0%7%36%43%14%

1 Comment(s)

11. Your contact information (OPTIONAL):

First Name9

Last Name9

Company Name8

Email Address9

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: Subscriber to: What is your role in the Nadcap program?

1/19/2012 3:27 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber52100.0%

Supplier00.0%

Auditor00.0%

Other00.0%

No Responses00.0%

Total52100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS59.6%

CMSP23.8%

COMP35.7%

CP713.4%

CT35.7%

ETG23.8%

FLU23.8%

HT1019.2%

MTL47.6%

NDT1223.0%

NMMM23.8%

NMMT35.7%

NMSE59.6%

SEAL11.9%

SLT00.0%

WLD815.3%

None00.0%

Total52100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes4586.5%

No47.6%

Not sure35.7%

No Responses00.0%

Total52100%

9 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?2751.9%

Prime or Customer Specific requirements?2038.4%

Quality System versus process specific questions?2548.0%

MOU (Memorandums of Understanding) with other task groups?917.3%

Baseline or Procedural versus Job Audit3057.6%

Process steps (e.g.: grit blasting, cleaning, material preparation)4076.9%

Don't know00.0%

Don't group47.6%

Other11.9%

Total52100%

3 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes2853.8%

No1834.6%

Not Sure611.5%

Other00.0%

No Responses00.0%

Total52100%

10 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes4484.6%

No611.5%

Not sure23.8%

No Responses00.0%

Total52100%

13 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform5096.1%

Based on merit status815.3%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)815.3%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)1325.0%

Based on initial versus reaccreditation1426.9%

Based on risk917.3%

Other47.6%

Total52100%

13 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status1632.6%

Number of job audit versus baseline or procedural based on number or severity of prior NCR1632.6%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)1326.5%

'Core' versus 'Non-core'1326.5%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)2755.1%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)2040.8%

Other612.2%

Total49100%

5 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?1362218

2%6%12%44%36%

In grouping related questions to collapse and make checklist shorter when Not Applicable?23102015

4%6%20%40%30%

In identifying relationships between questions (e.g. job audit to baseline)?29141511

4%18%27%29%22%

In organizing the checklist in a logical fashion?1342517

2%6%8%50%34%

In providing a format which is easy to search and find applicable questions?0882410

0%16%16%48%20%

5 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?1141331

2%2%8%26%62%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?1562116

2%10%12%43%33%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?12112214

2%4%22%44%28%

Hyperlinks (e.g. from job audit to baseline questions)?22101620

4%4%20%32%40%

8 Comment(s)

11. Your contact information (OPTIONAL):

First Name22

Last Name22

Company Name22

Email Address22

Nadcap Smart Checklists

Constant Contact Survey Results

Survey Name: Nadcap Smart Checklists

Response Status: Partial & Completed

Filter 1: Supplier to: What is your role in the Nadcap program?

1/19/2012 3:28 PM EST

1. What is your role in the Nadcap program?

Number of Response(s)Response Ratio

Subscriber00.0%

Supplier47100.0%

Auditor00.0%

Other00.0%

No Responses00.0%

Total47100%

2. What Task Group(s) do you participate in?

Number of Response(s)Response Ratio

AQS48.5%

CMSP00.0%

COMP612.7%

CP1429.7%

CT510.6%

ETG612.7%

FLU12.1%

HT1021.2%

MTL919.1%

NDT1021.2%

NMMM24.2%

NMMT12.1%

NMSE510.6%

SEAL24.2%

SLT24.2%

WLD24.2%

None12.1%

Total47100%

3. Does your Task Group currently group related questions in the checklists?

Number of Response(s)Response Ratio

Yes3880.8%

No510.6%

Not sure48.5%

No Responses00.0%

Total47100%

4 Comment(s)

4. Are the question groupings in your Task Group checklists related to (you may select more than 1):

Number of Response(s)Response Ratio

Identified scopes suppliers select for accreditation?2144.6%

Prime or Customer Specific requirements?1634.0%

Quality System versus process specific questions?2553.1%

MOU (Memorandums of Understanding) with other task groups?12.1%

Baseline or Procedural versus Job Audit2042.5%

Process steps (e.g.: grit blasting, cleaning, material preparation)2859.5%

Don't know24.2%

Don't group24.2%

Other12.1%

Total47100%

2 Comment(s)

5. Are the same questions asked or used more than once (e.g. Baseline and Job audit, or in different sections of the checklist)?

Number of Response(s)Response Ratio

Yes3574.4%

No817.0%

Not Sure36.3%

Other12.1%

No Responses00.0%

Total47100%

7 Comment(s)

6. Does your Task Group currently use "NA" (Not Applicable) or "Section NA" to mark groups of questions in the checklist? (Note: if the response is yes for groupings in some parts of the checklists and no for other groupings, please mark yes and add clarifying comments in the comment box.)

Number of Response(s)Response Ratio

Yes4187.2%

No510.6%

Not sure12.1%

No Responses00.0%

Total47100%

7 Comment(s)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform4597.8%

Based on merit status1634.7%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)1226.0%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)1226.0%

Based on initial versus reaccreditation1941.3%

Based on risk1839.1%

Other36.5%

Total46100%

6 Comment(s)

8. What, if any, of these factors/considerations might be used to determine selection of questions?  (You may select more than 1.)

Number of Response(s)Response Ratio

Number of job audit versus baseline or procedural based on merit status1839.1%

Number of job audit versus baseline or procedural based on number or severity of prior NCR2043.4%

Results of statistical analysis (e.g. number of NCRs for given question, consistency of question being missed, variability from auditor to auditor, etc.)1328.2%

'Core' versus 'Non-core'1430.4%

The response to a particular question earlier in the checklist - scope (e.g. is a supplier performing x)2452.1%

Response to a particular question earlier in the checklist - risk (e.g. is a supplier performing x CORRECTLY)2350.0%

Other36.5%

Total46100%

6 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?1752113

2%15%11%45%28%

In grouping related questions to collapse and make checklist shorter when Not Applicable?21461312

4%30%13%28%26%

In identifying relationships between questions (e.g. job audit to baseline)?3108169

7%22%17%35%20%

In organizing the checklist in a logical fashion?2661815

4%13%13%38%32%

In providing a format which is easy to search and find applicable questions?3135206

6%28%11%43%13%

5 Comment(s)

10. Which of the following search features would be most useful to the auditors for your Task Group in terms of making the checklist easier to use?

1 = Not Useful, 2 = Not likely to be useful, 3 = Not sure/Undecided, 4 = Somewhat Useful, 5 = Very Useful

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

Use of NA (Not Applicable) to collapse sections of checklists?0111529

0%2%2%33%63%

Free Text Searching (e.g. find word that user types in anywhere in checklist)?1061326

2%0%13%28%57%

Key Word Searching or Tagging (e.g. Export Controlled, Training)?1131427

2%2%7%30%59%

Hyperlinks (e.g. from job audit to baseline questions)?1171819

2%2%15%39%41%

6 Comment(s)

11. Your contact information (OPTIONAL):

First Name28

Last Name27

Company Name26

Email Address27

Survey data may be useful for general review by task groups. (Weld Example)

7. Under what conditions would it be acceptable to exclude certain questions from the checklist?  (You may select more than 1.)

  Number of Response(s)Response Ratio

Based on scope or process that supplier does not perform1386.6%

Based on merit status746.6%

Based on 'sampling plan' (e.g. 3 out of 5 questions from this section, 2 out of 4 questions from this section, etc)320.0%

Based on 'rotation plan' (e.g.this question asked every 3rd audit, or if this supplier does work for 3 primes, rotation will ensure all 3 primes questions are asked within a 3 year rotation)746.6%

Based on initial versus reaccreditation746.6%

Based on risk426.6%

Other213.3%

Total15100%

3 Comment(s)

9. How effective do you feel your Task Group is:

1 = Not effective, 2 = Could be better, 3 = Not sure/undecided, 4 = Somewhat effective, 5 = Very effective

Top number is the count of respondents selecting the option. Bottom % is percent of the total respondents selecting the option.12345

In grouping related questions to make the checklist easier to use?11147

7%7%7%29%50%

In grouping related questions to collapse and make checklist shorter when Not Applicable?22136

14%14%7%21%43%

In identifying relationships between questions (e.g. job audit to baseline)?12425

7%14%29%14%36%

In organizing the checklist in a logical fashion?12146

7%14%7%29%43%

In providing a format which is easy to search and find applicable questions?01274

0%7%14%50%29%

3 Comment(s)

Smart Checklist Drivers

Driver/NeedAlternate ApproachWhat worksWhat still needs improvementOptions to address improvement needComments

Audit Efficiency.  Focus audit only on special processes covered in scope of activity. Checklist structure –slash sheets –NA of sectionsConsistent structureTime to evaluate applicability of NA Potential differences applying NA. Redundant questions.A checklist tailored to supplier scope with questions from a common databaseNeed to make sure that if supplier didn't accurately ID scope, auditor can 'recover' while on-site.

Audit Consistency: Time Management.  Focus audit only on special processes covered in scope of activity. Auditor training and pre-audit preparationAuditors know where to find applicable sectionsAdherence to Checklist. Portability of checklist.A checklist tailored to supplier scope with questions from a common databaseVariable structure may require careful lay-out and additional training.  Consider notebook tablets to have checklist criteria “in hand” when on the shop floor. 

Audit Usefulness: deep-dives / Follow-upFollow up questions that clarify problems or chain of discrepant activityAuditor training and pre-audit prep. Review prior NCRs Layers of linked questionsConsistent structurePotential differences in applying NATailored checklist with questions pulled from database with hierarchy to guide auditor through additional questionsFault tree or branching logic may be difficult to program.

Auditing Prime Requirements: Embed req in checklist. Slash sheets Focused auditor trainingConsistent structureLength of checklistMeta data tags to identify Prime questions to be asked based on scope  

Efficient Revision of Audit Criteria.  Simple up/down votes on new questions rather than complete checklist rev.Checklist structure –slash sheets by ‘families’  Consistent structureTime spent on wording questions. Time spent locating examples. Compare and contrast questions to facilitate editing. Difficulty and delays in fixing errors or updating checklist when specs change.Database style format where questions can be revised individually rather than revising a whole checklist.  Change the way we review and ballot audit checklists could make the system quicker to react to errors, failures, document changes and new OEM requirements.

Options for “Smart” Checklists

OptionProsConsNotes/Feedback

Level 1:Mapping or tagging questions to more directly match sub-scope/supplier methods by selective use of questions.

Enables more dynamic answering during an audit.Note: Use of paper copy audits may not allow dynamic answering during audit. May require ability to print 2 hardcopy versions – full and ‘pre-collapsed’.More accurate checklistAuditor spends more time on most important audit aspectsAccurate completeness checkMore valuable review timeReduces clutterSignificant effort by TG to identify and maintain mappingsSome commodities already compose their checklists in this formatLimited direct extra value to the Subscribers (lots of indirect value, but hard to measure)May be an opportunity to have checklist questions “tagged” for near-term search and grouping (and future ability to dynamically compose checklists), with TGs able to add tags in a phased approach.

Level 2:Dynamically composed checklists based on a question database. Questions have attributes (i.e. Technology, Prime Spec, Quality System, etc.)

Enables dynamic composition of checklists before an audit.Could start in phases with shared/header metadata (Facility Info, QS Info, Job Audits, etc.) means less impact on existing modulesMore detailed reports for Job AuditsGetting and maintaining consensus for rules/questions could be difficultSignificant effort by TG to identify and maintain mappingsImpact to other modules (i.e. cost estimator) would have to be handled during development

Level 3: Questions asked based on Risk (i.e. If this question generated a lot of NCRs, it would be asked more frequently. Also, skip questions based on performance. Adaptive – if your question generated an NCR, ask a few more questions to drill down)

Enables dynamic composition of checklists during an audit.Moves us toward supplier scorecard and risk based managerImproves the effectiveness of the auditEach audit may not have the opportunity to find every NCRMost substantial development effortSignificant effort by TG to identify mappingsHow do we implement in a phased approach?

Key learning:

How question relationships and hierarchy are determined; currently existing capabilities for collapsing and searching of checklists

Audit Efficiency

Collapsing "Section N/As"

The current offline checklist completion tool allows entire checklist sections to be collapsed when a "parent" question is marked "N/A".

Demonstration

Tasks Groups are recommended to recognize this feature when organizing questions (many TGs already do)

Slash Sheets

Rather than having a single checklist for a commodity, questions can be grouped based on technical content and organized into Slash Sheets.

E.g.,

AC7109, criteria for Coatings

AC7109/1, thermal spray

AC7109/2, vapor deposition

AC7109/3, diffusion

Etc.

Auditing Prime Requirements

Prime-specific questions can be embedded, ideally when consensus occurs, in regular checklists

Supplemental checklists can be created to capture prime-specific requirements and have them included in the audit where appropriate to the supplier's customer. This can add to the length of the audit and not used by all TGs.

Audit Efficiency:

Full-text Searching

eAuditNet currently allows for checklists to be displayed, and thus searched, as a single document – but only in online checklist completion:

Audit Efficiency:

Tagging by Keyword

Grouping and search-ability of questions can be improved by "tagging" questions by keyword; once done, checklist navigation could recognize this tagging.

PRI system enhancements would need to be completed up front, but Task Groups could phase this in over time.

Example in MSWord version:

System to administer keywords? Assign keywords to questions? How would navigation (within overall checklists) work?

PRI AC7110/5 Revision F

- 17 -

161Thorn Hill RoadCRITERIA

Warrendale, PA 15086-7527AEROSPACE

Audit criteria

AC7110/5 REV. F

Issued 1995-01-06

Revised 2010-03

Superseding AC7110/5 REV. E

TO BE USED ON AUDITS ON OR AFTER APRIL 4, 2010

Nadcap

AUDIT CRITERIA FOR

FUSION WELDING

EDITORIAL Revision to PARAGRAPH 1.0 ON JANUARY 4, 2010.

EDITORIAL Revision to PARAGRAPHS 9.5 AND 9.6 ON mARCH 12, 2010.

1.

SCOPE

This checklist is to be used as a supplement to PRI AC7110 for suppliers seeking fusion welding accreditation.

This checklist utilizes AC7110/12 - Nadcap Audit Criteria for Welder / Welding <> Qualification.

This checklist utilizes AC7110/13 – Nadcap Audit Criteria for Evaluation of Welds.

2.

GENERAL INFORMATION

In completing this assessment, auditors are instructed to respond with a ‘yes’ or ‘no’ to address compliance with each statement or requirement. For any negative response, the auditor must clearly indicate in the NCR if the ‘no’ reflects noncompliance with respect to existence, adequacy, and/or compliance. Existence relates to evidence of a documented procedure or policy, adequacy relates to the completeness of the procedure or policy, and compliance relates to evidence of effective implementation.

All negative responses require a Nonconformance Report (NCR). Not Applicable (N/A) responses do not require an explanation, unless otherwise noted. There is only one plausible reason for an N/A, which is, that a particular operation or issue is not being used at the supplier. There are no N/A’s simply for a lack of a customer requirement. If a system is in use, then all questions pertaining to that system are applicable.

The base document defines the minimum requirement for this process. Supplements related to unique customer requirements are contained in AC7110/5S which shall be used in conjunction with this checklist.

The audit results shall not include any customer proprietary information. Technical information on parts which have been designated “Export Controlled – License Required” (EC-LR) cannot be input into eAuditNet. If auditors have any questions about this, they should contact the Staff Engineer for directions.

2.1 Welding Scope

Processes performed at the facility (check appropriate boxes)

Baseline – Applicable to all processes/materials etc

FORMCHECKBOX

Supplement A - Process – Additional requirements for SMAW (MMA)

FORMCHECKBOX

Supplement B - Process – Additional requirements for SAW

FORMCHECKBOX

Supplement C - Process – Additional requirements for Automatic/Semi-Automatic processes

FORMCHECKBOX

Supplement D - Material – Additional requirements for titanium

FORMCHECKBOX

Supplement E - Casting Repair – Additional requirements for

FORMCHECKBOX

Supplement F - Filler Materials – Additional requirements for

FORMCHECKBOX

Supplement G - Processes which use gases – Additional requirements for

FORMCHECKBOX

Supplement H - Pre/Interpass Heat Treatment – Additional requirements for

FORMCHECKBOX

Supplement I - Stress Relieve Heat Treatment (Furnace & Oven) – Additional requirements for

FORMCHECKBOX

Supplement J - Tack Welding – Additional requirements for

FORMCHECKBOX

3.

REFERENCES

3.1

Customer Specifications

Are applicable customer specifications available at the facility?

YES NO

4.

MATERIAL CONTROL

4.1

Cleaning Materials, Chemical Solvents & Etching Solutions

Are cleaning materials, chemical solvents, or etching solutions as specified on part drawings or certified/qualified welding procedures/schedules?

YES NO

5.

EQUIPMENT

5.1

Equipment Capability

Is there a documented procedure to ensure welding machines, fixtures, tooling and tooling material are suitable and capable of consistently producing acceptable welds?

YES NO

5.2

Equipment <>

Is there a documented procedure that defines the <> of equipment at established intervals?

YES NO

6.

PROCEDURE CONTROL

6.1

Qualified Welding Schedules

6.1.1Welding Schedule Established

Is there a documented procedure to ensure that qualified welding procedures/schedules are established for each production weld where qualification is required by the customer/specification?

YES NO

6.1.2Customer Procedure Approval

Is there a documented procedure that includes a provision for obtaining customer approval when the customer requires welding procedure approval?

YES NO

6.1.3Customer Schedule Approval

Is there a documented procedure that includes a provision for obtaining customer approval when the customer requires welding schedule approval?

YES NO

6.2Deterioration of Welds

Is there a documented procedure to ensure, that when deterioration of welding is encountered, an investigation is conducted to assign the cause and implement corrective action?

YES NO

6.3

Filler Material Control

Do supplier procedures specifically prohibit the use of filler material or if not is Supplement F for filler material control included in the scope of the audit?

YES NO

6.4

Weld Start and Run-Off Tabs

6.4.1Run-On/Run-Off Tab Usage

Does the supplier have a documented procedure to control run-on/run-off tabs to assure they are the same alloy as the part being welded, or alloy specified by the design authority?

YES NO

6.4.2Run-On/Run-Off Tab Removal

Does the supplier have a documented procedure to control run-on/run-off tab removal to prevent damage to the part?

YES NO

6.5

Stress Relief

Is there a process to ensure stress relief of weldments is performed in accordance with applicable requirements? (If this is performed in-house and stress relief operations are NOT performed in a Nadcap accredited Heat Treat furnace/facility then is Supplement I used).

YES NO

6.6

Heat Treatment

Is there a process to ensure heat treatment of weldments are performed in accordance with applicable requirements?

YES NO

7.PROCESS CONTROL

7.1Cleaning

7.1.1Part Cleanliness

Is there a documented procedure to ensure part cleanliness prior to welding?

YES NO

7.1.2Surface Conditions

Are parts properly cleaned to assure surfaces of the details and representative test specimens are free from contaminants such as oxides, scale, oil, dirt, ink, or other surface conditions that are detrimental to the joining process?

YES NO

7.1.3Protection of Cleaned Material

Are part details protected after cleaning until welding can be performed?

YES NO

7.1.4Time Limits

Is there a documented procedure to ensure that after cleaning, all part details are welded within specified time limits?

YES NO

7.1.5Handling

Is there a documented procedure to ensure that after surface preparation, parts are handled and protected to prevent contamination, including a requirement to re-clean should contamination occur?

YES NO

7.1.6Jigs, Fixtures, and Measuring Devices

Are jigs, fixtures, and measuring devices free of scale, grease, protective coatings, oxides, dust, oil, and other foreign matter detrimental to the welding process?

YES NO

7.2

Instructions

7.2.1Sequence of Operations

Does the router list the sequence of operations that control manufacturing and inspection operations?

YES NO

7.2.2Detailed Operation Sheet

Are there detailed operation sheets/weld schedules with parameter settings for each part number?

YES NO

7.3

Qualification Test Reports

7.3.1Test Reports

Is there a documented procedure to ensure that parameter settings on operation sheets are traceable to qualification test reports, when required by customer specifications?

YES NO

7.3.2Retention of Test Reports

Are test reports documented and held on file for customer review?

YES NO

7.3.3Approval for Evaluation of Welds

Are the applicable supplements of AC7110/13 included as part of this audit for captive lab evaluations of welds or if performed externally does the lab hold AC7110/13 or National accreditation or relevant Prime approval?

(Note that N/A is only an option for when metallographic or bend testing of welds is not required)

(Note that unique Prime requirements exist against this question. Refer to AC7110/5S to establish if additional requirements are mandated)

YES NO N/A

7.4

Multiple Pass Welds

Is there a documented procedure to ensure that prior to the deposition of each pass in a multiple pass weld, the welder or welding <> performs inter-pass cleaning and visually examines the previous pass for contamination and defects?

YES NO

7.5

Subsequent Passes

Is there a documented procedure to ensure defects and contaminants are removed prior to the deposition of subsequent passes?

YES NO

7.6

In-Process Corrections

Is there a documented procedure that defines and controls the requirements for in-process weld correction(s) prior to submitting the weld joint for acceptance inspection?

YES NO

7.7

Rework Cycles

Is there a process in place to ensure rework cycles are tracked and recorded?

YES NO

8.

PERSONNEL

8.1

Welder/Welding <>s Certification Specifications

Does the facility performing welder / welding <> qualifications comply with AC7110/12?

YES NO

8.2 Activity Records

Does the supplier maintain activity records for each welder/welding <> to justify extensions of the qualification period or does the qualification frequency comply with applicable specifications?

YES NO

8.3

Inspector Training/Qualification

Is there a documented procedure in place to train and qualify welding visual inspectors?

YES NO

8.4

Inspector Qualifications

Is there documented evidence of training and qualification of welding visual inspectors?

YES NO

9.

INSPECTION AND ACCEPTANCE CRITERIA

9.1

Visual Inspection

Is there a documented procedure to ensure all welds are inspected by qualified visual inspectors?

YES NO

9.2

Correction of Nonconformances

Is there a documented procedure to ensure that weld nonconformances are corrected in accordance with applicable requirements?

YES NO

9.3

Dimensional Inspection

Are inspection tools to measure dimensional features available to the inspector?

YES NO

9.4

Inspection Equipment

Does the facility have the proper inspection equipment to inspect weld characteristics?

YES NO

9.5

NDT Inspection

Does the supplier have a process in place to ensure all required NDT inspections are performed?

(Note that unique Prime requirements exist against this question. Refer to AC7110/5S to establish if additional requirements are mandated)

YES NO

9.6

Records

Is there a documented procedure that defines the record retention requirements for inspection records, including radiographic films (or digital equivalent)?

YES NO

9.7

Traceability

Is there a documented procedure that defines traceability requirements?

YES NO

10.

PERIODIC MAINTENANCE

10.1

Preventative Maintenance

Do documented procedures require preventative maintenance of weld equipment and tooling at a specified frequency?

YES NO

10.2

Maintenance Records

Do records indicate that maintenance is performed on weld equipment and tooling in accordance with the procedures and appropriate standards?

YES NO

11.

COMPLIANCE

NOTE: Compliance to applicable specifications and customer requirements shall be demonstrated by auditing an in-process job. It is mandatory that this job is in-process. Where a specific part cannot be witnessed it is acceptable for the auditor to witness representative test piece.

The supplier needs to clearly identify the Export Control status of all parts being used in the audit. Technical information on parts which have been designated “Export Controlled – License Required” (EC-LR) cannot be input into eAuditNet.

Throughout the Compliance section of the audit specific questions / data entry points are marked with an <>. These are questions which have been designated by the Task Group as those which cannot be answered in eAuditNet IF the part is (EC-LR). Do not enter technical data, and state ‘EC technical data restricted’. If auditors have any questions about this, they should contact the Staff Engineer for directions.

11.1

<> Job Audit #1

Job Status

Complete

In-Process

Job Audit Properties

Customer/Prime Contractor

          

Unique Work Order # or Lot #

     

Quantity

     

Job Identification

<> Drawing #

     

<> Part Name

     

Operation Sequence Number

     

<> Material Name

     

<> Material Specification

     

<> Thickness

     

<> Filler Material

     

<> Filler Material Specification

     

<> Gas

     

Welder/<>

     

Welding Process

<> Type:

<> Manual

<> Automatic

<> Semi-Automatic

Reference Specifications

<> Customer Specification/Rev.

     

MIL/Industry Specification/Rev.

     

<> Class

     

<> Welding requirements from purchase order, drawing or specification

     

<> Pre-weld/requirement/cleaning/preheat/prep/assembly on shop paper

     

<> Actual pre-weld conditions of parts observed at weld operation

 �