optimal+ gsa 2014

24
Michael Schuldenfrei, CTO Leveraging Test Data for Quality GSA Quality Team Meeting December, 2014

Upload: optimalplus

Post on 27-Jul-2015

1.095 views

Category:

Technology


4 download

TRANSCRIPT

Page 1: Optimal+ GSA  2014

Michael Schuldenfrei, CTO

Leveraging Test Data for Quality

GSA Quality Team MeetingDecember, 2014

Page 2: Optimal+ GSA  2014

2© Optimal+ 2014

The Need

Shifting from “Defects per Million” to “Defects per Billion”

Page 3: Optimal+ GSA  2014

3© Optimal+ 2014

The Problem

No Problem Found32%

Fab Process28%

Test Program10%

Test Operation4%

Test Equipment26%

RMA Source

No Problem Found Fab Process Test Program Test Operation Test Equipment

Page 4: Optimal+ GSA  2014

4© Optimal+ 2014

The Challenge

BIG DATA

EXPERTISE

COST

TIME

Page 5: Optimal+ GSA  2014

5© Optimal+ 2014

Big Data – Device DNA

ECID

ECID

ECID

ECID

ECID

WATWS1WS2

WATWS1WS2

WATWS1WS2

WATWS1WS2WS3

WATWS1WS2WS3

FT1Burn in

FT2

Example: One package contains:5 dicex ~2 WS operations per diex ~1.2 iterations per operationx 3000 parametric measurements+ 1000 per-site WAT measurements+ 3000 FT measurements

A DNA consisting ~35K measurements!

An SLT lot with 5000 parts could have 150M historical measurements from hundreds of wafers & FT lots

Page 6: Optimal+ GSA  2014

6© Optimal+ 2014

Back to Basics

THE QUALITY QUESTION;

IS “GOOD” REALLY GOOD?

Page 7: Optimal+ GSA  2014

7

Outlier Detection GeographicParametric

Escape PreventionTest program issuesATE issues

Data Feed Forward (More intelligent decision making)

DriftSmart Pairing

© Optimal+ 2014

Quality Solutions

Page 8: Optimal+ GSA  2014

8

Outlier Detection

© Optimal+ 2014 – All rights reserved

Page 9: Optimal+ GSA  2014

9Optimal+ 2014 Company Confidential

Outlier Detection – Algorithms

D-PAT: Dynamic Part Average Testing

NNR: Nearest Neighbor Residual

Z-PAT: Z-Axis Part Average Testing

GDBN: Good Die in Bad Neighborhood

Zonal: Low yield zone-based detection

Final Test

Post Final-Test operation and Based on Die-ID (ECID etc.)

In real-time at Final-Test operation without Die-ID

Page 10: Optimal+ GSA  2014

10Optimal+ 2014 Company Confidential

Cross-Operation Outlier Detection

Cross-operational quality based on Die ID

Contributing operationsETEST/PCM/WATWafer SortFinal-TestBurn-InSystem Level Test

Example: E-Test based bin-switching performed post-Wafer SortThe ability to identify potential bad devices based on E-test data geographical analysisBin switching occurs post-wafer sortRequires data-feed-forward within the supply chain

Page 11: Optimal+ GSA  2014

11

Escape Prevention

© Optimal+ 2014 – All rights reserved

Page 12: Optimal+ GSA  2014

12© Optimal+ 2014

Escape Prevention – ATE Freeze

A freeze occurs when a tester instrument becomes “stuck” and repeatedly returns the same or similar result for a sequence of parts

Page 13: Optimal+ GSA  2014

13© Optimal+ 2014

Escape Prevention – ATE / TP

13

The STDF “PRR.NUM_TESTS” field tells us the number of tests executed on the part. It should be relatively stable throughout

the lot

Page 14: Optimal+ GSA  2014

14© Optimal+ 2014

Escape Prevention – Test Ops

Excessive probing – when operation ignores probe mark spec for a device and keeps on probing to get the yield

Page 15: Optimal+ GSA  2014

15© Optimal+ 2014

Escape Prevention – Test Program

Human error is one of the main contributors for test escapes and RMA. Here the PE commented a few blocks in the TP for debug and forgot to uncomment before production release:

Traditional SBL is design to detect yield issues in which a specific bin count spikes. However human error can result in a drop to 0

which is missed.

SBLSBL drop of soft bin 11

from ~3% to 0 following new TP revision

Page 16: Optimal+ GSA  2014

16© Optimal+ 2014

Escape Prevention – Test Program

Extremely loose test limits may mask real test performance problems

~95 Sigmas

~95 Sigmas

Page 17: Optimal+ GSA  2014

17

Advanced Quality Solutions

© Optimal+ 2014 – All rights reserved

Page 18: Optimal+ GSA  2014

18

Implementations:Within the same test area (e.g. WS, FT, etc.)Between test areas (e.g. from WAT to WS to FT)Within a single subconBetween multiple subcons (hub and spoke)Real-time (test program integration)Offline bin-switching

Example scenarios:Outlier Detection – drift analysisPairing – cherry-picking for power & speed combinationsTest program tuningSLT / Burn-in reduction

© Optimal+ 2014

Data Feed Forward

Page 19: Optimal+ GSA  2014

19© Optimal+ 2014

Data Feed Forward – Drift

Database at subconTester

1. ECID Data

2. FT1 Measurements

Test Program runningFT2 operation

Real-time data!No test time impact!

Page 20: Optimal+ GSA  2014

20

One or more numeric values representing the perceived quality of a part based on:

Wafer geography (e.g. edge vs. center)

Outlier detection rule inputs (e.g. GDBN, Z-PAT, D-PAT, etc.)

Number of iterations to PASSOverall lot/wafer yieldEquipment health during testParametric test results from multiple operationsEtc…

© Optimal+ 2014

Quality Index

Quality Index

Lot/Wafer

Yield etc.

Quality Rule

Inputs

Wafer Geogra

phy

Page 21: Optimal+ GSA  2014

21© Optimal+ 2014

“No Problem Found”

Combinations of chips causing issues:

IC3

IC2

PCB

IC1

Page 22: Optimal+ GSA  2014

22© Optimal+ 2014

Smart Pairing

• New methodology to pair IC’s for optimal compatibility

• Customer and suppliers agree on recipe for “Best Match” between IC’s (e.g. based on power consumption and speed)

• “Quality Index” created based on manufacturing and test data to categorize chips

• Data fed-forward to assembly to ensure IC’s pre-sorted into “buckets” based on Quality Index

• MCPs and boards are assembled with well-matched components

Grade A

Grade B

Grade C

Grade A

Grade B

Grade C

Page 23: Optimal+ GSA  2014

23

Supreme Quality requires a comprehensive end-to-end approach which takes into account problems arising from:• Equipment• Test Process• Human Error• Material…and much more

© Optimal+ 2014

Conclusions

Page 24: Optimal+ GSA  2014

24

Q&A

Optimal+ 2014 Company Confidential