editorial: research in industrial statistics—part i
TRANSCRIPT
QUALITY AND RELIABILITY ENGINEERING INTERNATIONAL
Qual. Reliab. Engng. Int. 2001; 17(6): iii–iv (DOI: 10.1002/qre.449)
EDITORIALRESEARCH IN INDUSTRIAL STATISTICS—PART I
Statistical quality control has been called one of thegreat technological innovations of the 20th century. Itcan be widely applied and is extremely effective, atleast in part because the underlying theory is simpleand the basic methods are straightforward. However,there are few areas in statistics where the gap betweenmethodological development and application is sowide. In many industrial settings, the only techniqueemployed for process monitoring is the basic Shewhartcontrol chart. This is disturbing, because it meansthat many industrial organizations have not benefitedfrom the very useful technical advances in thesemethods that have occurred over the last 50 years.There are many controversies among experts in thisfield and a number of contradictions in the literatureregarding recommendations for good practice. Thesecontradictions and controversies are partly responsiblefor this situation.
Typically, control charts are applied to complexor high-value-add processes, as these representthe greatest opportunity for improvement in manyindustries. When a control chart signals, the analystmust determine which parameters of the monitoredprocess have changed and characterize the assignablecause. Pinpointing the time at which the assignablecause has occurred is usually an essential part ofthese activities. This is the problem of determining achangepoint, for which there is a substantial literature;however, little of this work has found its way intoprocess monitoring applications. Bayesian methods,time series modeling and analysis techniques, expertsystems, and stochastic calculus are techniques thatmay be useful in this area.
Control charts are usually applied in a monitoringmode, and adjustments are made to the process onlyafter an assignable cause is isolated. Over time, thissequence of monitoring, diagnosis, and correctionactivities will reduce process variability. The controlchart originated in the discrete parts industry andits statistical framework is hypothesis testing. Analternative approach that has enjoyed great successin the chemical and process industries is engineering
process control (EPC). In a typical EPC system, theprocess output is observed and, based on the currentdeviation of the output deviation from a desiredtarget, an adjustment is made to a manipulatibleinput variable that is designed to bring the outputcloser to target in subsequent periods. Popular EPCschemes include integral control and proportional-integral control. EPC requires (1) a model relatingthe manipulatible input variables to the output, (2) theability to forecast the output, and (3) the capability tomake adjustments quickly and easily. The statisticalbasis of EPC is parameter estimation, and it essentiallyoperates by transferring variability in the outputvariable to the manipulatible input variable. Thusassignable causes are not directly identified andremoved, but are adjusted out. There has been somework integrating these two approaches to reducingprocess variability, but there are many opportunitiesfor innovative research in this area.
Many of our research opportunities will be drivenby the complex and highly diverse set of conditionsin which modern manufacturing processes operate.Today’s industrial environment is often data-richand highly automated. For example, in a modernsemiconductor fabrication facility, each wafer maypass through several hundred processing steps, andat each step numerous measurements are typicallymade, sometimes on individual die features. Severalthousand wafers may pass through this facilityweekly. This leads to a large database consisting ofmillions of records on hundreds of process variablesand product characteristics. In these environmentsthere is a significant need to be able to both detect anddiagnose patterns, changes, and problems in processperformance. Furthermore, because of the rate ofproduction and the economic consequences, this allneeds to be done in real time, not the usual two-stageapproach involving real-time detection and off-linediagnosis. This clearly involves the integration ofstatistical research with other fields, including variousengineering disciplines, information and computertechnology, and knowledge-based systems. Capturing
Copyright 2001 John Wiley & Sons, Ltd.
iv EDITORIAL
information about a complex manufacturing processand combining statistical analysis of the in-processdata with information about the process technology,the sequence of product flow, and engineeringknowledge can allow rapid and efficient diagnosisof assignable causes of unwanted process variability.This is an area with a broad collection of research
problems for statisticians, spanning multivariatestatistics, special statistics, statistical computing,statistical graphics, non-parametric methods, timeseries analysis, and many other technical specialties.
DOUGLAS MONTGOMERY
Copyright 2001 John Wiley & Sons, Ltd. Qual. Reliab. Engng. Int. 2001; 17(6): iii–iv