countingon - national physical laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf ·...

16
The third and final year of the first Software Support for Metrology (SSfM) programme (April 1998 - March 2001) has again been full of achievements. The set of SSfM best-practice guides has been extended from the five produced by the end of the second year to ten at the end of the programme. The following table lists all ten best-practice guides, which together with other SSfM reports are being included on a single CDROM, providing a “road map” to the outputs of the programme: A number of other reports were published during the year. NEL produced one on continuous modelling in metrology, and NPL produced a set of reports on numerical software testing, including the results of testing functions in MathCAD and S-Plus and comparative studies of various implementations of standard deviation and linear regression algorithms. There were also reports on guidelines for software selection and software tools for uncertainty evaluation, which were supplemented by conference and journal papers on topics such as data fusion, model validation, and uncertainty evaluation. In addition, three training courses have been developed on measurement system validation, advanced uncertainty evaluation and statistical modelling, and testing the numerical accuracy of scientific software. The National Physical Laboratory is operated on behalf of the DTI by NPL Management Limited, a wholly owned subsidiary of Serco Group plc ISSUE 10 SPRING 2001 MATHEMATICS AND SCIENTIFIC COMPUTING Counting on IT Contents A THIRD SUCCESSFUL YEAR FOR SSfM 1-2 GENETIC ALGORITHMS IN METROLOGY 2-3 RANDOM TESTING BRIDGING THE GAP 3 GUM-THE NEXT GENERATION 4-5 MODEL VALIDATION BEST PRACTICE 6 CMM SOFTWARE APPROVED FOR PUBLICATION 6-7 LIMS-BEST PRACTICE 7 WAVELETS IN METROLOGY 8 NEAR FIELD TO FAR FIELD PREDICTION FOR UNDERWATER ACOUSTICS 9 METROS EXPANDS 10 DATA FUSION 11 GUIDELINES ON SELECTING AND USING SOFTWARE IN METROLOGY APPLICATIONS 12 TESTING THE NUMERICAL CORRECTNESS OF SCIENTIFIC SOFTWARE 13 A GLOBAL POSITIONING SYSTEM TIMING RECEIVER 14 BIOMETRIC AUTHENTICATION PERFORMANCE 15 LOOKING FORWARD TO SSfM-2 15-16 Guide No. Title Originator Status 1 Measurement System Validation NPL Published 2 Development of Virtual Instruments Adelard Published 3 Developing Software for Metrology Logica Finalised The Guide 4 Discrete Modelling NPL Published 5 Software Reuse: Guide to METROS NAG & NPL Published 6 Uncertainties and Statistical Modelling NPL & NEL Final Draft 7 Testing Spreadsheet Models NPL Published 8 Mixed Language Programming NAG Published 9 Selection and Use of LIMS Sira Test & Finalised Certification 10 Discrete Model Validation NPL Final Draft Measurement Systems Validation Training Course A third successful year for SSf M

Upload: others

Post on 25-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

The third and final year of the first Software Supportfor Metrology (SSfM) programme (April 1998 -March 2001) has again been full of achievements.

The set of SSfM best-practice guides has beenextended from the five produced by the end of thesecond year to ten at the end of the programme.The following table lists all ten best-practiceguides, which together with other SSfM reportsare being included on a single CDROM, providing a“road map” to the outputs of the programme:

A number of other reports were published duringthe year. NEL produced one on continuous modelling in metrology, and NPL produced a set of reports on numerical software testing, includingthe results of testing functions in MathCAD and S-Plus and comparative studies of various implementations of standard deviation and linearregression algorithms.

There were also reports on guidelines for softwareselection and software tools for uncertainty evaluation, which were supplemented by conferenceand journal papers on topics such as data fusion,model validation, and uncertainty evaluation.

In addition, three training courses have been developed on measurement system validation,advanced uncertainty evaluation and statisticalmodelling, and testing the numerical accuracy ofscientific software.

The National Physical Laboratory is operated on behalf of the DTI by NPL Management Limited, a wholly owned subsidiary of Serco Group plc

I S S U E 1 0 • S P R I N G 2 0 0 1

MATHEMATICS AND SCIENT IF IC COMPUTING

Counting on ITContentsA THIRD SUCCESSFULYEAR FOR SSfM 1-2

GENETIC ALGORITHMSIN METROLOGY 2-3

RANDOM TESTINGBRIDGING THE GAP 3

GUM-THE NEXTGENERATION 4-5

MODEL VALIDATIONBEST PRACTICE 6

CMM SOFTWAREAPPROVED FORPUBLICATION 6-7

LIMS-BEST PRACTICE 7

WAVELETS IN METROLOGY 8

NEAR FIELD TO FARFIELD PREDICTION FORUNDERWATERACOUSTICS 9

METROS EXPANDS 10

DATA FUSION 11

GUIDELINES ONSELECTING AND USINGSOFTWARE INMETROLOGYAPPLICATIONS 12

TESTING THE NUMERICALCORRECTNESS OFSCIENTIFIC SOFTWARE 13

A GLOBAL POSITIONINGSYSTEM TIMINGRECEIVER 14

BIOMETRICAUTHENTICATIONPERFORMANCE 15

LOOKING FORWARDTO SSfM-2 15-16

Guide No. Title Originator Status

1 Measurement System Validation NPL Published

2 Development of Virtual Instruments Adelard Published

3 Developing Software for Metrology Logica Finalised

The Guide

4 Discrete Modelling NPL Published

5 Software Reuse: Guide to METROS NAG & NPL Published

6 Uncertainties and Statistical Modelling NPL & NEL Final Draft

7 Testing Spreadsheet Models NPL Published

8 Mixed Language Programming NAG Published

9 Selection and Use of LIMS Sira Test & Finalised

Certification

10 Discrete Model Validation NPL Final Draft

Measurement Systems Validation Training Course

A third successful year for SSfM

Page 2: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Counting on IT

P A G E 2

This year we have held a joint workshop onmathematical and statistical modelling, a workshopon data fusion, and a joint workshop on discreteand continuous modelling.We also participatedvery fully in the Advanced Mathematical andComputational Tools for Metrology conferenceheld in Portugal in May.

The SSfM Club remains very active with 31 fullmembers, 41 UK associates, 62 internationalassociates and 81 NPL members.

Two Club meetings were held in the year.TheUniversity of Huddersfield hosted the first inSeptember 2000, at which the topics covered werenumerical analysis, continuous modelling, anduncertainty evaluation.The second meeting was heldat NPL in February 2001, and the topics coveredwere statistical modelling, LIMS, looking back, andlooking forward. Reports on club meetings areavailable to members via the SSfM web site, whichalso provides information on all the projects, andgives access to all publications and to the METROSweb site.

For further information contact:Dave Rayner, extension 7040,e-mail: [email protected]

Genetic algorithms in metrologyTraditionally, optimisation problems withinmetrology are solved by a variety of methods,including gradient-based or other direction searchapproaches.These techniques can be effective andaccurate for finding a local minimum. However, inproblems where a global minimum is requiredthese methods can be somewhat limiting, as oncethey find an appropriate solution they terminate,and will thus not find anything other than the local minimum.

In order for a solution method to find globalminima effectively when required, searches musttake place throughout the solution space ratherthan just in the vicinity of a potential minimum.Since a search in this manner by definition needsto be less focused, such techniques lose out interms of efficiency. However, a trade-off may bemade in the ability to find global solutions, and it is often possible to combine a global search

followed by a more localised one to obtain theaccuracy required.

Among a number of promising techniques that canbe used to assess global minima is the applicationof genetic algorithms.These computationalmethods mimic evolutionary processes in natureto obtain the most favourable, or fit, solution.

Genetic algorithms work in the following way: thesystem under analysis is firstly characterised interms of a population of “chromosomes” made upof a string of binary numbers. Each stringcharacterises a particular state of the system.Thepopulation of possible candidates for global minimais then subjected to a number of “evolutionary”processes governed by probabilistic rules.A checkfor convergence is performed after the applicationof the algorithms, and the process continues until asuitable convergence is achieved.

Mathematical modelling workshop

AMCTM 2000 Meeting audience

Page 3: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Recently we faced the task of testing software that“bridged” two tools: taking the output of one andapplying a complex transformation to produceinput for the other. The two tools were NP-Tools,which models systems and automatically provestheir properties, and FaultTree+, a reliability analysis tool.

A test suite for the transformation was developed manually.However, the sizes and theusefulness of the tests were limited,because each test required the constructionof a complex data structure.The problem was todevelop sufficiently large data structures to testthe correctness and efficiency of thetransformation.To do this manually would havetaken too long and would have been error prone.

NPL has over the years developed a means togenerate random self-checking programs, of anysize, to test the object code produced bycompilers, in particular for PASCAL (includingvariants),Ada and several other languages.Thistechnique has produced program generators,which have found many bugs in commercialcompilers, and has been used to give assurance forcompilers in a safety-critical context.

The techniques used to generate random self-checking programs were used to generaterandom fault trees for FaultTree+, which were thenused to test the software bridge.The self-checkingof the results was done by using the automaticproof capabilities of NP-Tools.

The technique described above is very good ingenerating complex tests. It can be applied whenthe tests require complex structures, haverelatively simple semantics, and there is a means tocheck the results automatically. It has been appliedto the testing of compilers and software bridges.It is thought it could also be applied to the testingof interfaces.

For further information contact:Graeme Parkin, extension 7104,e-mail: graeme [email protected]

P A G E 3

Random testing - bridging the gap

Bridges can have complex structures

Due to the probabilistic nature of the process, it is possible to obtain a number of distinct near-optimal solutions that may be furtheranalysed as necessary.

The basic rules governing the “evolution” of thepopulation are as follows:

selection, where a new population is obtained byweighted probabilistic selection from the previouspopulation,

crossover, where members of the population arecrossed with each other to produce “child” strings,

mutation, where a new population is createdby randomly changing bit values, at a very lowprobability.

The three stages ensure that the overall “fitness”of the population improves probabilistically, but thesearch is maintained throughout the function space

and is thus done in a truly “global” sense.Genetic algorithms are a useful andstraightforward method of treating globaloptimisation problems.They are particularly usefulwhen applied to discrete problems, or problemsthat can be readily discretised. The techniques canbe further manipulated to give a predefined number of minima, which may then be further distinguished by alternative qualitative criteria that are beyond the scope of the objective function.

For further information contact:Mark Dainton, extension 7012,e-mail: [email protected]

Page 4: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 4

Counting on IT

The Guide to the Expression of Uncertainty inMeasurement (GUM), published by ISO, has beenvery widely used and accepted. Its release in the mid-1990s has had a dramatic effect inimproving the evaluation and reporting ofmeasurement uncertainty.

Although the GUM is an extraordinarily richdocument, it has generally been implemented insoftware in ways that can sometimes lead toinvalid results. Moreover, GUM users are seekingfurther guidance, particularly for the morecomplicated situations or when the uncertaintiesare large. Further work, with SSfM involvement, isaimed at addressing these issues.

Current GUM. Use of the current version of theGUM (or UKAS M 3003) is widespread.Ameasurement model is developed that relates theinputs (quantities that can be measured orobtained from suppliers’ data, calibrationcertificates, etc.) and their uncertainties to theoutput (the measurand or physical quantity) and itsuncertainty.These input values and theiruncertainties are then propagated though themodel to provide a measurement result and itsuncertainty. Finally, an expanded uncertainty, viz.,an interval likely to cover say 95% of the valuesthat could plausibly be ascribed to the measurand,is obtained.

So where’s the problem? First, it is oftendifficult to assign appropriate distributions(rectangular, normal, etc.) to the inputs, from whichthe input uncertainties are obtained. Second, thepropagation has to be carried out reliably. If thefirst part can be carried out satisfactorily, thenusually there will be no problem, since the GUMgives a clear but approximate prescription foruncertainty propagation.The difficulty is that it ishard to say when this prescription will give us theanswers we want and when it might let us down.Generally, we want to determine a measurementresult and its associated coverage interval to aparticular number of figures.Typically, we would be looking for just one correct figure in the uncertainty, e.g. a result of the form (50.000 84 ± 0.000 06) mm. Can we be sure of the0.000 06? Should it be 0.000 04 or 0.000 08, for

example? The importance of this point depends onthe application, but in terms of the credentials of alaboratory or in a competitive environment, thedifferences can be very relevant.

GUM revision. The revision will assist in decidingwhether the limitations of the GUM apply in anyparticular circumstance. It will cover what thevalidity of results means, and how valid results canbe obtained.The current GUM concentrates on acertain class of models; the revision will cover awider range of model types:

(a) measurements where there is more than one output, e.g. the parameters of a calibration curve

(b) circumstances where it is not possible to express the output explicitly in terms of the inputs, e.g. in some flow measurement models

(c) measurements involving complex variables as arise in electrical, acoustical and optical metrology

Two phases. GUM revision will handle theproblem of the correctness of the outputuncertainty by dividing the process into two phases:

• Phase 1 (formulation): developing the model,deciding its inputs, and assigning their distributions.

• Phase 2 (evaluation): the propagation of thesedistributions through the model to obtain thedistribution of the output.The central 95% of thisdistribution gives the required expanded uncertainty.

How will this help? Phase 1 is the province ofthe metrologist, perhaps with statistical support.Phase 2 can be carried out, using software or byhand, but without any metrological knowledge.Once the metrologist has finalised the model andthe inputs, phase 2 can be undertaken asaccurately as required in terms of that information.There is no further source of uncertainty as a resultof approximations, as in the GUM today.

The solution process is designed to return a validanswer for the posed problem. It is then down tothe metrologist to investigate the effects of hisassumptions, perhaps by varying an inputdistribution to see the changes in the output andits uncertainty.Whatever he does will be under hiscontrol, and not exacerbated by numerical effectsthat are unknown to him.

No change! There will be no change to the GUMas we know it, in order to protect theconsiderable implementation investment made bymany organisations.The GUM revision will takethe form of supplemental guides.The first,concerned with the propagation of distributions,will promote the two-phase process describedabove, especially providing approaches forobtaining the output distribution.Therecommended tool is Monte Carlo Simulation(MCS), a computer-intensive process that samplesmany times from the input distributions, and, byevaluating the model for each sample, provides adistribution for the output. Unless the model is

GUM - The next generation

A North Sea oil export metering system.Courtesy of FLOW Ltd., which has developed

a range of generic models as the basis forevaluating the uncertainty of typical

metering systems.With regular calibration ofthe meters and other instrumentation the

overall measurement uncertainty is better than0.25% of the reading to comply with

Department of Trade and Industry andNorwegian Petroleum Directorate regulations.

Page 5: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 5

very complicated, a valid answer for phase 2 canbe established in minutes, or even seconds insimple cases.

Models. The second supplemental guide willprovide a classification of models: one or morethan one output, formula for the output or anequation to be solved for it, real or complexvariables; 2 by 2 by 2 = 8 categories in all.Adviceon how to handle each category will be given, plusan example, with formulae or an approach that isreadily implemented.

GUM validation. An important aspect of MCS isthat it can be employed to validate the use of theGUM. If the GUM and MCS results agree in theabove sense, the GUM approach can be taken asvalid for the application and for sufficiently similarapplications. If there is disagreement, there is acause for investigation. A conclusion could be thatthe GUM is inapplicable for the problem and MCSshould be considered instead.

User impact. MCS can be introduced alongsidethe current use of the GUM. Only if the GUM isproducing inadequate results need any change bemade. Of course, a change should be necessary inany circumstance where the approach used does

not deliver results that are fit for purpose.Themodelling guidance will expedite GUM usage to awider range of models, but leave mattersunaffected in terms of “conventional” models.

The SSfM angle. A major part of the first SSfMprogramme was concerned with the development,application and promotion of sound uncertaintyprinciples.We have been strongly guided by fourworkshops, attended by over 250 delegates, themedsessions at conferences such as the internationalAdvanced Mathematical and Computational Tools

in Metrology series, national andinternational committee work,and numerous additionalcontacts.This emphasis willcontinue in SSfM-2.A best-practice guide resultingfrom SSfM is available from theSSfM web site, and a trainingcourse has been developed.

Applications of the SSfMwork, in the MCS and themodelling senses, have beenmade to a large number ofprojects.They range from inter-laboratory comparisonsthrough co-ordinate measuringmachines to radiation emittedby domestic appliances.TheMCS principle has been appliedby industry to large-scale flowmetering in the oil industry andother areas.

The future. The JointCommittee for Guides inMetrology (JCGM) isresponsible for GUM revision.BIPM, OIML, ISO, IEC, IUPAC,IUPAP, IFCC and ILAC arerepresented on JCGM. Expertsfrom the national measurementinstitutes, all concerned with theneeds of users, also sit onJCGM.The UK, benefitinggreatly from the SSfMprogramme, has made and will continue to makesubstantial input to this work.

For further information contact:Maurice Cox, extension 6096,e-mail: [email protected]

EMC radiated emission measurement of asubstitute electronic product inside a fully

anechoic chamber at NPL over the frequencyrange 30 MHz to 1 GHz.The cables simulate a

mains supply and a connection to peripheral equipment.

Page 6: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Counting on IT

P A G E 6

The quality of the information we extract fromexperimental data depends on the quality of themathematical model used to describe the system.If the mathematical model inadequately representsthe actual behaviour of the system, or the way weanalyse the measurement data is not appropriatefor the model, then the results we derive could beinvalid.As part of the SSfM validation theme, wehave looked at case studies to examine validationissues in all aspects of modelling including:

a) the specification of the functional model in terms of equations relating the various parameters and variables

b) the specification of the statistical model for the measurement errors

c) the method of solution for the model parameters

d) estimation of the uncertainties of the fitted parameters

One suite of case studies concerned roundnessassessment of manufactured artefacts frommeasurement data.While the basic functionalmodel is easy to define, the complete modeldepends crucially on the systematic behaviour ofthe measuring instrument and the form errors ofthe artefact, and if these are not modelledappropriately the estimated parameters and theiruncertainties can be wildly unrealistic.

A second suite of case studies concerned modelsof measurement systems where the focus ofattention was on taking into account themeasurement sensor behaviour and the multiplecorrelation in the measurement errors.

These case studies feature in the Best PracticeGuide on Discrete Model Validation.

Further Reading:Software Support for Metrology Best PracticeGuide 4: Discrete Modelling, M G Cox,A B Forbesand P M Harris, National Physical Laboratory.

Software Support for Metrology Best PracticeGuide 10: Discrete Model Validation, R M Barkerand A B Forbes, National Physical Laboratory.

See the SSfM website for published guides.

For further information contact:Alistair Forbes, extension 6348,e-mail: [email protected]

Model validation - best practice

0-50-100 10050-100

-80

-60

-40

-20

0

20

40

60

80

Measurement strategy can have a large effecton roundness assessment

CMM software standard approved for publication

Co-ordinate measuring machines(CMMs) are widely used in industrialmetrology as part of the inspection ofmanufactured components.Theyreturn co-ordinate measurements ofpoints on the surface of a real feature.In order to yield useful informationfrom these measurements, a commonrequirement is to fit an associatedgeometric feature, such as a line,circle, plane or cylinder, to a data setconsisting of such points.

The parameters of the fitted feature that describethe size, shape, position and orientation of thefeature, are compared with design tolerances inorder to decide the extent to which the featuresatisfies dimensional and positional specifications.Typically, the fitting of geometric features toco-ordinate data is undertaken using software.Consequently, the reliability of the informationderived from the measurement data that cruciallyimpacts on the decision to accept or reject a partis influenced by the quality of the software forcomputing these features.

Using a CMM to inspect amanufactured component

Page 7: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 7

Laboratories produce a vast amount ofinformation, which, because of the regulatoryclimate, needs to be retained for long periods andmust be readily retrievable for processing at anytime.The information can be disparate - it mightinclude test data, staff training and workloadrecords, for example. Laboratory InformationManagement Systems can bring major benefits interms of productivity, resource management,reduction of paperwork and the efficient handlingof customer enquiries.

Selecting and implementing a LIMS is a non-trivialtask. Bespoke software or off the shelf packagesmay be used, the trend being to choose packagesthat can be configured to meet the laboratory’sneeds.Thorough preparation of the systemspecification is essential, and it is recommendedthat a laboratory audit be undertaken to definewhat will be required.This will include a review ofthe workflow, the instruments, the reports,security, and the systems to which the LIMS willhave to interface.The specification will alsoconsider the retention of records in electronic orpaper format, the control of the software itself andthe medium on which it is stored.

Thought will also need to be given, and assurancesreceived about an upgrade and development path:to obtain a return on the substantial investmentrequired, a LIMS should be seen as a long termtool lasting at least eight years.

The performance of a LIMS is vital to thelaboratory, and one of the keys to realising thedesired performance is to build a good partnershiprelationship with the supplier. Planning, an agreedscope, and flexibility are also important in gettingthe best from the system.

It can take at least six months, and often muchlonger to install, prove and adopt a LIMS system,and it is crucial that supporting documentation andtraining offered by the vendor are of high quality.The requirements will change, not least because

users will become more aware of the capabilitiesof the system, and there needs to be a scheme forchange management.

The system needs to be validated,and problems can arise here. Inmost cases it is not possible fullyto understand the intricacies ofthe software, and the ‘black box’approach needs to be implemented.This treats the system as an entity,and provides inputs (test data) andvalidates the outputs. Considerationneeds to be given to ‘what if ’ tests,as well as thorough preparation ofusual scenarios. Calculations,including roundings need to beclosely checked.

The product alone will notguarantee a successful installation.The right approach and methodologymust be followed, and the chosensystem should be based on futuretechnology needs, selected after athorough search to find a supplierwith the desired interpersonalskills.With this in mind, once commissioning iscomplete, there are enormous benefits to be gainedfrom the installation of a LIMS.

Further Reading:Software Support for Metrology Best PracticeGuide 9: Selection and Use of LIMS, Sira Test &Certification

This is available from the SSfM website.

For further information contact:Steve Lower, 020 8467 2636,e-mail:[email protected]

LIMS - best practice

LIMS can bring benefits to a widevariety of measurement laboratories

ISO committee TC213/WG10 (whose meetingsare attended by NPL) has approved a standardconcerned with testing software for computingbest-fit geometric elements [1].The standard isexpected to be published in Summer 2001, and willunderpin NPL’s work in testing and validating suchsoftware.The new standard utilises themethodology promoted in the SSfM softwaretesting project for the use of reference data setsand the comparison of test and reference results.

Reference[1] ISO 10360-6:2001(E), Geometrical Product

Specifications (GPS) - Acceptance test and reverification test for coordinate measuring machines (CMM), Part 6: Estimation of errorsin computing Gaussian associated features.

For further information contact:Peter Harris, extension 6961,e-mail: [email protected]

Page 8: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Counting on IT

P A G E 8

Wavelets in metrologyTraditionally, in spectral analysis, techniques basedon the Fourier transform are used to relate asignal that varies in time or space to its frequency-domain representation. However,such methods have limitations when applied to non-stationary signals.

Within the last 20 years, an approach using“wavelets” has been developed.This uses a differentset of basis functions from those used in Fourieranalysis.Wavelets provide a general set of basisfunctions that are localised both in time andfrequency.

Within the scientific community, wavelet analysishas been applied successfully to a wide range ofproblems, including the recognition of features insignals, signal compression and de-noising.Waveletshave also been used as a basis in numericalschemes.To date these techniques have not,however, generally been exploited in metrology.

While the traditional Fourier-based signalprocessing techniques have proved effective for themajority of applications, a capability in waveletanalysis would significantly add to the availabletechniques and enable more general forms ofsignal to be studied.

The last eighteen months have seen researchundertaken at NPL to investigate the potential forusing wavelets in the metrology environment, andto increase awareness within NPL of theirusefulness. In addition, the application of waveletsto a specific acoustic application, namely thecalibration of underwater electroacoustictransducers in reverberant laboratory tanks, isbeing investigated.

Potential applications of wavelets in this acousticsarea include de-noising and estimation of echo-arrival times (an example of featurerecognition). Figure 1 shows a sample signaloriginating from the calibration of a hydrophone,while Figure 2 shows the signal obtained after de-noising using wavelets. Figure 3 contains avisualisation of a so-called “continuous wavelettransform”, usually abbreviated to CWT andanalogous to the Fourier transform, of the original signal, which illustrates its non-stationarycharacteristics.

The aim throughout the research project is tocombine the strengths of wavelet analysis (e.g. sparse representation of signals, featuredetection, etc.) with existing techniques, to form amore complete set of analytical tools.

For further information contact:Ian Smith, extension 7071,e-mail: [email protected]

3

2

1

0

-1

-2

-3

-4

-5

-6

-71 1.5 2 2.5 3 3.5

10-3

x

10-4

x

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Position in signal

2

3

4

5

6

7

8

Sca

le

3

2

1

0

-1

-2

-3

-4

-5

-6

-71 1.5 2 2.5 3 3.5

10-3

x

10-4

x

Figure 1

Figure 2

Figure 3

Page 9: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 9

Near-field to far-field prediction forunderwater acousticsManufacturers of underwater acoustic transducerswant to know the performance characteristics oftheir devices in the acoustic far-field. However, forsome transducers, direct measurement in the far-field requires a large volume of water such as alake or reservoir, which means making testmeasurements is an expensive and time-consumingtask. As a solution to this problem, NPL has beeninvestigating methods for predicting the far-fieldresponse of a transducer from measurementsmade in the near-field in a laboratory tank. NPLoperates a large open tank (5.5 m diameter, 5 mdeep, see figure 4) equipped with a high-resolutionpositioning system, which is used for the near-fieldmeasurement scans.

The propagation of acoustic waves of constantfrequency in homogeneous media may bedescribed using the Helmholtz equation, which canbe reformulated as an integral equation over aclosed surface.This means that if the pressuredistribution over a closed surface is known, thepressure value anywhere else can be calculated.

Boundary element methods provide a numericaltechnique for solving the Helmholtz integralequation, and NPL has carried out an investigationof such methods.An initial study identified twotechniques that were suitable for this problem, andwork has concentrated on the validation andcomparison of these two methods. Software iscurrently being developed that will process thescanning device output and calculate results atuser-specified points in the far-field.

The pictures show a typical example of ameasurement data set on a cylinder (figure 5)made up of about 29000 points and a far fieldprediction (figure 6) on a cylinder with a radius 20times larger, calculated using about 7200 values.The next areas of investigation will be uncertaintyestimation and understanding how the resultsdepend on the number and distribution ofmeasurement points.

For further information contact:Louise Wright, extension 6466,e-mail: [email protected]

Figure 4

-0.2-0.1

00.1

0.2

0.2

0.1

0

-0.1

-0.2

-0.05

0

0.05

2

4

6

8

10

12

10-3

x

-4

-4

-2

-2

0

0

0

22

4

4

1

-1

1

2

3

4

5

6

7

10x-4

Figure 5

Figure 6

Page 10: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Counting on IT

P A G E 1 0

METROS expandsThe SSfM Software Re-use project is nowcomplete, with the prototype METROS library forre-usable software as its major output.The METROSwebsite, www.npl.co.uk/ssfm/metros/, containsspecifications of over 50 keys functions, andimplementations of some of those key functions,provided as source in languages such as Fortranand MATLAB, and as compiled DLLs which allowthem to be called from Excel, for example.

The METROS website will be central to the deliveryof various outputs from the next SSfM programme.Algorithms for discrete modelling and foruncertainty analysis will be included as keyfunctions, and existing and new software from NPLwill be included as implementations of these keyfunctions.We will also extend METROS to includebundled software such as packages and libraries,guidance material such as the existing SSfM bestpractice guides, and reference data sets and resultsfor testing mathematical software, including resultsfrom a project to test contributions to METROS.

The software testing project has made extensiveuse of reference data sets and these are alreadyavailable through METROS at:www.npl.co.uk/ssfm/metros/reference_data_sets/.The structure is similar to the rest of METROS:there is a page for each function to be tested, andthese pages point to the individual reference datasets, which are indexed by various parameters.There is also guidance on how the data setsshould be used and reference material on howthey were produced.

NPLFITA software package, NPLFIT, has been available forsome time for fitting calibration curves and otherexperimental data. In its original file-based versionit had some limitations, but an enhanced version hasjust been released which provides a graphical userinterface to the package having a Windows “feel”.

NPLFIT will be made available on METROS invarious forms. Implementations of particularfunctions from the library are implementations ofkey functions.The library as a whole will be madeavailable as DLLs, and the NPLFIT-GUI will be

available as a package, providing a user interface tothe library functions.

NPLFIT provides a coherent and usable interfaceto curve-fitting software representing the currentbest practice.The use of the package enables twoimportant classes of models, polynomials andsplines, to be fitted to data.These models offerconsiderable flexibility as the algorithms used allowpolynomials of high degree, and splines with manyknots. In particular, the “best fit” in terms ofpolynomial degree or number of spline knots canbe reliably obtained.The fitting criteria available are2-norm (least-squares), 1-norm (min-sum-mod)and infinity-norm (min-max).

The package has controls such as buttons andboxes, which are easily operated, to allow the userto select from a family of models the one that ismost appropriate for the data, and to enable thefitted function to be evaluated, differentiated andintegrated.This version is more versatile in that,for example, it allows the user to evaluate the

fitted function for any selection of points. For fitsobtained using the 2-norm, measures of standarduncertainties can be extracted, and the choice ofoperations has been extended to include theanalysis of features, such as extrema and points ofinflexion, of the spline.

Figure 7 shows a window displaying the choices fora data file. The results are displayed graphically ona window and they can be plotted on a printer orsaved to a file: see figure 8.

For further information on METROS contact:Robin Barker, extension 6348,e-mail: [email protected] NPLFIT contact: Paul Kenward,e-mail: [email protected]

Figure 7 ‘Choices and Visualisation’ window Figure 8 Window showing a graph of fitted spline

Page 11: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Corrosion news

P A G E 1 1

The SSfM data fusion project recently sponsored asecond workshop on data fusion in metrology.This was held at NPL on 31 October 2000 and, incontrast to the first workshop held a year earlier,was directed solely at NPL scientists. It consistedof three presentations on the data fusion workundertaken at NPL.

Alistair Forbes showed how data fusion wasapplicable in many metrology situations, andhighlighted the prevalence of multiple-sensormeasurements within the laboratory. For instance,with gauge block calibration using interferometry,air temperature, pressure, humidity, artefacttemperature, and the coefficient of thermalexpansion (and their uncertainties) can all be seenas factors that need to be fused to obtain areliable measurement result. In this example,measurements are fused over attributes.There areother examples where fusion is made over a set ofmeasurements taken over a time interval, orwhere the measurements come from many similarsensors, or are from overlapping parts of a physical spectrum.

The data fusion work was shown to be related tothe discrete modelling best practice guide, and wasdescribed in terms of parameter estimation formulti-sensor systems.

Gavin Kelly presented some work on Bayesianstatistics, and their role in data fusion.The casewas argued that “Bayesian priors” are central tocarrying out data fusion, as they provide an idealway of carrying information from one experimentor calculation to the next.

If we make a series of measurements pertaining tomass, for example, then in a classical interpretationwe would estimate the true mass to be the onethat most likely explains the measurement data. Ina Bayesian world, we update our prior belief usingthe measurement data, and can then find the valueof mass that we have the most belief in.As we takemore and more measurements, these views willgenerally converge.

The speaker argued that a Bayesian approach notonly allowed us to carry measurement databetween experiments, but it also allowed us toincorporate a wider range of information than justmeasurement data. For instance, in the graph in

figure 9, we show the results of using Bayesianpriors to build into our uncertainty analysis theknowledge that a certain parameter is non-negative.

Finally, Robin Barker gave an overview of the workbeing carried out for the INTErSECT project,wherein a review of data fusion techniques is beingundertaken.An early version of the resultant guideto data fusion techniques will soon be available at:www.datafusion.org.uk

For further information contact:Gavin Kelly, extension 6975,e-mail: [email protected]

Data fusion

-2 -1 0 21 3 40

2.5

5

µ

XFigure 9 90 and 95% Bayesian confidence

intervals for a positive parameter

Page 12: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 1 2

Counting on IT

A report providing information and guidelines tohelp users select and use software in such a waythat it is “fit for purpose”, has been completed aspart of the SSfM programme (CMSC Report04/00). Fitness for purpose is defined in terms ofmeeting requirements relating to the numericalaccuracy of the results returned by the software.

Four subjects are covered in the report:

• factors that affect how users decide fitness forpurpose requirements, accounting for the possible ways that the results returned by scientific software may subsequently be used

• approaches for verifying or testing the numerical accuracy of scientific software

• how to make better use of existing software by pre-processing the input data and using appropriate model parameterisations

• help in locating software that is relevant to metrology applications

Figure 10 illustrates an example from the report. Itshows some measurement data (as black circles)and a calibration curve (blue line) established asthe least-squares best-fit quadratic to the data.Suppose we wish to communicate the calibrationcurve to a user by providing the values of theparameters defining the curve rounded to fourdecimal places (so that, for example, they fit on acalibration certificate).When the user comes toevaluate the calibration curve he obtains the curveshown as a red line, and the errors associated withthis curve are evidently larger than themeasurement errors associated with the data.

The example shows how important it is for therequirements placed on the accuracy of the resultsreturned by software to reflect their intended use:in this case four decimal places of accuracy isinsufficient. Here, a better way to establish thecalibration curve is to pre-process the data beforefitting so that the x-values are normalised to lie inthe interval [-1, +1], and then the curves definedby the parameters before and after rounding are(essentially) identical.

For further information contact:Peter Harris, extension 6961,e-mail: [email protected]

Guidelines on selecting and usingsoftware in metrology applications

100 100.1 100.2 100.3 100.4 100.5 100.6 100.7 100.8 100.9 1012.5

3

3.5

4

4.5

5

5.5

6

6.5

7

7.5

Figure 10

Page 13: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 1 3

Testing the numerical correctness ofscientific softwareTwo new reports (CMSC 07/00 and CMSC 08/00)have been completed as part of the SSfMprogramme concerned with testing the numericalcorrectness of scientific software used inmetrology. Each report focuses on a particularnumerical calculation and presents the results oftesting functions for this calculation taken from anumber of spreadsheet, statistical and scientificsoftware packages used in metrology.Thecalculations considered are, respectively, samplestandard deviation and straight-line linearregression, and the software packages covered areExcel, MathCAD, S-PLUS, Matlab, the NAG Fortranlibrary and the IMSL Fortran library.

Figure 11 illustrates an example of the resultsobtained from testing the functions for samplestandard deviation.A sequence of graded referencedata sets is generated, where each data set has the same reference sample standard deviation buta larger sample mean than the previous set in thesequence.The sequence therefore mimicsmeasurement data sets with increasing signal-to-noise ratio.

The reference data sets are applied as input toeach test function and the results returned arecompared with the reference results.Aperformance measure is used for this comparison:it measures the number of significant figures ofaccuracy that are lost by the test function overand above the number that can be expected to belost by a reference algorithm for the calculation.The figure shows in the form of performanceprofiles the values of the performance measure forthe sequence of reference data sets and each testfunction. A near-flat performance profile is good.A strongly increasing profile is bad.

The results illustrated in Figure 11 indicate thatthe IMSL, NAG, Matlab, S-PLUS and MathCADpackages provide reliable results for thesereference data sets (albeit that MathCAD is slightlypoorer than the others), whereas there is seriousdegradation in the performance of the functionprovided by Excel. The results suggest that theExcel function implements an unstable formula forcalculating the sample standard deviation, whereasthe other packages implement stable formulae forthis calculation.The implementation of a commonunstable formula for the standard deviationreproduced exactly the Excel results for the cases examined.

For further information contact:Peter Harris, extension 6961,e-mail: [email protected]

Figure 11

1.E+00 1.E+01 1.E+02 1.E+03 1.E+04 1.E+05 1.E+071.E+06 1.E+08 1.E+09 1.E+10 1.E+11 1.E+12 1.E+13 1.E+14 1.E+15 1.E+16

Reference mean value

0

1

2

3

4

5

6

7

8

9

Per

form

ance

mea

sure

Performance profiles for standard deviation software

IMSL

NAGMatlabExcel

S-PLUSMathCAD

Page 14: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

Counting on IT

P A G E 1 4

In recent years Global Positioning System (GPS)receivers have become relatively inexpensive.Recent improvements to the GPS have resulted inreceivers costing £100 or less being capable ofdetermining a user’s position to within 20 m in real time.

GPS satellites have excellent on-board atomicclocks and this makes them superb sources of highquality timing signals. Primary timing laboratoriesuse GPS for the inter-comparison of the clocksforming the national time scales; but expensivespecialist receivers are required. Note here thatthe primary timing laboratories do not setthemselves to GPS time! Instead, GPS is aconvenient transfer standard: laboratories A and Bcompare their time against a satellite S. By recordingthe differences (A-S) and (B-S), and thensubtracting, the time differences between the twolaboratories (A-B) can be found.This method isknown as GPS common-view.

The common-view receivers output data in aunique format developed by the BureauInternational des Poids et Mesures (BIPM) in Paris,and require hardware modifications not normallyavailable on standard GPS receivers, so that anexternal clock can be directly connected to the receiver.

Up until the mid-1990s, cost had prevented thewidespread use of GPS receivers by UK users oftime and frequency. However inexpensive GPSreceiver engines have now been developed withtiming capabilities that make them suitable for usenot only by primary timing laboratories but also bytop-level calibration laboratories within the UK.

NPL has been working in collaboration with a UKmanufacturer,Time and Frequency Solutions, todevelop a common-view receiver based on aninexpensive GPS receiver engine that can be usedfor the highest accuracy time and frequencydisseminations within the UK.The new equipmentallows users to compare their standards againstthe national time standard.

The software for this project has been producedby CMSC. It takes the form of a user-friendlypackage written in Visual Basic, to run in aWindows environment.The software bothgenerates timing data in the internationally agreedBIPM format, and provides much additionalreceiver status information. It continuously readsand decodes satellite information and local clocktimings from the receiver. Satellite time readingsare corrected using orbital parameters fromephemeris readings, and further corrections aremade for transmission delays through the earth’sionosphere and troposphere.

Following a prearranged schedule, readings arerecorded in thirteen minute tracks, one readingbeing taken every second. At the end of eachtrack, the data are sub-divided into fifteen-secondblocks, and a result is obtained after applyingquadratic and linear fits.

The first of the new GPS receivers is at presentbeing used for NPL’s GPS monitoring service,where the timing capability of the GPS systemover the UK is being continuously monitored. Thereceiver automatically provides data, and a systemof Real-time Autonomous Integrity Monitoring(RAIM) has been introduced to check for errors inthe GPS system in real-time.

The main use of the GPS receivers will be in NPL’sGPS common-view service that will be launchedearly in 2001. This will provide continuousmonitoring of caesium clocks distant from NPL.

For further information contact:John Davis, extension 7137,e-mail: [email protected]

A global positioning system timing receiver

Sky map of GPS satellitesobserved at the National

Metrology Laboratory in Dublinusing the common-view receiver.Each satellite trajectory is tracedusing a seperate colour.The ‘hole’to the north is an artefact of theGPS satallite constellation, but

the rest of the sky shows a goodspread of ‘common-view’observations.These are

important for reliable timecomparisons

51,871 51,872 51,873 51,874 51,875 51,876

MJD

210

215

220

225

230

235

240

245

250

255

260

265

270

Qua

rtzl

ock

Mas

er -

UT

C(N

PL)

(ns)

(unc

alib

rate

d)

Quartzlock Maser - UTC(NPL), GPS common-view, uncalibrated

Comparison of a passive hydrogen maser atQuartzlock,Totnes, Devon, against UTC

(NPL), the UK’s national time scale. Eachpoint represents a separate 13-minute GPSsatellite measurement. Each GPS satellite

measurement is plotted in a seperate colour.The x-axis plots the Modified Julian Day

(MJD), a calendar that increments by oneeach day (i.e. the graph shows nearly six days

of data).

Page 15: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

P A G E 1 5

NPL has conducted a performance assessment ofseven biometric authentication systems for the UKgovernment Biometrics Working Groups run bythe Communication Electronics Security Group(CESG). Each system used a different biometrictechnology: iris patten, vein patten, hand geometry,face image, fingerprint (using optical and capacitivechip imaging) and voice print.

The test scenario was one of access control in anormal office environment, and data was collectedfrom over 200 volunteers using the systemsseveral times over a three-month period.The testmethodology followed the “Biometric Testing BestPractices” previously developed by NPL jointlywith other biometric testing organizations(see: www.afb.org.uk/bwg/bestprac.html).

There are several performance measures, andsystems can trade one aspect of performanceagainst another. For example increasing the falseacceptance rate, or the effort required of the user,might reduce the number of false rejections.Whichis the “best” system will depend on applicationrequirements, as illustrated by the Detection ErrorTrade-Off graph shown.

The results show typical levels of performanceattainable by the technologies tested and, togetherwith our testing experiences, are helping in thedevelopment of testable performance criteria forfuture security evaluations.

For further information contact:Tony Mansfield, extension 7029,e-mail: [email protected]

Biometric authentication performance

60

70

80

90

100

110

120

130

140

150

160

170

180

190

The SSfM-2 programme (to run from April 2001 toMarch 2004) will have a substantial R&D content,with much work on modelling, uncertainties,validation and testing, which will build upon thesuccessful work in these areas in SSfM-1.

The METROS website will be a major vehicle fordisseminating the programme results. Theprogramme will also coordinate internationalcontributions to the website through theEuroMETROS EUROMET project. See article“METROS Expands” on page 10.

There will be a major uncertainties project,covering a wide spectrum of activities including:

• international contributions to the GUM,• analysis of results of key inter-laboratory

comparisons,• research into statistical techniques new to

metrology,• the development of an industrial uncertainty

methodology to get the basic concept of uncertainty conveyed to those on the shop-floor who make and use measurements.

In addition, two projects will investigateuncertainties associated with visualisations andcontinuous models. Finally, two more projects willbe devoted to contributions to standardisation,much of which will relate to uncertainty evaluation.

Continued on back page

Looking forward to SSfM-2

100%

10%

1%

0.1%

Fals

e R

ejec

tio

n R

ate

100%10%1%0.1%0.01%0.001%0.%

False Acceptance Rate

A typical voice print

Iris and iris code

Detection Error Trade-Off

Page 16: Countingon - National Physical Laboratoryresource.npl.co.uk/.../counting_on_it/issue10.pdf · evaluation,which were supplemented by conference and journal papers on topics such as

General CMSC enquiries

+44 20 8943 7100 (direct line)+44 20 8977 7091 (facsimile)

Website: www.npl.co.uk/cmsc

General NPL Helpline

For enquiries to NPL outside the scopeof CMSC, please use:

+44 20 8943 6880 (NPL Helpline)+44 20 8943 6458 (Helpline facsimile)

Making contactYou can contact any of the experts directly by using the directdial number plus the extension or via e-mail.

Direct line +44 20 8943 + (extension)

Head of CMSCDave Rayner ext 7040 e-mail: [email protected]

Software Support for Metrology ClubWendy Johnson ext 6106 e-mail: [email protected] website: www.npl.co.uk/ssfm

If you have a general enquiry or do not know who you shouldcontact please call our general enquires number and we will bepleased to help you.

© Crown Copyright 2001. Reproduced by permission of the Controller of HMSO.

Counting on IT

National Physical LaboratoryQueens Road, Teddington, Middlesex, UK, TW11 0LW

Centre for Mathematics and Scientific Computing (CMSC)

The work on validation and testing will includeapplying numerical software testing to continuousmodelling software packages, such as finite elementpackages.Test services will be developed for the numerical testing of software in simple mass-market products, and for contributions tostandards and METROS.The work on validating software in measurement systems will be appliedto remote calibration situations, self-calibratinginstruments, and safety-critical uses.The latter willbe done in association with the DTI CASS scheme.

The programme will also tackle some importantnew topics. Firstly, it will provide generic supportto and coordination of NMS activities on remotecalibration over the Internet. Secondly, it will promote best practice in numerical analysis foralgorithm design, raising awareness of the problemsthat can arise from not applying good numericalanalysis. Finally, it will investigate new directions forfuture work, including new work to support legalmetrology and digital signal processing, support for soft metrology, and the aspects ofbioinformatics needed to support biotechnologymeasurement work.

For further information contact:Dave Rayner, extension 7040,e-mail: [email protected]

Theme 1: Modelling techniquesExtension of empirical modelsAlgorithms for discrete modellingPromoting best practice in discrete modellingContinuous modelling for metrologyUncertainties and statistical modellingData fusionVisual modelling and uncertaintiesData visualisationTheme 2:Validation and testingNumerical software testingTesting continuous modelling softwareTesting algorithms in standards and METROSMaintenance of measurement systemvalidation best practiceValidation of safety-critical measurementsystemsValidation of self-calibrating instrumentsTheme 3: Metrology software andalgorithm development techniquesNumerical analysis for algorithm designMaintenance of existing guidesTheme 4: Standards support and developmentMathematics and statistics input tometrology standards Standard approaches touncertainty in calibration and conformityTheme 5: Support for NMS infrastructureUse of the Internet by calibration servicesNew directionsTheme 6: Generic technology transferSSfM and METROS websiteSSfM ClubNewsletterReports on international activitiesAwareness presentations and conferencesLinks to Universities

Overview of themes and projects in SSfM-2.Those in italics and highlighted orange are tobe competitively tendered.