june /c ,1981

28
ARKANSAS NUCLEAi< ONE - UNIT 2 DOCKET 50-368 CEN-162(A)-NP REVISIDH 00 , CPC/CEAC SYSTEM PHASE II SOFTWARE VERIFICATION TEST REPORT . JUNE /c ,1981 . ' . i Combustion Engineering, Inc Nuclear Power Systems Power Systems Group Windsor, Connecticut 8107300146 810720 I PDR ADOCK 05000368- p PDR- .

Upload: others

Post on 14-Jan-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: JUNE /c ,1981

ARKANSAS NUCLEAi< ONE - UNIT 2

DOCKET 50-368

CEN-162(A)-NP

REVISIDH 00,

CPC/CEAC SYSTEM

PHASE II SOFTWARE VERIFICATION

TEST REPORT

.

JUNE /c ,1981.

'

.

i

Combustion Engineering, Inc

Nuclear Power Systems

Power Systems Group

Windsor, Connecticut

8107300146 810720 IPDR ADOCK 05000368-p PDR-

.

Page 2: JUNE /c ,1981

. ,,

.

.

.

*.

.

.

*

.

. .

-.

.

.

LEGAL NOTICE.

.

This response was prepared as an account of worksponsored by Combustion Engineering, Inc. licither CombustionEngineering nor any person acting on its behalt':

a. Makes any warranty or representa: ion, expressor implied it;cluding the warranties of fitness for a particularpurpose or merchantability, with respect to the accuracy,completenest, or usefulness of the information contained inthis response, or that the use of any information, apparatus,method, or p ocess disclosed in this respense may not infringeprivately owned rights; or

b. Assumes any liabilities'with rerpect to the use of,or for damages resulting from the use of, any information,apparatus, method or process disclosed in this response.

.

.

'

.

-.

.

. ..

.

.

-,

. r, .

*.

.

.

.

.

Page 1 of 271

Page 3: JUNE /c ,1981

.

.

IdtS1It.*yQ

.

phase 11 Testing is performed on the D'll;R/LPD Calculator System to (1)verify that the CPC and CEAC sof tware modifications have been properlyintegrated with the CPC and CEAC software and the system hardware and (2)

provide confinaation that the static and dynamic operation of the integratedsystem as .'odified is consisterit with that predicted by design analyses.

This report presents the Phase 11 test results for the Arkansas Power andLight Af!0-2 plant CPC/CEAC Rev. 04 software.

The Phase 1: Testing was performed according to previously issued procedures(Reference 1). The test results inhicate that the CPC and CEAC softwaremodifications have been properly integrated with the CPC and CEAC software

and hardware and that the operation of the integrated system as modified isconsistent with the performance predicted by design analyses.-

.

This document was prepared and reviewed in accordance with sections 5.2 and5.4 respectively of QtsDP, Revision 14.

'

. , , , ..,

e

*e

G

.

Page 2 of 27

Page 4: JUNE /c ,1981

.

l ARI.L OF CD:n fl!15

Section Ti tie _ Pa,ge_po.

1.0 It!TRODUCTION 4-

1.1 Objectives 4

1.2 Description of Phase II Testing 5

1.3 Applicability 5

2.0 CPC/CEAC IMPUT S'.EEP TESTS 6

2.1 CPC Input Sweep Test Case Selection 6

2.1.1 CPC Processor Uncertainty Results 6

2.1.2 Analysis of CPC Input Sucep Test Results 7

2.2 CLAC Input Sweep Test' Case Selection 10

2.2.1 CEAC Processor Uncertainty Results 11

2.2.2 Analysis of CEAC Input Sweep Test Results 11,

.-

3.0 DYl!A!11C SOETiu,RE YERIFICAT10M TEST 12

3.1 DSVT Test Case Selection 12

3.2 Generation of DSVT Acceptance Criteria 13

3.3 DSVT Test Results 19

4.0 LIVEflNPUT SII!GLE PARAMETER TESTING 23'

3.1 LISP Test Case Selection 23.

4.2 Generation of LISP Acceptance Criteria 23

4.3 LISP Test Results 24

5.0 PJIASE 11 TEST RESULTS SUMMARY 26

6.0 nErERENCES 27

.-

Page 3 of 27

Page 5: JUNE /c ,1981

.

. ,

1.0 tilII:0DUCTio'l-

'The verification of software modifications of the DNCR/LPD Calcula-' tion System consists of several steps which address two majorareas of the codification process:~

(1) Specification of sof tware codifications(2) Implementation of software modifications

The specification of the software modifications is documented inthe CPC/CEAC Functional Descriptions and Data Base Document and

is verified by design analysis contained in recorded calculations.The implementation of softijare modifications is documented inSoftware Design Specifications and assembly listings. The verifi-cation process for the modified software implementation includesPhase I and Phase'll software verification tests.

..

The requirements of the Phase 11 Software Verification Testingare based on 'the fact that the Phase I Testing will be performed.Successful completion of Phase 1 Testing verifies the correctimplementation of the modified software. Phase 11 Testing completes

the software modification process by verifying that the integratedCPCSysteIFrespondsasexpected.

This document contains the test results and conclusions for thePhase 11 Testing.

1.1 Obiectives .

.

The primary objective of Phase 11 testing is to verify that theCPC and Cf.AC sof tware modifications have been properly integrated

''

with the CPC and CEAC sof tware and the system hardware. Inaddition Phase 11 testing provides confirmation that the staticand dynamic operation of the integrated system as modified isconsistent with that predicted by design analyses. These objectives

are chieved by comparing the response of the integrated system*

Page 4 of 27

Page 6: JUNE /c ,1981

.

.

. .

to the respon,e predicted by the CI'C f0R1Pldi Simulat ion Code..

This comparison is performed for a selected range of simulatedStatic and dynamic input conditions.

1.2 ' pe.sgiption..of Phase 11 Testinge

Phase 11 testing consists of the following tests:

(1) Input Sweep Test,

(2) Dynamic Sof tware Verification Test, and(3) Live input Single Parameter Test.

These tests are performed on a single channel CPC/CEAC Systemwith integrated softtlere 'that has undergone successful Phase itesting. -

.

''

1,3 /pplicability'

,

This report. applies to the Phase 11 testing performed on theArtansas Poveer and Light AND-2 plant CPC/CEAC system sof ttrare.

The software revisions documented in this report are designatedas Iiodification llumber 4 to the AND-2 CPC/CEAC system software.

'. ,y ..

.

.

.

..,

Page fi of P7

Page 7: JUNE /c ,1981

....

.

.

2.0 CPCLCl AC IUPill SurLP TESTS

The Input Sweep Test is a real time exercise of the CEAC and CPC

application sof tware and executive sof tware with steady-state CPC'and CCAC input values read from a storage device. This test hasthe following objectives:

(1) To determine the processing uncertainties that are inherentin the CPC and CEAC designs.

(2) To verify the ability of the CPC and CEAC algorithms used in,

the system hardware to initialize to a steady state after anauto-restart for eat,h of a large number of input combinationswithin the CPC/CEAC operating space, and

(3) To complement Phase 1 module testing by identifying any''

abnormalitiesinth5CPCandCEACalgorithmsusedinthe-

system hardware which were not uncovered previously..

2.1 CPC Input Sweep Test Case Selection

- m. --| test cases, each involving different combinations of

-proco's's* inputs and addressable constants, were used for CPC

design qualification testing of the Revision 04 software.

2.1.1 CPC Processor Uncertaicty Results

for each test case, differences between FORTRAN simulation andCPC system results,were calculated. A statistical analysis ofthese differences produced the processing uncertainties.

''

The DN11R statistics did not include those cases for which the, DNiiR as calculated on either system was at the limits

.

This is because a difference of zero (or close to zero) would hecomputed anil would incorrectly weight the distribution of differ-

A total of [ ] cases remained af ter these cases wereences.

rage f, 0f n

_.

Page 8: JUNE /c ,1981

.

eliminated. The LPD statistics did not include those cases for- -

which the IPD as calculated on either system was equal to orgreater than the upper limit of

, ,

I.. cure average Lw/f t ( L -

1.w/ f t ) . A total of-- -

'climinated._ ].- cases remained after these cases were

.

Although cases were not included in the computationof DNBR and LPD statistics, respectively, they were still presentfor the purpose of identifying software errors.

The processor uncertainties for DHCR and LPD are defined as theone-sided tolerance limits which encompass 95% of the distribution

of DMBR and LPD differences for all test cases with a 95% confidencelevel. The processor uncertainties determined from Inp't Sweep~for DNBR and LPD respectively are DNBR units and

,)coreaveragekw/ft, llowever, since the distribution of'. differences is so tight the maximum error may be used (that is,

the limits which encompass 100:; of the difference). This is more'

conservative and yet still results in low processor uncertainties.

Thus defined, the processor uncertainties for Revision 04 on DNDRand LPD are D'!BR units and core average kw/ft,

*

respective.ly.-J..

'

.

2.1.2 Analysis of CPC Input Sweep Test Results

The results of the test cases exceeding the 95/95 tolerance limitwere analyzed for evidences of software errors.

A review of the stdtistical analysis of the test casesindicated .

.

.

' - ~

P. ige 7 o f .??

- - - . . .

Page 9: JUNE /c ,1981

e

. .

,

9

4 8

4

0

*@

I -

l'. ige 8 of P7

Page 10: JUNE /c ,1981

.

.

.

The review re:ults of the DNt:R and LPD test car.cs outside the95/95 tolerante limit will now be dir. cussed, for DNtik there were

| cases below the lower tolerance limit of,(DNBR

'uIiits) and|_ ]Lest cases above the upjier tolerante limit of.. I-o

~1(DNDR units).J-

--

.

.

s

.

e

e es - *

Page 9 of ??

Page 11: JUNE /c ,1981

. .

~ ~

for the .| DUCR cases above the 95/% tolerance level the- ,

greatest percent error was_ _4

lhe remaining -. | test cases-

_| (ab:.ol ute) . The co: mon inputhad percent errors less than '.--

. data to these test cases was found in other tesi cases with lessmaximum difference and less percent error. It is thereforeconcluded that no errors are indicated in the CPC Single ChannelDNDR program,

for LPD the cases examined were: ~ l cases with differences below

thelower95/95tolerancelimitof ] (% of core average|al/f t), .fcaseswithdifferencesgreaterthantheuppertolerance

''

limit of'

' land .jcaseswithLPDvaluesgreaterthanof core average kw/ft and with differences outside the above__

-

stated tolerance limits. For the LPD cases with values aboveVjof core average Ix/ft, the largest percent error was{ g.

1gThe size of this percent error term indicates that the differences

'

between the CPC Single Chcnnel and the FDRTPM simulation are due-

to machine dif ferences ir. ccuracy when calculating large numbers.

For the test cases with LPD values less than of core,

- _ . -.

average ku/ft, I' cases had percent errors greater than-_ _,i. The-. m

largest percent error was 1 Examination of the inputs to,

theseCtasesshowednocom!aninput. Examination of the inputsto all ~jlPDcasesoutsidethetolerancelimitsshowedthatthe

_

inputs covered a wide spectrum. L'o co:miun area was found. It is

therefore concluded that there is no indic tion of softwareerrors in the Single Channel calculation of LPD,

2.2 CEAC_ Input Swefep. Test Case _ Selection

] test cases, each involving different combinations-

of CLAC process inputs were used for CLAC design qualificationtesting of the kevision 04 software. These test cases coveredall CEAC operaling space.

-

Page 1D of 27

Page 12: JUNE /c ,1981

'

2. '2.1 CEAC Processor Uncertainty Results

for each test case, differences between 10||1RAtt siinulation andCEAC system results were calculated. The processor uncertainties

for Dl!!!R and LPD are defined as the orie-sided tolerance limitswhich encompass 95'/ of the distribution of D';IR and 1.PD penalty

factor differences for all test cases with a 95% confidencelevel.

The processor uncertainties for the DulR and the LPD penaltyfactor differences are and ]respectively.

2.2.2 Analysis of CEAC loput Sweep Test Results

The results of the test cases exceeding the 95/95 tolerance limit'

were analyzed for evidences of software errors.] ] test

cases had differences in the big penalty factor flag.',

,

[ ]testcaseswithdifferencesinthebigpenaltyThe

factor flag were examined. Results indicated that these differ-ences were due to implementation differences between the CEAC

software and the CEAC FORTRAN and not due to software or FORTRAftprogra.r.tnipg errors. These implementation differences do r.ot

,

impact calculation of the CEAC penalty factor output words. Itr-

was concluded that the results of thel _' test cases did notindicate the existence of software errors.

'. .,

.

.

.

. .

Page 11 of ?7

Page 13: JUNE /c ,1981

. .

3.0 DYt hMIC 501 (1.'A1:E VLltifICAllDl TESI

The Dyo mic Software Yerification lest (DSVT) is a real timeexercise of the CPC application software and executive softwarewith transient CPC input values read from a storage device. Thistest has tuo objectives:

(1) To verify the dynamic response of the integrated CPC softwareis consistent with that prediced by design analyses, and

(2) To supplement design documentation quality assurance, Phase

I module tests, and Input Suecp lests in assuring correctimplementation of software modifications.

.

Eurther information concerning DSVT may be found in Reference 1..

,3.1 DSVT Test Case Selection.

Test cases for DSVT are selected to exercise dynamic portions ofthe CPC software with emphasis on those positions of the softwarethat have been modified. The major modifications made by theRevision 4 changes are:

....'..: '

..

(1) CEAC i.ogic and Data Base changes to allow proper operationwith plants containing 2-CEA subgroups.

(2) Replacement of the COSM0/h'-3 based Df!BR calculation with

CETOP2 based on the TORC /CE-1 DNDR correlation. This changetotally replaces the STATIC program and modifies the DNDRUPDATE calcul'ation.. .

(3) Changes to curve-fitting routines, modelling core power,

distribution for more precise CI:A configurations and the useof additional corrections and offsets to yield improvedD CCura cies .

page 12 of P'/

Page 14: JUNE /c ,1981

.

I' '

(4) Algorithm :,implifications in the POL'LR program yieldingimproved computer efficiency.

(S) Additional addressable c90stants te facilitate changing', constants lil:ely to vary during plant life, and to allow

clearance of the CE/C snapshot buffer and rewriting of theentire CRT display.

For more detail on Revision 04 sof tware modifications, see Reference2.

DSVT requires that as a minimum cases be

selected for test (Reference 1). These cases are from the PhaseIl test series (identified in Reference 1) and consist of

_

._l.

~5ccause the changes to the program algorithms were significant*

.

all of the DSVT test cases were' executed on thei

CPC TORThAN simulation code and on the Singic Channel facility.This ensures that all dynamic portions of the CPC software areadequately exercised.

3.2 Generation of DSVT Acceptance Criteria .

Acceptance criteria for DSVT are defined (in Reference 1) as atrip time and initial values of DNBR and LPD for each test case.These Trip Times and initial values are generated using thecertified CPC FORTRAN simulation code. Processing uncertaintyobtained during Input Sweep Testing is factored into the acceptancecriteria. Trip times are also affected by program executionlengths. The minimum and maximum progran e.secution lengths forthe Revision 01 sof tware modifications t,vre calculated and were.

used in DSVT. These execution lengths (in milliseconds) arelisted below.

Page 13 of 27

Page 15: JUNE /c ,1981

;.

Program liinimum liar.imum-

.__

l'LDW

U?DATE

' P0..'E R

STATJC -

- -

Each DSVT case was executed once with the minimum executionlengths and most conservative D!!BR and LPD trip setpoints andonce with the maximum execution lengths and least-conservativeDiiBR and LPD trip setpoints. This results in a bandwidth of D: BRand LPD trip times.

'

The final DSVT acceptance criteria bandwidths contain the effectsof processing uncertainties and program execution lengths. The

software DSVT program also includes a []raillisecond interrupt'

cycle in order to chech'for D:BR and LPD trip signals. This-

results in a ,] millisecond interval limit an Trip Time resolution~

which is factored in the acceptance criteria. The followingtables contain the final DSVT acceptance criteria for initialvalues and trip times of DNBR and LPD.

.

'

. , . - ..

.

.

Page 14 of 27

Page 16: JUNE /c ,1981

!)ti!!!! and 1.PI) 1rii t ial Valut '. (li:1PR linitt, atu! lat/f t. re!.pectively),

D lilR DilllR LPD LPDTett Car.e iliiid, j!jax.J. ,Qliid ,(lj.u ._),

--.

9

$

e 8

0

e

.m

l'a!!c l!i of E7.

Page 17: JUNE /c ,1981

. .

D!il;R iinti 1.PD h.itial Values (D:!!!R tinitt. ioni heIit . re".1u c t ivel).)..

(Cont.)

DimR DNiiR LPD LPDTer.t ' Ca se ,(lii n. ) ,(lia x .), ,(lii n_.

,

Mg . ),j.-

hh

. ..

. &

.

.. ,, g

.

.

.

e *

.

l' age 16 of 27

Page 18: JUNE /c ,1981

t

D!!!;1bipd i PD T_r_iy_lig ;dr.econdg

Dl!Pl! Trip Dillill Tri (Pl> Trip LPD TripTer.t Car.e __(liijk)_., __(Ito:. )p___ jliin ._)_ __(liqx l_

_.

Ii,

.

.

.

.

.

" ____l.

l' age 17 of P7

Page 19: JUNE /c ,1981

.

D:|iti: and I.PD 1 rip,li!r|L(acpjig,1(Cont.)

Dh'l;R 1 rip D!!!'R Trip LPD Tri Li'D TripTest Case _(liijhj_ ._(li<ix. L_ _(ljin.)p ._(ljh)_

.

'.

J._._

.

., . . - -

.. .

.

O g

.

S#

Page 18 of 27.

Page 20: JUNE /c ,1981

. .

'.

3.3 DSVT 'lest I:et.ul t r.

The Dynamic Sofluore Verification lest was executed on the SingleChannel facility using the ikvision 04 CPC software. The DSVT

Test results are contained l>elow.

Dh'DR . LPD DilDR Trip LPD Tri_l.est Cqte Q,'hR Un,ilts_). .U.w/ f t .]. Accd_ jsec )p-

ow..

O .g

4

.'

.

4%

M

l' age 19 of I'/

- . _ _ .

Page 21: JUNE /c ,1981

s

.

[11;im Li'D Dl;liR Trip U'D TripTe!,t Cat.e(1)ilbl: lin i t r. ). ,(lw/ft-[ _(ML)_ EE)._i,

.

. - - - -

e

*e

h

aO 4g g 9 4

4

4 *

e

k

Page 22: JUNE /c ,1981

'. .

for some test cases, the initial values of D:R;R or LPD, or theDfil'R or LPD trip times, were outside those defined ley the 10R11:h:1generated bandwidths. These cases weie evaluated individuallyand there is no evidence of sof tware errors. Thus, the objectives

.of DSVT have been met.

All the initial values of D:'!!R and LPD vere within those definedby the FORTRAtt generated bandwidths with the exception of[_)cases. These were test cases with initial

-

LPD values at ]Lw/ftrespectively,-

. _ ,correspondingto{_ _ gof ratec average power density.The processing uncertainties used to generate the bandwidths are

valid only for normal power levels up to[ ]of rated average,

power. The magnitude of the differences between the FORTRAi1

value and the Single Channel value was extremely snall whencompared to their initial values. These differences are due tothe differences in machine precision between the CPC hardware-

(i.e., the Interdata 7/16) and the CDC 7500 (on which the CPC

FORTRA!! simulator is executtd) in conjunction with the processingunccriainty range mentioned above. In each of these cases, boththe single channel and the FORTRAf1 initialized with a D::DR and

LPD trip because of the magnitudes of the inputs.'

-

.. 3_

. ...

Test cases were cases withthe plant in the trip condition initially. In the FORTRAf; simula-

tion, one program execution cycle was required to generate a tripoutput, therefore the minimum and maximum trip times from the

FORTRA!1 were[_lseconds while the trip times from the Single

Channel were{ ] seconds. Proper software initialization wasverificd for these cases by checking the FORTRAf1 outputs at time

seconds and verifying that the FORTRA!l did indeed calculate a--

trip conth Lion.-

.

Page 21 of 27

Page 23: JUNE /c ,1981

,

.

i,

. . ~ ~

The t.PD trip time:, in text cases ond the D:lBI: trip

time:, in test cases [__ IIere[

~

' ~~ |outside of the FORTI:li!! generated bandwidth. Investigation oithis anomaly revealed the following:

.-

,

*

.

.

.

.

.-.

It was concluded that the DSVT results did not indicate theexistence of sof tware error.

l' age P2 of '7

Page 24: JUNE /c ,1981

. .

'.

4.0 LIVE IllPUI Sil!CI E PAf'AMr1ER TES11HG.

The Live Input Single Parameter test program is a real-tic:eexercise of the CPC/CEAC application and executive software, with

-

' transient CPC/CEAC input values generated from an external sourceand read through the CPC/CEAC input hardware. The objectives ofthis test are:

(1) To verify that the dynamic response of the integrated CPC/CEACsoftware and hardware is consistent with that predicted bydesign analyses.

(2) To supplement design documentation quality assurance, Phase

I module tests, input sweep tests, and DSVT testing inassuring correct implementation of software modifice. ions.

''

(3) To evaluate the integrated hardware / software system during-

operational modes approximating plant conditions..

4.1 LISP Test Case Selection

Reference 1 i entifies the test cases to be used for LISP. Thesecases 'are'.the single variable dynamic transient test cases fromthe Phase 11 test series.

These test cases, which are applicable to ANO-2 cycle 2, consistof "

I~.

-

. ...

4.2 Generation of LISP Acceptance Criteria _.

The accrptance criteria for LISP is bared on trip times for the,dynamic test cases.

.

l' age 23 of P7

Page 25: JUNE /c ,1981

- :

.

These cases are simulated in f0RTR/J4 and contain the following-

adjustment compone'ils..-

.

-

Program execution lengths ,used for LISP testing were the same asthose for DSVT, with the addition of CEAC minimum and maximum

executionlengths([ ]insec respectively).'

''' The final acceptance criteria (generated by the FORTRAN code and -

adjusted for the above components) for LISP is contained in thefollowing table.

Test Case Minimum Trip Time Maximum Trip Time

(seconds) (seconds). . . ... ....

-

4.3 LISP Test Results

The ] } dynamic transients were executed on the Singic Channel*'

facility. The recorded trip times (in seconds) for each case arelisted in the following table:

'

Page N of 27

Page 26: JUNE /c ,1981

..

-

liun ..

.

All recorded trip times mee't the final acceptance criteria forLISP. .

.

liajor aspects of the operator's module operation, particularly*

the point ID verification and addressable constant range limitswere tested. As part of the testing, the CPC and CEAC Point IDtables were checked to assure that the Point ids displayed on theoperator's module are the same as those listed in the Point IDtables. During the check of the CPC Point ID table,

~

.

The aspects of automated reentry of addressable constants ucretested and found to be acceptable..

.

.

l' age 25 of 27

Page 27: JUNE /c ,1981

. .

5.0 Pil/tSE 11 Tf:ST kl Stil.TS StlMMARY

The Phase 11 sof tware verification tests liave beco performed asrequired in Reference 1. In all cases. the test .esults fell

' ithin the acceptantc criteria, except those cases discussed init

Chapter 3. These cases were analysed and results indi"ated thatthey were not due to software or FORTRAN errors. Based on these

results, the Phase 11 test objectives have been successfullyachieved. It is concluded that the CPC and CEAC sof tware modifica-tions described as Revision 04 hav,; been properly integrated withthe CPC and CEAC sof tware and system hardware and that the staticand dynamic operation of the integrated system as modified isconsistent with that predicted by design analyses.

.

.

. .,

.

. . . . ,.. ...,

..

e

.

.

Page P6 of 27

-. , - . . . . . . . . . . - - - - ~

Page 28: JUNE /c ,1981

4

.

0.0 !!JIO11:llCI:5

1. CPC Protection Algorillun Sof tware change Procedure Cell-39(A)-P, ,

Itevision 02, Deccin!>cr 21, 1978,e

2. CPC/CEAC Sof tware Modifications for Arkansas l'uclear One -Unit 2, Cell-143(A)-P Revision 00, December,1980.

.

e

I g 9

*

..,.,.; . ,,

. .

e

e

9

9

Pa!,e 27 of 27