uftr-qa1-06.1, uftr digital control system upgrade ... · /8/ uftr-qai-109, "software library...

28
UF/NRE ProjectID: QA-I UFTR QUALITYASSURANCE DOCUMENT Revision 0 Copy 1 UFTR I Page)I of 28 Project Title: UFTR DIGITAL CONTROL SYSTEM UPGRADE UFTR-QA1-06.1, Software Test Plan - SIVAT Test Prepared by, Reviewed by, Prof. Ed Dugan . .. nature) Dr. Gabriel Ghita ..... (Signature) Date: .... . Approved by, Prof. Alireza Haghighat . .... ... (Signature) Date:

Upload: others

Post on 09-Aug-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Project ID: QA-IUFTR QUALITYASSURANCE DOCUMENT Revision 0 Copy 1UFTR I

Page)I of 28

Project Title: UFTR DIGITAL CONTROL SYSTEM UPGRADE

UFTR-QA1-06.1, Software Test Plan - SIVAT Test

Prepared by, Reviewed by,

Prof. Ed Dugan

. .. nature)

Dr. Gabriel Ghita

..... (Signature)

Date: .... .

Approved by,

Prof. Alireza Haghighat

. .... ... (Signature)

Date:

Page 2: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-), UFTR-QAI-106.1UFNR Name: Name: Revision 0 Copy 1UFTR DDate : Initials: Date : Initials: VoL 1 Page 2 of 28

THE DISTRIBUTION LIST OF THE DOCUMENT

No. Name Affiliation Signature Date

1.

2.

3.

4.

5.

6.

Page 3: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINREUFTR

Prepared by Reviewed by QA-I, UFTR-QAI-106.1Name: Name: Revision 0 Copy IDate: Initials: Date: Initials: Vol. 1 Page 3 of 28

THE LIST OF THE REVISED PA GES OF THE DOCUMENT

Revision no. Reviewed by -Approved by The Modified Pages Date

I .1- 4-

4 4- + 4-

4 4- -I- 4-

___ I ____ I ____ I _____ I __

4 4- + 4-

Page 4: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1UF/R Name: Name: Revision 0 Copy IUFTR Date: -initials: Date: Initials: Vol. 1 Page 4 of 28

TABLE OF CONTENTS

1. Purpose and Scope .......................................................................................................... 6

2. R eference ................................................................................................................................. 7

2.1 UFTR D ocum ents ...................................................................................................... 7

2.2 AREVA N P Inc. D ocum ents ..................................................................................... 7

2.3 Industry Standards ................................................................................................... 7

2.4 NR C D ocum ents ........................................................................................................ 7

3. A bbreviations and A cronym s ........................................................................................... 8

4. Test Item s ................................................................................................................................ 9

4.1 Functions to be Tested ............................................................................................... 9

4.2 Features to be Tested ............................................................................................... 10

4.3 Features N ot to be Tested ........................................................................................ 10

5. A pproach ............................................................................................................................... 11

5.1 Test Specification ...................................................................................................... 11

5.2 Test Procedure ............................................................................................................... 11

5.3 Test Preparation ...................................................................................................... 11

5.4 Test Execution ............................................................................................................... 12

5.4.1 Test of Input Subm odules .................................................................................... 125.4.2 Test of I& C Functions ........................................................................................... 12

5.4.3 Test of O utput Subm odules ................................................................................. 13

5.5 Com prehensiveness ................................................................................................. 13

6. Item Pass/Fail Criteria .................................................................................................... 14

7. Suspension Criteria and Resumption Requirements .................................................... 15

8. Test D eliverables ................................................................................................................... 16

8.1 Test Specification ........................................................................................................... 16

8.2 Test Procedure ............................................................................................................... 18

8.3 Test Log ........................................................................................................................... 19

8.4 Test Incident R eport ............................................................................................... 20

8.5 Test Sum m ary R eport ............................................................................. ............... 20

9. Testing Tasks ........................................................................................................................ 22

10. Environm ental N eeds ................................................................................................. 23

Page 5: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-I, UFTR-QA 1-106.1

UFTR Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: Vol. I Page 5 of 28

10.1 H ardw are ....................................................................................................................... 23

10.2 Softw are ......................................................................................................................... 2310.2.1 O perating System (s) ........................................................................................ 2310.2.2 Com m unications Software ............................................................................... 2310.2.3 Security ................................................................................................................... 2310.2.4 Tools ....................................................................................................................... 23

11. R esponsibilities ................................................................................................................... 25

11.1 Project Coordinator ................................................................................. ...................... 25

11.2 Softw are D evelopm ent G roup .................................................................................. 25

11.3 IV & V Team .................................................................................. ................................ 25

12. Staffi ng and Training ................................................................................................. 26

12.1 Staffi ng ........................................................................................................................... 26

12.2 Training .......................................................................................................................... 26

13. Schedule ............................................................................................................................. 27

14. R isk and Contingencies ............................................................................................... 28

Page 6: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

Prepared by Reviewed by QA-), UFTR-QA 1-106.1Name: Name: Revision 0 Copy 1

UFTR Date: Initials: Date: Initials: VoL 1 Page 6 of 28

1. Purpose and Scope

This document provides the SIVAT Test Plan for the UFTR Reactor Protection System(RPS) application software. SIVAT testing is performed prior to Factory Acceptance Testing(FAT), in order to verify that the functional requirements of the Software RequirementsSpecification (SRS) and the software design in the Software Design Description (SDD) areproperly implemented in the SPACE application. For more information about SIVAT, referto the SIVAT User Manual, /12/.

The introduction of the test plan shall provide an overview of the entire testing process. Itshall at a minimum list the following plan objectives:

1. To define the software test items to be tested.2. To detail the approach required to prepare for and conduct SIVAT testing.3. To define the resources needed to perform the testing.4. To communicate to all responsible parties the tasks that they are to perform and the

schedule to be followed in performing the tasks.5. To define the test tools and environment needed to conduct SIVAT testing.6. To describe the acceptance (pass/fail) criteria for the Software Test Items being tested.

The form and content of this document follows IEEE Std. 829-1983, /14/, endorsed byRG 1.170, /16/. The methods for planning, preparation, execution, and evaluation of theSIVAT test follow IEEE Std. 1008-1987, /15/, endorsed by RG 1.171, /17/. This plan andadditional SIVAT testing documents shall comply with UFTR "Quality Assurance Program(QAP)," /1/, UFTR "Conduct of Quality Assurance," /2/, UFTR "Quality Assurance ProjectPlan (QAPP)," /3/, UFTR "Software Quality Assurance Plan (SQAP)," /4/, and UFTR"Software Configuration Management Plan (SCMP)," /5/, and UFTR "Software Verificationand Validation Plan (SVVP)," /6/, as applicable.

Page 7: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

RE Prepared by Reviewed by QA-), UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: Vol. 1 Page 7 of 28

2. Reference

2.1 UFTR Documents

/1/ UFTR-QAP, "Quality Assurance Program (QAP)"/2/ UFTR-QAP-0 1 -P, "Conduct of Quality Assurance"

/3/ UFTR QA1-QAPP, "Quality Assurance Project Plan (QAPP)"

/4/ UFTR-QA1-01, "Software Quality Assurance Plan (SQAP)"

/5/ UFTR-QA1-02, "Software Configuration Management Plan (SCMP)"

/6/ UFTR-QA1-03, "Software Verification and Validation Plan (SVVP)"

/7/ UFTR-QA1-05, "Software Safety Plan (SSP)"

/8/ UFTR-QAI-109, "Software Library and Control"

/9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)"

/10/ UFTR Technical Specifications

2.2 AREVA NP Inc. Documents

/11/ AREVA NP Inc., 43-10272, "Software Program Manual, TELEPERM XS

Safety Systems"

/12/ AREVA NP Inc., 01-5044046-01, "TELEPERM XS SIVAT-TXS Simulation

Based Validation Tool User Manual TXS-1047-76-2.1 (Version 1.5.0 and

Higher)"

/13/ AREVA NP Inc. Document No., 38-9033245-000, "Safety Evaluation by the

Office of Nuclear Reactor Regulation Siemens Power Corporation Topical

Report EMF-21 10(NP) "TELEPERM XS: A Digital Reactor Protection

System" Project No. 702"

2.3 Industry Standards

/14/ IEEE Std 829-1983, "IEEE Standard for Software Test Documentation"

/15/ IEEE Std 1008-1987, "IEEE Standard for Software Unit Testing"

2.4 NRC Documents

/16/ Regulatory Guide 1.170, Rev. 0, September 1997, "Software Test

Documentation for Digital Computer Software Used in Safety Systems of

Nuclear Power Plants"

/17/ Regulatory Guide 1.171, Rev. 0, September 1997, "Software Unit Testing for

Digital Computer Software Used in Safety Systems of Nuclear Power Plants."

Page 8: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-l, UFTR-QAI-106.lName: Name: Revision 0 Copy 1

UFTR Date : Initials: Date : Initials: VoL 1 Page 8 of 28

3. Abbreviations and Acronyms

CATS-SDE

CRC

FAT

AREVA GmbH

ID

I/OLSSS

MSI

NRC

Oracle

PC

RCS

RPS

RTE

SDD

SDE

SER

SIVAT

SPACE

SRS

TXS

V&V

Code Adaptation Tool for SDE Simulation EnvironmentCyclic Redundancy CheckFactory Acceptance TestAREVA Germany DivisionIdentificationInput/OutputLimited Safety System SettingsMonitoring and Service InterfaceNuclear Regulatory CommissionDatabase Management System Used by SPACEPersonal ComputerReactor Coolant SystemReactor Protection SystemRun Time EnvironmentSoftware Design DescriptionSimulation Development Environment (used by SIVAT)Safety Evaluation ReportSimulation Based Validation ToolSpecification And Coding Environment (TXS engineering system)Software Requirements SpecificationTELEPERM XSVerification & Validation

Page 9: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-l, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: VoL. 1 Page 9 of 28

4. Test Items

The Test Item is the UFTR RPS Application Software that is compiled and linked using

the SIVAT simulation-environment of the TELEPERM XS (TXS) system.The Test Item is created using the SWAT-Tool CATS-SDE /12/. Sources for this

simulation-environment are the SPACE project database, the simulation software (SDE), andthe results of the code generation.

All software modules that make up UFTR RPS Application Software will be tested.SIVAT testing as described in this plan will verify the correct implementation of all thefunctions and requirements, as described in the SRS. Additionally, SIVAT testing will verifythe software design in the SDD was properly implemented in the SPACE project database asshown in the Application Software Code document.

The SPACE project database (i.e., Application Software) is identified by a versionnumber and release date as described in UFTR "Software Library and Control," /8/. The code

configuration is documented and checked through the use of the Cyclic Redundancy Check(CRC) checksums of the database tables.

4.1 Functions to be Tested

The following RPS Application Software functions adapted from the LimitedSafety System Settings (LSSS) shall be tested per UFTR "Functional RequirementsSpecification (FRS)," /9/:

1. RPS Trip #1: Reactor period < 3 sec2. RPS Trip #2: Reactor Power > 119% of full power3. RPS Trip #3: Loss of chamber high voltage (> 10%)4. RPS Trip #4: Loss of electrical power to control console5. RPS Trip #5: Primary cooling system

- Loss of primary pump power- Low water level in core (<42.5")- No outlet flow- Low inlet water flow (< 41 gpm)

6. RPS Trip #6: Secondary cooling system (> 1kW)- Loss of flow (well water •60 gpm)- Loss of secondary well pump power

7. RPS Trip #7: High primary coolant inlet temperature (_ 99°F)8. RPS Trip #8: High primary coolant outlet temperature (> 155'F)9. RPS Trip #9: Shield tank low water level (6" below established normal level)10. RPS Trip #9: Ventilation system

- Loss of power to stack dilution fan- Loss of power to core vent fan

Page 10: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-), UFTR-QAI-106.1UFNR Name: Name: Revision 0 1 Copy 1

Date: Initials: Dale: Initials: VoL 1I Page 10 of 28

Justification for the Limited Safety System Settings (LSSS) is established in theUFTR Technical Specifications, /10/. The LSSS are established from operatingexperience and safety considerations.

NOTE: The Functions listed above include any associated supporting software modules.

4.2 Features to be Tested

The Application Software functionality that is specified in the RPS UFTR SDD willbe tested to determine if software elements (for example: modules, submodules, andfunctions) correctly implement software requirements. As a minimum the criteria for thisdetermination are:

1. Compliance with functional requirements.2. Performance at boundaries, interfaces, and under stress and error conditions.

In addition to the proper functionality of the Application Software according to theRPS UFTR SDD, the following characteristics must be verified:

1. Signals to output boards must have no fault status at all times, even under errorand stress conditions.

2. Test results must be verified from start of test until the completion of the test inorder to verify that no unexpected intermediate results are present.

3. Race conditions must be handled in order to ensure spurious alarms are notgenerated by the software.

NOTE: UFTR Specific RPS Application Software versions that have been fully testedwith SIVAT do not need to be fully tested when the Application Software version isupdated. The Application Software will only be tested for differences in logic betweenthe two versions. Signal names, clear text descriptions, parameter changes, and FunctionDiagram name changes of the version update can be verified via visual inspection.

4.3 Features Not to be Tested

The testing of TXS system software components (Operating System, 1/0 Drivers,Communication Software, RTE, and Function Blocks) is not within the scope of this test.These system software components are considered "reusable" software and have beentested and qualified by AREVA NP GmbH. These system software components werepreviously deemed acceptable by the NRC within the scope described in TXS SER(Safety Evaluation Report), /13/.

Page 11: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy IDate: Initials: Date: Initials: Vol. 1 Page 11 of 28

5. Approach

5.1 Test Specification

The test personnel will use the software design defined in the SDD to prepare the

Test Specifications for each Input, Function, and Output Module or Submodule. The Test

Specifications shall ensure that all functionality as defined in the SDD is tested. This

shall be accomplished by specifying test cases that test the combinational logic of theindividual software modules and submodules (Input, Function, and Output). To ensure

that the boundaries and interfaces of the software modules are fully tested, overlapping

tests shall be used.The Test Specifications are to consist of test design and test cases that contain the

detailed instructions for testing as well as the features or test case acceptance criteria (i.e.,Expected Results) to be employed during the testing effort.

The inputs, tasks, and outputs of SWAT testing shall be determined, refined and

designed in this phase. The format and outline of the Test Specifications are described in

Section 8.1.

5.2 Test Procedure

The Test Procedures shall contain test scripts that institute the test cases defined in

the Test Specifications. These test scripts shall be created by utilizing the SDD and the

Application Software Code document. For further information regarding the form and

function of the commands that are to make up the test scripts, refer to the SIVAT User

Manual, /12/.The Test Procedure shall also verify the version/revision of the UFTR RPS

Application Software, SIVAT Software and test scripts that are to be tested. Verification

shall be done by comparing the CRC Checksums between the versions entered into theSoftware Library against the version specified in the Test Procedure.

The format and outline of the Test Procedure is described in Section 8.2.

5.3 Test Preparation

The Test Engineer shall check out the following software from the Software Library

(UFTR "Software Library and Control," /8/):

" The UFTR RPS Application Software

" SWAT test scripts ( that correspond to the specified Test Procedure)

The Test Engineer shall perform a pre-job brief.SWAT testing shall be performed after the approved software design has been

implemented in the SPACE project database and the Application Software has been

generated for the SIVAT simulation without any error. Code generation is performed for

two reasons:

1. The code generator will check the SPACE application software database for

syntax completeness and testability.

Page 12: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-1, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy IDate : I Initials: Date: Initials: Vol. Page 12 of 28

2. The code generator will prepare the code for SIVAT simulation.

5.4 Test Execution

The test tasks should be executed in the sequence of:

1. Input Submodule Tests

2. Function Module Tests

3. Output Submodule Tests

NOTE: The Signal Monitoring Submodules shall be tested together with their

corresponding functions and modules.In the Test Procedures, the SIVAT Simulation commands shall be used to form

scripts. These scripts are to be used to completely automate the test execution by using

the StartSim.tcl configuration file within the SWAT-Tool. This file can be modified by

the user to specify the order in which the scripts are to be run. The location and names of

the script files must be entered into the StartSim.tcl file. The StartSim.tcl file is

automatically called when the simulator is started. SWAT runs the test scripts and then

terminates the simulation. For further information refer to the TXS Simulation Based

Validation Tool (SIVAT) User Manual /12/.The automated test scripts shall generate data files capturing the results of the test

runs. The data files should be entered in the Software Library in accordance with UFTR

"Software Library and Control," /8/. The data contained in the data files shall then be

plotted using the SIVAT plot conversion tool. These plots are reviewed against the

expected results and discrepancies shall be logged for disposition by the IV&V Group

and shall be captured in the Test Incident Report (see Section 8.4). Additionally, the Test

Logs (see Section 8.3) and a summary of the executed tests, including discrepancies

affecting the software design and implementation, shall be documented in the Test

Summary Report (see Section 8.5).

5.4.1 Test of Input Submodules

Input Submodules read the input information from the field device(s),

perform signal validation and failure annunciation functions, and distribute the

input signals to the I&C Function(s).

These tests verify the correct data is sent to the related I&C Function(s).

5.4.2 Test of I&C Functions

I&C Functions obtain data from the Input Submodule(s), then perform thelogic functions and send actuation signals to the Output Submodule(s) or actuation

device(s).

These tests verify the implemented logic and the correct output to the

annunciators, actuators, and to the related Output Submodule(s).

Page 13: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: Vol. 1 Page 13 of 28

5.4.3 Test of Output Submodules

Output Submodules obtain the outputs from the I&C Functions and thensend the actuation signals to the actuation devices. Output Submodules also processcheckback signals from the field device and provide component test logics.

These tests verify the implemented logic, the actuation of the field device,and the correct status of the actuated field devices.

5.5 Comprehensiveness

The SIVAT Test Plan shall ensure that the functional requirements of the softwaredesign detailed in the SDD are properly implemented in the SPACE application. Thecomprehensiveness of the testing effort shall ensure that all functionality as defined in theSDD is tested.

Page 14: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1Name: Name: Revision 0 Copy IUFTRDate: Initials: Date: Initials: VoL 1 Page 14 of 28

6. Item Pass/Fail Criteria

The Test Item shall be considered successfully passed when the results of the test matchthe predicted results described in the Test Specification without unexpected intermediateresults.

Any Test Item containing unexpected results can still be considered successfully passedif:

1. The disposition of the unexpected result determines that the UFTR Application

Software is functioning correctly.2. Disposition/justification of the item is documented and preserved in the Test Incident

Report for future reference.

Under these conditions, a retest of the item shall not be necessary.The Test Item shall be considered failed if the test script has a syntax error that prevents

the script from running, or if the test script or the Test Specification is found to be in error(i.e., the results of the test do not match the predicted results described in the TestSpecification). Any errors encountered while performing the test shall be documented in theTest Log and Test Incident Report.

Page 15: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-I, UFTR-QAI-106. IName: Name: Revision 0 Copy IDate: Initials: Date: Initials: Vol. 1 Page 15 of 28

7. Suspension Criteria and Resumption Requirements

If a discrepancy is found during test execution, the error shall be documented in the TestLog and the Test Incident Report and where possible the automation is continued. Adisposition of the discrepancies logged shall determine if the discrepancy affects the TestSpecification, Test Procedure, SDD, or the Application Software.

If a discrepancy is found while comparing the plot data to the expected results, thediscrepancy shall be recorded and dispositioned by the Hardware & Testing andDevelopment Groups. The discrepancy shall be recorded in the Test Incident Report and thedisposition of the results continues.

When a discrepancy is detected that affects the SDD or the Application Software, an"Open Item" shall be created and corrected following the methods described in UFTR-QAP-0 l-P, "Conduct of Quality Assurance," /2/.

During the review of the test results, all discrepancies shall be recorded in the TestIncident Report as described in Section 8.4.

Test reruns shall start after the changes to the SDD and Application Software have beenimplemented and the Test Specifications and Test Procedures have been updated to the newdesign. Test reruns shall be performed on all sections of the Test Specification deemednecessary and recorded in the Test Incident Report and the Test Summary Report.

Page 16: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFMNRE Prepared by Reviewed by QA-), UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: VoL 1 Page 16 of 28

8. Test Deliverables

The following types of documents shall be generated:

1. Test Specifications

2. Test Procedures3. Test Log (Attached to Test Summary Report)

4. Test Incident Report

5. Test Summary Report

All documents shall be prepared, reviewed and approved in accordance with the UFTRQAP, /1/, and UFTR-QAP-01-P, "Conduct of Quality Assurance," /2/.

8.1 Test Specification

The Test Specification shall specify refinements of the test approach and identifythe features to be tested by creating test cases. It also may be produced for subsets of the

complete UFTR RPS Application Software.The outline and content of the Test Specification shall have the following structure:

- Test Specification IdentifierThe Test Specification shall be identified by a UFTR Document ID number.

- Features to Be Tested

Identify the components of the Test Item and describe the features and combination offeatures that are the object of this design specification. Other features may be exercisedbut need not be identified. For each feature or feature combination, a reference to itsassociated requirements in the requirements specification or design specification should

be included.

- Approach RefinementsSpecify refinements to the approach described in the Test Plan. Include specific testtechniques to be used. The method of analyzing test results shall be identified. Specifythe results of any analysis that provides a rationale for test case selection. Summarize thecommon attributes of any test cases. This may include:

* Input constraints that must be true for every input in the set of associated test

cases

* Shared environmental needs

" Shared special procedural requirements

* Shared case dependencies

- Test Identification

List the identifier of each test case associated with this design.

Page 17: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-), UFTR-QAI-106.1Name: Name: Revision 0 Copy IUFTRDate: Initials: Date: Initials: Vol. 1 Page 17 of 28

- Pass/Fail Criteria

Specify the criteria to be used to determine whether the feature or feature combinationhas passed or failed.

- Test ItemsIdentify the components of the Test Item and features to be exercised by this test case.For each item consider supplying references to the following Test Item documentation:

" Software Design Description* Application Software Code document

- Input SpecificationsSpecify each input required to execute the test case. Some of the inputs will be specifiedby value (with tolerances where appropriate) while others will be specified by name.Specify all required relationships between inputs.

- Output Specifications

Specify all the outputs required of the Test Items (i.e., expected results). Provide exactvalue (with tolerances where appropriate) for each required output.

- Environmental Needs

" HardwareSpecify the characteristics and configurations of the hardware required to executethis test case.

* SoftwareSpecify the system and application software required to execute this test case.This may include software such as operating systems, compilers, simulators, andtest tools. In addition, test items may interact with application software.

" OtherSpecify any other requirements such as unique facility needs or specially trainedpersonnel.

- Special Procedural Requirements

Describe any special constraints on the Test Procedures that execute the test case. Theseconstraints may involve special set up, operator intervention, output determinationprocedures, and special wrap up.

- Interease Dependencies

List the identifiers of test cases that must be executed prior to this test case. Summarizethe nature of dependencies.

Page 18: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-1, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy IDate: Initials: Date: Initials: VoL 1 Page 18 of 28

8.2 Test Procedure

Test Procedures shall consist of steps identified by the respective Test

Specification. The outline and content of the Test Procedure shall have the following

structure:

- Test Procedure Identifier

The Test Procedure shall be identified by a UFTR Document ID number.

- Purpose

Describe the purpose of the Test Procedure.

- Special requirements

Identify any special requirements that are necessary for the execution of the Test

Procedure.

- Procedure Steps

The procedure steps shall be documented as part of the attached test script.

" LogDescribe any special methods or formats for logging the results of the test

execution, the incidents observed, and any other events pertinent to the test.

* Set UpDescribe the sequence of actions necessary to prepare for the execution of the

procedure.

" StartDescribe the actions necessary to begin execution of the procedure.

" ProceedDescribe any actions necessary during the execution of the procedure.

* MeasureDescribe how the test measurements will be made.

" Shutdown

Describe the actions necessary to suspend testing, when unscheduled events

dictate.

* Restart

Identify any procedural restart points and describe the actions necessary to restart

the procedure at each of these points.

* Stop

Describe the steps necessary to bring execution to an orderly halt.

" Wrap Up

Describe the actions necessary to restore the environment.

" ContingenciesDescribe the actions necessary to deal with anomalous events that may occur

during execution.

Page 19: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1Name: Name: Revision 0 Copy.!UFTRDate: Initials: Date: Initials: Vol. 1 Page 19 of 28

- Attachment

For each Test Procedure, the test scripts shall be attached. The StartSim.tcl configurationfile (see Section 8.3) shall also be attached to each Test Procedure. The test scripts shallbe written using the SIVAT command syntax and shall contain the followinginformation:

" Version identification of the test script* Test purpose (including Functions/Modules tested)

8.3 Test Log

The Test Log shall provide a chronological record of relevant details about theexecution of tests. The Test Log shall be attached to the Test Summary Report. Theoutline and content of the Test Log shall have the following structure:

- DescriptionInformation that applies to all entries in the log, except as specifically noted in anotherlog entry, shall be noted here. The test log shall:

" Identify the items being tested, including their version/revision levels.

" Identify the attributes of the environment under which testing is conducted, byincluding facility identification, hardware being used, and system software used.

- Activity and Event Entries

For each event, including the beginning and end of activities, record the occurrence dateand time along with the identity of the author. The following information shall beconsidered:

" Execution DescriptionRecord the identifier of the Test Procedure being executed and supply a referenceto its specification. Record all personnel present during the pre-job brief and testexecution including testers, operators, and observers. Also indicate the function ofeach individual.

" Procedure ResultsFor each execution, record the visually observable results (for example, errormessages generated, aborts, and requests for operator action). Also record thelocation of any output. Record the successful or unsuccessful execution of thetest.

* Environmental InformationRecord any environmental conditions specific to this entry (for example,hardware substitutions).

* Anomalous EventsRecord what happened before and after unexpected events. Record the

Page 20: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

I Prepared by Reviewed by QA-1, UFTR-QAI-106.1UF/NRE Name: Name: Revision 01 Copy 1

Date: Initials: Date: Initials: VoL 1 Page 20 of 28

circumstances surrounding the inability to begin execution of a Test Procedure orfailure to complete a Test Procedure.Incident Report IdentifiersRecord the identifier of each Test Incident Report, whenever one is generated.

8.4 Test Incident Report

The Test Incident Report shall document any event that occurs during the testingprocess that requires investigation. The outline and content of the Test Incident Reportshall have the following information:

- Test Incident Report Identifier

The Test Incident Report shall be identified by a UFTR Document 1D number.

- Summary

Summarize the incident. Identify the Test Items involved, indicating theirversion/revision level. References to the appropriate Test Specifications or TestProcedures shall be supplied.

- Incident DescriptionProvide a description of the incident. Provide the name of the person that identified theincident and the date that the incident was identified. The incident description shall usethe discrepancy resolution process defined in Section 5.4 of this document.

- Impact

If known, indicate. what impact this incident will have on Test Plans, Test Specifications,Test Procedures, SDD, or Application Software.

8.5 Test Summary Report

The Test Summary Report shall contain test summary information. Some or all ofthe content of a section may be included by reference to the document that contains theinformation. The outline and content of the Test Summary Report shall have thefollowing structure for each test run:

- Test Summary Report Identifier

The Test Summary Report shall be identified by a UFTR Document ID number.

- Test Summary

Summarize the evaluation of the test items. Identify the items tested, indicating theirversion/revision level. Indicate the environment in which testing activities took place.For each test item provide references to the following documents if they exist:

" Test Plans" Test Specifications

" Test Procedures" Test Incident Reports

Page 21: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106IO)Name: Name: Revision 0 Copy 1Date: Initials: Date: Initials: Vol. I Page 21 of 28

* Test Logs

- Variances

Report any variances of the Test Items from their design specifications. Indicate anyvariances from the Test Plan, Test Specifications, or Test Procedures. Specify the reason

for each variance.

- Comprehensive AssessmentEvaluate the comprehensiveness of the testing process against the comprehensivenesscriteria in the Test Plan if it exists. Identify features or feature combinations that were notsufficiently tested and explain the reasons.

- Summary of Results

Summarize the results of the testing. Identify all resolved incidents and summarize theirresolutions (i.e., reference to the Test Incident Report). Identify all unresolved incidents.

- Evaluation

Provide an overall evaluation of each test item including its limitations. The evaluationmust be based on the test results and the item level pass/fail criteria. An estimate offailure risk may be included.

- Summary of ActivitiesSummarize the major testing activities and events referencing the Test Log.

Page 22: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-1, UFTR-QAI-106.1Name: Name: Revision 0 Copy IUFTRDate: Initials: Date: Initials: Vol. 1 Page 22 of 28

9. Testing Tasks

The following steps shall be executed in sequence to verify that the functionalrequirements of the SRS and the software design in the SDD are properly implemented in theSPACE application:

" Prepare the Test Specification (with expected results) in accordance with the softwarespecification documents.

" Prepare the Test Procedure (test scripts) in accordance with the software specificationdocuments.

" Checkout applicable Software (i.e., Application Software & test scripts) from theSoftware Library.

" Complete pre-job brief prior to running the Test Procedures." Run the SIVAT Test Procedures for the appropriate Test Item." Plot the test results using the SIVAT plot conversion tool* Compare the test results against the expected results listed in the Test Specification to

check for implementation correctness.

If discrepancies are identified by the Independent Verification & Validation (IV&V)Group, the steps as described in Section 7 shall be followed for corrective actions.

The procedures for change control are defined in UFTR SCMP, /5/.

Page 23: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1

UFTR Name: Name: Revision 0 Copy IDate: Initials: Date: Initials: Vol. 1 Page 23 of 28

10. Environmental Needs

The physical characteristics of the hardware shall be provided in sufficient detail toensure the simulation tool (SWAT) and the Application Software tests can be executed asdesired. Also specify the level of security that must be provided for the test facilities, systemsoftware, and proprietary components such as software, data, and hardware. Identify anyother testing needs. Identify the source for all needs that are not currently available to the testgroup. Identify any special test tools or supporting software packages needed.

10.1 Hardware

The computers used for the PC clients/Engineering Servers shall have the followingminimum performance capabilities:

" Minimum clock frequency of the processor: 600M1-lz• Minimum graphics card/monitor resolution: 1280 x 1024 pixels" Minimum memory capacity: 512 MB" Minimum hard disk capacity: 20GB" Monitor, Mouse (PS/2), Keyboard (PS/2) ports• USB port

10.2 Software

10.2.1 Operating System(s)

For the Engineering Server, the operating system shall be SuSE Linux 8.2.2or higher. For the PC clients, the operating system shall be able to support anetwork configuration and use network connection interface software to connect tothe Engineering Server.

10.2.2 Communications Software

For the PC client, network connection interface software (i.e., Exceed8.0.0.0 or higher) is required for multiple users to gain remote access to theEngineering Server.

10.2.3 Security

A password protected login to the Engineering Server is used by the TestEngineer to perform the test. Software versions are controlled during the test byusing the process defined in UFTR "Software Library and Control," /8/.

10.2.4 Tools

The following software tools are required to be installed on the EngineeringServer prior to starting testing:

1. Oracle Database version 1.3.12. SPACE version 3.0.7a3. SIVAT version 1.5.3

Page 24: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-), UFTR-QA 1-106.1Name: Name: Revision 0 Copy 1

UFTR Date : Initials: Date : Initials: Vol. 1 Page 24 of 28

4. SDE version 1.2.3

Page 25: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-1, UFTR-QAl-106.1

UFTR Name: Name: Revision 0 Copy IDate: Initials: Date: Initials: Vol. 1 Page 25 of 28

11. Responsibilities

The overall project team organization and responsibilities for the design activities for theRPS project are defined in UFTR QAPP, /3/. The individuals and groups identified in thefollowing sections will be involved in SWAT testing.

11.1 Project Coordinator

The Project Coordinator controls the overall testing process and makes all decisionsrelevant to project personnel and assignments (including delegation of responsibilities).The Project coordinator also acts as the interface to other groups for deviationmanagement and documentation version control.

11.2 Software Development Group

The Software Development Group is responsible for the complete software designand the software code generation using the TXS engineering tool SPACE. The SoftwareDevelopment Group is also responsible for the identification and control of softwareConfiguration Items. All modifications to the software design and software code in allphases, are performed by this group.

11.3 IV&V Team

The IV&V uses the SIVAT simulation tool for software testing in order to enhancethe quality of the software design prior to the FAT effort.

The Test Administrator is designated by the Project Coordinator to oversee thetesting process of each Test Item. This includes preparing the pre-job brief and answeringany questions the Test Engineer has during the execution of the Test Item.

The Test Engineer(s) prepares and executes the test. The Test Engineer(s) alsoprepares the data, in the form of plots, which are reviewed independently foridentification of incidents. The IV&V Group is responsible for creating the Test IncidentReport and the Test Summary Report.

The TV&V Team is responsible for the independent V&V during the performanceof all phases of the software life cycles described in UFTR SVVP, /6/, including thisplan.

Page 26: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106. IName: Name: Revision 0 Copy .UFTRDate: Initials: Date: Initials: Vol. 1 Page 26 of 28

12. Staffing and Training

12.1 Staffing

The following staff is needed to carry out this testing project:

1. Project Coordinator

2. Test Administrator

3. Test Engineer(s)

Per Regulatory Guide 1.171, /17/, requirement (noted below), the IV&V personnel

should be independent from the Software Development Group. This means that the

design engineer(s) shall not be a participant in the Test Plan, Test Specifications, or Test

Summary Report development unless an independent review of these documents is

performed. Also, the design engineer(s) shall not be involved in the testing unless an

independent review of the test results is performed.

12.2 Training

The Test Engineer(s) shall be trained in the use of the SPACE system and the

SIVAT testing tool.

The test personnel shall receive the following training prior to performing

Application Software testing:

L. Software Configuration Management Plan, /5/,2. Software Verification and Validation Plan, /6/,

3. Software Library and Control, /8/,

4. Software Safety Plan, /7/

5. Open Item Process

6. TXS Software

7. Quality Assurance Training

Page 27: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UFINRE Prepared by Reviewed by QA-I, UFTR-QAI-106.1Name: Name: Revision 0 Copy .1UFTRDate: Initials: Date: Initials: Vol. 1 Page 27of28

13. Schedule

The SIVAT Test Plan and implementing activities are governed by UFTR QAPP, /3/ and

SVVP, /6/.

Page 28: UFTR-QA1-06.1, UFTR Digital Control System Upgrade ... · /8/ UFTR-QAI-109, "Software Library and Control" /9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)" /10/ UFTR

UF/NRE Prepared by Reviewed by QA-), UFTR-QAI-106.1Name: Name: Revision 0 Copy IUFTRDate: Initials: Date: Initials: VoL 1 Page 28 of 28

14. Risk and Contingencies

SWAT testing may not begin if there are Design Open Items to be resolved, unless aconscious project decision is made by Project Management. This project decision must beidentified and recorded in the Test Log with sign-offs by Project Management.

Design Open Items that are known at the time of the test start are to be recorded in theTest Log and are to be clearly identified in the Test Summary Report. Actions must be takento ensure that these Design Open Items are resolved and tested before FAT commences.