sca test, evaluation and certification · sca test lab is expected to either develop or license,...
TRANSCRIPT
SCA Test, Evaluation and Certification
Document WINNF-12-P-0005
Version V1.0.0
25 July 2016
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc Page i
All Rights Reserved
TERMS, CONDITIONS & NOTICES
This document has been prepared by the SCA Test, Evaluation, and Certification Work Group to
assist The Software Defined Radio Forum Inc. (or its successors or assigns, hereafter “the Forum”).
It may be amended or withdrawn at a later time and it is not binding on any member of the Forum
or of the SCA Test, Evaluation, and Certification Work Group.
Contributors to this document that have submitted copyrighted materials (the Submission) to the
Forum for use in this document retain copyright ownership of their original work, while at the
same time granting the Forum a non-exclusive, irrevocable, worldwide, perpetual, royalty-free
license under the Submitter’s copyrights in the Submission to reproduce, distribute, publish,
display, perform, and create derivative works of the Submission based on that original work for
the purpose of developing this document under the Forum's own copyright.
Permission is granted to the Forum’s participants to copy any portion of this document for
legitimate purposes of the Forum. Copying for monetary gain or for other non-Forum related
purposes is prohibited.
THIS DOCUMENT IS BEING OFFERED WITHOUT ANY WARRANTY WHATSOEVER,
AND IN PARTICULAR, ANY WARRANTY OF NON-INFRINGEMENT IS EXPRESSLY
DISCLAIMED. ANY USE OF THIS SPECIFICATION SHALL BE MADE ENTIRELY AT
THE IMPLEMENTER'S OWN RISK, AND NEITHER THE FORUM, NOR ANY OF ITS
MEMBERS OR SUBMITTERS, SHALL HAVE ANY LIABILITY WHATSOEVER TO ANY
IMPLEMENTER OR THIRD PARTY FOR ANY DAMAGES OF ANY NATURE
WHATSOEVER, DIRECTLY OR INDIRECTLY, ARISING FROM THE USE OF THIS
DOCUMENT.
Recipients of this document are requested to submit, with their comments, notification of any
relevant patent claims or other intellectual property rights of which they may be aware that might
be infringed by any implementation of the specification set forth in this document, and to provide
supporting documentation.
This document was developed following the Forum's policy on restricted or controlled information
(Policy 009) to ensure that that the document can be shared openly with other member
organizations around the world. Additional Information on this policy can be found here:
http://www.wirelessinnovation.org/page/Policies_and_Procedures.
Although this document contains no restricted or controlled information, the specific
implementation of concepts contain herein may be controlled under the laws of the country of
origin for that implementation. Readers are encouraged, therefore, to consult with a cognizant
authority prior to any further development.
Wireless Innovation Forum™ and SDR Forum™ are trademarks of the Software Defined Radio
Forum Inc.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc Page ii
All Rights Reserved
Table of Contents
TERMS, CONDITIONS & NOTICES ............................................................................................ i Executive Summary ....................................................................................................................... iv 1. Introduction .................................................................................................................................1
1.1 Introduction to the SCA Specification .............................................................................1 1.2 Purpose of an SCA Test Lab ............................................................................................1 1.3 Comment on “Compliance” vs. “Verification” vs. “Validation” ....................................2 1.4 SCA Test Lab Ecosystem ................................................................................................3 1.5 Relationship to “SCA Test and Certification Guide …” (April 2009) ............................4
1.6 References ........................................................................................................................5 1.6.1 US Government Documents References .................................................................5
1.6.2 Other References ......................................................................................................5 2. Organizing and Operating Requirements ....................................................................................6
2.1 Test Labs Accreditation ...................................................................................................6 2.2 Network of Test Labs.......................................................................................................6
2.3 SCA Test Labs .................................................................................................................7 2.3.1 Test Lab #1: (Government Test Lab).......................................................................7
2.3.2 Test Lab #2: (Radio Provider Test Lab) ..................................................................7 2.3.3 Test Lab #3 (Independent Test Lab) ........................................................................7
2.4 Transparency ....................................................................................................................9
2.5 Integrity and the Protection of Intellectual Property........................................................9 2.6 Acquisition of Test Procedures, Tools and Platforms .....................................................9
3. SCA Test Lab Procedures & Deliverables ................................................................................11
3.1 Evaluating Compliance ..................................................................................................11
3.2 SCA Test Procedures .....................................................................................................11 3.2.1 Methods of Testing SCA Requirements ................................................................11
3.3 Tool Selection ................................................................................................................13 3.3.1 Tools that allow new rules or customizability .......................................................14 3.3.2 Tools that automatically generate code ..................................................................14
3.4 Test Coordination with Developers ...............................................................................14 3.5 SCA Test Lab Report .....................................................................................................15 3.6 Meaning of SCA Compliant ..........................................................................................16
4. Business Case Considerations ...................................................................................................17
4.1 Business Income ............................................................................................................17 4.2 Business Expenses .........................................................................................................17
4.3 Business Stakeholders ....................................................................................................18 4.4 Suppliers to the SCA Test Lab.......................................................................................18 4.5 Other Business Relationships of the SCA Test Lab ......................................................19
5. Key Takeaways .........................................................................................................................20 6. Acronyms ..................................................................................................................................22
7. Appendix A - Test Lab Requirements ......................................................................................24 8. Appendix B - Test Lab Best Practices ......................................................................................26 9. Appendix C – Items to be Submitted to Test Lab for Evaluation .............................................27
10. Appendix D – Contents of a Test Lab Report ......................................................................28
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc Page iii
All Rights Reserved
List of Figures
Figure 1 – SCA Ecosystem with SCA Test Lab Highlighted 2
List of Tables
Table 1: Government Docs References Table ................................................................................ 5
Table 2: Other References Table .................................................................................................... 5 Table 3: SCA Test Laboratories ..................................................................................................... 8
Table 4: Table of Acronyms ......................................................................................................... 22
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page iv
All Rights Reserved
Executive Summary
This document provides detailed information on the role, organization, and responsibilities for an
SCA Test Lab. This information is intended to benefit those interested in understanding the finer
details of establishing an SCA Test Lab or the function of an SCA Test Lab within in the broader
SCA ecosystem. This document is not specific to any version of the SCA, but is written with
awareness of compliance requirements for SCA 2.2 and 2.2.2[A], and the SCA 4.0 and 4.1
specifications.
In particular, this report provides analysis and recommendations pertaining to:
The establishment and understanding of requirements for organizing and operating an SCA
Test Lab including a description of the variety of SCA Test Labs
The establishment of verification and validation procedures for testing SCA requirements
and producing deliverables, including compliance reports
The business case relating to costs, funding, development and plausible customers of an
SCA Test Lab
The goal of this document is to provide organizations with specific needs for an SCA Test Lab
with knowledge and expectations they can use to establish a test lab business specializing in testing
SCA-based SDRs. Additionally, this report should provide the reader with an understanding of the
requirements for establishing meaningful, reproducible verification and validation test
methodologies, business and marketing considerations to continue as an on-going enterprise, and
provide characteristics for establishing a team dedicated to its execution. For the broader SCA
community, this document should provide a starting point for addressing the need for establishing
a certification entity for third-party waveform software.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 1
All Rights Reserved
SCA Test, Evaluation and Certification
1. Introduction
1.1 Introduction to the SCA Specification
The Software Communications Architecture (SCA) specification and derived supporting SCA
Standards provide an architecture framework intended to assist in the development of Software
Defined Radio (SDR) communication systems. Compliance with the SCA ensures that a
deployment platform will provide a well-defined set of capabilities and simplifies the process of
porting waveform application software across radio platforms.
1.2 Purpose of an SCA Test Lab
An SCA Test Lab is a credible and accepted test facility that provides “controlled” conditions to
perform SCA compliance verification and (optionally) interoperability and portability validation
tests on SDR waveform applications and operating environments. By being accredited, the SCA
Test Lab is trusted to provide timely, accurate, reproducible test results. To achieve this goal, the
SCA Test Lab is expected to either develop or license, adopt, and then employ test procedures and
test tools that have been recognized by an Accreditation Body.
Through the results of verification and validation tests on SCA-based SDRs, the SCA Test Lab
generates and compiles their official test findings. Upon completion of a Test Report, the SCA
Test Lab interacts with the Certification Body to ensure that all key stakeholders acknowledge,
and are brought into agreement about, the conclusions of the Test Report. Additionally, the SCA
Test Lab interacts with the Specification Body and Standards Definition Body on clarifications to
acceptance criteria and with Test Suite Developers on automated software applications to cover
various requirements and opportunities to refine or improve Test Procedures. Interaction with the
Definition Body and the Accreditation Body ensures that the SCA Test Lab is using valid tests for
the SCA requirements being tested for compliance.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 2
All Rights Reserved
Figure 1 illustrates the interactions of the SCA Ecosystem with the SCA Test Lab highlighted.
Figure 1 – SCA Ecosystem with SCA Test Lab Highlighted
1.3 Comment on “Compliance” vs. “Verification” vs. “Validation”
The primary purpose of an SCA Test Lab is to test compliance of SCA waveform application and
operating environment source and resulting binary code with the SCA specification. Compliance
testing is a form of verification. In the software engineering sense, verification refers to the process
of determining whether or not a software system meets a specification. In contrast, validation
refers to the process of determining whether or not the software system achieves its intended result.
One of the intended goals of the SCA is to facilitate some measure of portability of software
applications across platforms. The SCA specification itself does not provide a guarantee of
portability. In the context of the SCA, any testing that an SCA Test Lab does to empirically
determine the portability of an application is properly referred to as validation testing. While a test
lab may do other forms of validation testing (e.g., determine if an SDR system meets various
operational requirements), those functions are outside the scope of this document.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 3
All Rights Reserved
1.4 SCA Test Lab Ecosystem
Stakeholder Relationship Description
Test Lab Perform SCA verification and validation tests
Accreditation Body Approve the Test Lab and tools and procedures used by the Test
Lab
Certification Body Receive and certify test results generated by the Test Lab
Test Suite Developer Develop test tools and procedures which can be adopted or
licensed by the Test Lab
Radio Provider Provide radio artifacts for testing (source code, binary,
hardware, documentation, etc.)
Definition Body Define SCA Standards which the Test Lab should verify
Specification Body Provide guidance on interpreting the SCA specification
Accreditation Body Grant accreditation to the Test Labs
Radio Provider Collaborate with the Test Lab on the interpretation of SCA test
results
Test Suite Developer Tools may be accredited independent of their adoption by an
SCA Test Lab
Definition Body Define SCA Standards used to accredit an SCA Test Lab
Certification Body Grant SCA certification of SCA Compliance
Radio Provider Receive SCA certification or reason for non-compliance
Test Suite Developer Develop test procedure and test tools
Radio Provider Adopt or license test tools and procedures for independent use
during radio development
3rd Party Developer Develop components to be incorporated into test tools and
procedures
Definition Body Provide standards that form the basis for SCA verification tests
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 4
All Rights Reserved
Radio Provider Provide a completed, certified radio
Procurement Entity Receive completed, certified radio
3rd Party Developer Develop components to be incorporated radios
Definition Body Provides standards for SCA developers
Definition Body Develop standards, interpret the SCA specification and
provide the certification criteria
Specification Body Receive SCA specification for as input for developing SCA
compliance criteria
3rd Party Developer Develop components to be incorporated into 3rd party
components
Specification Body Develop the SCA specification
3rd Party Developer Develop components to be incorporated by other
stakeholders
Procurement Entity Define the functionality of and purchase an SCA radio
1.5 Relationship to “SCA Test and Certification Guide …” (April 2009)
The “SCA Test and Certification Guide for SDRs based on SCA Part 1: SCA,” approved in April
2009, provides an overview of the roles and interactions of stakeholders in the SCA certification
process. The material in Section 1.4 summaries the descriptions from that document. The April
2009 document describes what it means for a radio to be certified, steps in preparing for
certification, and a high-level description of the certification process. Unless explicitly stated
herein, it should be assumed that the material in this document is consistent with, and builds upon,
this prior work.
In contrast to the previous work, the purpose of this document is to provide guidance on the process
of establishing an SCA Test Lab – one element of the broader SCA certification process. This
document provides more detail on the organization and operation of an SCA Test Lab and defines
in more detail what an SCA compliance and (optionally) validation test process should look like.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 5
All Rights Reserved
1.6 References
1.6.1 US Government Documents References
Table 1: US Government Docs References Table
# JTNC Released References
1 Joint Program Executive Office (JPEO) Joint Tactical Radio System (JTRS)
(2006). Software Communication Architecture Specification. v2.2.2
2
Joint Program Executive Office (JPEO) Joint Tactical Radio System (JTRS),
JTRS Test and Evaluation Laboratory (JTEL) SCA 2.2.2 OE Requirements List
version 2.2 Release Notes, November 4, 2010,
JTEL-SCAv2.2.2-OEReqts-v2.2.pdf
3
Joint Program Executive Office (JPEO) Joint Tactical Radio System (JTRS),
JTRS Test and Evaluation Laboratory (JTEL) SCA 2.2.2 Application
Requirements List version 2.2 Release Notes, July 8, 2010,
sca_2_2_2_application_requirements_list_v2.2.pdf
4 Joint Tactical Networking Center (JTNC), The Complete SCA 4.1
Specification, http://www.public.navy.mil/jtnc/sca
1.6.2 Other References
Table 2: Other References Table
# Other References
1 ISO/IEC 17025: (2005) General Requirements for the competence of testing and
calibration laboratories
2 ISO/IEC 17000: (2005) Conformity assessment ; Vocabulary and general principles
3 The International Traffic in Arms Regulations (ITAR); available at
http://www.pmddtc.state.gov/regulations_laws/itar.html
4 SCA 2.2.2 Endorsement SDRF-08-R-0006-V0.6.0; Document of the SDR Forum
5 "Certification Guides" (SDRF-08-P-0007-V1.0.0): Test and Certification Guide for SDRs
based on SCA: Part 1: SCA
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 6
All Rights Reserved
2. Organizing and Operating Requirements
2.1 Test Labs Accreditation
The Accreditation Body has the responsibility to officially endorse that a Test Lab has the authority
to carry out SDR assessments of SCA requirements. For the purposes of this description, the
Accreditation Body shall refer to the collective body that issues endorsements, even if separate
endorsements from one or more sources is involved. It is expected that an SCA Test Lab will have
received all necessary endorsements prior to conducting any test activity and that these
endorsements will be available for review by all parties who interact with the Test Lab.
The Accreditation Body assesses the Test Lab to ensure that it is capable of conducting SDR
testing. This includes an assessment of the Test Lab’s capability and practices with respect to the
test activity and may also include assessment of the Test Lab as a business entity capable of
operating in a professional and responsible manner (e.g., capable of safeguarding proprietary
information provided by 3rd-party developers). Accreditation also recognizes that the Test Lab will
issues a test report assessment for each SDR under test.
Under the scenario where the Test Lab is a National Lab, the endorsement and accreditation can
be issued by the nation’s respective national authorities. Accreditation should follow any
accreditation guidelines provided by the Definition Body, where practical, to ensure that the Test
Lab will support the harmonized certification guidelines (certification criteria, etc.) the Definition
Body may have established. In this case, it may be that the Accreditation Body provides an
endorsement of the test capability and the national authority then provides an additional
endorsement of the Test Lab as a business entity authorized to operate in that nation. This later
endorsement may also serve to limit the Test Labs that may conduct test activities for that nation.
It is more than feasible that a National Test Lab can be accredited by an industry-accepted
Accreditation Body from one or more other nations. With the accreditation, the respective and
participating nations accept and confirm the certificates and test reports issued by the respective
Test Lab. It may also be that a National Test Lab is permitted to outsource some testing, with final
certification to be provided by the Test Lab endorsed by the respective nation.
2.2 Network of Test Labs
Different Test Labs may include the capability to test and certify (as determined by the
Accreditation Body) one or more of the following groups of criteria:
o SCA conformance of Operating Environment (OE)
o SCA conformance of Applications (APP)
o Application Programming Interface (API) criteria
o Platform execution performance/interoperability/portability criteria
This document envisions the establishment of multiple Test Labs, each providing a regional,
national, or international capability. The existence of an available, relevant, accredited Test Lab
should be considered a precondition of SDR development. SDR Test Labs consist of government
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 7
All Rights Reserved
test labs, industrial labs and labs of international organizations with cross-accreditations,
respectively.
Each of the Test Labs may draw upon a common set of test methodologies, test procedures, and
test tools. Sharing would serve to ensure a higher level of portability (and possibly interoperability)
across all SDR Test Labs. The expectation is that common test methodologies, test procedures and
test tools will be augmented and supplemented by nation-specific test procedures for unique
capabilities of nation-specific SDRs. It is, however, assumed that the Accreditation Body may set
limits on the extent to which Test Labs may share information about process or use of test tools.
For example, a national or international organization may prevent the sharing of nation- or
organization-specific processes.
2.3 SCA Test Labs
The Test Lab is a facility that provides controlled conditions to achieve reproducible test results
and has the credibility to perform tests deemed necessary to certify compliance with the SCA. The
Test Lab shall have accreditation for performing such tests. It will use test procedures and test
tools developed by the Test Suite Developer. The Test Lab also interacts with the Definition Body
on clarifications and refinements of the Test Procedure and/or requirements.
Test Labs are accredited by the Accreditation Body. The Test Lab takes a radio and/or the source
code and executes test procedures and determines compliance of the radio to the Test Procedures.
The Test Lab may be a government entity, an independent entity or the testing laboratory of a
Radio Provider. The Test Lab will provide feedback to the Test Developer on the Test Procedures.
The Test Lab will be responsible for maintaining its ongoing accreditation as a Test Lab. The Test
Lab supplies a Compliance Test Report to the Certification Body; it does not provide the
certification.
2.3.1 Test Lab #1: (Government Test Lab)
This is a Test Lab that is controlled by a national agency. It can provide test services for
radios/software developed as part of procurement or independently developed.
2.3.2 Test Lab #2: (Radio Provider Test Lab)
This is a Test Lab that is an organization within a Radio Provider. It can provide test services for
radios/software developed as part of procurement or independently developed. A Test Lab
operated by a radio provider supports the Self-Evaluation for SCA Compliance model. It is not
expected that a Radio Provider would perform validation testing for another Radio Provider.
2.3.3 Test Lab #3 (Independent Test Lab)
This is a Test Lab that is an independent organization, not part of a government or Radio Provider.
It can provide test services for radios/software developed independently or as part of procurement.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 8
All Rights Reserved
Table 3: SCA Test Laboratories
Government TL Radio Provider TL Independent TL
Goals
Evaluate a radio or SW
component for SCA
Compliance
Provide Self-Evaluation
capabilities for radio or
SW component SCA
Compliance
Provide evaluation
capabilities to external
entities for radio or SW
component SCA
Compliance
Tasks
Establish a lab to
perform validation tests
on the SCA
Specification according
to the Test Procedure,
obtain Test Lab
accreditation, execute
the Test Procedure, or
generate a Validation
Test Report
Establish a lab to
perform validation tests
on the SCA
Specification according
to the Test Procedure,
obtain Test Lab
accreditation, execute
the Test Procedure, or
generate a Validation
Test Report
Establish a lab to
perform validation tests
on the SCA
Specification according
to the Test Procedure,
obtain Test Lab
accreditation, execute
the Test Procedure, and
generate a Validation
Test Report
Customer
Procurement Entity,
Radio Providers, 3rd
Party SW Developer, or
Certification Body
Procurement Entity, 3rd
Party SW Developer, or
Certification Body
Procurement Entity, 3rd
Party SW Developer,
Radio Provider, or
Certification Body
Suppliers
Specification Body,
Definition Body,
Accreditation Body, or
Test Suite Developer
Specification Body,
Definition Body,
Accreditation Body, or
Test Suite Developer
Specification Body,
Definition Body,
Accreditation Body, or
Test Suite Developer
Finance
Flow
In-flow = from
Procurement Entity,
Radio Provider, or 3rd
Party SW Developer
Out-flow = to
Accreditation Body, or
Test Suite Developer
In-flow = from
Procurement Entity, 3rd
Party SW Developer
Out-flow = to
Accreditation Body, or
Test Suite Developer
In-flow = from
Procurement Entity, 3rd
Party SW Developer,
or Radio Provider
Out-flow = to
Accreditation Body, or
Test Suite Developer
Information
& Material
Flow
In-flow = of Standards
Specification,
Accreditation Criteria,
Accreditation
Certification, Test Suite,
or Radio and/or SW to
be validated
Out-flow = of
Validation Test Report
In-flow = of Standards
Specification,
Accreditation Criteria,
Accreditation
Certification, Test Suite,
or Radio and/or SW to
be validated.
Out-flow = of
Validation Test Report
In-flow = of Standards
Specification,
Accreditation Criteria,
Accreditation
Certification, Test
Procedure, Test Report
Forms, Test Tools, or
Radio and/or SW to be
validated
Out-flow = of
Validation Test Report
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 9
All Rights Reserved
2.4 Transparency
An SCA Test Lab is expected to use known and publicly released documents for the verification
of compliance with SCA requirements. An SCA Test Lab may use unpublished tests to verify
known SCA requirements, but in all cases, the SCA Test Lab should be able to refer violating
incidents back to public documents.
For the purposes of this section, “public” refers specifications and other test criteria that are known
in common to all of the stakeholders in the system being tested. This may include specifications
that are not known to the public at large (e.g., requirements that are specific to particular national
agency).
2.5 Integrity and the Protection of Intellectual Property
An SCA Test Lab has a duty to protect the intellectual property embodied in systems under test.
This requires written policies and procedures to safeguard property. Polices can vary by SCA Test
Lab, but should include, at a minimum, explicit prohibitions from releasing information about a
system under test to any entity that is not a party to the test being conducted. Information that must
be safeguarded includes, but should not be considered to be strictly limited to: source code, system
specifications, associated documentation, and the outcomes of any testing conducted by the SCA
Test Lab.
It is reasonable for customers of an SCA Test Lab to expect that the protection of intellectual
property will be part of their test agreement and that these protections will be a precondition for
releasing any materials related to the product to be tested to the SCA Test Lab.
2.6 Acquisition of Test Procedures, Tools and Platforms
Automated test tools provide the benefit of efficiently and effectively conducting SCA compliance
testing for the SCA Test Lab. Utilizing Test Suite tools is an effective solution to lowering the cost
of conducting SCA Compliance. It is widely understood that automated Test Suite tools reduce
the amount of test time, reduce errors, and support the aim of reproducibility of test results.
Therefore, the SCA Test Lab is encouraged to incorporate the use of SCA compliance Test Suite
tools (software and platforms).
It is the responsibility of the Test Suite Developers to design and develop automated SCA Test
Suite tool applications for industry. Under this scenario, it is the Definition Body that develops the
requirements for software Test Suite that must align to the SCA requirements.
Test Suite Developers may acquire test software and/or platforms from third-party developers.
Any third-party software tool must be approved by the Accreditation body before they can be used
for SCA compliance testing. Approval by the Accreditation Body requires formal certification that
the tool correctly tests the SCA requirements within its scope. It is expected that certification of a
test tool or test platform will be conducted by comparing the test output with one or more reference
outputs, either from constructed tests cases or prior test runs from other certified tools or platforms.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 10
All Rights Reserved
Certain test suites, such as JTAP, allow for the inclusion of custom implementations. They are an
example of how the Test Suite Developer may implement the test methods required by a specific
certification criterion, which may deviate from the implemented criteria. The Test Suite Developer
should check the feasibility of implementing such a specialized test mechanism in the existing
tool. The implementation of these custom test methods should also be validated by the Definition
Body.
Runtime test platforms should be able to support the basic elements of the Operating Environment,
e.g., CORBA communication, file system, or Naming Service. They should have sufficient
resources to be able to host more than one waveform application simultaneously and provide for
communication of modulated data between waveforms.
Appendix A provides a list of SCA Test Lab Requirements.
Appendix B provides a summary of recommended SCA Test Lab Best Practices.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 11
All Rights Reserved
3. SCA Test Lab Procedures & Deliverables
3.1 Evaluating Compliance
Compliance to the SCA Specification requires that an SDR product meet all applicable SCA
requirements identified within the scope of the specification. SDR products are submitted to the
Test Lab Body for compliance verification. Results of that compliance verification are submitted
to the Certification Body for evaluation.
Appendix C provides a list of items to be submitted to an SCA Test Lab for evaluation.
3.2 SCA Test Procedures
The Test Lab will use resources and/or solutions that enable the Test Lab to properly verify and
validate compliance of an SDR to the SCA.
The Test Lab can use Test Procedures that provide the basic instructions for how to verify and
validate specific SCA Requirements. Test procedures are detailed test descriptions that explain
how to verify specific SCA requirements. The development of these tools and procedures is outside
the scope of this document.
The Certification Criteria, derived from the applicable version of the SCA standard, will be the
basis for the Test Procedures. Test procedures and test tools that define the verification and
validation of SCA requirements must be defined and developed.
It is at the discretion of the SCA Test Lab and the Accreditation Body whether test procedures will
be made, in part or in whole, available to the vendors submitting products for testing. Making test
procedures available assists the vendors in addressing potential SCA compliance issues before
submission, but holding back certain procedures resists vendor overfitting to certain tests and may,
in the long run, be deemed better for the integrity of the overall test process.
3.2.1 Methods of Testing SCA Requirements
An SCA requirement is the basic building block of the certification testing process. Ideally, an
SCA requirement should be a single testable fact about an SDR product and a test for an SCA
requirement should produce a pass or fail result.
The test process attached to each requirement should be categorized according to each of three (3)
broad criteria:
Type of test: Static or Dynamic
Static – A static test is a test that can be run on the SDR product artifacts alone and does not require
the product to be built (compiled) or executed. This class of tests includes analysis and/or
inspection of source code, IDL, XML descriptor files, and documentation.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 12
All Rights Reserved
Dynamic – A dynamic test is a test that requires the software to be built (compiled) and executed
either on an actual radio platform or in a test harness.
Level of automation: Automated, Partially Automated, or Manual
Automated – An automated test is one that is fully machine reproducible given some (saved)
environment configuration.
Partially Automated – A partially automated test is one that contains both an automated and a
manual component. The manual component may occur before and/or after the automated
component. Partially automated tests may be further characterized by the level (percentage) of
automation.
Manual – A manual test is a test that requires the execution of a human procedure with little or no
automation (beyond possibly using a generic COTS tools, e.g., grep).
Type of pass result: Necessary, Sufficient, Both, or Neither
Necessary – A necessary result means that the requirement does not hold if the test fails. A result
of pass does not mean that the requirement is verified, it only indicates that proof of non-
compliance could not be found. A necessary test may generate false-negative results – results that
indicate a pass when the result should be fail.
Sufficient – A sufficient result means that the requirement holds if the test passes. A result of fail
does not mean that the requirement is not satisfied, it only means that a proof of compliance could
not be found. Sufficient tests are often coupled with pre-conditions on how the product must be
developed (e.g., a coding standard that makes certain properties transparent). A sufficient test may
generate false-positive results – results that indicate a fail when the result should be a pass.
Both – A necessary and sufficient test means that the requirement holds if and only if the test
passes. This is the strongest possible type of test. These tests generate neither false-positive nor
false-negative results.
Neither – A test that is neither necessary nor sufficient produces results that must be manually
post-processed to make a pass/fail determination. These tests are capable of generating both false-
positive and false-negative results. Often, a “neither” test does not attempt to provide any
determination of compliance and only collects information for further inspection or analysis.
Note that for manual tests, this classification does not take into account the potential for human
error. Also note that for partially automated tests, it may make sense to characterize the parts of
the test separately. For example, the initial (automated) portion of the test may be sufficient, but
after manual post-inspection all of the possible false-positive results may be eliminated, making
the process both necessary and sufficient.
For necessary tests, a test A is said to subsume a test B if a pass result for A implies a pass result
for B. For sufficient tests, a test A is said to subsume a test B if a pass result for B implies a pass
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 13
All Rights Reserved
result for A. Notwithstanding factors such as test execution time, if test A subsumes test B, then
test A can be considered a “better” or “more accurate” test for the requirement.
In an ideal world, all SCA requirements would be written so as to admit an automated (static or
dynamic) necessary and sufficient test. In practice, this is typically impossible for many needed
requirements without imposing draconian development standards. For example, many problems
related to the possible execution of code are undecidable – it is provable that no static analysis can
test them perfectly in all cases and no dynamic test tool can be assured of having tested all possible
execution paths.
As a guideline, static tests are more likely to be sufficient and dynamic tests are more likely to be
necessary. This is because static tests can review an over-approximation of all possible program
paths (including some that cannot be realized) and dynamic tests only test some subset of the set
of all possible executions. However, exceptions to these guidelines do exist. For example, some
partially automated static tests search for necessary code constructs. This suggests, however, that
some requirements may benefit from the application of multiple tests that provide an overlapping
capability. In total, overlapping tests will usually lead to better total test coverage for a
requirement.
Beyond these broad characterizations, it is generally useful for each test to explain the result
characterization with examples or an enumeration of the ways in which the test can produce an
imperfect result. This information can be used to inform the Standards Body of how to sharpen a
requirement and can also be used to produce better tests in the future.
3.3 Tool Selection
It is anticipated that the success of the SCA will lead to an ecosystem of tool SCA tool vendors.
For SCA compliance testing, these tools may include both stand-alone products (presumably
covering a specific set of a SCA requirements and applicable to an identifiable subset of SCA
product classes) and integrated tools applicable to products developed within a particular
development environment.
For individual SCA requirements, the material in Section 3.2 on what makes one test better than
another (test quality) provides one of the criteria for selecting tools. Other criteria to be considered
by a Test Lab when selecting among accredited tools for assembling an SCA Test Lab include, but
should not be limited to:
Cost (both up-front and maintenance)
Support (direct technical support and likelihood of ongoing maintenance and upgrades)
Ease-of-use (both initial training to proficiency and day-to-day)
Reliability
Platform requirements (including versatility of platforms)
Openness (dependence on closed or proprietary modules)
Scope (may be a preference for fewer tools that cover more requirements)
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 14
All Rights Reserved
Applicability (range of products to which the tool can be applied)
3.3.1 Tools that allow new rules or customizability
An SCA Test Lab may employ tools that include functionality for defining new tests. This
functionality may be present in the form of a rule-definition or scripting language or through
common rule patterns that can be instantiated in new ways. For SCA compliance testing, the
specific rule, script, or pattern instantiation should be evaluated and accredited independent of the
tool (or tool platform) itself. SCA Test Labs should not have the authority, on their own, to define
and accept new tests and SCA Test Labs should only use customizations or customization
processes that have been approved by the Accreditation Body. It is expected and encouraged that
approved customizations be shared among SCA Test Labs.
3.3.2 Tools that automatically generate code
An SCA Test Lab should also anticipate the growing use of vendor-side development tools that
automatically generate large portions of SCA code. In this model, visual models of an SCA product
translate to SCA scaffolding code (including, for example, setup of device properties, port
connections, and startup and shutdown code) along with placeholders for vendor-supplied business
logic.
While it is beyond the scope of this document, it is foreseeable that an Accreditation Lab may
accredit a tool that automatically generates code and designate such code as compliant by
construction. In this case, compliance testing in the context of an SCA Test Lab may be limited to
verifying that the provided code was, in fact, generated from the accredited code generating tool
and then separately testing the business logic. Alternatively, the SCA Test Lab may enlist
accredited tools that verify the SCA compliance of code generating artifacts (e.g., visual models)
used by the accredited tools.
Both of these scenarios lead toward a relationship between the Developer and an SCA Test Lab
that permits something other than strict separation between development and compliance testing.
Section 3.4 elaborates on possible alternate relationship models.
3.4 Test Coordination with Developers
SCA Test Labs are encouraged to engage with developers at the earliest possible point in the
software development process. This early engagement will help identify potential obstacles or
areas of misunderstanding before the beginning of the formal (final) SCA certification process.
We identify three progressively more advanced forms of developer-coordinated compliance
testing:
1. Pre-Testing: A vendor uses tools known to, and used by, the Test Lab to develop and test
their software. Test results are thus, presumably, known to the vendor prior to submission
for formal compliance certification.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 15
All Rights Reserved
2. Pre-Certification: The vendor uses tools accredited by the Test Lab and submits the output
of those tools for review by the Test Lab as proof of compliance.
3. Self-Certification: The vendor uses tools accredited by the certification authority and
directly publishes the product and test results from those tools as proof of compliance.
The possibility of pre-certification, and ultimately self-certification, requires the certification of
tools and the means to establish trust that the certified tools have been configured and executed in
accordance with published test procedures.
The prime difficulty of vendor-assisted pre-certification is retaining the high confidence in the
results that stems from the use of an independent certification authority in the first place. One
possible way to increase this confidence is by using code-signing techniques similar to those used
for cryptographic applications. The checking software would produce a checksum as part of the
report that is unique to the combination of test software, test configuration, test invocation options,
SCA XML project files, IDL, source code and test results. The vendor would deliver the test
results, with included checksum, which could then be quickly checked for validity using a simple
“result checker” utility against the other artifacts. If the test results or any of the artifacts are
altered, the checksum will not match. If the checksum matches, the result can be treated as
authoritative for those software artifacts and test configuration.
Self-certification of compliance with automatically testable SCA requirements would require that
the “result checker” be available to the software consumer. In cases where the vendor is unwilling
to make source code available (such as for intellectual property reasons), the test result would have
to be linked to the binary image. A separate assurance that the binary is the product of the tested
source code would then have to be held in escrow by a trusted entity. Ultimately, the security of
any signing approach depends on the integrity of the checksum. The vendor should not be able to
manually calculate the checksum for any given code base and set of test results; only the test
software approved by the Test Lab should be able to do this.
3.5 SCA Test Lab Report
The end product of an SCA Test Lab compliance test event is a Test Report. The Test Report
should be provided back to the developer and, in the case of a recommendation for compliance
certification, also passed on to the Certification Body. In cases where compliance is not
recommended, the Test Report should include the most precise information available to assist the
vendor in identifying and remedying the identified points of non-compliance.
In assembling an SCA Test Lab Report, every effort should be made to ensure that the contents
and conclusions of the report are reproducible. It is anticipated that, in may cases, testing will be
an iterative process and it is essential that repeated tests yield consistent results. A preference for
automated tests can assist with this, but in all cases the Test Lab Report should include all of the
information necessary to reconstruct the conditions of the test. This includes a complete record of
what was submitted, all test platform and test configuration information, and an identification of
anyone who performed manual inspection tasks along with their notes.
Appendix D provides suggestions for the contents of an SCA Test Lab Report.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 16
All Rights Reserved
3.6 Meaning of SCA Compliant
Once an accredited SCA Test Lab submits a test report to the Certification Body for review, the
Certification Body then has the authority, based on review of the test report to designate the
corresponding product as “SCA Compliant.” It is expected that:
A designation of SCA Compliant will be accompanied by a specific version of the SCA
specification.
A designation of SCA Compliant should follow only after no deviations from the relevant
version of the SCA specification are found using the most complete, accredited
methodology available to the Test Lab.
While a Test Lab may take into consideration waivers for certain requirements or aspects
of requirements, these waivers should be judged by the Test Lab and affirmed by the
Certification Body as not fundamentally impacting alignment with the intent of the SCA.
Except by SCA version number and possibly by domain profile within the scope of the
relevant specification, a designation of SCA Compliant should not be qualified in any other
way.
In some cases, where substantial waivers are required, a designation of SCA Compliant with
Waivers can be used.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 17
All Rights Reserved
4. Business Case Considerations
The value of SCA-based test, evaluation and certification of products is primarily focused on
providing assurance of compliance with established standards as a foundation for enabling
waveform portability and, as a consequence, a widespread opportunity for interoperable
communications. Users and procurement authorities have confidence that the products and systems
acquired to accomplish mission operations will meet expectations. Users also increase their
awareness of, and receive positive feedback on, the versatility of the hardware, now proved for
being able to deploy different waveforms, according to the SDR paradigm of portability. Radio
providers are assured that their products meet the requirements of the customer community for
both current and future communications needs. In addition, radio providers can potentially
leverage certified implementations across multiple platforms to reduce aggregate development
costs and time-to-market.
4.1 Business Income
A large part of the Test Lab financial support is expected to originate from the fees it charges its
customers for the services the Test Lab provides. The fees it charges its customers allow it to
provide efficient and effective testing of Software Defined Radios, generate efficient status reports
and results of its testing, involve all key stakeholders, and consistently set new test guidelines and
efficient requirements that propagate throughout the world.
It is expected that the fees charged to customers will depend on the type and level of service
provided. Services may range from simple consulting on tests guidelines and opinions on
compliance to full SCA testing that includes multiple iterations of testing, test reports,
consultations with all stake holders, and submitting test opinions to Certification Authorities. In
each case, it is important that the Test Lab provide clear expectations on the nature and parameters
of the service to be provided. Fees generated from these services are expected to be the dominant
source of income for the Test Lab.
4.2 Business Expenses
An SCA Test Lab contains a secured area to conduct its testing. Within the secured test area, it is
expected facilities will include many test benches with computers, network equipment, analyzers,
spectrum analyzers, RF shielded test boxes, projectors, small and large monitors to conduct
meetings, radio test mounts, power strips and cords, and many other hardware items to support
testing.
Test Engineers will also utilize software applications to automate and replicate testing. This
consists of SCA software to verify XML formats, SCA software to verify SCA requirements,
software to complete quick algorithm searches, software to conduct different verification and
validation test techniques, software to automate its reporting requirements, software to augment
its requirements management, and many other software applications to promote efficiency for the
primary benefit of delivering high-quality and accurate test status reports.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 18
All Rights Reserved
Along with the expenses of different hardware equipment and software applications, it is expected
that the SCA Test Lab will need a support staff mixed with seasoned software engineers with test
backgrounds, test engineers with SCA understanding, software configuration management person,
support personnel, and different level of managers to support the SCA Test Lab.
An SCA Test Lab will also incur expenses from traveling to and from the customers test site and/or
test event. This could include both domestic and international travel. It is also likely that expenses
will include periodic training needed to provide an on-going and thorough understanding of the
SCA for its staff and team. These requirements and activities represent the dominant costs of
operating the SCA Test Lab.
4.3 Business Stakeholders
There is a diverse set of stakeholders for SCA test, evaluation, and certification, including:
o Product and system users (the mission communicators).
o Governments and associated procurement authorities (those who specify the
requirements and procure the products and systems).
o Radio Providers (developers and manufacturer of SCA based products and systems).
o Third Party software developers (e.g., Applications or middleware providers).
o Test Suite Developers (support development, manufacturing and testing).
o Others (i.e., independent test and certification organizations).
Though in detail each of these stakeholders has a particular perspective and set of objectives, at a
very high level they all have a similar set of requirements, which is to meet:
o Low cost
o Schedule (Time-to-Market)
o Performance
o Security
In more detail this leads to the following strong requirements on the certification process:
o Portability is improved significantly compared with non-standardized solutions
o Certification costs are minimized
o Time delay for the products to hit the market is tolerable
o Industry’s intellectual property is protected
o National security is maintained
o National sovereignty is maintained
o Enough flexibility is given to handle tailored solutions
o Transparency is ensured in the certification process
4.4 Suppliers to the SCA Test Lab
The SCA Test Lab operates in an ecosystem and is inter-dependent on other organizations that
impact the business case of the lab. Those inter-dependencies are as follows:
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 19
All Rights Reserved
Test Suite Developer – Test Suite Developers design, develop and sell test software
applications that either augment the SCA Test Lab's test efforts or fill an important
task of the SCA Test Lab. Test Labs must balance the cost of licensing test suites
against the cost of developing and maintaining custom solutions.
Definitions Body – This Definitions Body provides the SCA Test Lab with the SCA
requirements that must be evaluated. A Test Lab may choose to provide input to the
Definitions Body based on past test experience, but this is not strictly required.
Updates to the SCA requirements may necessitate development or adoption of new
test processes on the part of the SCA Test Lab.
Specification Body – The Specification Body provides the SCA Test Lab with the required
SCA specification. A Test Lab may choose to provide input to the Specification Body
based on past test experience, but this is not strictly required. Updates to the SCA
specification may necessitate further training or updated accreditations on the part of
the SCA Test Lab.
Accreditation Body – For the SCA Test Lab to have proper test guidelines, a separate
organization should periodically review its test processes to ensure it is accurate and
consistent with the requirements it is attempting to fulfill. This body ensure the SCA
Test Lab is conducting its testing accurately. It provides an independent third party
opinion of the SCA Test Lab's processes. Accreditations issues by the Accreditation
Body may be time limited and the Accreditation Body may charge a fee for renewal
and/or updates driven by the continued evolution of the SCA.
Radio Provider – The Radio Provider is the customer of the SCA Test Lab. The Radio
Provider pays fees for services that the Test Lab is then obligated to provide.
Certification Body – The Certification Body is the arbiter of the test process conducted by
the Test Lab. The Certification Body thus sets the standards by which the Test Lab
must operate – this includes defining the content and rigor of test reports. Changes
in standards by the Certification Body can impact the manner in which the SCA Test
Lab conducts testing.
4.5 Other Business Relationships of the SCA Test Lab
The SCA Test Lab should be intimately involved with the broader effort to promote, foster,
expand, and proliferate the value proposition of the SCA and the necessity of providing quality
test services to SCA adopters. Test Labs are encouraged to maintain a membership in the Wireless
Innovation Forum and other organizations that support the Software Communications Architecture
in the international arena such as the Institute of Electronics and Electrical Engineering (IEEE).
Maintaining these relationships ensures that the SCA Test Lab team and staff have a forum for
sharing experiences and will remain cognizant and aware of the current trends and conditions in
the marketplace.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 20
All Rights Reserved
5. Key Takeaways
[Section 1: Introduction] An SCA Test Lab is a credible and accepted test facility that
provides “controlled” conditions to perform SCA compliance verification and (optionally)
interoperability and portability validation tests on SDR waveform applications and
operating environments. Through the results of verification and validation tests on SCA-
based SDRs, the SCA Test Lab generates and compiles their official test findings into a
Test Report.
[Section 1: Introduction] The primary purpose of an SCA Test Lab is to test compliance
of SCA waveform application and operating environment source and resulting binary code
with the SCA specification. Compliance testing is a form of verification. In contrast,
validation refers to the process of determining whether or not the software system achieves
its intended result. While a test lab may perform forms of validation testing (e.g., determine
if an SDR system meets various operational requirements), those functions are outside the
scope of this document.
[Section 2: Organizing and Operating Requirements] The Accreditation Body has the
responsibility to officially endorse that a Test Lab has the authority to carry out SDR
assessments of SCA requirements.
[Section 2: Organizing and Operating Requirements] An SCA Test Lab can be
organized and operated by a government, radio provider, or independent entity. In all cases,
the Test Lab supplies a Compliance Test Report to the Certification Body; it does not
provide the certification.
[Section 2: Organizing and Operating Requirements] An SCA Test Lab is expected to
use known and publicly released documents for the verification of compliance with SCA
requirements. In some cases, an SCA Test Lab may use tools or procedures specific to
national requirements (e.g., closed APIs), but these should still be documented and
accessible to relevant parties.
[Section 2: Organizing and Operating Requirements] An SCA Test Lab has a duty to
protect the intellectual property embodied in systems under test. This requires written
policies and procedures to safeguard property.
[Section 2: Organizing and Operating Requirements] Test Suite Developers design and
develop automated SCA Test Suite tool applications for industry. Test Suite Developers
may acquire test software and/or platforms from 3rd-Party Developers. Any third-party
software tool must be approved by the Accreditation Body before it can be used for SCA
compliance testing.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 21
All Rights Reserved
[Section 3: SCA Test Lab Procedures and Deliverables] The test process attached to
each requirement should be categorized according to each of three (3) broad criteria:
o Type of test: Static or Dynamic
o Level of automation: Automated, Partially Automated, or Manual
o Type of pass result: Necessary, Sufficient, Both, or Neither
Because of differences in test coverage and type of result, overlapping tests may be
beneficial for some requirements.
[Section 3: SCA Test Lab Procedures and Deliverables] SCA Test Labs are encouraged
to engage with developers at the earliest possible point in the software development
process. This document identifies three progressively more advanced forms of developer-
coordinated compliance testing: pre-testing, pre-certification, and self-certification.
[Section 3: SCA Test Lab Procedures and Deliverables] In assembling an SCA Test
Lab Report, every effort should be made to ensure that the contents and conclusions of the
report are reproducible.
[Section 3: SCA Test Lab Procedures and Deliverables] Except by SCA version number
and possibly by domain profile within the scope of the relevant specification, a designation
of SCA Compliant should not be qualified in any other way.
[Section 4: Business Case Considerations] A large part of the SCA Test Lab financial
support is expected to originate from the fees it charges its customers for the services the
Test Lab provides.
[Section 4: Business Case Considerations] The SCA Test Lab operates in an ecosystem
and is inter-dependent on other organizations that impact the business case of the lab.
[Section 4: Business Case Considerations] The SCA Test Lab should be intimately
involved with the broader effort to promote, foster, expand, and proliferate the value
proposition of the SCA and the necessity of providing quality test services to SCA adopters.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 22
All Rights Reserved
6. Acronyms
Table 4: Table of Acronyms
Acronym Definition
API Application Programming Interface
CF Core Framework
CORBA Common Object Request Broker Architecture
COTS Commercial Off The Shelf
IDL Interface Description Language
IEC International Electrotechnical Commission
ISO International Organization for Standards
ITAR International Traffic in Arms Regulations
JPEO Joint Program Executive Office
JTAP JTRS Test Application
JTEL JTRS Test and Evaluation Laboratory
JTNC Joint Tactical Networking Center
JTRS Joint Tactical Radio System
OE Operating Environment
OS Operating System
RF Radio Frequency
SCA Software Communications Architecture
SDR Software Defined Radio
SDRF Software Defined Radio Forum
SW Software
T&E Test and Evaluation
TL Test Lab
US United States
WG Working Group
WInnF Wireless Innovation Forum
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 23
All Rights Reserved
Acronym Definition
WF Waveform
XML eXtensible Markup Language
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 24
All Rights Reserved
7. Appendix A - Test Lab Requirements
This section generalizes the requirements and responsibilities of an SCA Test Tab. The items are
provided as an enumeration of “shall” statements in the form of high-level descriptions. Where
necessary, the description of an item is accompanied by a clarifying rationale.
The SCA Test Lab Shall:
1. Verify and validate SCA requirements from one or more of the following groups:
a. The SCA Operating Environment (OE)
i. The Operating System as it pertains to the SCA (POSIX)
ii. The CORBA Middleware
iii. The Core Framework (CF)
b. The SCA Application Waveform (WF)
c. The SCA Application Programming Interface (API)
2. Verify that each SCA-based SDR is SCA compliant. This is defined as:
a. SCA Compliant is passing 100% of testable SCA Requirements (including or
excluding SCA APIs as per the test service agreement)
b. SCA Compliant with waivers is when all identified deficiencies are formally requested
and approved by all stakeholders
c. SCA Non-Compliant is when waivers on requirements have been found to be deficient
3. Maintain an accredited, reproducible, unambiguous test procedure for each individual SCA
requirement to be tested that terminates in a pass/pass with waiver/or fail conclusion for each
applicable test. The procedure being some combination of:
a. Automated static analysis tests
b. Runtime tests (actual radio or test harness)
c. Manual inspection of code or documentation artifacts
4. Have developed, licensed, or purchased the hardware and software tools necessary to
implement the test procedure for each individual SCA test.
5. Have developed a procedure for determining relevant SCA tests, and scheduling and applying
those tests.
6. Have a clear reporting format that includes the essential elements of the test environment and
results (see Appendix D)
7. Have policies and procedures in place to safeguard intellectual property under test
8. Use known public released documents for the verification of SCA Requirements
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 25
All Rights Reserved
9. Use and contain authorized SCA Test Lab copyrights, trademarks, and/or unique test markings
on all released documents that are distributed to stakeholders and throughout the SDR
Community.
10. Interface between all stakeholders to ensure clear and concise verification and validation of
test procedures
11. Interface with the SDR Community to ensure clear direction and understanding of the SCA
Test Lab's charter
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 26
All Rights Reserved
8. Appendix B - Test Lab Best Practices
This section describes responsibilities (“best practices”) the SCA Test Lab should consider adding
to their charter. These items are not considered requirements for an SCA Test Lab.
The SCA Test Lab should:
1. Encourage developers to engage with the Test Lab early in the development process.
2. Use software test applications that are developed from the Test Suite Developers
3. Advertise automated test methodologies to clients, the SDR Community, and other
organizations
4. Use testing methodologies that generate reproducible test reports
5. Consider remote testing of target platforms
a. The SCA Operating Environment (OE)
i. The Operating System as it pertains to the SCA (POSIX)
ii. The CORBA Middleware
iii. The Core Framework (CF)
b. The SCA Application Waveform (WF)
c. The SCA Application Programming Interface (API)
6. Staff the test team with software engineers
7. Encourage and promote business practices that are clear and concise. This includes all test
methodologies, software test releases, business plan, and other items.
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 27
All Rights Reserved
9. Appendix C – Items to be Submitted to Test Lab for Evaluation
Depending on the verification and validation tests to be performed, an SCA Test Lab should
determine which items the SCA Developer should submit to facilitate evaluation. Items to consider
for submission include:
Source code for the software application
o C/C++ source code
o CORBA IDL code (if applicable)
o SCA XML domain profile files
Build environment for the software application
o Source compiler, IDL compiler, supporting tools
o Build script that includes sequence information and build flags
E.g., –D and –I options
o Self-test for post-build validation (confirm complete/correct build)
Test hardware platform for the software application
o Appropriate connections to external test drivers
Design documentation
o Applicable version of the SCA
o Applicable units of functionality
o List of components
Component interfaces
User documentation
List of SCA waivers granted and/or requested (if any)
List of known issues (if any)
SCA T&E WG
SCA Test Lab
WINNF-12-P-0005-V1.0.0
Copyright © 2016 The Software Defined Radio Forum Inc. Page 28
All Rights Reserved
10. Appendix D – Contents of a Test Lab Report
An SCA Test Lab report should contain all of the information necessary to achieve three goals:
1. To convey the conditions under which the test was performed
2. To convey the results of the testing performed, with error reports sufficient to allow the
customer to isolate the root cause of any problems
3. To allow, if necessary, the test to be reproduced
To achieve these goals a test report should contain the all of the following information:
Title of the report
Unique report reference number
Product being tested, with version number or other identification sufficient to uniquely
identify the specific software being tested
Name(s) of the engineer(s) performing the testing
Date and time the testing was performed
Comments (if applicable) on purpose of testing
Description of the test environment, including (as necessary) for each test tool
o Version of the test tool
o Hardware platform
o Operating system and version of test platform
o Any configuration files or configuration settings used by the test platform
o Command-line (if applicable) used to invoke the tool
Requirements in scope of testing (each tool), including
o References to SCA sections to which requirements pertain
o References to documentation describing what the test does or does not cover
relevant to the requirement
o References to where, and under what license, the test tool may be obtained
Record of any waivers or test variances (special conditions) applied in testing
Status of each requirement test scope (e.g., Pass/Fail/Not Applicable/Not Testable)
Summary of test coverage
Summary of requirement status
Test engineer comments on requirement status, possibly including
o Reason for result
o Suggested steps for mitigation/repair
o Process to resubmit for testing
Wall-clock time to complete the test
Recommendations on acceptance or further action(s) to the Certification Body
Recipients/classification of the test report
Signature of the test engineer(s) certifying the correctness/completeness of the report
Contact information for the performing test lab