vfnl atlas to nis interfaces - detail test plan_v0.1

26
Vodafone Netherlands Terminal NL Wave 3 Detailed Test Plan Date Printed: CONFIDENTIAL Page 1 Save Date: Author Surajit Bagchi Project Manager- Vodafone [email protected] Status Initial Draft Status Date 22- Dec-2014

Upload: manjunathganguly771275

Post on 23-Dec-2015

221 views

Category:

Documents


3 download

DESCRIPTION

sdsdsdsd

TRANSCRIPT

Page 1: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

Vodafone Netherlands Terminal NL Wave 3

Detailed Test Plan

Date Printed: CONFIDENTIAL Page 1 Save Date:

Author Surajit Bagchi

Project Manager- [email protected]

Status Initial Draft

Status Date  22-Dec-2014

Page 2: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

DOCUMENT CONTROL

Modification HistoryVersion Author(s) Date Released Notes

0.1 Surajit Bagchi 22 Dec 2014 Initial draft

Review HistoryVersion Reviewed Reviewer Date Reviewed Notes

0.1 [email protected]

Approval HistoryVersion Approved Approver Date Approved Notes

Additional Distribution ListRecipient Date received

Asim K Chatterjee

Page 2 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 3: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

CONTENT

Document Control.......................................................................................................2Modification History........................................................................................................................ 2Review History............................................................................................................................... 2Approval History............................................................................................................................ 2Additional Distribution List..............................................................................................................2

Content....................................................................................................................... 31 Introduction...........................................................................................................5

1.1 Overview............................................................................................................................. 51.2 Identification........................................................................................................................ 51.3 References.......................................................................................................................... 51.4 Notations and Abbreviations:...............................................................................................5

2 Background...........................................................................................................62.1 High Level Data Model:.......................................................................................................6

3 Test Objectives.....................................................................................................74 Test Scope:...........................................................................................................8

4.1 Out of scope........................................................................................................................ 85 Assumptions..........................................................................................................96 Test Strategy.......................................................................................................10

6.1 High Level overview of system test approach.....................................................................106.2 Test Goals:........................................................................................................................ 116.3 Types of Testing................................................................................................................116.4 Test Approach – System Testing.......................................................................................116.5 Risk Assessment...............................................................................................................126.6 Entry / Exit Criteria............................................................................................................12

6.6.1 System Testing..........................................................................................................127 Test Plan.............................................................................................................13

7.1 Test Planning Workshops..................................................................................................137.2 Roles and responsibilities..................................................................................................137.3 Test Schedule................................................................................................................... 14

8 Test Environment Build Strategy.........................................................................158.1 Test Environment..............................................................................................................158.2 Test Data Strategy............................................................................................................158.3 Test Tools......................................................................................................................... 158.4 Configuration Management................................................................................................15

9 Test Management & Reporting Procedures........................................................169.1 Test Management.............................................................................................................16

9.1.1 Defect Tracking/Management Procedures..................................................................169.1.2 Change Management Procedures..............................................................................189.1.3 Progress Tracking Procedures....................................................................................18

9.2 Reporting.......................................................................................................................... 189.2.1 Test Progress Reporting.............................................................................................18

10 Test Deliverables:...............................................................................................1910.1 System Test Cases........................................................................................................1910.2 Traceability Matrix:.........................................................................................................1910.3 Test Result Report:........................................................................................................1910.4 QA Summary Report:.....................................................................................................1910.5 System Test End Report:...............................................................................................19

Page 3 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 4: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

1 INTRODUCTION

1.1 OverviewFor Local Integration layer (LIL/ ODS), existing Article Master Stock interface IF0161 from 3TM to LIL will be enhanced due to Business process changes/ decision to use EAN # as additional article numbers, Thuiskopijheffing (Copy Right levy) and technical issues experienced in the current interface flows (e.g. not being able to correctly identify deactivated articles). And some pass through interfaces between 3TM and AXY-RS/ Dynafix via LIL.

1.2 IdentificationThe purpose of the document is to create a complete and consistent detail test plan that provides high level testing guidance and insight in the following levels of tests:

System Testing/SIT– Verify the correctness of the new functionalities implemented and check whether the system as a whole with all modules is working as expected with available test data.

Vodafone BICC UAT Support: Support will be provided during UAT as per the scope mentioned in UAT test strategy provided by BICC.

This document will focus majorly on system testing details- e.g. approach, strategy etc.

1.3 ReferencesThis document is based on and refers to the following documents:

Sr. No. Title Version Date

1 VF_NL HLD Terminal NL Wave 3 ver.1.0.doc 1.0 21-Nov-2014

2VF_NL_BI Detailed Design - Terminal_NL_Wave_3_V1.10.doc

1.10 114-Dec-2014

1.4 Notations and Abbreviations:Please refer the below table for notations and abbreviations used in this document:

Abbreviation/Notation DescriptionHLD High Level DesignLLD Low Level DesignMM Material MasterLIL Local Integration Layer

Page 4 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 5: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

2 BACKGROUND

2.1 High Level Data Model:Below is the charcoal diagram for the project:

Page 5 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 6: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

3 TEST OBJECTIVES

Testing is the systematic search for defects in all project deliverables. It is the process of examining an output of a process under consideration, comparing the results against a set of pre-determined expectations, and dealing with the variances.

The main test objectives are:

Identify and publish the scope and substance of the testing effort Identify and publish risks to the successful completion of the project Verify functional correctness from a design point of view Verify functional correctness from system behaviour Test product robustness and stability Report on the status of the product regarding the tests and measures

Page 6 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 7: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

4 TEST SCOPE:The scope of this document concerns with testing of source data processing coming from different source systems. This comprises:

System Test/SIT: Verify the test goals mentioned in this document. VF BICC UAT Support: Support will be provided during UAT based on the scope of IBM-BI.

Scope of UAT support will be as defined in UAT strategy document provided by VF BICC.

Test Deliverables – for System Test:

Sr. No.

Deliverable Name Remark

1 Detail test planThis document will contain test scope, test approach and all detail on different test activity

2 System test casesWill contain functionality covered, steps to follow, expected output, input test data

3 System Test Result Will contain execution status of each test case.

4 QA Summary ReportWill contain daily execution status report along with defect details. At the end of each level of testing IBM will provide the overall project health along with Go, No-Go suggestion for the next level.

4.1 Out of scopeThe out of scope items for testing are mentioned below:

Article (Item) Interfaces from LIL to ATLAS are out of scope. Interface from LIL to Downstream Systems containing the Article Stock information is out of

scope. Any test goal that is not mentioned in this document. Infrastructure support/system administration is out of scope. Vodafone Contracting Company and

Supplier Contracting Company will jointly initiate and ensure in time systems access. Test Data Creation – IBM will not create any test data manually. IBM will receive all required test

data from source systems as mentioned in detailed design document prior to system testing to cover all possible test scenarios.

Performance & Automation Testing Verification/validation of any functionality not mentioned in detailed design document Non functional testing

Page 7 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 8: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

5 ASSUMPTIONS

The assumptions for performing the test activities are mentioned below:

No. Assumption Comment

1IBM will receive test data from stock interface IF0161 (from 3TM) to LIL prior to system test starts

2 Connectivity should be established between 3TM and LIL

3During SIT/VF BICC UAT phase IBM-BI will provide support from 9 AM to 6 PM IST.

4Test environment will be ready with test data prior to system test starts.

Page 8 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 9: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

6 TEST STRATEGY

This section describes the approach and methodologies to plan, organize and manage the testing. The test strategy is business driven and aims at:

-finding major defects as early as possible-making the critical path of the testing phase as short as possible

Detail Test Plan Describes the test approach, procedure and planning for test activities. Specifies all the test activities to be performed for verifying the functionalities.

Test Designing

Test Case Identification:

Test goals need to be identified based on detailed design document.

Test cases need to be created based on test goals.

Requirements traceability matrix will be created to trace the test coverage between test goals and test cases.

Test Environment Requirement:

Environment details (Unix box name, database name) for each test level will be provided in QA Summary Report document.

Test execution.Initiation of the test execution will be based on the fulfillment of entry criteria mentioned in this document.All identified System test cases will be executed as part of system test.

Success or failure of any test function/feature will be determined based on difference between expected result and actual observation. Defects will be logged for failed test cases.

Test Reporting:It is the process of reporting the progress of testing activities in each test level such as new identified test cases, deviations from the baseline (test design).Furthermore the progress of execution, results of the tests including issues found, root cause identification, known issues. A daily progress report will be sent during system test execution.

6.1 High Level overview of system test approach Connectivity establishment:

Connectivity verification from 3TM (interface IF0161) to LIL

Source file Validation:

Verification of source file IF0161_O_<YYYYMMDDHHMMSS>.txt

Page 9 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 10: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

Verification of mapping at LIL/ODS as mentioned in detailed design document

Data Loading:

Verify data loading as mentioned in detailed design documentData validation once data loading completed- randomly selected data will be validated

Mapping:

Validate attribute level mapping between source and target data as per the mapping sheet mentioned in detailed design document

6.2 Test Goals:The following are the test goals based on business functionality:

6.3 Types of TestingThis section defines the types of testing, which relate to business functions (Functional Types) or to structural functions (Non-functional Types). These test types will be mapped to the various test types which will prove system compliance.

Functional Types of TestThe following is a list of all functional types of tests which will be performed for the project:

System Test/SIT Test goals mentioned in section# 6.2VF BICC UAT Support

Support will be provided during UAT phase. Details of support will be defined in BICC UAT plan document.

Non-functional Types of TestNA

6.4 Test Approach – System Testing

Sl# Test Area Approach Details Dependency Comment

1 Connectivity TestVerify whether source data is coming from 3TM to LIL in the landing directory as mentioned in detailed design document

Connectivity should be set between 3TM interface to LIL

2 Interface TestValidation of each source data based on latest detailed design document.

Source data should be available prior to system test.

3 Data LoadingVerify all details related to data loading as mentioned in detailed design document

Source data should be available prior to system test

4 Mapping ValidationAttribute level validation from source to target based on the mapping sheet mentioned in the detailed design document

Updated mapping sheet should be available

Page 10 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 11: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

6.5 Risk AssessmentThis section comprises a detailed risk assessment of the test strategy, driven by the business decisions, business requirements and technical requirements of the solution under development.

Sr. No.

Description of Risk & Impact Rank(H/M/L)

Mitigation Strategy & Status

1 Test data will not available prior to system test MCommunicate VF test manager and source system in case test data not received in time.

6.6 Entry / Exit Criteria This section defines the entry and exit criteria for the System Test

6.6.1 System Testing

Entry Criteria

Coding and unit testing completed successfully

Unit test result and all open issues are documented and available

All test hardware platforms are identified, installed, configured and verified- ready for testing. E.g identify LIL test server, Teradata server.

All applicable standard software tools including testing tools must have been successfully installed.

All required accesses are granted for each team member.- E.g. Teradata access for testing team.

All documentation and design of the architecture must be made available.

Test cases and test plans are reviewed and signed off by VF test manager

A separate test environment must be available.

Required test data is available in test environment.

Exit Criteria

All System Test Cases are executed and execution status is recorded At least 95% of test cases executed are passed- out of which all critical and high priority test cases

are passed. Go ahead with failed test cases will be decided based on mutual agreement with IBM Test Manager and VF Test Manager.

All open defects are reviewed and agreed by Vodafone for go-ahead in next test level IBM has delivered a final and signed-off ST End Report (or QA report) to Vodafone- ensuring no

blocking defect is open and maximum of 5 severe defect can be in open status.

Page 11 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 12: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

7 TEST PLAN

7.1 Test Planning WorkshopsAfter the initial version of this document is distributed and made available to all stakeholders, review will be done on this document. Once review feedback is received by all stakeholders, document will be modified and send again for review. This process will be completed based on the defined milestone date once all stakeholders are agreed to test planning mentioned in this document.

7.2 Roles and responsibilitiesThe following table summarizes the testing responsibilities.

Roles Responsibilities

IBM Project Manager Review, Provide updates and maintain the detail test plan Manage test plan execution 1st point of contact for issue resolution during all testing phases

IBM Test Lead

Support Project Manager/Test Manager in the following areas:

Prepare/modify detailed test plans Define/identify test scope Manage and Maintain test cases/scripts Manage test execution and monitor defects Manage and monitor test progress during all testing phases Reporting test progress to VFNL test manager and IBM PM.

Testers

Identification and understanding of test requirements. Test Case/Test Script Preparation and review Preparation/Identification of Test data Verify test environment Test Execution Logging defects and verify fixes once available Providing traceability of test cases to Requirements to ensure proper test

coverage Reporting test case/script creation/execution status to Test Lead/Test Manager Prepare Test Result Report

VF Test Manager

Review and approve test deliverables Co-ordinate with other teams along with IBM Test Manager/Test Lead to

complete all levels of testing 1st level of contact point for issue resolution raised by IBM (as far as not IBM

internal)

Page 12 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 13: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

VF BICC Team

Create UAT test plan with required support from each vendor. Discuss and agree with IBM regarding support required for UAT environment

setup, test data identification/creation, test execution etc. Share UAT test scenarios of IBM scope.

7.3 Test ScheduleAttached is the major testing milestones for this project. Any change in the timeline will result in modification of test milestones below.

Sl# MilestonePlanned Start

DatePlanned End

DateComment

1 Detailed Test Plan Creation 8-Dec-2014 22-Dec-20142 System Test Case Creation 10-Dec-2014 24-Dec-20143 System Test/SIT execution 29-Dec-2014 9-Jan-20154 UAT Support 12-Jan-2015 30-Jan-20155 IBM Stability Test 03-Feb-2015 06-Mar-20156 Roll Out & Go live phase 09-Mar-2015 13-Mar-20157 Babysit/Stabilisation Period 14-Mar-2015 10-Apr-2015

Page 13 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 14: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

8 TEST ENVIRONMENT BUILD STRATEGY

8.1 Test EnvironmentThis section defines test environment requirements so that the necessary hardware, software, tools and other infrastructure components can be assembled to create the test environment.

System Test/SIT will be performed in stand alone system test environment connected to FileExpress server to receive source data from different source systems.

VF BICC UAT support will be provided in UAT environment created and maintained by IBM (will be done based on UAT plan provided by VF BICC).

8.2 Test Data StrategyTest data will be received from 3TM source system as menbtioned in detailed design document.

8.3 Test Tools Activities Tool Version

Test Management Microsoft Excel Excel-2003 Defect Tracking Microsoft Excel Excel-2003 Telnet Putty, Winscp Putty v 0.60DBMS tools sql scripts NA

8.4 Configuration Management The build management process must be followed at all times. Promotion of code from the

development environment to the test environment will not be automatic. A process, which includes the use of configuration management, will be created to manage code promotions. Email notification will be sent to the test leads before code can be promoted to the test environment.

During System Test, the configuration management team will be restricted from promoting any changes or defect correction to the test environment without approval from the test manager.

Builds will be delivered with unique names or numbers. Release notes defining the defect fixes included in the build will be delivered to the test team prior to the installation of each build in to the test environment.

Environment changes will not be accepted. If there is a need for any of these changes it will need to be handled via the Project Change control process. Environment changes include changes to the version of the OS, User access control or network control devices which may impact the interface testing.

No separate configuration management tool will be used for version controlling of test assets.

Page 14 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 15: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

9 TEST MANAGEMENT & REPORTING PROCEDURES

9.1 Test Management Each phase of testing will be coordinated by team lead for the phase’s responsible group. VF test manager will be the escalation point for IBM-BI. Also VF test manager will track and monitor the progress of testing and all issues related to testing.

9.1.1 Defect Tracking/Management Procedures Defects will be logged and tracked through Microsoft Excel tool. Purpose: Effectively manage the monitoring and resolution of defects found in deliverables during the project

lifecycle. Ensure that all defects are resolved with minimum impact to the project. Ensure that audit trails of defects are maintained and analyzed. Capture the information needed in order to perform root cause analysis at the project level, to identify

opportunities for improving the quality of both the work product and the process with which it was created.

Definition of Defect:It is an aspect of a work product or a deliverable that does not meet its intended purpose. In contrast to nonconformity, a defect can be viewed as an inadequacy with respect to needs or general expectations.All defects that are found during work product inspections and testing will be logged in defect tracker.

Page 15 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 16: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

Defect Workflow:

This procedure ends when: Defects have been closed and any analysis of trends and root cause has been completed.

Page 16 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Log Defect in defect tracker

Defect Verified by Dev Team

Defect Accepted?

Close Defect

Assign Priority

Need to fix?

Defect is deferred

Defect is fixed and sent for verification

Assign Developer to fix

Assign tester to verify

Verification failed?

Close Defect

No

Yes

No

Yes

No

Yes

Page 17: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

Overview of the definitions of defect severity:Defect Severity Severity Definition

Blocking Catastrophic defect with complete loss of a service. Work cannot reasonably continue, the operation is mission critical to the business and there is no work around. In general, a blocking defect would prevent the product from being released or would involve high cost. E.g. Defect that cause system to crash.

Severe Defect results in severely loss of a service. A work around may exist but its use is unsatisfactory on the long term.

Minor Defect causes failure of non-critical aspects of the system. There is a reasonably satisfactory work around.

Informational Defect of minor significance. A work around exists or, if not, the impairment is slight. E.g. A button is slightly off centre on a data screen.

9.1.2 Change Management ProceduresChange management process will be followed as per current IBM process.

9.1.3 Progress Tracking ProceduresThe purpose of the procedure is to keep test progress consistent with the plan and raise a flag if there is any risk of deviation. Test progress and risks are tracked against the plan. Then decisions must are made to ensure progress stays within the plan or to update the plan

9.2 Reporting

9.2.1 Test Progress ReportingThe relevant process reports for this test project are:

Test Result Report - This report will describe the status of each system test case with its steps as pass or failed along with associated defects. This report will be sent to VF within a week of system test completion. This is applicable only for system testing.

QA Summary Report: This report will contain the summary of test case execution status, % of test cases executed vs pass/fail, defect summary report, risks and mitigation plan, overall project status. This will be sent to VF with regular interval of each test phases.

System Test End Report – Includes summary of test execution report, summary of test result report, summary of open defects and is signed off by the IBM project manager stating that IBM is confident (or not) to proceed to the SIT. The report is delivered to the VF Test Manager before the start of the SIT. This is the same document “QA Summary Report”.

.

Page 17 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM

Page 18: VFNL ATLAS to NIS Interfaces - Detail Test Plan_v0.1

10 TEST DELIVERABLES:

10.1 System Test CasesSystem test cases will be documented and tracked through excel. Test cases will be maintained as a separate document.

10.2 Traceability Matrix:Traceability matrix is to map the test cases with test goals and vice versa. Traceability matrix will be maintained in “QA Summary Report” along with test coverage.

10.3 Test Result Report:Test result report will be created for system testing only. This report contains a detailed execution status of each test case along with associated defects. Test result report will be maintained as a separate document.

10.4 QA Summary Report:QA Summary Report will contain day to day execution status in each test level. This report includes:

Test case execution summary report % of test cases pass/fail report Defect summary report Risk and mitigation plan Overall project health based on the status of above reports

10.5 System Test End Report:This is similar to QA Summary Report which will be captured at the end of system test. This report will contain all the details as mentioned in QA Summary Report along with IBM advice to go/no-go for the next level of testing.

Page 18 Information Security Level 2 - Sensitive

Proprietary and Confidential Information of Vodafone and IBM