Download - doc2.doc
QUALITY ASSESSMENT PROCESS DEFINITION
Link to Table of Contents: tablel.doc
Description:
This document defines the process of assessing the quality of the organization’s software
development process. The purpose of this assessment is to review and evaluate the execution of
an ongoing, software development process in order to verify that it is carried out according to the
Project Management Process Definition (with its supporting documents) and that this Definition
is adequate to provide the required degree of product quality. Thus, objectives of this process
are, primarily, to discover ways to improve the Definition itself for subsequent projects, and,
secondarily, to provide management with recommendations on ways to modify the conduct of
the project in order to maintain conformance with the Definition.
Entry Conditions:
1) The Project Management Process has produced a Software Development Process Plan.
Input Summary:
1) Software Development Process Plan organized according to the project model.
2) Project Schedule Plan from the Project Scheduling Process.
3) Project Tracking Plan from the Project Management Process.
4) Project Quality Assessment Plan.
5) Project Quality Assessment Plan with data collection scheduled according to the Project
Schedule Plan.
6) Monitor Data Definition Document.
7) Monitor Data History Archive.
8) Data and reports reflecting project performance and product quality from the software
development, tracking and scheduling process.
Implementation Conditions:
1) The organization’s management appoints a Process Quality Assessment Board that does not
include members of the acceptance testing and software project teams (including their direct
management).
2) Create a Project Quality Assessment Plan (or use an existing one) that is correlated with the
specific phrases in the Software Development Process Plan and the Project Scheduling and
Tracking processes according to the following conditions:
2.1) It is recommended that this Plan have the form shown in the Project Quality Assessment
Plan document.
2.2) It is essential that the Project Quality Assessment Plan contain an approved Process
Quality Assessment Plan for the Project Scheduling and Tracking Processes and for each
sub process listed in the Software Development Process Plan.
2.3) It is essential that each of these Process Quality Assessment Plans describe in detail the
quality assessment and reporting activities to be carried out by the Process Quality
Assessment Board and the software project staff during the lift of the corresponding
process.
2.4) It is essential that each of these Process Quality Assessment Plans require a written
Quality Assessment Report.
2.5) It is recommended that the content of each Process Quality Assessment Report provide
criteria for assessing the items listed in the Process Quality Report Content form.
3) Inform the Project Scheduling team about the Project Quality Assessment Plan so that it can
schedule the data collection activities the Plan requires. This produces a scheduled version of
the Project Quality Assessment Plan.
4) Schedule the activities of the Process Quality Assessment Board in conformity to the
scheduled version of the Project Quality Assessment Plan received from the Project
Scheduling Process. Inform the Project Scheduling and Tracking teams of the Board’s
schedule.
It is essential that the Process Quality Assessment Board perform the following activities:
1) Carry out the steps in the Process Quality Assessment Plan, making use of the data in the
Monitor Data History Archive.
2) Analyze the managerial structure that influences the quality of the software for
appropriateness and proper function and ascertain to whether or not the positions within the
structure have clearly defined tasks and responsibilities.
3) Monitor adherence to all standards and practices and communicate with the project
management its observations about the degree of adherence. This requires evaluating the
artifacts produced during the project.
4) Examine project review and audit procedures to determine their adequacy and ascertain the
extent to which the project adheres to them.
5) Revise all plans, procedures, cases and reports relating to testing (unit, integration, system
and acceptance) to determine their adequacy and the project’s adherence to them.
6) Review the change-handling procedures for adequacy and monitor the project for compliance
to them. “Adequacy” here means that change orders are tracked from initiation to completion
and that the causes of problems giving rise to changes are eliminated.
7) Review the methods and facilities used to maintain, store, secure and register controlled
versions of artifacts (e.g. software) to determine whether or net they are adequate and are
employed correctly
8) Check software acquires from external suppliers against project standards and determine to
what extent each subcontractor adheres to a documented software engineering process that
meets the standards required by the organization’s Software Engineering Process.
9) Determine to what extent all project staff have the necessary skills for the positions to which
they are assigned. Recommend to the project management the training necessary for the staff
to achieve this in case of deficiencies.
10) Record in the Monitor Data History Archive the information required by the Monitor Data
Definition Document concerning the Board’s activities.
11) Submit its final report and terminate its work only when all of the required post- delivery
reviews have been completed and evaluated.
It is recommended that the Process Quality Assessment Board:
1) Monitor the execution of the change-control activities of the project and examine trends in the
occurrence of problems.
2) Evaluate the appropriateness of the methods, tools and techniques employed in the project
and monitor their correct application throughout the project.
3) Monitor the risk assessment and management activities of the project and advise the project
management on appropriate methods for carrying out these activities.
Output Summary:
1) Project Quality Assessment Plan correlated with the Software Development Plan and the
Project Schedule Plan.
2) Quality Assessment Reports.
3) Updated Monitor Data History Archive.
Exit Conditions:
All reports of the Process Quality Assessment Board have been submitted to the software
development team management and the data required by the Monitor Data Definition Document
has been recorded in the organization’s Monitor Data History Archive.
Notes: None.
1. Initial Project Schedule Plan Process Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Initial Project Schedule Plan
Process. This assessment process has three primary goals:
1) Review the process that produces the Initial Project Schedule Plan of the software product
from the Objectives Document, Product Specifications Document, Monitor Data History
Archives, High Level Design Document, and Software Development Process Definition..
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment
Report on Initial Project Schedule Plan.
3) Write the Quality Assessment Report on Initial Project Schedule Plan, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
1) The Product Specifications Document has been initiated.
2) The High Level Design Document has been initiated.
3) Near-Term Schedule Plan has been initiated..
4) The Initial Project Schedule Plan has been initiated.
Input Summary:
1) The Objectives Document.
2) Available Product Specifications Draft..
3) Available High Level Design Draft
3) Documentation Standards Document.
4) Project Schedule Process Definition.
5) Change Control Process Definition
6) Monitor Data Definition
7) Monitor Data History Archive.
8) Documents from reviews conducted during Initial Project Schedule Plan.
9) Kickoff Initial Project Schedule Planning Meeting Minutes..
10) Full Review Meeting Minutes.
11) Project Checklist.
12) Near-Term Project Schedule Plan.
13) Overview Project Schedule Plan.
14) Activity Responsibility Matrix.
15) Gnat charts for owners of activities.
16) Initial Project Schedule Plan.
17) Quality Assessment Schedule Plan.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Initial Project Schedule Plan.
1.2) Near-term Project Schedule Plan.
1.3) Project Checklist.
1.4) Activity Responsibility Matrix.
1.5) Overview Project Schedule Plan
1.6) Gantt charts for owners of activities.
1.7) Quality Assessment Schedule Plan.
1.8) All documents containing data on management and document quality
generated by the Initial Project Schedule Plan.
2) Evaluate the process that produces the Initial Project Schedule Plan, including its
management component, using the relevant data in the Monitor Data History Archive as
standards.
2.1) Evaluate implementation of change control process for Initial Project
Schedule Plan phase, using the following forms as the basis for the evaluation:
2.1.1) change control form,
2.1.2) change proposal status form,
2.1.3) implementation schedule document.
3) Complete the Quality Assessment Report on Initial Project Schedule Plan.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Initial Project Schedule Plan.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Initial Project Schedule Plan.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Initial
Project Schedule Plan Process.
3) The Quality Assessment Report on Initial Project Schedule Plan is available to
management.
Notes: None.
QUALITY ASSESSMENT REPORT ON INITIAL PROJECT SCHEDULE PLAN
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Initial Project Schedule Plan. The questionnaire below provides evaluation criteria
to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the Initial Project Schedule Plan process appropriately initiated relative to the
other phases of the software development process? ______
1.2) Were activities prioritized adequately? ______
1.3) Were the resources (including personnel) adequately and appropriately allocated?
______
1.4) Were the tasks clearly defined and assigned an owner?
______
1.5) Were the number of meetings held insufficient for completing the Initial Project
Schedule Plan process in a satisfactory manner? ______
1.6) Were too many meetings needed to complete the Initial Project Schedule Plan process in
a
satisfactory manner? ______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Were costs and scheduling monitored? ______
1.9) Were management approvals given appropriately and in a timely fashion?
______
2) Scheduling Management:
2.1) Did all phases scheduled actually conform to the schedule? If not, describe the
deviations, reasons and costs.
______
2.2) Were the number of deviations from the Initial Project Schedule Plan at an acceptable
level? ______
2.3) Were activity responsibilities correctly allocated? ______
2.4) Were all activity owners able to carry out their responsibilities adequately? ______
3) Documentation:
3.1) Were the Entry Conditions for the Initial Project Schedule Plan process satisfied prior to
commencing the process? ______
3.2) Does the Input Summary for the Initial Project Schedule Plan process include all needed
and/or relevant input? ______
3.3) Were the Implementation Conditions of the Initial Project Schedule Plan process all
satisfied? ______
3.4) Were additional implementation conditions needed for the Initial Project Schedule
Plan process? ______
3.5) Does the Output Summary of the Initial Project Schedule Plan process include
all needed and/or relevant output? ______
3.6) Were the Exit conditions for the Initial Project Schedule Plan process satisfied
no later than when the Initial Project Schedule Plan was approved?
______
4) Documentation Standards:
4.1) Does the Initial Project Schedule Plan Process Definition Document satisfy the
conditions in the Documentation Standards?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
5.1.1) The numerical data required for the Initial Project Schedule Plan Process in the
Monitor Data Definition
______
6) Reviews and Audits:
6.1) Were the following reviews and inspections performed satisfactorily by the
Initial Project Schedule Plan team?
6.1.1) Examination of the Objectives document ______
6.1.2) Examination of the Requirements document.
______
6.1.3) Examination of available Specifications document draft ______
6.1.4) Examination of available High Level Design document draft ______
6.2) Did the final draft of the Initial Project Schedule Plan meet approval,
and if not, was there closure on all exceptions to the approval conditions?
______
6.3) Was the approved version of the Initial Project Schedule Plan archived under change
control for subsequent update? ______
7) Standards:
7.1) Did the Initial Project Schedule Plan and forms and documents produced for the
Initial Project Schedule Plan meet the standards contained in the Initial Project
Schedule Process Definition?
______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Initial Project Schedule Plan?
______
8.3) Was the Schedule Change Plan adhered to? ______
8) Initial Project Schedule Plan Control:
8.1) Was the Initial Project Schedule Plan Process Definition adhered to? ______
9) Media and Hardware Control:
9.1) Were all documents generated during the Initial Project Schedule Plan process
maintained
on the appropriate media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the Initial Project Schedule Plan process
recorded appropriately and made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Initial Project Schedule Plan process? ______
11.2) Was adequate training about the Initial Project Schedule Plan process provided to the
staff prior to its being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
2. Final Project Schedule Plan Process Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Final Project Schedule Plan
Process. This assessment process has three primary goals:
1) Review the process that produces the Final Project Schedule Plan of the software product
from the Objectives Document, Product Specifications Document, Monitor Data History
Archives, High Level Design Document, and Software Development Process Definition.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report
on Final Project Schedule Plan.
3) Write the Quality Assessment Report on Final Project Schedule Plan, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
1) The Product Specifications Document has been initiated.
2) The High Level Design Document has been initiated.
3) The Final Project Schedule Plan has been initiated.
4) The Initial Project Schedule Plan has been approved.
Input Summary:
1) The Objectives Document.
2) The Product Specifications Document..
3) The High Level Design Document
3) Documentation Standards Document.
4) Project Schedule Process Definition.
5) Initial Project Schedule Plan.
6) Change Control Process Definition
7) Monitor Data Definition
8) Monitor Data History Archive.
9) Documents from reviews conducted during Final Project Schedule Plan.
10) Kickoff Final Project Schedule Planning Meeting Minutes..
11) Full Review Meeting Minutes.
12) Overview Project Schedule Plan
13) Project Checklist.
14) Activity Responsibility Matrix.
14) Gantt charts for owners of activities.
15) Final Project Schedule Plan.
16) Quality Assessment Schedule Plan.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Final Project Schedule Plan.
1.2) Overview Project Schedule Plan.
1.3) Project Checklist.
1.4) Activity Responsibility Matrix.
1.5) Gantt charts for owners of activities.
1.6) Quality Assessment Schedule Plan.
1.7) All documents containing data on management and document quality generated
by the Final Project Schedule Plan.
2) Evaluate the process that produces the Final Project Schedule Plan, including its management
component, using the relevant data in the Monitor Data History Archive as standards.
2.1) Evaluate implementation of the change control process for Final Project
Schedule Plan phase, using the following forms as the basis for the evaluation:
2.1.1) change control form,
2.1.2) change proposal status form,
2.1.3) implementation schedule document.
3) Complete the Quality Assessment Report on Final Project Schedule Plan.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Final Project Schedule Plan.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Final Project Schedule Plan.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Final
Project Schedule Plan Process.
3) The Quality Assessment Report on Final Project Schedule Plan is available to
management.
Notes: None.
QUALITY ASSESSMENT REPORT ON FINAL PROJECT SCHEDULE PLAN
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Final Project Schedule Plan. The questionnaire below provides evaluation criteria
to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the Final Project Schedule Plan process appropriately initiated relative to the
other phases of the software development process? ______
1.2) Were activities prioritized adequately? ______
1.3) Were the resources (including personnel) adequately and appropriately allocated?
______
1.4) Were the tasks clearly defined and assigned an owner?
______
1.5) Were the number of meetings held insufficient for completing the Final Project
Schedule Plan process in a satisfactory manner? ______
1.6) Were too many meetings needed to complete the Final Project Schedule Plan process in
a
satisfactory manner? ______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Were costs and scheduling monitored? ______
1.9) Were management approvals given appropriately and in a timely fashion?
______
2) Scheduling Management:
2.1) Did all phases scheduled actually conform to the schedule? If not, describe the
deviations, reasons and costs.
______
2.2) Were the number of deviations from the Final Project Schedule Plan at an acceptable
level? ______
2.3) Were activity responsibilities correctly allocated? ______
2.4) Were all activity owners able to carry out their responsibilities adequately? ______
3) Documentation:
3.1) Were the Entry Conditions for the Final Project Schedule Plan process satisfied prior to
commencing the process? ______
3.2) Does the Input Summary for the Final Project Schedule Plan process include all needed
and/or relevant input? ______
3.3) Were the Implementation Conditions of the Final Project Schedule Plan process all
satisfied? ______
3.4) Were additional implementation conditions needed for the Final Project Schedule
Plan process? ______
3.5) Does the Output Summary of the Final Project Schedule Plan process include
all needed and/or relevant output? ______
3.6) Were the Exit conditions for the Final Project Schedule Plan process satisfied
no later than when the Final Project Schedule Plan was approved? ______
4) Documentation Standards:
4.1) Does the Final Project Schedule Plan Process Definition Document satisfy the
conditions in the Documentation Standards?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
5.1.1) The numerical data required for the Final Project Schedule Plan Process in the
Monitor Data Definition
______
6) Reviews and Audits:
6.1) Were the following reviews and inspections performed satisfactorily by the
Final Project Schedule Plan team?
6.1.1) Examination of the Objectives document ______
6.1.2) Examination of the Requirements document.
______
6.1.3) Examination of the Specifications document ______
6.1.4) Examination of the High Level Design document. ______
6.1.5) Examination of the Initial Project Schedule Plan
______
6.2) Did the final draft of the Final Project Schedule Plan meet approval,
and if not, was there closure on all exceptions to the approval conditions?
______
6.3) Was the approved version of the Final Project Schedule Plan archived under change
control for subsequent update? ______
7) Standards:
7.1) Did the Final Project Schedule Plan and forms and documents produced for the
Final Project Schedule Plan meet the standards contained in the Final Project
Schedule Process Definition? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Final Project Schedule Plan?______
8.3) Was the Schedule Change Plan adhered to? ______
9) Final Project Schedule Plan Control:
9.1) Was the Final Project Schedule Plan Process Definition adhered to? ______
10) Media and Hardware Control:
10.1) Were all documents generated during the Final Project Schedule Plan process
maintained
on the appropriate media according to the Media Control Standards?
______
11) Records:
11.1) Were all documents generated during the Final Project Schedule Plan process
recorded appropriately and made accessible to the project staff? ______
12) Training:
12.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the Final Project Schedule Plan process?
______
12.2) Was adequate training about the Final Project Schedule Plan process provided to the
staff prior to its being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
3. Project Tracking Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Project Tracking Process. This
assessment process has three primary goals:
1) Review the process that produces the objectives of the software product from the
requirements document.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Project
Tracking Process.
3) Write the Quality Assessment Report on Project Tracking Process, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization's other projects.
Entry Conditions:
Availability of the following artifacts:
1) Requirements Document.
2) Objectives Document.
3) Specifications Document.
4) Documentation Standards.
5) Project Scheduling Plan for this phase.
6) Change Control Process Definition Document.
7) Monitor Data Definition Document.
8) Monitor Data History Archive.
Input Summary:
1) Requirements Document.
2) Objectives Document.
3) Specifications Document.
4) Documentation and Media Control Standards.
5) Project Scheduling Plan for this phase.
6) Change Control Process Definition.
7) Monitor Data Definition.
8) Monitor Data History Archive.
9) Documents from reviews of the Objectives process.
10) Data from change control, scheduling and tracking activities.
11) Objectives Plan Quality Assessment Process Definition
Implementation conditions:
1) It is essential to perform the following steps:
1.1) Evaluate the following documents using the relevant data in the Monitor Data
History Archive as standards for comparison:
1.1.1) All documentation generated in the Project Tracking Team training.
1.1.2) All documentation generated in the Project Tracking Team meetings.
1.1.3) All documentation generated in the Project Tracking Team work
escalation meetings.
1.1.4) Project Tracking Action Log.
1.1.5) All documents containing data on management quality and
document quality generated by the Project Tracking Process.
1.2) Evaluate the process that produces the Project Tracking Document, including its
management component, using the relevant data in the Monitor Data History
Archive as standards. This includes evaluating the activities in the Tracking,
Scheduling and Change Control Processes.
1.3) Complete the Quality Assessment Report on Objectives Process.
1.4) Record the quality assessment data required by the Monitor Data Definition
concerning the Objectives Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Objectives Process.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Objective Process.
3) The Quality Assessment Report on Objectives Process is available to management.
Notes: None
QUALITY ASSESSMENT REPORT ON PROJECT TRACKING PROCESS
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the Project Tracking Document. The questionnaire below provides evaluation criteria
to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the Project Tracking process appropriately scheduled and initiated relative
to the other phases of the software development process?
______
1.2) Was there an acceptable project schedule plan for the preparation of the
Project Tracking Document? ______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the
deviations, reasons and costs. ______
1.5) Were the resources (including personnel) adequately and appropriately
allocated? ______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Was there satisfactory Project Tracking of the process according to the Project Tracking
Plan? ______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.10) Were costs and scheduling monitored? ______
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the
process? ______
2.2) Were the Implementation conditions all satisfied?
______
2.3) Were the Exit conditions for the process satisfied no later that when the
Project Tracking Document was approved? ______
2.4) Does the Project Tracking Document address adequately each of the items in
the Requirements for the Project Tracking Document? ______
3) Documentation Standards:
3.1) Does the Project Tracking Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Project Tracking Process in the Monitor Data
Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Project Tracking Process. ______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the Project Tracking Document.
5.1.3) Review of the final draft Project Tracking Document.
______
5.2) Did the final draft of the Project Tracking Document meet approval, and if
not, was there closure on all exceptions to the approval conditions? ______
5.3) Was the approved version of the Project Tracking Document archived under
change control for subsequent update?
______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? ______
6.2) Were the stipulations in the Quality Control Plan complied with:
______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
Project Tracking Document? ______
7.2) Were all test and quality control plans adhered to in developing the
Project Tracking Document? ______
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards?
______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff?
______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? ______
11.2) Was adequate training about the process provided to the staff prior to its need?
______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum)
___________
================================
Recommendations:
QUESTION ID COMMENTS
4. Requirements Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Product Requirements Process.
This assessment process has three primary goals:
1) Review the process that produces the requirements of the software product from the
user request.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Product
Requirements.
3) Write the Quality Assessment Report on Product Requirements, which will contain
recommendations on ways to improve this process and its management in the current
project and ways to modify the process in the organization's other projects.
Entry Conditions:
Availability of the following artifacts:
1) User Request
2) Documentation Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition Document
6) Monitor Data Definition Document
7) Monitor Data History Archive
Input Summary:
1) User Request
2) Documentation and Media Control Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition
6) Monitor Data Definition
7) Monitor Data History Archive
8) Documents from reviews of the Product Requirements process
9) Data from change control, scheduling and tracking activities
10)Product Requirements Document
Implementation conditions:
It is essential to perform the following steps
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
Product Requirements Document
All documents containing data on management quality and document quality generated by the
Product Requirements Process
2) Evaluate the process that produces the Product Requirements Document, including its
management component, using the relevant data in the Monitor Data History Archive as
standards. This includes evaluating the activities in the Tracking, Scheduling and Change
Control Processes.
3) Complete the Quality Assessment Report on Product Requirements.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Product Requirements Process.
Output Summary:
1) Updated Monitor Data History Archive Quality Assessment Report on Product
Requirements
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Product Objective Process.
3) The Quality Assessment Report on Product Requirements is available to management.
Notes: None
5. Product Objectives Process Quality Assessment Plan
Description:
This document defines the process of assessing the quality of the Product Objectives Process.
This assessment process has three primary goals:
1) Review the process that produces the objectives of the software product from the
requirements
specification.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Product
Objectives.
3) Write the Quality Assessment Report on Product Objectives, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Availability of the following artifacts:
1) Software Requirements Specification
2) Documentation Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition Document
6) Monitor Data Definition Document
7) Monitor Data History Archive
Input Summary:
1) Software Requirements Specification
2) Documentation and Media Control Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition
6) Monitor Data Definition
7) Monitor Data History Archive
8) Documents from reviews of the Produce Objectives Process
9) Data from Change Control, Scheduling and Tracking Activities
10) Product Objectives Document
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Product Objectives Document
1.2) All documents containing data on management quality and document quality generated
by the Product Objectives Process
2) Evaluate the process that produces the Product Objectives Document, including its
management component, using the relevant data in the Monitor Data History Archive as
standards. This includes evaluating the activities in the Tracking, Scheduling and
Change Control Processes.
3) Complete the Quality Assessment Report on Product Objectives.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Product Objectives Process.
Output Summary:
1) Updated Monitor Data History Archive
2) Quality Assessment Report on Product Objectives
Exit Conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required Quality Assessment data on the
Product Objective Process.
3) The Quality Assessment Report on Product Objectives is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON OBJECTIVES PROCESS
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the Objectives Document. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended
to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the objectives process appropriately scheduled and initiated relative
to the other phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the
Objectives Document? ______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs. ______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner? ______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
______
1.10) Were costs and scheduling monitored? ______
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
______
2.2) Were the Implementation conditions all satisfied? ______
2.3) Were the Exit conditions for the process satisfied no later that when the
Objectives Document was approved? ______
2.4) Does the Objectives Document address adequately each of the items in
the Requirements for the Objectives Document? ______
3) Documentation Standards:
3.1) Does the Objectives Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Objectives Process in the Monitor Data
Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Objectives Process. ______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the Objectives Document.
5.1.3) Review of the final draft Objectives Document. ______
5.2) Did the final draft of the Objectives Document meet approval, and if
not, was there closure on all exceptions to the approval conditions?
______
5.3) Was the approved version of the Objectives Document archived under
change control for subsequent update? ______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process?
______
6.2) Were the stipulations in the Quality Control Plan complied with:
______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
Objectives Document? ______
7.2) Were all test and quality control plans adhered to in developing the
Objectives Document?
______
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards?
______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? ______
11.2) Was adequate training about the process provided to the staff prior to its need? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum) ___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
6. Product Prototype Process Quality Assessment Plan
Description:
This document defines the process of assessing the quality of the Product Prototype Process. This
assessment process has three primary goals:
1) Review the process that produces the Prototype of the software product from the user request.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Product
Prototype.
3) Write the Quality Assessment Report on Product Prototype, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) User Request
2) Documentation Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition Document
6) Monitor Data Definition Document
7) Monitor Data History Archive
Input Summary:
1) User Request
2) Documentation and Media Control Standards
3) Project Tracking Plan for this phase
4) Project Scheduling Plan for this phase
5) Change Control Process Definition
6) Monitor Data Definition
7) Monitor Data History Archive
8) Documents from reviews of the Product Prototypes Process
9) Data from Change Control, Scheduling and Tracking Activities
10) Product Prototype Document
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Product Prototype Document
1.2) All documents containing data on management quality and document quality generated
by the Product Prototype Process
2) Evaluate the process that produces the Product Prototype Document, including its
management component, using the relevant data in the Monitor Data History Archive
as standards. This includes evaluating the activities in the Tracking, Scheduling and
Change Control Processes.
3) Complete the Quality Assessment Report on Product Objectives.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Product Prototype Process.
Output Summary:
1) Updated Monitor Data History Archive
2) Quality Assessment Report on Product Prototype
Exit Conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required Quality Assessment data on the
Product Prototype Process.
3) The Quality Assessment Report on Product Prototype is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON PROTOTYPING
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the Objectives Document. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended
to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the objectives process appropriately scheduled and initiated relative
to the other phases of the software development process?
______
1.2) Was there an acceptable project schedule plan for the preparation of the
Objectives Document? ______
1.3) Were activities prioritized adequately?
______
1.4) Did the process actually conform to the schedule? If not, describe the
deviations, reasons and costs. ______
1.5) Were the resources (including personnel) adequately and appropriately
allocated? ______
1.6) Were the tasks clearly defined and assigned an owner? ______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Was there satisfactory tracking of the process according to the Project Tracking
Plan? ______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.10) Were costs and scheduling monitored? ______
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the
process? ______
2.2) Were the Implementation conditions all satisfied? ______
2.3) Were the Exit conditions for the process satisfied no later that when the
Objectives Document was approved? ______
2.4) Does the Objectives Document address adequately each of the items in
the Requirements for the Objectives Document? ______
3) Documentation Standards:
3.1) Does the Objectives Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Objectives Process in the Monitor Data
Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Objectives Process.
______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the Objectives Document.
5.1.3) Review of the final draft Objectives Document. ______
5.2) Did the final draft of the Objectives Document meet approval, and if
not, was there closure on all exceptions to the approval conditions? ______
5.3) Was the approved version of the Objectives Document archived under
change control for subsequent update? ______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? ______
6.2) Were the stipulations in the Quality Control Plan complied with: ______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
Objectives Document? ______
7.2) Were all test and quality control plans adhered to in developing the
Objectives Document? ______
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? ______
11.2) Was adequate training about the process provided to the staff prior to its need? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum) ___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
7. Specifications Process Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Specifications Process. This
assessment process has three primary goals:
1) Review the process that produces the Specifications of the software product from the
requirements document.
2) Generate the data required by the Monitor Data Definition concerning the quality of the
Specifications process; they are also for use in preparing the Quality Assessment Report on
the Specifications Process.
3) Write the Quality Assessment Report on the Specifications Process, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization's other projects.
Entry Conditions:
Availability of the following artifacts:
1) Requirements Document.
2) Documentation Standards.
3) Project Tracking Plan for the Specifications phase.
4) Project Scheduling Plan for the Specifications phase.
5) Change Control Process Definition Document.
6) Monitor Data Definition Document.
7) Monitor Data History Archive.
Input Summary:
1) Requirements Document.
2) Documentation and Media Control Standards.
3) Project Tracking Plan for the Specifications phase.
4) Project Scheduling Plan for the Specifications phase.
5) Change Control Process Definition.
6) Monitor Data Definition.
7) Monitor Data History Archive.
8) Documents from reviews of the Specifications process.
9) Data from change control, scheduling and tracking activities.
10) Specifications Document.
Implementation conditions:
1) It is essential to perform the following steps:
1.1) Evaluate the following documents using the relevant data in the Monitor Data
History Archive as standards for comparison:
1.1.1) Specifications Document.
1.1.2) All documents containing data on management quality and document
quality generated by the Specifications Process.
1.2) Evaluate the process that produces the Specifications Document, including its
management component, using the relevant data in the Monitor Data History
Archive as standards. This includes evaluating the activities in the Tracking,
Scheduling and Change Control Processes.
1.3) Complete the Quality Assessment Report on Specifications Process.
1.4) Record the quality assessment data required by the Monitor Data Definition
concerning the Specifications Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on the Specifications Process.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Specifications Process.
3) The Quality Assessment Report on the Specifications Process is available to management.
Notes: None
QUALITY ASSESSMENT REPORT ON SPECIFICATIONS PROCESS
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the Specifications Document. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended
to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included
Evaluation Criteria: Score
1) Management:
1.1) Was the Specifications process appropriately scheduled and initiated relative
to the other phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the
Specifications Document? ______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the
deviations, reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately
allocated? ______
1.6) Were the tasks clearly defined and assigned an owner? ______
1.7) Were meetings properly prepared, conducted and reported?
______
1.8) Was there satisfactory tracking of the process according to the Project Tracking
Plan? ______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.10) Were costs and scheduling monitored? ______
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the
process? ______
2.2) Were the Implementation conditions all satisfied? ______
2.3) Were the Exit conditions for the process satisfied no later that when the
Specifications Document was approved? ______
2.4) Does the Specifications Document address adequately each of the items in
the Requirements for the Specifications Document? ______
3) Documentation Standards:
3.1) Does the Specifications Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Specifications Process in the Monitor Data
Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Specifications Process.
______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the Specifications Document.
5.1.3) Review of the final draft Specifications Document.
______
5.2) Did the final draft of the Specifications Document meet approval, and if
not, was there closure on all exceptions to the approval conditions? ______
5.3) Was the approved version of the Specifications Document archived under
change control for subsequent update? ______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? ______
6.2) Were the stipulations in the Quality Control Plan complied with:
______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
Specifications Document? ______
7.2) Were all test and quality control plans adhered to in developing the
Specifications Document? ______
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process?
______
11.2) Was adequate training about the process provided to the staff prior to its need?
______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum) ___________
================================
Recommendations:
QUESTION ID COMMENTS
8. High Level Design Process Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the High Level Design Process.
This assessment process has three primary goals:
1) Review the process that produces the High Level Design of the software product from the
requirements document.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on High Level
Design Process.
3) Write the Quality Assessment Report on the High Level Design Process, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization's other projects.
Entry Conditions:
Availability of the following artifacts:
1) Requirements Document.
2) Documentation Standards.
3) Project Tracking Plan for High Level Design.
4) Project Scheduling Plan for High Level Design.
5) Change Control Process Definition Document.
6) Monitor Data Definition Document.
7) Monitor Data History Archive.
Input Summary:
1) Requirements Document.
2) Documentation and Media Control Standards.
3) Project Tracking Plan for High Level Design.
4) Project Scheduling Plan for High Level Design.
5) Change Control Process Definition.
6) Monitor Data Definition.
7) Monitor Data History Archive.
8) Documents from reviews of the High Level Design process.
9) Data from change control, scheduling and tracking activities.
10) High Level Design Document.
Implementation conditions:
1) It is essential to perform the following steps:
1.1) Evaluate the following documents using the relevant data in the Monitor Data
History Archive as standards for comparison:
1.1.1) High Level Design Document.
1.1.2) All documents containing data on management quality and document
quality generated by the High Level Design Process.
1.2) Evaluate the process that produces the High Level Design Document, including
its management component, using the relevant data in the Monitor Data History
Archive as standards. This includes evaluating the activities in the Tracking,
Scheduling and Change Control Processes.
1.3) Complete the Quality Assessment Report on the High Level Design Process.
1.4) Record the quality assessment data required by the Monitor Data Definition
concerning the High Level Design Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on High Level Design Process.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the High
Level Design Process.
3) The Quality Assessment Report on the High Level Design Process is available to
management.
Notes: None
QUALITY ASSESSMENT REPORT ON THE HIGH LEVEL DESIGN PROCESS
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the High Level Design Document. The questionnaire below provides evaluation
criteria to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included
Evaluation Criteria: Score
1) Management:
1.1) Was the High Level Design process appropriately scheduled and initiated relative
to the other phases of the software development process?
______
1.2) Was there an acceptable project schedule plan for the preparation of the
the High Level Design Document? ______
1.3) Were activities prioritized adequately?
______
1.4) Did the process actually conform to the schedule? If not, describe the
deviations, reasons and costs. ______
1.5) Were the resources (including personnel) adequately and appropriately
allocated? ______
1.6) Were the tasks clearly defined and assigned an owner? ______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Was there satisfactory tracking of the process according to the Project Tracking
Plan? ______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process? _____
1.10) Were costs and scheduling monitored? _____
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the
process? ______
2.2) Were the Implementation conditions all satisfied? ______
2.3) Were the Exit conditions for the process satisfied no later than when the
High Level Design Document was approved? ______
2.4) Does the High Level Design Document address adequately each of the items in
the Requirements for the High Level Design Document? ______
3) Documentation Standards:
3.1) Does the High Level Design Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the High Level Design Process in the Monitor
Data Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the High Level Design Process. ______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the High Level Design Document.
5.1.3) Review of the final draft The High Level Design Document.
______
5.2) Did the final draft of the High Level Design Document meet approval, and if
not, was there closure on all exceptions to the approval conditions?
______
5.3) Was the approved version of the High Level Design Document archived under
change control for subsequent update?
______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? ______
6.2) Were the stipulations in the Quality Control Plan complied with:
______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
High Level Design Document?
______
7.2) Were all test and quality control plans adhered to in developing the
High Level Design Document? _____
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards?
______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process?
______
11.2) Was adequate training about the process provided to the staff prior to its need?
______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum) ___________
================================
Recommendations:
QUESTION ID COMMENTS
9. Publications Content Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Publications Content Plans
Process. This assessment process has three primary goals.
1) Review the process that produces the Publications Content Plans of the software product
from the Objectives Document and Product Specifications Document.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report
on Publications Content Plans.
3) Write the Quality Assessment Report on Publications Content Plans, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
1) The Product Specifications Document has been initiated.
2) The Quality Plan and the Final Project Schedule Plan have been initiated.
Input Summary:
1) The Objectives Document, particularly the section listing publications requirements.
2) The Product Specifications Document..
3) Documentation Standards Document.
4) Project Tracking Plan for Publications Content Plans phase.
5) Project Scheduling Plan for Publications Content Plans phase.
6) Change Control Process Definition
7) Monitor Data Definition
8) Monitor Data History Archive.
9) Documents from reviews of Publications Content Plans.
10) Documents from Publications Content Plans reviews by Quality Control.
11) Documents from Usability Walk through.
12) Documents and data from change control, scheduling and tracking activities.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Publications Content Plans.
1.2) Documents from reviews of Publications Content Plans
1.3) Documents from Publications Content Plans reviews by Quality Control.
1.4) Documents from Usability Walkthroughs
1.5) All documents containing data on management and document quality generated by the
Publications Content Plans phase.
2) Evaluate the process that produces the Publications Content Plans, including its
management component, using the relevant data in the Monitor Data History
Archive as standards.
2.1) Evaluate implementation of scheduling for phase, using the following forms
as the basis for the evaluation:
2.1.1) Kickoff Meeting minutes,
2.1.2) full review meeting minutes,
2.1.3) overview project schedule plan form,
2.1.4) activity responsibility matrix form,
2.1.5) project checklist form,
2.1.6) Gantt charts for Publications Content Plans phase.
2.2) Evaluate implementation of tracking for phase, using the following forms
as the basis for the evaluation:
2.2.1) PTT meeting minutes,
2.2.2) work and escalation meeting minutes,
2.2.3) project checklist form,
2.2.4) Action Log form,
2.2.5) action log progress form.
2.3) Evaluate implementation of change control process for Publications Content Plans
phase, using the following forms as the basis for the evaluation:
2.3.1) change control form,
2.3.2) change proposal status form,
2.3.3) implementation schedule document.
3) Complete the Quality Assessment Report on Publications Content Plans.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Publications Content Plans phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Publications Content Plans.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Publications Content Plans Process.
3) The Quality Assessment Report on Publications Content Plans is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON PUBLICATIONS CONTENT PLANS
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Publications Content Plans. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended
to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the Publications Content Plans process appropriately scheduled and initiated
relative
to the other phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Publications
Content Plans? ______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Publications
Content Plans process in a satisfactory manner? ______
1.8) Were too many meetings needed to complete the Publications Content Plans process in
a
satisfactory manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Publications Content Plans process satisfied prior to
commencing the process? _____
2.2) Does the Input Summary for the Publications Content Plans process include all needed
and/or relevant input?
______
2.3) Were the Implementation Conditions of the Publications Content Plans process all
satisfied? ______
2.4) Were additional implementation conditions needed for the Publications Content
Plans process? ______
2.5) Does the Output Summary of the Publications Content Plans process include
all needed and/or relevant output? ______
2.6) Were the Exit conditions for the Publications Content Plans process satisfied
no later than when the Publication Content Plans were approved? ______
3) Documentation Standards:
3.1) Does the Publications Content Plans Process Definition Document satisfy the conditions in
the Documentation Standards?
______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Publications Content Plans Process in the
Monitor Data Definition
______
4.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Publications Content Plans Process ______
5) Reviews and Audits:
5.1) Were the following reviews and inspections performed satisfactorily by the
software project team?
5.1.1) Examination of the Objectives document ______
5.1.2) Examination of the Requirements document.
______
5.1.3) Reviews of the Publications Content Plans by the Review Committee ______
5.1.4) Reviews of the Publications Content Plans with the Customer
______
5.1.5) Inspections of the Publications Content Plans by the Quality Control reviewers.
______
5.2) Did the final draft of the Publications Content Plans Document meet approval,
and if not, was there closure on all exceptions to the approval conditions?
______
5.3) Was the approved version of the Publications Content Plans archived under change
control for subsequent update? ______
6) Tests:
6.1) Were adequate plans made for the First and Second Usability Tests? ______
6.2) Were Usability test plans adhered to in developing the Publications Content
Plans document? ______
6.3) Were Usability test results made available to all project members who needed
them? ______
6.4) Were Usability test results made available in a timely fashion? ______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Publications Content Plans process?
______
7.2) Were the stipulations listed in the Publications Content Plans Quality Control Plan
complied with? ______
7.3) Was the Publications Content Plans Quality Control Plan adhered to? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Publications Content Plans? ______
9) Media and Hardware Control:
9.1) Were all documents generated during the Publications Content Plans process
maintained
on the appropriate media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the Publications Content Plans process
recorded appropriately and made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the Publications Content Plans process?
______
11.2) Was adequate training about the Publications Content Plans process provided to
the staff prior to its being needed?
______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
10. Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Test Plan Process. This
assessment process has three primary goals:
1) Review the process that produces the Test Plan.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
Test Plan.
3) Write the Quality Assessment Report on Test Plan, which will contain recommendations on
ways to improve this process and its management in the current project and ways to modify
the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for Test Plan phase.
3) Project Scheduling Plan for Test Plan phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for Test Plan phase.
3) Project Scheduling Plan Document for Test Plan phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of Test Plan.
8) Documents from Test Plan inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Test Plan.
1.2) Documents from reviews of Test Plan.
1.3) Documents from Test Plan inspections by Quality Control..
1.4) All documents containing data on management and document quality
generated by the Test Plan phase.
2) Evaluate the process that produces the Test Plan, using the relevant data in the Monitor
Data History Archive as standards.
3) Complete the Quality Assessment Report on Test Plan.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Test Plan phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Test Plan.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Test
Plan Process.
3) The Quality Assessment Report on Test Plan is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON TEST PLAN
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Test plan. The questionnaire below provides evaluation criteria to facilitate
evaluating this process. Any other evaluation reporting information may be appended to it in
order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the test plan process appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Test plan?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Test plan process in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Test plan process in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Test plan process satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Test plan process include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the Test plan process all satisfied? ______
2.4) Were additional implementation conditions needed for the Test plan process?
______
2.5) Does the Output Summary of the Test plan process include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Test plan process satisfied no later than when the
System testing was completed?
______
3) Documentation Standards:
3.1) Does the Test plan Process Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Test plan Documentation Standards:
4.1) Are the Test plan Documentation standards sufficient to clearly and fully document the
test plan? ______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the Test plan Process in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Test plan Process ______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the test plan by the Review Committee
______
6.1.3) Inspections of the test plan by the Quality Control reviewers.
______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Test plan process?
______
7.2) Were the stipulations listed in the Test plan Quality Control Plan complied with?
______
7.3) Was the Test plan Quality Control Plan adhered to? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Test plan?
______
9) Test plan Control:
9.1) Was the Test plan Process Definition adhered to?
______
10) Media and Hardware Control:
10.1) Were all documents generated during the Test plan process maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the Test plan Process Definition Document in producing
Test plan? ______
11.2) Was the Test plan Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the Test plan process recorded appropriately and
made accessible to the project staff?
______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the Test plan process?
______
13.2) Was adequate training about the Test plan process provided to the staff prior to it
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum) ___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
11. Low Level Design Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Low Level Design Process.
This assessment process has three primary goals:
1) Review the process that produces the Low Level Design of the software product from the
requirements document.
2) Generate the data required by the Monitor Data Definition concerning the quality of the Low
Level Design process; they are also for use in preparing the Quality Assessment Report on
Low Level Design Process.
3) Write the Quality Assessment Report on the Low Level Design Process, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization's other projects.
Entry Conditions:
Availability of the following artifacts:
1) Requirements Document.
2) Documentation Standards.
3) Project Tracking Plan for the Low Level Design phase.
4) Project Scheduling Plan for the Low Level Design phase.
5) Change Control Process Definition Document.
6) Monitor Data Definition Document.
7) Monitor Data History Archive.
Input Summary:
1) Requirements Document.
2) Documentation and Media Control Standards.
3) Project Tracking Plan for the Low Level Design phase.
4) Project Scheduling Plan for the Low Level Design phase.
5) Change Control Process Definition.
6) Monitor Data Definition.
7) Monitor Data History Archive.
8) Documents from reviews of the Low Level Design process.
9) Data from change control, scheduling and tracking activities.
10) Low Level Design Document.
Implementation conditions:
1) It is essential to perform the following steps:
1.1) Evaluate the following documents using the relevant data in the Monitor Data
History Archive as standards for comparison:
1.1.1) Low Level Design Document.
1.1.2) All documents containing data on management quality and document
quality generated by the Low Level Design Process.
1.2) Evaluate the process that produces the Low Level Design Document, including
its management component, using the relevant data in the Monitor Data History
Archive as standards. This includes evaluating the activities in the Tracking,
Scheduling and Change Control Processes.
1.3) Complete the Quality Assessment Report on the Low Level Design Process.
1.4) Record the quality assessment data required by the Monitor Data Definition
concerning the Low Level Design Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on the Low Level Design Process.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Low
Level Design Process.
3) The Quality Assessment Report on the Low Level Design Process is available to
management.
Notes: None
QUALITY ASSESSMENT REPORT ON LOW LEVEL DESIGN PROCESS
Purpose:
This document reports the Process Quality Assessment Board's evaluation of the process that
produces the Low Level Design Document. The questionnaire below provides evaluation criteria
to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the Low Level Design process appropriately scheduled and initiated relative
to the other phases of the software development process?
______
1.2) Was there an acceptable project schedule plan for the preparation of the
Low Level Design Document? ______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the
deviations, reasons and costs. ______
1.5) Were the resources (including personnel) adequately and appropriately
allocated? ______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were meetings properly prepared, conducted and reported? ______
1.8) Was there satisfactory tracking of the process according to the Project Tracking
Plan? ______
1.9) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.10) Were costs and scheduling monitored? ______
1.11) Were appropriate actions taken when deviation from plans occurred? ______
1.12) Were the risks that became reality adequately compensated for? ______
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the
process? ______
2.2) Were the Implementation conditions all satisfied? ______
2.3) Were the Exit conditions for the process satisfied no later that when the
Low Level Design Document was approved? ______
2.4) Does the Low Level Design Document address adequately each of the items in
the Requirements for the Low Level Design Document?
______
3) Documentation Standards:
3.1) Does the Low Level Design Document satisfy the conditions in the Documentation
standards of the project? ______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Low Level Design Process in the Monitor
Data Definition.
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Low Level Design Process.
______
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software
project team?
5.1.1) Examination of the Requirements Document.
5.1.2) Reviews of the preliminary drafts of the Low Level Design Document.
5.1.3) Review of the final draft Low Level Design Document. ______
5.2) Did the final draft of the Low Level Design Document meet approval, and if
not, was there closure on all exceptions to the approval conditions?
______
5.3) Was the approved version of the Low Level Design Document archived under
change control for subsequent update?
______
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? ______
6.2) Were the stipulations in the Quality Control Plan complied with:
______
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the
Low Level Design Document? ______
7.2) Were all test and quality control plans adhered to in developing the
Low Level Design Document? ______
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
______
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate
media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the process recorded appropriately and
made accessible to the project staff? ______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? ______
11.2) Was adequate training about the process provided to the staff prior to its need?
______
Summary Evaluation:
TOTAL SCORE (sum of the scores / total possible sum) ___________
================================
Recommendations:
QUESTION ID COMMENTS
12. Code Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Code Process. This
assessment process has three primary goals:
1) Review the process that produces the code of the software product from the low level design
document.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
Code.
3) Write the Quality Assessment Report on Code, which will contain recommendations on
ways to improve this process and its management in the current project and ways to
modify the process in the organization’s other projects.
Entry Conditions:
1) The low level design has been initiated.
2) The Quality Plan and the Final Project Schedule Plan have been initiated.
Input Summary:
1) The Low Level Design Document.
2) Documentation Standards Document.
3) Project Tracking Plan for Code phase.
4) Project Scheduling Plan for Code phase.
5) Change Control Process Definition
6) Monitor Data Definition
7) Monitor Data History Archive.
8) Documents from reviews of code.
9) Documents from code inspections by Quality Control.
10) Documents and data from change control, scheduling and tracking activities.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Code.
1.2) Documents from reviews of code
1.3) Documents from code inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the Code phase.
2) Evaluate the process that produces the Code, including its management component, using the
relevant data in the Monitor Data History Archive as standards.
2.1) Evaluate implementation of scheduling for phase, using the following forms as
the basis for the evaluation:
2.1.1) Kickoff Meeting minutes,
2.1.2) full review meeting minutes,
2.1.3) overview project schedule plan form,
2.1.4) activity responsibility matrix form,
2.1.5) project checklist form,
2.1.6) Gantt charts for Code phase.
2.2) Evaluate implementation of tracking for phase, using the following forms as the
basis for the evaluation:
2.2.1) PTT meeting minutes,
2.2.2) work and escalation meeting minutes,
2.2.3) project checklist form,
2.2.4) Action Log form,
2.2.5) action log progress form.
2.3) Evaluate implementation of change control process for code phase, using the
following forms as the basis for the evaluation:
2.3.1) change control form,
2.3.2) change proposal status form,
2.3.3) implementation schedule document.
3) Complete the Quality Assessment Report on Code.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Code phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Code.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Code
Process.
3) The Quality Assessment Report on Code is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON CODE
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Code. The questionnaire below provides evaluation criteria to facilitate evaluating
this process. Any other evaluation reporting information may be appended to it in order to
complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the code process appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Code?
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Code process in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Code process in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Code process satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Code process include all needed and/or relevant
input?
2.3) Were the Implementation Conditions of the Code process all satisfied?
______
2.4) Were additional implementation conditions needed for the Code process?
______
2.5) Does the Output Summary of the Code process include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Code process satisfied no later than when the
System testing was completed?
______
3) Documentation Standards:
3.1) Does the Code Process Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Coding Standards:
4.1) Are the Code Standards sufficient to produce acceptable code? ______
5) Code Documentation Standards:
5.1) Are the Code Documentation standards sufficient to clearly and fully document the
code? ______
6) Metrics:
6.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
6.1.1) The numerical data required for the Code Process in the Monitor Data
Definition ______
:
6.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Code Process
______
7) Reviews and Inspections:
7.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
7.1.1) Examination of the low level design document ______
7.1.2) Reviews of the code by the Review Committee
______
7.1.3) Inspections of the code by the Quality Control reviewers. ______
8) Standards:
8.1) Was there an adequate Quality Control Plan for the Code process? ______
8.2) Were the stipulations listed in the Code Quality Control Plan complied with?
______
8.3) Was the Code Quality Control Plan adhered to?
______
9) Change Control:
9.1) Were the Change Control Standards for Documents adhered to?
______
9.2) Were there an excessive number of changes to the Code?
______
10) Tools:
10.1) If programming tools were used, were they appropriate and did they produce
code that met the Coding Standards? ______
11) Code Control:
11.1) Was the Code Process Definition adhered to?
______
12) Media and Hardware Control:
12.1) Were all documents generated during the Code process maintained on the
appropriate media according to the Media Control Standards? ______
13) Supplier Control:
13.1) Did the supplier adhere to the Code Process Definition Document in producing
Code? ______
13.2) Was the Code Quality Control Plan adhered to? ______
13.3) Did the code meet all conditions in the Coding Standards and Code Document
Standards? ______
14) Records:
14.1) Were all documents generated during the Code process recorded appropriately and
made
accessible to the project staff? ______
15) Training:
15.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Code process? ______
15.2) Was adequate training about the Code process provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
13. Unit Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Unit Test . This assessment
process has three primary goals:
1) Review the Unit Test.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
Unit Test.
3) Write the Quality Assessment Report on Unit Test, which will contain recommendations on
ways to improve this process and its management in the current project and ways to modify
the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for Unit Test phase.
3) Project Scheduling Plan for Unit Test phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
7) Unit Test Plan for Unit Test phase.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for Unit Test phase.
3) Project Scheduling Plan Document for Unit Test phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of Unit Test.
8) Documents from Unit Test inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
10) Unit Test Plan Document for Unit Test phase.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Unit Test.
1.2) Documents from reviews of Unit Test.
1.3) Documents from Unit Test inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the Unit Test phase.
2) Evaluate the Unit Test, using the relevant data in the Monitor Data History Archive as
standards.
3) Complete the Quality Assessment Report on Unit Test.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Unit Test phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Unit Test.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the Unit
Test Process.
3) The Quality Assessment Report on Unit Test is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON UNIT TEST
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the Unit test. The
questionnaire below provides evaluation criteria to facilitate evaluating this process. Any other
evaluation reporting information may be appended to it in order to complete the Quality
Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the unit test appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Unit test?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Unit test in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Unit test in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Unit test satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Unit test include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the Unit test all satisfied?
______
2.4) Were additional implementation conditions needed for the Unit test ? ______
2.5) Does the Output Summary of the Unit test include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Unit test satisfied no later than when the
System testing was completed?
______
3) Documentation Standards:
3.1) Does the Unit test Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Unit test Documentation Standards:
4.1) Are the Unit test Documentation standards sufficient to clearly and fully document the
unit test?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the Unit test in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Unit test ______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the unit test by the Review Committee
______
6.1.3) Inspections of the unit test by the Quality Control reviewers.
______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Unit test ? ______
7.2) Were the stipulations listed in the Unit test Quality Control Plan complied with?
______
7.3) Was the Unit test Quality Control Plan adhered to?
______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Unit test?
______
9) Unit test Control:
9.1) Was the Unit test Definition adhered to? ______
10) Media and Hardware Control:
10.1) Were all documents generated during the Unit test maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the Unit test Definition Document in producing
Unit test? ______
11.2) Was the Unit test Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the Unit test recorded appropriately and made
accessible to the project staff? ______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the Unit test ?
______
13.2) Was adequate training about the Unit test provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
14. Function Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Function Test . This
assessment process has three primary goals:
1) Review the Function Test.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
Function Test.
3) Write the Quality Assessment Report on Function Test, which will contain recommendations
on ways to improve this process and its management in the current project and ways to
modify the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for Function Test phase.
3) Project Scheduling Plan for Function Test phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
7) Function Test Plan for Function Test phase.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for Function Test phase.
3) Project Scheduling Plan Document for Function Test phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of Function Test.
8) Documents from Function Test inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
10) Function Test Plan Document for Function Test phase.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Function Test.
1.2) Documents from reviews of Function Test.
1.3) Documents from Function Test inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the Function Test phase.
2) Evaluate the Function Test, using the relevant data in the Monitor Data History Archive as
standards.
3) Complete the Quality Assessment Report on Function Test.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Function Test phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Function Test.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Function Test Process.
3) The Quality Assessment Report on Function Test is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON FUNCTION TEST
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the Function test.
The questionnaire below provides evaluation criteria to facilitate evaluating this process. Any
other evaluation reporting information may be appended to it in order to complete the Quality
Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the function test appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Function test?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Function test in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Function test in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Function test satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Function test include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the Function test all satisfied?
______
2.4) Were additional implementation conditions needed for the Function test ?
______
2.5) Does the Output Summary of the Function test include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Function test satisfied no later than when the
System testing was completed?
______
3) Documentation Standards:
3.1) Does the Function test Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Function test Documentation Standards:
4.1) Are the Function test Documentation standards sufficient to clearly and fully
document the function test?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the Function test in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Function test
______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the function test by the Review Committee
______
6.1.3) Inspections of the function test by the Quality Control reviewers.
______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Function test ?
______
7.2) Were the stipulations listed in the Function test Quality Control Plan complied with?
______
7.3) Was the Function test Quality Control Plan adhered to? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Function test? ______
9) Function test Control:
9.1) Was the Function test Definition adhered to? ______
10) Media and Hardware Control:
10.1) Were all documents generated during the Function test maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the Function test Definition Document in producing
Function test? ______
11.2) Was the Function test Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the Function test recorded appropriately and
made
accessible to the project staff? ______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Function test ? ______
13.2) Was adequate training about the Function test provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
15. Component Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Component Test . This
assessment process has three primary goals:
1) Review the Component Test.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
Component Test.
3) Write the Quality Assessment Report on Component Test, which will contain
recommendations on ways to improve this process and its management in the current
project and ways to modify the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for Component Test phase.
3) Project Scheduling Plan for Component Test phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
7) Component Test Plan for Component Test phase.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for Component Test phase.
3) Project Scheduling Plan Document for Component Test phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of Component Test.
8) Documents from Component Test inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
10) Component Test Plan Document for Component Test phase.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Component Test.
1.2) Documents from reviews of Component Test.
1.3) Documents from Component Test inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the Component Test phase.
2) Evaluate the Component Test, using the relevant data in the Monitor Data History Archive as
standards.
3) Complete the Quality Assessment Report on Component Test.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Component Test phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Component Test.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Component Test Process.
3) The Quality Assessment Report on Component Test is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON COMPONENT TEST
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the Component
test. The questionnaire below provides evaluation criteria to facilitate evaluating this process.
Any other evaluation reporting information may be appended to it in order to complete the
Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the component test appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Component test?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Component test in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Component test in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Component test satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Component test include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the Component test all satisfied?
______
2.4) Were additional implementation conditions needed for the Component test ?
______
2.5) Does the Output Summary of the Component test include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Component test satisfied no later than when the
System testing was completed?
______
3) Documentation Standards:
3.1) Does the Component test Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Component test Documentation Standards:
4.1) Are the Component test Documentation standards sufficient to clearly and fully
document the component test?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the Component test in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Component test
______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the component test by the Review Committee
______
6.1.3) Inspections of the component test by the Quality Control reviewers.
______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Component test ? ______
7.2) Were the stipulations listed in the Component test Quality Control Plan complied with?
______
7.3) Was the Component test Quality Control Plan adhered to? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Component test?
______
9) Component test Control:
9.1) Was the Component test Definition adhered to? ______
10.1) Media and Hardware Control:
10.1) Were all documents generated during the Component test maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the Component test Definition Document in producing
Component test? ______
11.2) Was the Component test Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the Component test recorded appropriately and
made accessible to the project staff?
______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Component test ? ______
13.2) Was adequate training about the Component test provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
16. System Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the System Test . This assessment
process has three primary goals:
1) Review the System Test.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report on
System Test.
3) Write the Quality Assessment Report on System Test, which will contain recommendations
on ways to improve this process and its management in the current project and ways to
modify the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for System Test phase.
3) Project Scheduling Plan for System Test phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
7) System Test Plan for System Test phase.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for System Test phase.
3) Project Scheduling Plan Document for System Test phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of System Test.
8) Documents from System Test inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
10) System Test Plan Document for System Test phase.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) System Test.
1.2) Documents from reviews of System Test.
1.3) Documents from System Test inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the System Test phase.
2) Evaluate the System Test, using the relevant data in the Monitor Data History Archive as
standards.
3) Complete the Quality Assessment Report on System Test.
4) Record the quality assessment data required by the Monitor Data definition concerning the
System Test phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on System Test.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
System Test Process.
3) The Quality Assessment Report on System Test is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON SYSTEM TEST
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the System test.
The questionnaire below provides evaluation criteria to facilitate evaluating this process. Any
other evaluation reporting information may be appended to it in order to complete the Quality
Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the system test appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the System test?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the System test in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the System test in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the System test satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the System test include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the System test all satisfied? ______
2.4) Were additional implementation conditions needed for the System test ? ______
2.5) Does the Output Summary of the System test include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the System test satisfied no later than when the System
testing was completed?
______
3) Documentation Standards:
3.1) Does the System test Definition Document satisfy the conditions in the
Documentation Standards? ______
4) System test Documentation Standards:
4.1) Are the System test Documentation standards sufficient to clearly and fully document
the
system test? ______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the System test in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the System test ______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the system test by the Review Committee
______
6.1.3) Inspections of the system test by the Quality Control reviewers. ______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the System test ? ______
7.2) Were the stipulations listed in the System test Quality Control Plan complied with?
______
7.3) Was the System test Quality Control Plan adhered to?
______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the System test? ______
9) System test Control:
9.1) Was the System test Definition adhered to? ______
10) Media and Hardware Control:
10.1) Were all documents generated during the System test maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the System test Definition Document in producing
System test? ______
11.2) Was the System test Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the System test recorded appropriately and
made
accessible to the project staff? ______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the System test ?
______
13.2) Was adequate training about the System test provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
17. First Draft Publications Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the first-draft publications process.
This assessment process has three primary goals:
1) Review the process that produces the first-draft of the product’s publications that are available
for review by groups within the project.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on First-Draft
Publications.
3) Write the Quality Assessment Report on First-Draft Publications, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) Documentation Standards.
2) Publication Contents Plan.
3) Objectives Document.
4) Test Plans.
5) Low Level Design Plan.
6) Coding.
7) Unit and Function test Results.
8) Preliminary Component Test Results.
9) Project Tracking Plan for this phase.
10) Project Scheduling Plan for this phase.
11) Change Control Process Definition Document.
12) Monitor Data Definition Document.
13) Monitor Data History Archive.
Input Summary:
1) Documentation and Media Control Standards.
2) Publication Contents Plan.
3) Project Tracking Plan for this phase.
4) Project Scheduling Plan for this phase.
5) Change Control Process Definition.
6) Monitor Data Definition.
7) Monitor Data History Archive
8) Documents from reviews of the First-Draft Publications process.
9) Data from change control, scheduling and tracking activities.
10) The Publications First Draft
Implementation conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) First-Draft Publications.
1.2) All documents containing data on management quality and document quality generated
by the First-Draft Publications Process.
2) Evaluate the process that produces the First-Draft Publications, including its management
component, using the relevant data in Monitor Data History Archive and Publication Content
Plan as standards. This includes evaluating the activities in the Tracking, Scheduling and
Change Control Processes.
3) Complete the Quality Assessment Report on First-Draft Publications.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
First-Draft Publications Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on First-Draft Publications.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required assessment data on the First-Draft
Publications Process.
3) The Quality Assessment Report on First-Draft Publications is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON FIRST DRAFT PUBLICATIONS
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the First Draft Publications. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended to
it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the question below.
The response to a question may include evaluating a compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations in
the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the first draft publications process appropriately scheduled and initiated relative to
the other phases of the software development process? _____
1.2) Was there an acceptable project schedule plan for the preparation of the First Draft
Publications?
_____
1.3) Were activities prioritized adequately? _____
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs. _____
1.5) Were the resources (including personnel) adequately and appropriately allocated?
_____
1.6) Were the tasks clearly defined and assigned an owner? _____
1.7) Were the meetings properly prepared, conducted and reported?
_____
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
_____
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
_____
1.10) Were costs and scheduling monitored? _____
1.11) Were appropriate actions taken when deviation from plans occurred? _____
1.12) Were the risks that became reality adequately compensated for?
_____
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
_____
2.2) Were the Implementation conditions all satisfied? _____
2.3) Were the Exit conditions for the process satisfied no later than when the First Draft
Publications was approved? _____
2.4) Does the First Draft Publications cover aspects of the software, such as instructions,
manuals, code etc. adequately?
_____
3) Documentation Standards:
3.1) Does the First Draft Publications satisfy the conditions in the Documentation Standards
of the project? _____
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected and
reported?
4.1.1) The numerical data required for the First Draft Publications Process in the
Monitor
Data Definition?
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the First Draft Publications Process. _____
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software project
team?
5.1.1) Examination of the Publications Content Plan.
5.1.2) Reviews of the preliminary drafts of the First Draft Publications.
5.1.3) Review of the final draft of the First Draft Publications.
_____
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? _____
6.2) Were the stipulations in the Quality Control Plan complied with?
_____
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the First Draft
Publications? _____
7.2) Were all test and quality control plans adhered to in developing the First Draft
Publications? _____
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to? _____
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate media
according to the Media Control Standards? _____
10) Records:
10.1) Were all documents generated during the process recorded appropriately and made
accessible to the project staff? _____
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? _____
11.2) Was adequate training about the process provided to the staff prior to its need? _____
Summary Evaluation:
TOTAL SCORE (sum of the scores /total possible sum) ________________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
18. Second Draft Publications Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the second-draft publications
process. This assessment process has three primary goals:
1) Review the process that produces the second-draft of the product’s publications that are
available for review by groups within the project.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Second-Draft
Publications.
3) Write the Quality Assessment Report on Second-Draft Publications, which will contain
recommendations on ways to improve this process and its management in the current project
and ways to modify the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) Documentation Standards.
2) Publications Content Plan.
3) First-Draft Publications.
4) Objectives Document.
5) Test Plans.
6) Low Level Design Plan.
7) Code.
8) Unit, Function, and Component test Results.
9) Preliminary System Test Results.
10) Project Tracking Plan for this phase.
11) Project Scheduling Plan for this phase.
12) Change Control Process Definition Document.
13) Monitor Data Definition Document.
14) Monitor Data History Archive.
Input Summary:
1) Documentation and Media Control Standards.
2) Publications Content Plan.
3) First-Draft Publications.
4) Project Tracking Plan for this phase.
5) Project Scheduling Plan for this phase.
6) Change Control Process Definition.
7) Monitor Data Definition.
8) Monitor Data History Archive
9) Documents from reviews of the Second-Draft Publications process.
10) Data from change control, scheduling and tracking activities.
11) The Publications Second Draft.
Implementation conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Second-Draft Publications.
1.2) All documents containing data on management quality and document quality generated
by the Second-Draft Publications Process.
2) Evaluate the process that produces the Second-Draft Publications, including its management
component, using the relevant data in Monitor Data History Archive and Publication Content
Plan as standards. This includes evaluating the activities in the Tracking, Scheduling and
Change Control Processes.
3) Complete the Quality Assessment Report on Second-Draft Publications.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Second-Draft Publications Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Second-Draft Publications.
Exit conditions:
1) Assessment will continue into the Maintenance Phase until feedback are collected.
2) The Monitor Data History Archive contains the required assessment data on the Second-Draft
Publications Process.
3) The Quality Assessment Report on Second-Draft Publications is available to management.
Notes: None
QUALITY ASSESSMENT REPORT ON SECOND DRAFT PUBLICATIONS
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Second Draft Publications. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended to
it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the question below.
The response to a question may include evaluating a compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations in
the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the second draft publications process appropriately scheduled and initiated relative
to
the other phases of the software development process? _____
1.2) Was there an acceptable project schedule plan for the preparation of the Second Draft
Publications? _____
1.3) Were activities prioritized adequately?
_____
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
_____
1.5) Were the resources (including personnel) adequately and appropriately allocated?
_____
1.6) Were the tasks clearly defined and assigned an owner? _____
1.7) Were the meetings properly prepared, conducted and reported? _____
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
_____
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
_____
1.10) Were costs and scheduling monitored?
_____
1.11) Were appropriate actions taken when deviation from plans occurred? _____
1.12) Were the risks that became reality adequately compensated for?
_____
2) Documentation:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
_____
2.2) Were the Implementation conditions all satisfied? _____
2.3) Were the Exit conditions for the process satisfied no later than when the Second Draft
Publications was approved?
_____
2.4) Does the Second Draft Publications cover aspects of the software, such as instructions,
manuals, code etc. adequately? _____
3) Documentation Standards:
3.1) Does the Second Draft Publications satisfy the conditions in the Documentation
Standards of the project? _____
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected and
reported?
4.1.1) The numerical data required for the Second Draft Publications Process in the
Monitor Data Definition?
4.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Second Draft Publications Process. _____
5) Reviews and Audits:
5.1) Were the following reviews and audits performed satisfactorily by the software project
team?
5.1.1) Examination of the Publications Content Plan.
5.1.2) Reviews of the preliminary drafts of the Second Draft Publications.
5.1.3) Review of the final draft of the Second Draft Publications. _____
6) Standards:
6.1) Was there an adequate Quality Control Plan for this process? _____
6.2) Were the stipulations in the Quality Control Plan complied with? _____
7) Tests:
7.1) Were adequate plans made for testing and controlling the quality of the Second Draft
Publications? _____
7.2) Were all test and quality control plans adhered to in developing the Second Draft
Publications? _____
8) Change control:
8.1) Were the Change Control Standards for Documents adhered to?
_____
9) Media and Hardware Control:
9.1) Were all documents generated during the process maintained on the appropriate media
according to the Media Control Standards? _____
10) Records:
10.1) Were all documents generated during the process recorded appropriately and made
accessible to the project staff? _____
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? _____
11.2) Was adequate training about the process provided to the staff prior to its need? _____
Summary Evaluation:
TOTAL SCORE (sum of the scores /total possible sum) _________
===============================
Recommendations:
QUESTION ID COMMENTS
19. Acceptance Test Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the Acceptance Test . This
assessment process has three primary goals:
1) Review the Acceptance Test.
2) Generate the data required by the Monitor Data Definition Document concerning the
quality of this process; they are also for use in preparing the Quality Assessment Report
on Acceptance Test.
3) Write the Quality Assessment Report on Acceptance Test, which will contain
recommendations on ways to improve this process and its management in the current
project and ways to modify the process in the organization’s other projects.
Entry Conditions:
1) Documentation and Media Standards.
2) Project Tracking Plan for Acceptance Test phase.
3) Project Scheduling Plan for Acceptance Test phase.
4) Change Control Process.
5) Monitor Data.
6) Monitor Data History Archive.
7) Acceptance Test Plan for Acceptance Test phase.
Input Summary:
1) Documentation and Media Standards Document.
2) Project Tracking Plan Document for Acceptance Test phase.
3) Project Scheduling Plan Document for Acceptance Test phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive.
7) Documents from reviews of Acceptance Test.
8) Documents from Acceptance Test inspections by Quality Control.
9) Documents and data from change control, scheduling and tracking activities.
10) Acceptance Test Plan Document for Acceptance Test phase.
Implementation Conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) Acceptance Test.
1.2) Documents from reviews of Acceptance Test.
1.3) Documents from Acceptance Test inspections by Quality Control..
1.4) All documents containing data on management and document quality generated
by the Acceptance Test phase.
2) Evaluate the Acceptance Test, using the relevant data in the Monitor Data History Archive
as standards.
3) Complete the Quality Assessment Report on Acceptance Test.
4) Record the quality assessment data required by the Monitor Data definition concerning the
Acceptance Test phase.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Acceptance Test.
Exit conditions:
1) The complete software system product has passed Acceptance Testing.
2) The Monitor Data History Archive contains the required quality assessment data on the
Acceptance Test Process.
3) The Quality Assessment Report on Acceptance Test is available to management.
Notes: None.
QUALITY ASSESSMENT REPORT ON ACCEPTANCE TEST
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the Acceptance
test. The questionnaire below provides evaluation criteria to facilitate evaluating this process.
Any other evaluation reporting information may be appended to it in order to complete the
Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the acceptance test appropriately scheduled and initiated relative to the other
phases of the software development process? ______
1.2) Was there an acceptable project schedule plan for the preparation of the Acceptance test?
______
1.3) Were activities prioritized adequately? ______
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs.
______
1.5) Were the resources (including personnel) adequately and appropriately allocated?
______
1.6) Were the tasks clearly defined and assigned an owner?
______
1.7) Were the number of meetings held insufficient for completing the Acceptance test in a
satisfactory manner? ______
1.8) Were too many meetings needed to complete the Acceptance test in a satisfactory
manner? ______
1.9) Were meetings properly prepared, conducted and reported? ______
1.10) Were the risks related to the process adequately estimated at the beginning of the
process? ______
1.11) Were costs and scheduling monitored? ______
1.12) Were the number of deviations from the plans at an acceptable level?
______
1.13) Were appropriate actions taken when deviation from plans occurred?
______
1.14) Were the risks that became reality adequately compensated for? ______
1.15) Were management approvals given appropriately and in a timely fashion? ______
2) Documentation:
2.1) Were the Entry Conditions for the Acceptance test satisfied prior to commencing the
process? ______
2.2) Does the Input Summary for the Acceptance test include all needed and/or relevant
input? ______
2.3) Were the Implementation Conditions of the Acceptance test all satisfied?
______
2.4) Were additional implementation conditions needed for the Acceptance test ? ______
2.5) Does the Output Summary of the Acceptance test include all needed and/or relevant
output? ______
2.6) Were the Exit conditions for the Acceptance test satisfied no later than when the
Acceptance testing was completed? ______
3) Documentation Standards:
3.1) Does the Acceptance test Definition Document satisfy the conditions in the
Documentation Standards? ______
4) Acceptance test Documentation Standards:
4.1) Are the Acceptance test Documentation standards sufficient to clearly and fully
document the acceptance test?
______
5) Metrics:
5.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported? ______
5.1.1) The numerical data required for the Acceptance test in the Monitor Data
Definition ______
5.1.2) The numerical data pertaining to the product quality required in the
Quality Control Plan of the Acceptance test ______
6) Reviews and Inspections:
6.1) Were the following reviews and inspections performed satisfactorily by the
software project team? ______
6.1.1) Examination of the low level design document ______
6.1.2) Reviews of the acceptance test by the Review Committee
______
6.1.3) Inspections of the acceptance test by the Quality Control reviewers.
______
7) Standards:
7.1) Was there an adequate Quality Control Plan for the Acceptance test ? ______
7.2) Were the stipulations listed in the Acceptance test Quality Control Plan complied with?
______
7.3) Was the Acceptance test Quality Control Plan adhered to? ______
8) Change Control:
8.1) Were the Change Control Standards for Documents adhered to?
______
8.2) Were there an excessive number of changes to the Acceptance test?
______
9) Acceptance test Control:
9.1) Was the Acceptance test Definition adhered to? ______
10) Media and Hardware Control:
10.1) Were all documents generated during the Acceptance test maintained on the
appropriate media according to the Media Control Standards? ______
11) Supplier Control:
11.1) Did the supplier adhere to the Acceptance test Definition Document in producing
Acceptance test? ______
11.2) Was the Acceptance test Quality Control Plan adhered to? ______
12) Records:
12.1) Were all documents generated during the Acceptance test recorded appropriately and
made accessible to the project staff?
______
13) Training:
13.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Acceptance test ? ______
13.2) Was adequate training about the Acceptance test provided to the staff prior to its
being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS
20. Packaging Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the packaging process. This
assessment process has three primary goals:
1) Review the process that do packaging on the product.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Packaging.
3) Write the Quality Assessment Report on Packaging, which will contain recommendations on
ways to improve this process and its management in the current project and ways to modify
the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) Approved final publications.
2) Approved software.
3) Project Tracking Plan for this phase.
4) Project Scheduling Plan for this phase.
5) Change Control Process Definition Document.
6) Monitor Data Definition Document.
7) Monitor Data History Archive.
Input Summary:
1) Approved publications and software.
2) Project Tracking Plan for this phase.
3) Project Scheduling Plan for this phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive
7) Documents from reviews of the Packaging process.
8) Data from change control, scheduling and tracking activities.
Implementation conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) All documents containing data on management quality and packaging quality generated
by the Packaging Process.
2) Evaluate the Packaging process, including its management component, using the relevant data
in Monitor Data History Archive as standards. This includes evaluating the activities in the
Tracking, Scheduling and Change Control Processes.
3) Complete the Quality Assessment Report on Packaging.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Packaging Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Packaging.
Exit conditions:
1) The termination of the product from the market.
2) The Monitor Data History Archive contains the required assessment data on the Packaging
Process.
3) The Quality Assessment Report on Packaging is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON PACKAGING
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
packages the product for delivery to a customer. The questionnaire below provides evaluation
criteria to facilitate evaluating this process. Any other evaluation reporting information may be
appended to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the question below.
The response to a question may include evaluating a compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations in
the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the packaging process appropriately scheduled and initiated relative to
the other phases of the software development process?
_____ 1.2) Was there an acceptable project schedule plan for packaging the product?
_____ 1.3) Were activities prioritized adequately?
_____
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs. _____
1.5) Were the resources (including personnel) adequately and appropriately allocated? _____
1.6) Were the tasks clearly defined and assigned an owner? _____
1.7) Were the meetings properly prepared, conducted and reported? _____
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
_____
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
_____
1.10) Were costs and scheduling monitored? _____
1.11) Were appropriate actions taken when deviation from plans occurred? _____
1.12) Were the risks that became reality adequately compensated for?
_____
2) Processing:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
_____
2.2) Were the Implementation conditions all satisfied?
_____
2.4) Does the Packaging Process comply with the Federal Fair Packaging and
Labeling Act adequately? _____
3) Metrics:
3.1) Were the following Quality Assessment monitoring data satisfactorily collected and
reported?
3.1.1) The numerical data required for the Packaging Process in the Monitor
Data Definition?
3.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Packaging Process. _____
4) Reviews and Audits:
4.1) Were the following reviews and audits performed satisfactorily by the software project
team?
4.1.1) Review of the Packaging Process during quality control by members of the
quality control in the Packaging Process? _____
5) Standards:
5.1) Was there an adequate Quality Control Plan for this process?
_____
5.2) Were the stipulations in the Quality Control Plan complied with? _____
6) Tests:
6.1) Were adequate plans made for testing and controlling the quality of the Packaging
Process? _____
6.2) Were all test and quality control plans adhered to in developing the Packaging
Process? _____
7) Change control:
7.1) Were the Change Control Standards for the process adhered to? _____
8) Training:
8.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? _____
8.2) Was adequate training about the process provided to the staff prior to its need? _____
Summary Evaluation:
TOTAL SCORE (sum of the scores /total possible sum) ________
===============================
Recommendations:
QUESTION ID COMMENTS
21. Delivery Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the delivery process. This
assessment process has three primary goals:
1) Review the process that deliver the packaged product to a customer.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Delivery.
3) Write the Quality Assessment Report on Delivery, which will contain recommendations on
ways to improve this process and its management in the current project and ways to modify
the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) Packaged product.
2) Project Tracking Plan for this phase.
3) Project Scheduling Plan for this phase.
4) Delivery Process Definition Document.
5) Monitor Data Definition Document.
6) Monitor Data History Archive.
Input Summary:
1) Packaged product.
2) Project Tracking Plan for this phase.
3) Project Scheduling Plan for this phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive
7) Documents from reviews of the Delivery process.
8) Data from change control, scheduling and tracking activities.
Implementation conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) All documents containing data on management quality and delivery quality generated by
the Delivery Process.
2) Evaluate the Delivery process, including its management component, using the relevant data
in Monitor Data History Archive as standards. This includes evaluating the activities in the
Tracking, Scheduling and Change Control Processes.
3) Complete the Quality Assessment Report on Delivery.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Delivery Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Delivery.
Exit conditions:
1) The termination of the product from the market.
2) The Monitor Data History Archive contains the required assessment data on the Delivery
Process.
3) The Quality Assessment Report on Delivery is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON DELIVERY
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
delivers the product to a customer. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended to
it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the question below.
The response to a question may include evaluating a compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations in
the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the delivery process appropriately scheduled and initiated relative to
the other phases of the software development process?
_____
1.2) Was there an acceptable project schedule plan for delivery of the product? _____
1.3) Were activities prioritized adequately? _____
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs. _____
1.5) Were the resources (including personnel) adequately and appropriately allocated? _____
1.6) Were the tasks clearly defined and assigned an owner? _____
1.7) Were the meetings properly prepared, conducted and reported? _____
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
_____
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
_____
1.10) Were costs and scheduling monitored?
_____
1.11) Were appropriate actions taken when deviation from plans occurred?
_____
1.12) Were the risks that became reality adequately compensated for?
_____
2) Processing:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
_____
2.2) Were the Implementation conditions all satisfied?
_____
2.3) Were the Exit conditions for the process satisfied no later than when the Delivery
Process was approved? _____
2.4) Does the Delivery Process cover aspects of the process such as courier choice,
costs, etc. adequately? _____
3) Metrics:
3.1) Were the following Quality Assessment monitoring data satisfactorily collected and
reported?
3.1.1) The numerical data required for the Delivery Process in the Monitor
Data Definition?
3.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Delivery Process. _____
4) Reviews and Audits:
4.1) Were the following reviews and audits performed satisfactorily by the software project
team?
4.1.1) Review of the Delivery Process during quality control by members of the
quality control in the Delivery Process? _____
5) Standards:
5.1) Was there an adequate Quality Control Plan for this process? _____
5.2) Were the stipulations in the Quality Control Plan complied with? _____
6) Tests:
6.1) Were adequate plans made for testing and controlling the quality of the Delivery
Process? _____
6.2) Were all test and quality control plans adhered to in developing the Delivery
Process? _____
7) Change control:
7.1) Were the Change Control Standards for the process adhered to? _____
8) Training:
8.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? _____
8.2) Was adequate training about the process provided to the staff prior to its need? _____
Summary Evaluation:
TOTAL SCORE (sum of the scores /total possible sum) ________
===============================
Recommendations:
QUESTION ID COMMENTS
22. Maintenance Plan Quality Assessment Process Definition
Description:
This document defines the process of assessing the quality of the maintenance process. This
assessment process has three primary goals:
1) Review the process that do maintenance on the delivered product.
2) Generate the data required by the Monitor Data Definition concerning the quality of this
process; they are also for use in preparing the Quality Assessment Report on Maintenance.
3) Write the Quality Assessment Report on Maintenance, which will contain recommendations
on ways to improve this process and its management in the current project and ways to
modify the process in the organization’s other projects.
Entry Conditions:
Availability of the following artifacts:
1) Project Tracking Plan for this phase.
2) Project Scheduling Plan for this phase.
3) Change Control Process Definition Document.
4) Monitor Data Definition Document.
5) Monitor Data History Archive.
6) Maintenance Process Definition.
Input Summary:
1) Feedback from the end users.
2) Project Tracking Plan for this phase.
3) Project Scheduling Plan for this phase.
4) Change Control Process Definition.
5) Monitor Data Definition.
6) Monitor Data History Archive
7) Documents from reviews of the Maintenance process.
8) Data from change control, scheduling and tracking activities.
Implementation conditions:
It is essential to perform the following steps:
1) Evaluate the following documents using the relevant data in the Monitor Data History
Archive as standards for comparison:
1.1) All documents containing data on management quality and maintenance quality
generated by the Maintenance Process.
2) Evaluate the Maintenance process , including its management component, using the relevant
data in Monitor Data History Archive as standards. This includes evaluating the activities in
the Tracking, Scheduling and Change Control Processes.
3) Complete the Quality Assessment Report on Maintenance.
4) Record the quality assessment data required by the Monitor Data Definition concerning the
Maintenance Process.
Output Summary:
1) Updated Monitor Data History Archive.
2) Quality Assessment Report on Maintenance.
Exit conditions:
1) The termination of the product from the market.
2) The Monitor Data History Archive contains the required assessment data on the Maintenance
Process.
3) The Quality Assessment Report on Maintenance is available to management.
Note: None
QUALITY ASSESSMENT REPORT ON MAINTENANCE
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that do
maintenance of the product. The questionnaire below provides evaluation criteria to facilitate
evaluating this process. Any other evaluation reporting information may be appended to it in
order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the question below.
The response to a question may include evaluating a compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations in
the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score
1) Management:
1.1) Was the maintenance process appropriately scheduled and initiated relative to
the other phases of the software development process? _____
1.2) Was there an acceptable project schedule plan for maintenance of the
product? _____
1.3) Were activities prioritized adequately? _____
1.4) Did the process actually conform to the schedule? If not, describe the deviations,
reasons and costs. _____
1.5) Were the resources (including personnel) adequately and appropriately allocated? _____
1.6) Were the tasks clearly defined and assigned an owner?
_____
1.7) Were the meetings properly prepared, conducted and reported?
_____
1.8) Was there satisfactory tracking of the process according to the Project Tracking Plan?
_____
1.9) Were the risks related to the process adequately estimated at the beginning of the
process?
_____
1.10) Were costs and scheduling monitored?
_____
1.11) Were appropriate actions taken when deviation from plans occurred?
_____
1.12) Were the risks that became reality adequately compensated for?
_____
2) Processing:
2.1) Were the Entry conditions for the process satisfied prior to commencing the process?
_____
2.2) Were the Implementation conditions all satisfied?
_____
2.3) Were the Exit conditions for the process satisfied no later than when the Maintenance
Process was approved? _____
2.4) Does the Maintenance Process cover aspects of the process using the Algorithm for
software maintenance adequately?
_____
3) Metrics:
3.1) Were the following Quality Assessment monitoring data satisfactorily collected and
reported?
3.1.1) The numerical data required for the Maintenance Process in the Monitor
Data Definition?
3.1.2) The numerical data pertaining to the product quality required in the Quality
Control Plan of the Maintenance Process. _____
4) Reviews and Audits:
4.1) Were the following reviews and audits performed satisfactorily by the software project
team?
4.1.1) Review of the Maintenance Process during quality control by members of the
quality control in the Maintenance Process? _____
5) Standards:
5.1) Was there an adequate Quality Control Plan for this process?
_____
5.2) Were the stipulations in the Quality Control Plan complied with?
_____
5.3) Were the standards for code testing compiled with?
_____
6) Tests:
6.1) Were adequate plans made for testing and controlling the quality of the Maintenance
Process? _____
6.2) Were all test and quality control plans adhered to in developing the Maintenance
Process? _____
7) Change control:
7.1) Were the Change Control Standards for the process adhered to? _____
8) Training:
8.1) Were the training needs of the staff for the process adequately estimated by the
beginning of the process? _____
8.2) Was adequate training about the process provided to the staff prior to its need? _____
Summary Evaluation:
TOTAL SCORE (sum of the scores /total possible sum) _________
===============================
Recommendations:
QUESTION ID COMMENTS
QUALITY ASSESSMENT REPORT ON QUALITY CONTROL PLAN
Purpose:
This document reports the Process Quality Assessment Board’s evaluation of the process that
produces the Quality Control Plan. The questionnaire below provides evaluation criteria to
facilitate evaluating this process. Any other evaluation reporting information may be appended
to it in order to complete the Quality Assessment Report.
Instructions:
Carefully evaluate the process activities and outcomes in answering each of the questions below.
The response to a question may include evaluating compliance with several conditions. Each
score is to be the percent of the total conditions satisfied for the particular question. Evidence for
an affirmative response on an evaluation condition must exist in order for a positive score to be
assigned for that condition. If a response to a condition is negative, describe the deviations from
acceptability, the additional costs due to the deviations, and the apparent reasons for the
deviations. Summarize the results of the evaluations for each question by recording a percentage
score in the column on the right of the page. Please record all comments and recommendations
in the Recommendation section, associating each comment with the identifying number of the
corresponding question. Evaluation of additional factors may be included.
Evaluation Criteria: Score:
1) Management:
1.1) Was the Quality Control Plan process appropriately initiated relative to the
other phases of the software development process? ______
1.2) Were activities prioritized adequately? ______
1.3) Were the resources (including personnel) adequately and appropriately allocated?
______
1.4) Were the tasks clearly defined and assigned an owner?
______
1.5) Were meetings properly prepared, conducted and reported? ______
1.7) Were costs and scheduling monitored? ______
1.8) Were Quality Control approvals for each phase given appropriately and in a timely
fashion? ______
1.9) Were management approvals given appropriately and in a timely fashion?
______
1.10) Was there satisfactory quality control of the requirements process according to
the Quality Control Plan? ______
1.11) Was there satisfactory quality control of the objectives process according to
the Quality Control Plan? ______
1.12) Was there satisfactory quality control of the specifications process according to
the Quality Control Plan? ______
1.13) Was there satisfactory quality control of the high-level design process according to
the Quality Control Plan? ______
1.14) Was there satisfactory quality control of the low-level design process according to
the Quality Control Plan? ______
1.15) Was there satisfactory quality control of the unit test plan process according to
the Quality Control Plan? ______
1.16) Was there satisfactory quality control of the unit test process according to
the Quality Control Plan? ______
1.17) Was there satisfactory quality control of the function test plan process according to
the Quality Control Plan? ______
1.18) Was there satisfactory quality control of the function test process according to
the Quality Control Plan? ______
1.19) Was there satisfactory quality control of the component test plan process according to
the Quality Control Plan? ______
1.20) Was there satisfactory quality control of the component test process according to
the Quality Control Plan? ______
1.21) Was there satisfactory quality control of the system test plan process according to
the Quality Control Plan? ______
1.22) Was there satisfactory quality control of the system test process according to
the Quality Control Plan? ______
1.23) Was there satisfactory quality control of the acceptance test plan process according to
the Quality Control Plan? ______
1.24) Was there satisfactory quality control of the acceptance test process according to
the Quality Control Plan? ______
1.25) Was there satisfactory quality control of the code process according to
the Quality Control Plan? ______
1.26) Was there satisfactory quality control of the publication contents plan process
according to the Quality Control Plan?
______
1.27) Was there satisfactory quality control of the first draft publications process according
to
the Quality Control Plan? ______
1.28) Was there satisfactory quality control of the second draft publications process
according to the Quality Control Plan?
______
1.29) Was there satisfactory quality control of the packaging process according to
the Quality Control Plan? ______
1.30) Was there satisfactory quality control of the delivery process according to
the Quality Control Plan? ______
1.31) Was there satisfactory quality control of the maintenance process according to
the Quality Control Plan? ______
2) Documentation:
2.1) Were the Entry Conditions for the Quality Control Plan for each phase satisfied prior to
commencing the process? ______
2.2) Does the Input Summary for the Quality Control Plan for each phase include all needed
and/or relevant input? ______
2.3) Were the Implementation Conditions of the Quality Control Plan for each phase all
satisfied? ______
2.4) Were additional implementation conditions needed for any of the phases in the Quality
Control Plan?
______
2.5) Does the Output Summary of the Quality Control Plan for each phase include
all needed and/or relevant output? ______
2.6) Were the Exit conditions for the Quality Control Plan satisfied no later than when the
Quality Control Plan was approved?
______
3) Documentation Standards:
3.1) Does the Quality Control Plan Document satisfy the conditions in the Documentation
Standards?
______
4) Metrics:
4.1) Were the following Quality Assessment monitoring data satisfactorily collected
and reported?
4.1.1) The numerical data required for the Quality Control Plan Process in the
Monitor Data Definition
______
5) Reviews and Audits:
5.1) Were the following reviews and inspections performed satisfactorily by the
Quality Control Plan team?
5.1.1) Quality Control Review of the Requirements document
______
5.1.2) Quality Control Review of the Objectives document. ______
5.1.3) Quality Control Review of the Specifications document ______
5.1.4) Quality Control Review of the High Level Design document. ______
5.1.5) Quality Control Review of the Low Level Design document ______
5.1.6) Quality Control Review of the unit test plan document
______
5.1.7) Technical review of the unit test ______
5.1.8) Quality Control Review of the function test plan document ______
5.1.9) Technical review of the function test ______
5.1.10 Quality Control Review of the component test plan document ______
5.1.11) Technical review of the component test
______
5.1.12 ) Quality Control Review of the system test plan document ______
5.1.13 ) Technical review of the system test ______
5.1.14) Quality Control Review of the acceptance test plan document ______
5.1.15) Technical review of the acceptance test ______
5.1.16) Technical review of the code ______
5.1.17) Quality Control Review of the publication contents plans
______
5.1.18) Quality Control Review of the first draft publications
______
5.1.19) Quality Control Review of the second draft publications
______
5.1.20) Quality Control Review of weekly test reports from software product tests
______
5.1.21) Quality Control Review of daily reports issued to packaging manager during
packaging phase
______
5.1.22) Quality Control Review of daily reports issued to delivery management during
delivery phase
______
5.1.23) Quality Control Review of weekly reports issued to maintenance management
during maintenance phase
______
5.2) Did the final draft of the Quality Control Plan meet approval,
and if not, was there closure on all exceptions to the approval conditions?
______
5.3) Was the approved version of the Quality Control Plan archived under change
control for subsequent update? ______
6) Standards:
6.1) Did the Quality Control Plan and forms and documents produced for the
Quality Control Plan meet the standards contained in the Quality Control
Process Definition? ______
6.2) Are the defect removal goals in the Quality Control Plan sufficient to produce
a software product of the desired quality? ______
7) Change Control:
7.1) Were the Change Control Standards for Documents adhered to? ______
7.2) Were there an excessive number of changes to the Quality Control Plan? ______
8) Quality Control Plan Control:
8.1) Was the Quality Control Plan Process Definition adhered to? ______
9) Media and Hardware Control:
9.1) Were all documents generated during the Quality Control Plan process maintained
on the appropriate media according to the Media Control Standards? ______
10) Records:
10.1) Were all documents generated during the Quality Control Plan process
recorded appropriately and made accessible to the project staff? ______
10.2) Were all documents available on a timely basis?
______
11) Training:
11.1) Were the training needs of the staff for the process adequately estimated by the
beginning
of the Quality Control Plan process? ______
11.2) Was adequate training about the Quality Control Plan process provided to the
staff prior to its being needed? ______
Summary Evaluation:
TOTAL SCORE (sum of the scores/total possible sum)
___________
=====================================================================
Recommendations:
QUESTION ID COMMENTS