d2.1 fitman verification & validation method and criteria · study of 9 standards, 2 software...
TRANSCRIPT
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
D2.1 – FITMAN Verification & Validation
Method and Criteria
Document Owner: Fenareti Lampathaki, Panagiotis Kokkinakos, Christina Bompa, Dimitris
Panopoulos, Iosif Alvertis, Sotiris Koussouris, Dimitris Askounis (NTUA)
Contributors: Outi Kettunen, Iris Karvonen, Kim Jansson (VTT), Guy Doumeingts, B.
Carsalade, M. Ravelomanantsoa (I-VLab), Giacomo Tavola (POLIMI)
Dissemination: Public
Contributing to: WP2, T2.1 FITMAN V&V Generic Method and Criteria Identification
Date: 23/08/2013
Revision: v1.00
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 2/130
VERSION HISTORY
VERSION DATE NOTES AND COMMENTS
0.1 19/04/2013 Initial table of contents and assignments
0.2 17/05/2013 Initial draft Sections 1, 2 and 3
0.3 20/05/2013 Updated draft circulated to the WP2 mailing list including draft Sections 1, 2
and 3 from NTUA and 5 from VTT.
0.4 10/06/2013 Updated draft circulated to the WP2 mailing list including revised Sections 1,
2 and 3 from NTUA and 5 from VTT.
0.5 26/06/2013 Updated draft circulated to the WP2 mailing list including revised Sections 1,
2 and 3 from NTUA, 4 from POLIMI and NTUA and 5 from VTT.
0.6 09/07/2013 Updated draft circulated to the WP2 mailing list including Executive
Summary and Section 6 from NTUA, revised section 4.1 from POLIMI and 5
from VTT.
0.7 23/07/2013 Updated draft circulated for internal peer review including updates in Section
3 and 4 and adding section 5.3 from NTUA.
0.8 02/08/2013 Updated draft prepared by NTUA to address the comments received from
internal peer review by IT Innovation and POLIMI
1.0 23/08/2013 Version submitted to the European Commission
DELIVERABLE PEER REVIEW SUMMARY
ID Comments Addressed ()
Answered (A)
1
An overall well-structured and readable report with good scientific quality.
There is good possibility to exploit its contents in PHASE III projects and/or
in following phases of FITMAN trials (after the project completion).
No action needed
2
Minor changes are very occasionally required in text to enhance the English
language used (in cases of ambiguity or difficulty to read) or formatting (in
cases of intermittent inline use of bold font or of tabular format).
3
The SotA activity is very well addressed in line with document scope.
Additional guidelines for the actual implementation of the methodology could
be beneficiary for the real FITMAN trials.
(more evidence of
the actual
implementation is
expected in T2.3-
T2.4)
4 The tables in Section 2.3.3.2 (which classify the approaches retrieved from the
literature) should be condensed.
Answered in Section
2.4.
5 In Section 2.3.5, the selected review of Phase I projects and the different
emphasis on their results should be explained.
Answered in Section
2.3.5.
6 Possible ambiguity (mainly in terms of phrasing) in the scope of step P-2
needs to be reviewed and addressed (in section 3.2.2.2).
7
With regard to the crowdsourcing aspects of the FITMAN V&V method, the
expectations from the FITMAN trials should be clarified, with the background
review telling them exactly what they need to know.
Answered in Section
3.2.
8 Guidelines for data consolidation and benchmarking could be provided.
Answered – initial
guidelines are
provided in section
5.4.2.
9 The extent to which STEEP is considered and expressed in (intangible)
Criteria such as Quality, Safety, Sustainability, should be explained.
Answered in Section
4.1 – trial criteria are
also expected to be
updated as the project
evolves.
10
Additional FITMAN tasks that are supposed to use the V&V method (e.g. at
some points it seems there should be a reference to Task 3.2 and maybe Task
3.6) should be considered for inclusion in Section 1.1 where the relations
between WP2 and other WPs are described.
11
The alignment with D2.2 in terms of glossary and concepts should be further
stressed. At the same time, “practical examples” to communicate key concepts
should be considered for inclusion. (in Annex I)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 3/130
ID Comments Addressed ()
Answered (A)
12 Partner contributions are not obviously identified. It would help to include a
brief section in the introduction explaining who did what.
Answered in the
Version History
Table
13 Acronyms and abbreviations are used very extensively. Adding a table of
acronyms and abbreviations would help for completeness and clarity reasons.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 4/130
Table of Contents
EXECUTIVE SUMMARY ................................................................................................................................... 5
1. INTRODUCTION ....................................................................................................................................... 9
1.1. Purpose and Scope .......................................................................................................................... 9 1.2. Methodological Approach ............................................................................................................. 11 1.3. Structure of the Document ............................................................................................................. 14
2. V&V LANDSCAPE ANALYSIS .............................................................................................................. 15
2.1. Definitions and Terms ................................................................................................................... 15 2.2. V&V Standards .............................................................................................................................. 17 2.3. V&V Approaches State of the Art .................................................................................................. 18 2.4. Key Take-aways ............................................................................................................................. 62
3. FITMAN V&V METHODOLOGICAL FOUNDATIONS .................................................................... 65
3.1. Roles’ Profiling ............................................................................................................................. 65 3.2. FITMAN V&V Methodology .......................................................................................................... 66 3.3. FITMAN V&V Methodology Mapping to the Reference Architecture ........................................... 79
4. FITMAN V&V CRITERIA ELABORATION........................................................................................ 81
4.1. Business Validation Criteria.......................................................................................................... 81 4.2. IT Verification & Validation Criteria ............................................................................................ 89
5. GUIDELINES FOR THE FITMAN V&V METHOD APPLICATION ............................................... 98
5.1. Timing Perspective ........................................................................................................................ 98 5.2. V&V Checklist per stakeholder ................................................................................................... 101 5.3. V&V Decalogue for the Trials ..................................................................................................... 104 5.4. Recommendations for the V&V Package ..................................................................................... 107
6. CONCLUSIONS & NEXT STEPS ......................................................................................................... 111
7. ANNEX I: DEFINITIONS AND TERMS ............................................................................................. 114
8. ANNEX II: ACRONYMS AND ABBREVIATIONS ........................................................................... 120
9. ANNEX III: REFERENCES................................................................................................................... 122
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 5/130
Executive Summary
FITMAN constitutes an FI-PPP phase II project aspiring to provide the FI-PPP with a set of
industry-led use case trials in the Smart, Digital and Virtual Factories of the Future, in order
to test and assess the suitability, openness and flexibility of FI-WARE Generic Enablers. In
this context, the deliverable at hand aims at developing a holistic Verification and Validation
(V&V) framework while simultaneously covering both the technical and business evaluation
aspects of FITMAN needs. Its main focus lies on verifying and validating both the FI-WARE
generic and FITMAN specific enablers in terms of satisfying the FITMAN technical
requirements as well as the business requirements of Smart-Digital-Virtual use case trials. In
addition, the evaluation and assessment criteria to be used in all Use Case Trials, taking into
account STEEP (social-technological-economical-environmental-political) and future
business benefits were identified and elaborated. Special attention was given to ensure that the
developed and proposed FITMAN V&V methodology took into account both IT and business
aspects and provided a general and extended framework which includes and addresses all the
aforementioned FITMAN’s needs and aspects.
To this direction, the present Deliverable D2.1 describes an all-inclusive framework for
verifying, validating and evaluating a software product from its conception to final release
and implementation in real-life, trial settings. From a business perspective, the methodology
is general enough to be implemented in applications beyond the FITMAN trials use cases and
goes hand-in-hand with the Business Performance Indicators (BPI) Method for FITMAN
elaborated in D2.2 (“FITMAN Verification & Validation, Business and Technical Indicators
Definition”). In essence, it needs to be noted that D2.1 provides the general method along
with techniques for each step and recommendations for the trials, yet it is up to each trial and
development teams to streamline the method according to their own needs and requirements
taking into account the guidelines provided.
The FITMAN V&V methodology introduces a new and innovative way of performing V&V
activities in various ways by:
Bringing together and getting the best of breed of the agile and waterfall software
engineering philosophies which have been essentially dealt as contradictory trends and
schools of thought until now.
Infusing a crowd assessment mentality within the V&V activities, under each
methodological step from code verification to business validation.
Envisioning to assess the software and “fit for purpose” requirements of trials and to
evaluate the software’s added value on business performance as well (in collaboration
with D2.2).
Attempting a categorization and classification of a number of tools and techniques
which in most cases had no clear distinction between V&V and Evaluation aspects.
In order to develop a consolidated FITMAN V&V method, identify the relevant IT and
business criteria and provide the use case trials with useful guidelines to apply the proposed
method, a meticulous methodological approach was put into effect as following:
1. The preparatory procedure
In order to define the scope of the FITMAN V&V method, an in-depth study and analysis of
the FITMAN DoW and use case trials descriptions were performed. The FITMAN’s needs,
aspects and aspirations were identified, elaborated and taken into account so as to develop a
method apt to their requirements from scratch.
Having determined the general scope of FITMAN, a state of play analysis was conducted
involving elaboration of V&V and Evaluation definitions (in collaboration with D2.2) and
study of 9 standards, 2 software development and V&V philosophies, 110 techniques (and
their tools) in 15 V&V categories. A set of facets were defined along a classification scheme
under which 53 approaches were mapped. 11 techniques were described in depth and the
public results of the FI-PPP Phase I projects were analysed. In addition to the above desk
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 6/130
research approach, particular emphasis was laid on crowd engagement methods in V&V with
a view to eventually incorporating a crowd assessment notion to the envisaged methodology.
A set of insightful conclusions were extracted during this procedure:
There is an active on-going research on the fields of V&V throughout the years with
V&V processes closely coupled with almost any business activity. Literature review
provides a variety of research and applied approaches to be studied and taken into
consideration.
A plethora of tools and techniques are available but without clear distinction between
V&V and Evaluation. Thus, despite the fact that techniques and tools were easily
identified, it was challenging to classify them under the V&V and evaluation aspect
they cover.
No “Universal” or Holistic V&V Approach covers any potential need, with most of
the identified and analysed methodologies/ approaches not being mature and widely
established.
The Waterfall model in software V&V prevails as most of the resources referring to
agile V&V activities appear in blogs and online resources rather than rigorous
academic work presented in papers.
Crowdsourcing aspects cannot be easily located in existing/ established V&V and/ or
evaluation approaches with these procedures being “closed” to specific stakeholders
that are typically involved in a highly structured way (like questionnaires).
Concerning the V&V notion in FI-PPP Phase I projects:
o Formal validation processes and feasibility/ acceptability analysis of GEs and SEs
are partially embedded in Phase I Projects
o Implicit verification aspects are put forward by Phase I Trial Projects
o There is a focus on evaluation and business validation spanning multiple domains
in a fit-for-purpose manner
o Lessons learnt and recommendations for the Phase II projects are released, and
o Limited involvement of stakeholders in terms of actual feedback and validation is
presented
Finally, reaching consensus on a common glossary for FITMAN V&V activities was a
milestone that emerged from the V&V-specific (in D2.1) and BPI-relevant (in D2.2)
activities.
2. The proposed FITMAN V&V Methodology
Upon a thorough examination of the V&V endeavours and trends, the FITMAN V&V
methodology development was initiated. As already explained, FITMAN introduces a new
and innovative V&V method combining the agile and waterfall concepts and characteristics.
The FITMAN V&V method also incorporates the business validation and evaluation aspects
and spans over both IT and business perspectives.
In order to develop the FITMAN V&V method the major stakeholders had to be identified,
while their roles and involvement had to be defined. These roles are diversified in three
layers: pure IT roles (that if not exclusively, mainly involve themselves in IT V&V activities),
higher level business roles (who assess the product in terms of the business requirements) and
a wider community constituting the external environment for crowd assessment purposes.
The developed V&V method is divided into two perspectives: the product specific and the
domain/ trial specific one. The trial specific perspective assesses whether the IT and business
requirements and domain’s needs are met while the product-specific perspective describes
how to verify and validate the product (i.e. the Generic Enabler (GE), the Specific Enabler
(SE) or the Trial Solution Component (TSC)) during its development. The method is
elaborated step-by-step featuring, apart from the general description of the procedure, (a) the
potential techniques to be employed (as extracted from the state of the art analysis previously
conducted), and the potential crowd engagement methods to be applied. For each step of the
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 7/130
FITMAN V&V method, one technique retrieved from the international bibliography and the
practical approaches is recommended while alternative techniques are also proposed in case
they are considered more appropriate for some trials or the development team is more familiar
with them. In detail, the V&V method steps include:
I. Business Validation (T-2) to assess whether the overall trial solution eventually
offers sufficient added value to the trial.
o Recommended Technique: Simplified ECOGRAI Methodology, as defined in
D2.2.
II. Trial Solution Validation (T-1) to guarantee that the overall trial solution satisfies
intended use and user needs.
o Recommended Technique: User Acceptance Testing
III. Product Validation (P-5) to examine whether the product satisfies intended use and
user needs.
o Recommended Technique: Black Box Testing for Validation
IV. Release Verification (P-4) to determine whether the requirements of the final product
release are met.
o Recommended Technique: Regression Testing
V. Backlog Verification (P-3) to determine whether the requirements of the product after
each sprint are met.
o Recommended Technique: Regression Testing
VI. Model Verification (P-2) to coordinate the alignment between design and
requirements, as well as between design and code.
o Recommended Technique: Traceability Analysis
VII. Code Verification (P-1) to ensure functionality, correctness, reliability, and
robustness of code.
o Recommended Technique: White Box Testing
It needs to be noted that it is anticipated that some of the above activities will be less intense
in a Use Case project like FITMAN (such as Code Verification and Model Verification), yet
they are defined in the V&V methodology for completeness purposes. Eventually, the
application of the V&V method consolidates the followed techniques per step and is self-
certified in the V&V Package.
Upon releasing the draft methodology, iterative feedback cycles and contributions among the
FITMAN project partners led to finalizing the V&V methodology.
3. Business and IT criteria elaboration
In parallel with the methodology formation the business and IT criteria were elaborated in
order to be part of the validation aspects of the V&V methodology as well as to prepare the
ground and give a fruitful background to elaborate on IT and Business Performance Indicators
(BPIs) in D2.2.
The business validation criteria are categorized into two sets: the generic criteria based on
SCOR and the specific criteria based on the trials business requirements. Indicative categories
of business generic criteria concern customer reliability, agility, responsiveness, cost and
assets, while indicative categories of business specific criteria involve cost, sustainability,
innovation, efficiency, flexibility and quality. IT V&V criteria involve FITMAN-relevant
categories (including openness and versatility) and more generic categories like functionality,
maintainability, usability, reliability, efficiency, and portability based on the ISO 9126
standard.
4. Guidelines
Following the development of the V&V methodology, the trials’ guidelines were prepared
with a view to providing concrete directions to the trials on how to implement the
methodology in their applications. Timing perspectives, V&V checklists and
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 8/130
recommendations are provided in order to help stakeholders perform the V&V activities
properly.
In particular, the guidelines provided involve the relationships/ dependences and timing
constraints of the FITMAN V&V Method development and application: Each step is matched
with the task in which it is performed, its relationships and dependence needs, its timing
constraints and the deadline month to be completed. For each stakeholder a list of items to be
checked is defined along with the timing and responsibility aspects:
The potential functional requirements for the package identification include:
o Integrated information package: The FITMAN V&V package should describe
all the information regarding the criteria, methodologies and indicators.
o Support for trial-specific instantiation: The FITMAN V&V package is used to
instantiate the generic criteria and indicators for a specific trial.
o V&V assessment support and guidance: The FITMAN V&V package should
guide the users through the different phases of V&V assessment.
o Data input and collection: Different indicators require different kinds of data
which are collected from different types of sources.
o Aggregation and analysis of data to calculate indicators
o Trial result visualization in order to offer the potential to understand and
interpret the trial results.
o Documentation: The FITMAN V&V package should document all the raw
data, query etc. results and the process.
o Assessment status indication, allowing for the identification of the status (the
phases completed or currently active) of the trial assessment either individually
per trial or consolidated at FITMAN level.
o Search functions: The V&V package could include a search function to search
all FITMAN V&V assessment information to identify
Other recommendations for the FITMAN V&V package include: Simple, self-
explanatory, easy to use tool; Easy to take into use and instantiate; On-line or/and off-
line usage; Assessment data openness; Modular Implementation/ Operation.
In addition, in order to infuse a very practical and applicable philosophy of the V&V steps to
all stakeholders, a “V&V decalogue” has been constructed including the following
recommendations:
I. Have a deep knowledge of the requirements and the ends sought in each trial.
II. Infuse a V&V culture to your trial.
III. Create a concrete V&V plan to be followed in each step, tailored to your real needs and
competences.
IV. Engage from the very beginning the required stakeholders, assigning them with
appropriate responsibilities and decision power.
V. Tap the power of data to effectively conduct V&V activities.
VI. Proceed with each V&V step on the basis of at least 1 recommended technique.
VII. Keep the balance between business and technical V&V aspects.
VIII. Invest on crowd-sourcing techniques aligned to the trial philosophy.
IX. Keep a complete V&V log and document in detail the V&V findings.
X. Treat V&V as a continuous, iterative procedure.
5. Conclusions & Lessons Learnt
Finally, the present report concludes with analysing the emerging perspectives and lessons
learnt by reporting the conclusions deriving from the work performed and documented in the
deliverable and directions and recommendations for next steps are provided. Next steps
indicatively concern the initialization of the FITMAN V&V Assessment Package and the
necessary adaptions and updates to the V&V method during the project’s Trials operation.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 9/130
1. Introduction
1.1. Purpose and Scope
Nowadays, Europe is undergoing major crisis conditions and is called to support growth in an
ever changing environment while simultaneously provide lasting solutions, so as to emerge
stronger from the crisis. The Future Internet Public-Private Partnership Programme (FUTURE
INTERNET PPP 2013) (FI-PPP) in line with the Europe 2020 (European Commission,
Europe 2020 2013) strategy and its flagship initiatives aims at addressing these challenges
and boosting innovation. “The main goal is to advance a shared vision for harmonised
European-scale technology platforms and their implementation, as well as the integration and
harmonisation of the relevant policy, legal, political and regulatory frameworks (FUTURE
INTERNET PPP 2013). In the sequel of the Phase I projects of FI-PPP, the FITMAN project
(Future Internet Technologies for MANufacturing industries) aspires to contribute in shaping
the future of the manufacturing domain in alliance with the Future Internet challenges.
In particular, according to the FITMAN Description of Work (DoW): “The mission of the
FITMAN project is to provide the FI-PPP with a set of industry-led use case trials in the
Smart, Digital and Virtual Factories of the Future domains, in order to test and assess the
suitability, openness and flexibility of FI-WARE Generic Enablers, this way contributing to
the social-technological-economical-environmental-political (STEEP) sustainability of EU
Manufacturing Industries.”
In this context, the work of FITMAN Task 2.1 is to provide a general, yet holistic Verification
and Validation (V&V) framework, which can be applied to all FITMAN’s needs. Based on
the FITMAN DoW, “verification involves reviewing, inspecting, testing, checking, auditing,
or otherwise establishing and documenting whether the enabler specifications meet the Use
Case Trial specified requirements. The validation is the process of evaluating and assessing
that the FI-WARE generic and FITMAN specific enablers satisfy the requirements of Smart-
Digital-Virtual use case trials”. Such a framework also concerns three different aspects:
1. Verifying that the FI-WARE generic and FITMAN specific enablers satisfy the
technological, platform and architectural integration requirements imposed
2. Validating that the FI-WARE generic and FITMAN specific enablers satisfy the
requirements of Smart-Digital-Virtual use case trials, and
3. Identifying the evaluation and assessment criteria to be used in all Use Case Trials,
taking into account sustainability (STEEP) and future business benefits
Thus, the deliverable at hand aims at developing a FITMAN V&V methodology
(incorporating also the aspect of Evaluation) so as to provide the project’s experimentation
sites and test applications (FITMAN Trials) with a structured and clear set of guidelines for
testing, verifying, validating and assessing the FI-WARE core platform and the respective
Generic Enablers (GEs). To this direction, the purpose of this report is:
To identify international standards addressing V&V aspects
To extensively review and classify existing V&V approaches/ methodologies along
multiple dimensions relevant to the FITMAN context
To study the results of the FI-PPP Phase I projects with the purpose of investigating
their V&V aspects, lessons learnt and recommendations towards FI-PPP Phase II.
To investigate the potential application of crowd-sourcing techniques in V&V.
To elaborate on an integrated, holistic FITMAN V&V methodology, incorporating
and adopting to the needs of FITMAN aspects from previously recognized and
analysed approaches/ methodologies, as well as new, innovative, aspects relevant to
the agile philosophy and the crowdsourcing directions.
To explain the relation of the FITMAN V&V methodology with the FITMAN
Reference Architecture.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 10/130
To propose specific Business and IT evaluation criteria that will formulate the basis
for the validation aspects of the V&V method and the definition of Business and
Technical Indicators in Task 2.2.
To provide a set of guidelines for the application of the FITMAN V&V methodology
by the trials.
To initiate the list of recommendations for the V&V Package that will be developed in
Task 2.3.
As also stated before, the report at hand is released in the context of Task 2.1: FITMAN V&V
Generic Method and Criteria Identification and falls within the scope of Work Package 2:
FITMAN Verification & Validation Method, which has as an overall purpose to develop a
method for the evaluation and assessment of the FITMAN Trials, through:
The identification and integration of existing V&V methods
The description of functional and non-functional technical indicators for evaluating
openness and versatility of FI-WARE in FITMAN trials
The description of business indicators for evaluating the business benefits in the trial
after the adoption of FI-WARE Generic Enablers (GEs)
The integration of technical and business indicators in a generic V&V assessment
package for FI-WARE evaluation in manufacturing smart-digital-virtual factories of
the future
The instantiation of the generic V&V package into the chosen Use Case Trials and
application domains.
In general, Tasks T2.1 (FITMAN V&V Generic Method and Criteria Identification) and T2.2
(FITMAN V&V Business and Technical Indicators Definition) proceed in parallel and give
input to the forthcoming tasks T2.3 (FITMAN V&V Assessment Package) and T2.4
(Instantiation of V&V Assessment Package per Use Case Trial), as depicted in Figure 1-1.
The main points of interaction among T2.1 and T2.2 are detected along the following lines:
From a methodological perspective, Task 2.1 defines the FITMAN V&V methodology
in which business validation is an integral part. Task 2.2 complements the approach by
defining in detail the methodology regarding how business and technical indicators are
selected.
From a FITMAN application perspective, Task 2.1 initiates the evaluation and
assessment criteria to be used in all Use Case Trials. Task 2.2 builds on such criteria to
describe functional and non-functional technical indicators for evaluating openness
and versatility of FI-WARE in FITMAN trials, as well as to define business indicators
for evaluating the business benefits in the trials.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 11/130
Figure 1-1 Contribution of T2.1 to the overall FITMAN concept
WP2 plays an instrumental role in FITMAN through a number of interactions with design and
development activities (in WP1 and WP3), trial-related activities (in WP4-WP6), and
assessment and evaluation activities (in WP7-WP9) as reflected in Figure 1-2.
Figure 1-2 WPs and Tasks interrelations with WP2
1.2. Methodological Approach
In order to conceptualize, develop and adapt the proper FITMAN V&V Generic Method and
consequently provide the corresponding criteria and guidelines for its application, a two-way
methodological approach is followed. A top-down approach has been combined with a
bottom-up approach in order to facilitate both the integration of different abstraction levels in
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 12/130
a tight timeframe as well as the incorporation of business and STEEP sustainability evaluation
aspects. A top-down method starts from the V&V state of the art analysis and the V&V
baseline method elaboration while, on the other hand, a bottom up approach departs from the
tasks performed in favour of trials’ needs to ensure the applicability of the proposed V&V
method.
In detail, with the main objective to study the V&V landscape and define the FITMAN V&V
methodology, the work carried out for the present deliverable (in constant alignment with the
activities of T2.2) bears the following steps:
0. Defining the scope
In the context of this initial and preparatory step, the scope of the FITMAN V&V
methodology (and, hence, the scope of the deliverable at hand) is clarified, taking into
account the FITMAN DoW. The main dimensions of the FITMAN V&V methodology are
identified, in order to provide the initial input to the following steps.
1. Analysing the state-of-play
During this step, an extensive state-of-the-art analysis takes place; first of all in order to
confirm the existence of the dimensions recognised and document that they are actually
needed and, secondly, in order to examine as many relevant aspects as possible and formulate
a complete image of the existing V&V and evaluation landscape.
In the context of this analysis, relevant initiatives are looked for and analysed, including:
Both finished and on-going research and/ or applied projects
V&V and Evaluation approaches/ methodologies
Techniques regarding V&V and evaluation
Tools regarding V&V and evaluation
Both business and IT criteria regarding V&V and evaluation
Existing V&V and evaluation standards etc.
During this process, various sources are consulted including standardization organisations
(e.g. IEEE (IEEE 2013) and ISO (ISO 2013)), electronic publications (e.g. IEEE, Elsevier
(ELSEVIER 2013), Springer (Springer 2013)), academic searching machines (e.g. Scopus
(Scopus 2013), ISI Web of Knowledge (Thomson Reuters 2013), Google Scholar (Google
Scholar 2013)), general searching machines (e.g. Google (Google 2013), CORDIS (Cordis
2013)) etc.
In order to depict the results of the current step, a proper V&V classification schema is
constructed and, apart from a core set of profiling metadata for each approach/ methodology/
case, a set of facets relevant to verification, validation and evaluation aspects and the
applicability to manufacturing settings are identified. Then, the approaches retrieved from
bibliographic research are mapped against the aforementioned V&V Classification Schema in
order to directly compare their features and capabilities.
It needs to be noted that business performance indicators techniques that were retrieved
during the V&V state-of the art analysis are analysed in detail in the context of Task T2.2.
2. Drafting a common glossary
Moving the SotA Analysis one step further, definitions and terms relevant to Verification,
Validation and Evaluation are thoroughly debated in collaboration with task T2.2 in an effort
to clarify the concepts and define some associated concrete examples. Differences amongst
the terms (as well as possible overlaps) were identified, so as to aid FITMAN adopt a proper
glossary applicable to the project’s scope and needs, after having taken into consideration a
wide range of reference sources.
3. Initiating the V&V Methodology
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 13/130
In the context of this step, the various phases of the FITMAN V&V methodology, as well as
the relations and interdependencies amongst them (providing all necessary inputs, outputs,
etc.), are explicitly defined and reported. In addition, the various actors that will participate in
(one or more of) the various phases of the FITMAN V&V methodology are identified and
each stakeholder’s point(s) of interaction and role is described.
The rationale for selecting a baseline method and enriching it with aspects from additional
state-of-the art methods in the various phases is also determined. Each of the aforementioned
phases is described in detail including all steps necessary, recommendations for potential
application of specific V&V techniques and tools, as well as stakeholder engagement
approaches.
4. Aligning the draft V&V Methodology with the FITMAN Reference Architecture
The purpose of this step is to provide a mapping between the project’s V&V phases/ steps and
the FITMAN reference architecture (although in a draft state at the moment). The purpose of
this exercise is to provide the FITMAN Trials a proper V&V methodology in order to help
them review their business case according to the appropriate levels of the FITMAN reference
architecture (Future Internet Cloud, Business Ecosystem, Individual Factory/ Enterprise) it
falls into.
5. Iteratively updating the V&V Methodology
In order to formalize and finalize the draft V&V methodology that was prepared in Step 3, a
set of iterative updates took place to ensure alignment of the validation steps of the V&V
method with Task 2.2 that defined in detail how business and technical indicators are selected.
Feedback from every project partner provided in the FITMAN WP2 physical meeting (that
took place on May 21st and 22nd, 2013 in Helsinki, Finland), the WP2 teleconferences or via
email was iteratively incorporated to ensure that the proposed V&V method meets the
FITMAN expectations from a scientific and application perspective.
6. Elaborating on the Business and IT Criteria
The main purpose of this step is to define all necessary criteria that will accompany the
FITMAN V&V methodology application at trial level. Both generic and trial-specific criteria
are identified, elaborated and reported for the validation steps.
FITMAN Generic Criteria are the horizontal, baseline criteria which concern the Verification
& Validation activities for all possible stakeholders (across all manufacturing trials) and are
divided in two main categories:
Business Criteria, defined as the criteria relevant to business aspects.
IT Criteria, defined as the criteria relevant to IT aspects.
On the other hand, FITMAN Trial-Specific criteria are the criteria extracted from the
particular needs, specifications and aspects of each FITMAN trial based on their business
objectives as expressed in Task T1.1.
On the basis of such criteria extracted for the V&V method, the proper Business Performance
Indicators and Technical Indicators are identified, iteratively aligned and adopted in Task
T2.2.
7. Preparing Trials’ Guidelines
Taking into account that the FITMAN V&V method is inevitably compiled as an integrated
methodological framework from a rather scientific perspective, a set of practical guidelines
needs to be provided, in order to assist all actors involved in the FITMAN V&V activities to
perform their task(s) in an efficient and effective way. Practical guidelines regarding the
timing perspective of each of the methodology’s steps as well as a set of structured checklists
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 14/130
and a “V&V Decalogue” are also provided to the relevant actors, in order to help them
organize their work and pay attention to all critical success aspects.
8. Analysing the emerging perspectives and lessons learnt
On the basis of the experience gained through the previous steps (e.g. the detailed V&V and
Evaluation state-of-the-art analysis, the iterative methodology updates etc.), a set of
conclusions and key take-aways emerge regarding both the concept of V&V and Evaluation,
as well as the most relevant underlying methodologies upon which the FITMAN V&V
method is built.
In summary, Figure 1-3 depicts the methodological approach followed in T2.1.
Figure 1-3 Methodological Approach
1.3. Structure of the Document
The structure of the document is as follows:
Section 2 investigates the current state-of-the art (e.g. approaches/ methodologies,
standards, FI-PPP projects) in Verification and Validation (V&V) approaches,
classifies them along multiple dimensions through a properly developed classification
schema, examines their relevance to the FITMAN needs and concludes with the
lessons learnt and the emerging key take-aways.
In section 3, the integrated methodological approach that will be followed for the
FITMAN Verification & Validation activities will be presented and described in
detail.
In Section 4, all necessary (business and IT) criteria that accompany the FITMAN
V&V approach in manufacturing settings (i.e. both generic and trial-specific) will be
identified and presented.
Section 5 provides a set of concrete practical guidelines towards all actors involved in
the FITMAN V&V methodology application.
The conclusions deriving from the work performed and documented in the deliverable
at hand, as well as directions and recommendations for next steps will be reported in
the final Section 6.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 15/130
2. V&V Landscape Analysis
2.1. Definitions and Terms
In general, the international literature is replete with various Validation and Verification
definitions. FITMAN adopts the ANSI/IEEE Std 1012-2012 (IEEE 2012) definition of V&V,
as presented in the following table:
Table 2-1: Verification and Validation Definition
Verification Validation
Definition The process of providing objective
evidence that the software and its
associated products conform to
requirements (e.g., for correctness,
completeness, consistency, accuracy) for
all life cycle activities during each life
cycle process (acquisition, supply,
development, operation, and
maintenance); satisfy standards,
practices, and conventions during life
cycle processes; and successfully
complete each life cycle activity and
satisfy all the criteria for initiating
succeeding life cycle activities (e.g.,
building the software correctly).
The process of providing evidence
that the software and its associated
products satisfy system
requirements allocated to software
at the end of each life cycle
activity, solve the right problem
(e.g., correctly model physical
laws, implement business rules, use
the proper system assumptions),
and satisfy intended use and user
needs.
Question Are we building the product right? Are we building the right product?
Objective To ensure that the product is being built
according to the requirements and design
specifications.
To ensure that the product actually
meets the user’s needs, the
specifications were correct in the
first place and the product fulfils its
intended use when placed in its
intended environment.
Indicative alternative definitions of V&V (ESA Board for Software Standardisation an
Control 1995) are provided in the following paragraphs. Although there are no significant
deviations from the context of the terms as defined in IEEE standard, they are provided for
better understanding the differences in the meaning of ‘Validation’ and ‘Verification’ terms.
Verification is defined as:
act of reviewing, inspecting, testing, checking, auditing, or otherwise establishing and
documenting whether items, processes, services or documents conform to specified
requirements (American National Standard 1978)
process of evaluating a system or component to determine whether the products of a
given development phase satisfy the conditions imposed at the start of the phase
(ANSI/IEEE 1990)
formal proof of program correctness (ANSI/IEEE 1990)
Validation is defined as:
process of evaluating a system or component during or at the end of the development
process to determine whether it satisfies specified requirements (Tran 1999)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 16/130
Confirmation by examination and through provision of objective evidence that the
requirements for a specific intended use or application have been fulfilled
(International Organization for Standardization 2008).
It has to be mentioned that in the literature, both Validation and Verification are often
considered part of the more generic concept of evaluation. However, typically evaluation is
broader in scope than typical Validation and Verification as it also involves answering the
question: `Does the software do something of sufficient value?' (Beacham 2010). The
consideration whether this software approach is better than any other (traditional) software
approach, is also often put into the table. In this context, it is thus important, apart from
determining that the right product is built in the right way, to demonstrate that the product has
a positive benefit to its users, allowing them to do something they couldn't do before, or
allowing them to do something better or faster than they could before. In some cases, it
happens that a system works correctly and does what the users want, but still – for several
reasons – not be of significant value to its users. In order to handle this case in FITMAN and
to be able to assess whether the produced outcomes are of real/significant value, the FITMAN
V&V methodology extends the typical V&V concept by including the usage of specific
Business Performance Indicators, through which the business benefits resulting from the
application of the FITMAN trial solutions can be identified and measured.
The following table presents a set of additional terms and definitions adopted in FITMAN
which are relevant to D2.1. The complete glossary of terms adopted for the V&V activities in
alignment with T2.2 is available in Annex I. Table 2-2: V&V-relevant Terminology
Key Term Definition
Backlog A backlog is a set of tasks that must be finished before code can be
released. During a product backlog grooming session, the
development team works with business owners to prioritize a list
of work that is backlogged and break the list into user stories for
future use (SearchSoftwareQuality 2010)
Criterion A standard on which a judgment or decision both at business and
at IT level may be based. Each criterion should be clearly defined
to avoid ambiguity in understanding and prioritizing the differing
views that affect a decision or an assessment.
Indicators Individual quantitative and/ or qualitative measures that provide
answers whether the V&V criteria are met. A complete set of
indicators is required in order to assess the extent to which
objectives have been achieved on the basis of the defined criteria.
Quantitative indicators relate to changes in numbers (e.g., the
number of users), whereas qualitative indicators relate to changes
in perception (e.g., the opinion of users). Indicators need to be
based on a clear idea of which stakeholder group(s) they will be
applied to, and to be feasible both technically and financially.
Key Performance
Indicators (KPI)
Performance Indicators which play an important role to define the
Performance of a system (Enterprise or trial). Usually at the
strategic level, the companies use the expression Key Performance
Indicators (KPIs) because these PIs evaluate the global situation.
Key Success Factors Internal or external elements that an organization can control to
reach its objectives. They can relate to products (quality),
organization (skill), customers (satisfaction) etc.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 17/130
Key Term Definition
Objective Desired outcome/ result to be achieved by a company or a part of a
company (in the case of FITMAN by each trial) within a specified
time frame and by utilising the available resources. Objectives are
defined for each company according to their functions and services
and need to be coherent so that the global performance is
improved. Each objective must contribute to the achievement of
the global objectives.
Performance
Indicators
A set of quantifiable measures that a company or industry uses to
estimate the efficiency of an action variable or a decision variable,
in comparison to the achievement of an objective defined for a
system (for example to evaluate performances on the way to meet
its strategic and operational goals).
Process An enterprise as production system is constituted by interactive,
interdependent and inter connected processes composed
themselves of activities.
Sprint A sprint is a set period of time during which specific work has to
be completed and made ready for review (SearchSoftwareQuality
2010). It is an iteration which after its completion results in a
product version release.
2.2. V&V Standards
During the last years, a plethora of standards referring to the V&V processes, as well as to
evaluation of systems and software, has been released under the auspices of international
standardization organizations. The following standards are listed among the most prominent
ones:
ANSI/IEEE Std 1012-2012 - IEEE Standard for System and Software Verification and
Validation (IEEE 2012): This standard deals with verification and validation (V&V)
processes and applies to systems, software, and hardware being developed,
maintained, or reused. It was published in May 2012 and is still active. It constitutes a
revision/ descendant of the ANSI IEEE Std 1012-1998 and the ANSI IEEE Std 1012-
2004.
IEEE Std 1059-1993 - IEEE Guide for Software Verification and Validation Plans
(IEEE 1993): This standard specifies the required content for an SVVP. This guide
recommends approaches to Verification and Validation (VandV) planning. It was
published in 1993 and is now withdrawn.
ESA PSS-05-0 ESA Software Engineering Standards (ESA Board for Software
Standardisation and Control 1991): This standard derives from the European Space
Agency (ESA), modified to remove ESA-specific terminology. This generalised
version was produced through ESA’s Technology Transfer Programme. It is active
and was first published in 1991, followed by a number of revisions. Among others, it
deals with software verification, validation and quality assurance.
ISO/IEC 25000 series - Systems and software Quality Requirements and Evaluation
(SQuaRE) (International Organization for Standardization 2005): This standard series
addresses software product quality requirements specification, measurement and
evaluation. Quoting from the standard itself “The general goal of creating the
SQuaRE set of International Standards is to move to a logically organized, enriched
and unified series covering two main processes: software quality requirements
specification and software quality evaluation, supported by a software quality
measurement process”. It was introduced in 2005 and is still active.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 18/130
ISO/IEC 9126 (International Organization for Standardization 2001) series Software
engineering - Product quality: ISO/IEC 9126 is an international standard for the
evaluation of software quality. The fundamental objective of this standard is to
address some of the well-known human biases that can adversely affect the delivery
and perception of a software development project. ISO/IEC 9126 tries to develop a
common understanding of a project's objectives and goals. It was published in 2001.
ISO/IEC 14102:2008 Information technology - Guideline for the evaluation and
selection of CASE tools (International Organization for Standardization 2008): This
standard defines both a set of processes and a structured set of CASE tool
characteristics for use in the technical evaluation and the ultimate selection of a CASE
tool. It follows the software product evaluation model defined in ISO/IEC 14598-
5:1998. It was published in 2008 and is still active.
ISO/IEC 14756:1999 Information technology- Measurement and rating of
performance of computer-based software systems (International Organization for
Standardization 1999): This standard addresses issues on computer software, software
engineering techniques, data processing, performance testing, quality assurance,
conformity, programming languages etc. It was published in January 1999 and is still
active.
ISO/IEC 29155:2011 Systems and software engineering -- Information technology
project performance benchmarking framework (International Organization for
Standardization 2011): This standard identifies a framework which consists of
activities and components that are necessary to successfully identify, define, select,
apply, and improve benchmarking for information technology (IT) project
performance. It also provides definitions for IT project performance benchmarking
terms. It was published in December 2011 and is still active.
Software Engineering Body of Knowledge (SWEBOK - ISO/IEC TR 19759:2005)
(International Organization for Standardization 2005): this guide to the software
engineering body of knowledge (SWEBOK) identifies and describes generally
accepted knowledge about software engineering (divided in 10 knowledge areas). The
guide was published in 2005 (Version 3) and is active ever since.
2.3. V&V Approaches State of the Art
In order to gain a thorough and complete vision on the existing V&V approaches/
methodologies (as foreseen in the project’s DoW), an extensive desk research has to take
place. A list of indicative, yet not exhaustive, sources utilized in order to achieve the
aforementioned scope includes: Publications in journals and conferences, Books, General
search engines and Search engines oriented towards academia and/ or research (Scopus, Web
of Science etc.), Reports, Standards.
2.3.1. Agile Vs. Waterfall Philosophy in V&V
In the process of reviewing the state-of-the-art in verification and validation methodologies,
especially when dealing with software development, the reader comes across two main
streams: the waterfall model and agile software development.
On the one hand, the waterfall model (Winston 1970) constitutes a step-by-step process; it is
sequential and highly structured. Indicative steps of the waterfall software development model
include Requirements Identification/ Specification, Design, Implementation/ Coding and
Integration. The waterfall model is well-established, has many supporting arguments but has
also received criticism, especially during the last decades. The effort to verify and validate
software developed following the waterfall model has led to one of the most (if not the most)
well-known general approaches for software verification and validation: the v-model
(Waterfall Model 2013) as depicted in the following figure.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 19/130
Figure 2-1: Example of Traditional V-model (SRM Technologies 2013)
The V-model practically constitutes an extension of the aforementioned waterfall model; in
fact, its left branch practically is the waterfall model, while its right branch includes
approaches/ techniques for the verification and validation of the software under development.
Unit Testing, Integration Testing, System Testing, Operational Testing, Acceptance Testing
constitute members of an indicative, yet not exhaustive, list of such verification and validation
techniques.
On the other hand, agile software development (Larman 2004) is gaining more and more
momentum in the last few years. In contrast to the waterfall model, agile software
development is based on an iterative and incremental workflow, aiming to rapid, collaborative
and flexible end-product development. The main idea behind agile software development is
breaking the main project into a larger number of relatively small increments, each one
developed in short, efficient and effective iterations.
Unlike the V-model, in agile software development, there is no special procedure (or set of
procedures) dedicated to verification and validation. Each individual iteration involves
working in all functions: planning, requirements analysis, design, coding, unit testing,
acceptance testing etc. At the end of the iteration, a working product (regardless of its size and
importance) is demonstrated to stakeholders who provide the necessary feedback. Although
there are voices suggesting that agile is inappropriate for products of sequential nature, agile
software development is an efficient and effective approach, accompanied with adaptability
and predictability (Boehm and Turner 2004).
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 20/130
Figure 2-2: Example of Agile Software Development (Navy Health Portal 2012)
Due to the broad scope of the desk research and the highly anticipated differentiation of the
available findings/ results, it was considered of great value to identify a set of (other generic,
other specific) criteria, aiming to facilitate the procedure of writing down and classifying all
recognised V&V approaches/ methodologies/ applications. The aforementioned criteria are
presented in the next section of the document at hand.
2.3.2. Typical V&V Techniques Used in V&V activities
There is a great variety of techniques available for performing V&V activities and tasks. In
the following table, a number of techniques have been categorized according to the
Verification and Validation aspects they cover and are further classified in line with the
SWEBOK Guide 2004 (International Organization for Standardization 2005). It needs to be
noted that some techniques may belong to more than one classification categories but they are
placed under the category which is considered most representative.
Table 2-3: V&V Techniques
Classification Techniques Verification Validation
Review and/ or Revision of
Requirements
Program Proving x
Prototype Execution x
Symbolic execution x
Execution Profiling x
Model Validation x
User Interface Analysis x
Requirement Reviews x x
Traceability analysis x x
Consistency Checking x x
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 21/130
Classification Techniques Verification Validation
Inspection x x
Walk-through x x
Management reviews x x
Technical reviews x x
Audits x x
Desk Checking x x
Quality measurement by
metrics
x x
Code based Techniques
White box testing x
Model Checking/
Algorithmic analysis
x
Reference models for code-
based testing (flow graph,
call graph)
x x
Graph Based Analysis x x
Coverage Based Testing x x
Construction Quality
Techniques
Unit testing x
Code stepping x
Use of assertions:
Inductive Assertions
Assertion Checking
x
Debugging x
Integration testing:
Bottom Up Testing
Top-Down Testing
x x
Test-first development:
Acceptance Test-
Driven Development
Test-Driven
Development
x x
Submodel Testing x x
System Testing x x
Software Engineer’s
Intuition and Experience
based Techniques
Ad hoc (e.g. Formalized
techniques/ descriptions)
x
Exploratory testing x x
Fault based Techniques Error guessing x
Mutation testing x
Objective Based
Techniques
Complexity-Based Testing/
Complexity Analysis
x
Control flow analysis x
Performance Testing
Load testing
Stress testing
Endurance testing
(soak testing)
Spike testing
x
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 22/130
Classification Techniques Verification Validation
Configuration testing
Isolation testing
Field Testing
Back-to-back testing x
User acceptance testing or
Beta Testing
x
Usability and human
computer interaction testing
x
Functional testing x
Correctness
testing
x
Visualisation x
Reliability and evaluation
achievement
x x
Regression Testing x x
Alpha Testing x x
Installation Testing x x
Security testing x x
Recovery testing x x
Path Analysis x x
Input/ Output Behaviour
based Techniques
Black box testing x
Software Design Quality
Analysis and Evaluation
Simulation and prototyping:
Performance
simulation
Feasibility prototype
x x
Face Validation x
Software design reviews:
Architecture reviews
Design reviews
Scenario-based
techniques
x x
Fault-tree analysis x x
Automated cross-checking x x
Specification-based
Techniques
Equivalence partitioning x
Pairwise testing x
Boundary-value analysis x
Random testing x
Decision table x x
Finite-state machine-based x x
Testing from formal
specifications
x x
Usage-based Techniques Operational profile x
User observation heuristics x
Statistical testing/ Statistical
Techniques
x
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 23/130
Classification Techniques Verification Validation
Induction, Inference, and
Logical Deduction
x
Usage Based Testing x x
Software Reliability
Engineered
x x
Proof of correctness x x
Selecting and Combining
Techniques
Functional and structural x
Deterministic vs. random x
Nature of Application
based Techniques
Object-oriented testing x x
Component-based testing x x
Web-based testing
Testing of concurrent
programs
x x
GUI testing x x
Protocol conformance testing
Testing of distributed
systems
x x
Testing of real-time systems x x
Safety Analysis Techniques Fault tree analysis x x
Failure mode effect and
criticality analysis (FMECA)
x x
Business Validation:
Assessment of the newly
developed software in terms
of performance in meeting
and contributing to
business requirements
Analytic Modelling x
Simulation x
Technology Readiness Level
(TRL)
x
Manufacturing Readiness
Level
x
DEA Model x
Standard Processes (e.g.
SOPs)
x
Workforce (e.g. Cross-
Training)
x
Visual Management (e.g.
Lean Tracking)
x
Quality Assurance (e.g.
Pareto Inspection)
x
Workplace Organisation
(e.g. 5S)
x
Material Flow (e.g. MTM
Analysis)
x
Lot Sizing (e.g. KANBAN
Systems)
x
Continuous Improvement
(e.g. PDCA)
x
ZIP Regression x
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 24/130
Classification Techniques Verification Validation
ANOMR/ANOMRV Chart x
P.E.R.T./C.P.M. x
GRAI Grid x
Balanced scorecard x
ECOGRAI Methodology for
decisional models
x
Manufacturing and Value
Chains’ SCOR-VCOR
methods
x
Evaluation:
Selection and/ or
prioritization of software
solutions to meet specific
requirements
COTS component selection x
DESMET Methodology x
Weighted Scoring Method x
Delphi x
Nominal Group x
ANP/AHP x
Fuzzy Logic x
MACBETH x
Fuzzy AHP x
Fuzzy AHP x
Fuzzy TOPSIS x
In the next paragraphs, the most commonly-used techniques which are considered as more
relevant to the FITMAN concepts are described. It needs to be noted that techniques like BSC
(Balanced Score Card), ECOGRAI, TOPP SYSTEM, ENAPS are analyzed in the context of
Deliverable D2.2 and are not thus included in this deliverable.
2.3.2.1 Inspection
Goal Inspection constitutes the most common review technique (Kan 2004)
(Rico 2013).
In terms of verification:
The goal of inspection is to identify software defects in order to
eliminate them and detect anomalies and faults in software
documentation.
In terms of validation:
The ultimate goal is that all the actors involved in the inspection
tasks will consent to a work product and approve it for use in the
project.
Description The inspection technique was first introduced by Michael Fagan in the
mid-1970s and has been later extended and modified (Fagan 1986). The
inspection process may consist of multiple steps so as to perform the
inspection tasks and segregate the inspection functions as follows (IEEE
2004):
a) Inspection planning: Planned by the moderator.
b) Product overview: The work product background is described by the
author.
c) Inspection preparation: Inspectors examine the work product to identify
possible defects.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 25/130
d) Examination meeting: The reader reads through the work product, part
by part and the each inspector points out the defects for every part.
e) Defect rework: The author modifies the work product according to the
action plans extracted from the inspection meeting.
f) Resolution follow-up: The changes by the author are reviewed and
checked to make sure everything is correct.
An inspection is performed by a small team of peer developers. The team
members undertake separate roles to find, classify, report, and analyse
defects in the product. These roles usually are the author, the moderator,
the reader, the recorder/scribe and the inspector. Each type of inspection is
specifically defined by its intended purpose, required entry criteria, defect
classification, checklists, exit criteria, designated participants, and its
preparation and examination procedures. Inspections do not debate
engineering judgments, suggest corrections, or educate project members;
they detect anomalies and problems and verify their resolution by the
author (IEEE 2004)
Offerings Rapid detection of anomalies and faults in documents in both
natural language and also in more formalized descriptions.
When the kind of anomaly or fault is well known and simple to
detect, readers with less experience than the authors may be used.
(International Atomic Energy Agency 1999)
More effective and efficient than many, if not most lifecycle
framework V&V techniques.
Inspections are an order of magnitude more effective and efficient
than testing, minimizing the need for testing-intensive V&V.
(Rico 2013)
Restrictions The readers should have sufficient expertise.
Known types of fault may be well identified but difficult to
discover.
(International Atomic Energy Agency 1999)
Disadvantages Depending on the readers’ expertise, the technique may be limited
to the discovery of known predictable faults and would not then
cover all aspects of the documents.
The technique is best used to check a limited volume of
documentation. It should be adapted for inspection of a large
volume of documentation produced by a development team.
(International Atomic Energy Agency 1999)
Tools Despite the fact that the inspection process is mostly dependent on human
activity and not automated or significantly tool assisted (International
Atomic Energy Agency 1999) according to (Laitenberger and DeBaud
2000) a list of inspection tools are presented below:
ICICLE (Brothers, Sembugamoorthy and Muller 1990)
Scrutiny (Gintell, Houde and McKenney 1995)
CSRS (Johnson and Tjahjono 1993)
InspecQ (Knight and Myers 1993)
ASSIST (Macdonald 1997)
CSI/ CAIS (Mashayekhi, et al. 1993)
InspectA (Murphy and Miller 1997)
Hypercode (Perpich, et al. 1997)
ASIA (Stein, et al. 1997)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 26/130
Perspectives Taking into consideration its wide range of applications in V&V
approaches and after having studied its appropriateness in being applied in
the framework of future internet services concerning the manufacturing
domain, the inspection technique is considered as a proper and potential
technique to be used in the current work.
2.3.2.2 Walk-through
Goal Walkthrough is an informal review technique, in which development
personnel lead others through a structured examination of a product.
(IEEE 2004)
The goal of walk-through is in terms of verification
To present the documents both within and outside the software
discipline in order to gather the information regarding the topic
under documentation.
To explain or do the knowledge transfer and evaluate the contents
of the document
To achieve a common understanding and to gather feedback.
Check completeness and consistency of documents resulting from
a creative activity (e.g. design) or requiring a high level of
justification (e.g. test plans);
(ISTQB GUIDE in STATIC TECHNIQUES n.d.)
Reveal faults in all phases of program development. The method
must be adapted for documents issued from a team
(International Atomic Energy Agency 1999)
In terms of validation
To give an overview of the totality of activities users perform, and
to gain an appreciation of how the software will be used.
(Perry 2006)
To validate the system’s behaviour simulated in the scenario
(Maiden, et al. 1999)
To examine and discuss the validity of the proposed solutions
(ISTQB GUIDE in STATIC TECHNIQUES n.d.)
Description The major roles involved in performing the walkthrough tasks are the
following:
Leader: conducts the walkthrough meetings and handles
administrative tasks so as to ensure that the process will be
conducted orderly
Recorder: notes all anomalies, decisions, and action items
identified during the meetings.
Author: mainly presents the software product in step-by-step
manner during the walk-through meetings, but in addition he may
also be responsible for completing most action items
Often the same person undertakes the leader’s and the author’s
role.
With those actors involved the walkthrough process is in short described
as follows: Author describes the artefact to be reviewed to reviewers
during the meeting. Reviewers present comments, possible defects, and
improvement suggestions to the author. Recorder records all defect,
suggestion during walkthrough meeting. Based on reviewer comments,
author performs any necessary rework of the work product if required.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 27/130
Recorder prepares minutes of meeting and sends the relevant stakeholders
and leader is normally to monitor overall walkthrough meeting activities
as per the defined company process or responsibilities for conducting the
reviews, generally performs monitoring activities, commitment against
action items etc. (soft-engineering.com 2010)
Offerings Walk-through can be organized in a short time.
It is best used to check documents resulting from a creative
activity.
It can be applied effectively to design documents or documents
requiring a high level of justification, such as test plans.
The freedom allowed to the discussion (the only constraint is to
discuss all the documents) permits coverage of all aspects of the
documents and a check for completeness, consistency and
adequacy.
Walk-through does not require extensive preparation.
The whole process can be concentrated in a few days.
It also serves the purpose of sharing information among the
development team.
It can be helpful in training newcomers.
(International Atomic Energy Agency 1999)
Consciousness raising-Senior people get new ideas and insights
from junior people
Enables observation of different approaches to software analysis,
design, and implementation
"Peer pressure" improves quality
Promotes backup and continuity-Reduces risk of discontinuity &
"useless code" since several people become familiar with parts of
software that they may not have otherwise seen
(s-qa.blogspot 2013)
Restrictions Walk-throughs must be foreseen and planned by the development
team, in order to ensure the availability of readers and authors.
(International Atomic Energy Agency 1999)
Disadvantages The walk-through team should involve persons independent of the
development team, to avoid bias in their criticism.
As the preparation time is short, some critical parts may be missed
or not sufficiently considered.
The technique is best used to check documents issued by a single
author, and it should be adapted to tackle documents issued by a
team.
(International Atomic Energy Agency 1999)
Tools The walkthrough process is totally dependent on human activity and not
automated or tool assisted. (International Atomic Energy Agency 1999)
Perspectives Walkthrough constitutes an easy to be applied and quite common V&V
technique that is considered quite useful and highly potential to be
implemented in manufacturing ICT projects.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 28/130
2.3.2.3 Traceability analysis/ assessment
Goal Software traceability is the ability to relate artefacts created during the
development of a software system to describe the system from different
perspectives and levels of abstraction with each other, the stakeholders
that have contributed to the creation of the artefacts, and the rationale that
explains the form of the artefacts. (Spanoudakis and Zisman 2005)
The goal in terms of verification and validation as well, is to verify for
those artefacts, that the requirements have been exhaustively taken into
account by checking their trace in the resulting documentation; and to
verify that all critical requirements have been identified and included
through the development life cycle, i.e. from requirements analysis to final
tests. (International Atomic Energy Agency 1999)
Description Traceability analysis may be carried out by a single person. It requires an
in depth understanding of the input requirements and also of the method or
technique used to develop the target documentation. When necessary,
traceability analysis may involve a step to confirm the requirements in
order to be certain to trace the actual requirements and not merely chapters
or paragraphs of the input documentation.
A knowledge of specific probabilistic reliability allocation techniques may
be necessary to carry out the traceability analysis of reliability
requirements, and similar expertise may also be needed for other non-
functional requirements. The analysis results in a document justifying that
all the requirements have been taken into account properly. The document
will identify, requirement by requirement, which part or component, or
chapter of the documentation, fulfils it. The results are usually
summarized in the form of a matrix of input requirements versus elements
of the target documentation. The matrix can also be extended to include
the source code. (International Atomic Energy Agency 1999)
Offerings Confirms the links between the documents and, if applied during
development, strengthens those links
(International Atomic Energy Agency 1999)
Customer satisfaction. Customer can check and follow the product
development progress according to their requirements
A higher level of details for the software engineers helps in impact
analysis
(Torkar, et al. 2012)
Coverage Analysis is easier to execute, since all requirements are
traced to higher-level requirements, it can be verified that all
requirements are satisfied.
A Better Design is the result of requirements tracing, as “best”
designs are possible when complete information is available to the
architect.
Fewer Code Reworks is another benefit offered, as tracing enables
the team to catch potential problems earlier in the process.
Change Management is improved, as when a requirement is
changed, the entire “trace” can been reviewed for the impact to the
application.
All these result in a shorter development cycle and more on time
launches.
(Stout)
Restrictions A person must be available with a deep understanding of the
requirements and of the way the target documents have been
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 29/130
developed.
Traceability analysis is made easier if the input requirements can
be simply referred to without ambiguity.
Some predefined rules should be followed; for example, one
requirement per sentence or numbering of requirements allows a
simple trace reference method.
(International Atomic Energy Agency 1999)
Disadvantages Traceability analysis is easier to apply for functional requirements.
Non-functional requirements, such as safety, security and
maintainability requirements, are less simple. These may be
fulfilled by a certain structure or by a co-operative behaviour of
the components. For these, the traceability matrix may often
usefully be accompanied by a case by case justification.
(International Atomic Energy Agency 1999)
Requirements traceability was perceived to offer no immediate
benefits and hindered the main development process. This view
was expressed most strongly by development managers and project
leaders.
(Arkley and Riddle 2005)
It takes extra time and effort to keep up with the tracing of
requirements, as well as increased maintenance effort, especially in
an evolving system that requires numerous changes for future
releases.
When the traceability process is established, a certain amount of
change management needs to be employed, especially with regard
to developers, who need to be convinced that the minor daily effort
of requirements tracing will have payoffs in the future.
The tendency to not know when to stop tracing: some tracing may
go “so deep” that the effort is counterproductive.
(Stout)
Tools Despite the fact that establishing the links is a process which is driven by
human activities, which cannot be automated, tools exist which are able to
capture both input documents and target documents and traceability links
between them. It may be assumed that other tools can also be used, e.g.
hypertext and electronic documentation tools. (International Atomic
Energy Agency 1999)
According to (Stout) such tools regarding requirements traceability are
presented below:
AnalystStudio
Caliber-RM
CORE
CRADLE/REQ
DOORS
DOORS/ERS
DOORSrequireIT
GMARC
icCONCEPT
IrqA
ITraceSE
RDT
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 30/130
RequisitePro
RIMS
RTM
RTS
SLATE
Systems Engineer
Tofs
Tracer
Vital Link
XTie-RT
Perspectives Traceability analysis is widely utilized and incorporated as a requirements
review technique and has proven its added value and effectiveness in lots
of applications and approaches. Thus, it is considered highly appropriate
and useful to be involved in V&V activities regarding the ICT projects in
manufacturing domain.
2.3.2.4 Model Checking
Goal Model checking is mainly a verification technique and the term model
checking designates a collection of techniques for the automatic analysis
of reactive systems. The goal is to find subtle errors in the design of
safety-critical systems that often elude conventional simulation and testing
techniques. (Merz 2001)
Description Software model checking is the algorithmic analysis of programs to prove
properties of their executions. It traces its roots to logic and theorem
proving, both to provide the conceptual framework in which to formalize
the fundamental questions and to provide algorithmic procedures for the
analysis of logical questions. (JHALA)
The inputs to a model checker are a (usually finite-state) description of the
system to be analysed and a number of properties, often expressed as
formulas of temporal logic, that are expected to hold of the system. The
model checker either confirms that the properties hold or reports that they
are violated. In the latter case, it provides a counterexample: a run that
violates the property. Such a run can provide valuable feedback and points
to design errors.
Offerings It has been proven cost-effective
It integrates well with conventional design methods
(Merz 2001)
Restrictions The object of analysis is always an abstract model of the system
Standard procedures such as code reviews are necessary to ensure
that the abstract model adequately reflects the behaviour of the
concrete system in order for the properties of interest to be
established or falsified.
(Merz 2001)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 31/130
Disadvantages Quite frequently, available resources only permit to analyse a
rather coarse model of the system. A positive verdict from the
model checker is then of limited value because bugs may well be
hidden by the simplifications that had to be applied to the model.
Counter-examples may be due to modelling artefacts and no longer
correspond to actual system runs
(Merz 2001)
Tools AlPiNA: (AlPiNA : an Algebraic Petri Net Analyzer)
BLAST Berkeley Lazy Abstraction Software Verification Tool
(Beyer, Thomas, et al. 2007)
CADP (Garavel, Langy and Mateescuz 2001)
CHESS (Microsoft Research)
CHIC Checker for Interface Compatibility
CPAchecker (Beyer and Stahlbauer 2013)
ÉCLAIR (bugseng)
FDR2 (Roscoe n.d.)
ISP (Vakkalanka, et al. 2008)
Java Pathfinder (JPF)
LTSmin (FMT Formal Methods & Tools)
Markov Reward Model Checker (MRMC)
McErlang (McErlang)
mCRL2 (Hansen, et al. 2010)
MoonWalker (MoonWalker)
NuSMV (NuSMV)
PAT (PAT: Process Analysis Toolkit)
Prism (Kwiatkowska, Norman and Parker 2002)
REDLIB (sourceforge)
Roméo (Roméo - A tool for Time Petri Nets analysis)
SMART Model checker (smart)
SPIN (Verifying Multi-threaded Software with Spin)
TAPAs (TAPAs)
TAPAAL (TAPAAL)
TLA+ (microsoft research)
UPPAAL (UPPAAL)
Vereofy (Vereofy / TU-Dresden)
Perspectives The wide range of applications, the usefulness and the great availability of
information and tools for model checking constitutes this technique
possible for being incorporated in a V&V methodology that concerns
manufacturing software services.
2.3.2.5 User Acceptance or Beta Testing
Goal User Acceptance Testing (UAT) is the last phase of the software testing
process and is mainly a validation technique. During UAT, actual software
users test the software to make sure it can handle required tasks in real
world scenarios, according to specifications. UAT is also known as Beta
testing, application testing or end user testing (techopedia)
Description UAT directly involves the intended users of the software. UAT can be
implemented by making software available for a free beta trial on the
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 32/130
Internet or through an in-house testing team comprised of actual software
users.
Following are the steps involved in in-house UAT:
Planning: The UAT strategy is outlined during the planning step.
Designing test cases: Test cases are designed to cover all the
functional scenarios of the software in real-world usage. They are
designed in a simple language and manner to make the test process
easier for the testers.
Selection of testing team: The testing team is comprised of real
world end-users.
Executing test cases and documenting: The testing team executes
the designed test cases. Sometimes it also executes some relevant
random tests. All bugs are logged in a testing document with
relevant comments.
Bug fixing: Responding to the bugs found by the testing team, the
software development team makes final adjustments to the code to
make the software bug-free.
Sign-off: When all bugs have been fixed, the testing team indicates
acceptance of the software application. This shows that the
application meets user requirements and is ready to be rolled out in
the market. (techopedia)
Offerings It helps demonstrate that required business functions are operating
in a manner suited to real-world circumstances and usage.
(techopedia)
The satisfaction of the clients is increased, as they are more
confident that the requirements are met, without the ‘fear’ of how
the product will behave in the real environment and if a critical
issue will arise when least expected.
The quality criteria of the product is defined in the early phase of
development/implementation
Vendors have improved communication, both with the clients and
inside the team, as the requirements definition is improved through
the acceptance tests and signed by the client
The engineering team ends up minimizing the pressure during the
implementation and risks of post-implementation live fixes.
Stakeholders use the information gathered through UAT to better
understand the target audience’s needs
(Benefits of User Acceptance Testing (UAT))
Restrictions All other forms of testing are carried out by the engineering team
to ensure that the system works technically, according to their
understanding of the business requirements. UAT should be
carried out by the users or business consultants to determine
whether the software fits their use or not.
(Benefits of User Acceptance Testing (UAT))
Disadvantages Automated test case generation at acceptance test level is not
usually possible.
(ESA Board for Software Standardisation an Control 1995)
Expensive to manage resource deficits to conduct proper user
acceptance tests
(Benefits of User Acceptance Testing (UAT))
Tools Cucumber, a BDD acceptance test framework
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 33/130
RSpec
Shoulda
(Behaviour Driven Development with Cucumber, RSpec, and
Shoulda n.d.)
Fabasoft app.test for automated acceptance tests (Fabasoft 2013)
FitNesse, a fork of Fit (FitNesse)
Framework for Integrated Test (Fit) (Fit: Framework for Integrated
Test)
iMacros (iOpus)
ItsNat Java Ajax web framework with built-in, server based,
functional web testing capabilities. (ItsNat)
Ranorex (Ranorex)
Robot Framework (RobotFramework)
Selenium (SeleniumHQ)
Test Automation FX (TestAutomationFX)
Watir (watir)
Perspectives Before releasing new software UAT procedure is critical and essential to
be performed. In the same concept manufacturing ICT projects shall
incorporate UAT before introducing their newly developed software to the
market.
2.3.2.6 Alpha Testing
Goal Alpha testing is the first test of newly developed hardware or software in a
laboratory setting.
In terms of verification:
The main purpose of alpha testing is to refine the software product
by finding (and fixing) the bugs that were not discovered through
previous tests.
(techopedia)
In terms of validation
When the first round of bugs has been fixed, the product goes into
beta test with actual users. For custom software, the customer may
be invited into the vendor's facilities for an alpha test to ensure the
client's vision has been interpreted properly by the developer.
(pcmag.com)
Description Alpha testing makes use of prototypes to test the software in its beta stage.
However, someone cannot expect the software to possess complete
functionality for which it is designed especially at this stage. This is
because alpha testing is usually conducted to ensure that the software that
is being developed provides all necessary core functions and that it
accepts all inputs and gives out expected outputs.
Alpha tests are conducted in the software developer’s offices or on any
designated systems so that they can monitor the test and list out any errors
or bugs that may be present. Thus, some of the most complex codes are
developed during the alpha testing stage. Furthermore, the project
manager who is handling the testing process will need to talk to the
software developer on the possibilities of integrating the results procured
from the alpha testing process with the future software design plans so that
all potential future problems can be avoided.
Before consumers receive the software system, it should be ensured that it
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 34/130
provides plenty of improvements in terms of stability and robustness.
(Exforsys Inc)
Offerings Alpha tests allows the developers to see the software working in an
actual, practical setting.
Consumers provide the necessary feedback about the problems
they had encountered and the things they expect once the software
is launched in the market and give the end-users perspective so
that software developers can address any possible issues.
Alpha testing provides better insights about the software’s
reliability and robustness at its early stages
Alpha testing has the ability to provide early detection, regarding
design issues and errors, to prevent potential problems in the future
(Exforsys Inc)
Restrictions The number of consumers, who test the software, should be strictly
limited, because the software is not yet ready for a commercial
launch at this point.
The users who would test the software must be aware of the
software’s limited functionality so that they can create unbiased
and accurate reviews of the software’s problems and other design
issues.
Software engineers should take the responsibility of educating all
users about the limited capability of the software at this point. In
addition, they should also include an advisory letter along with the
software package that mentions about the limited capability of the
software. Customers should also understand that they will most
likely experience frustrating problems if they run the software at
its alpha stage like a product
Since alpha testing makes use of undeveloped prototypes, this type
of testing is only good for the early stages of the software. It is
almost impossible to conduct an in-depth reliability test or an
installation test, and even documentation tests at this point,
because the software is still undeveloped. Therefore, it is very
difficult to address these concerns accurately at the alpha test
stage.
It is also very critical to provide fool proof security and safety for
the user system.
(Exforsys Inc)
Disadvantages Consumers who test the product during alpha stage may end up
dissatisfied with the results. Even when they are provided with a
notice about its developmental stage, the user may still feel that the
software given to them was unable to meet their needs and
requirements.
At times, some of the feedbacks received from consumers during
alpha release can also be accurate because they encountered some
sort of unsatisfactory or bad experiences while using the software.
(Exforsys Inc)
Tools The tools are in some extent common with the beta testing tools as the
only difference between the two tests is that alpha testing is employed as a
form of internal acceptance testing, before the software goes to beta
testing.
Perspectives Prior to releasing new software to a considerable number of users, alpha
testing procedure is critical before proceeding to UAT, so as to ensure that
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 35/130
the newly developed software is ready to be given to a wider audience.
Thus manufacturing ICT projects as well, shall perform alpha tests before
releasing new software.
2.3.2.7 White box testing
Goal The goal of white box testing, which refers mainly to the verification
aspect of V&V procedure, is
To find program bugs, referring them to corresponding parts of the
specification, and
To check systematically that all corresponding requirements are
met, and that there is no unreachable source code.
(International Atomic Energy Agency 1999)
Description The tester chooses inputs to exercise paths through the code and
determines the appropriate outputs. Programming know-how and the
implementation knowledge is essential. White box testing is testing
beyond the user interface and into the nitty-gritty of a system. (Software
Testing Fundamentals)
The chosen part of the program is executed together with a test data set.
This must be chosen from a global test data set which refers to the relevant
section of the design specification or is provided by automated test data
set generation. Appropriate test sets should be chosen to detect program
faults systematically. The paths through the code should be systematically
traced. This can be done using code instrumentation. The instrumentation
of source code can be assisted by various available compilers and testing
tools. When the test routine is first run, all counters implemented via code
instrumentation are reset. Starting the program module with a relevant
data test set, certain counters will be incremented. Thus, a corresponding
trace can be shown for that test set. The test routine is repeated until a
specified set of structural elements of the program, such as a sequence of
statements, all branches or a proportion of paths, has been covered. Thus,
it is possible to show a certain test coverage measure or metric.
(International Atomic Energy Agency 1999)
Offerings The software module can be thoroughly, but not necessarily
exhaustively, tested.
Automation is possible to minimize the effort required to complete
the whole test routine.
Certain test coverage metrics can be produced.
It can be shown that there is no dead, unused or unreachable
source code within the software module.
Relevant test data sets can be chosen out of the global test set
according to source code documents.
By looking at counter values, it is possible to mark a certain trace.
This is of importance for analysing traces for software modules
containing WHILE or CYCLE constructs.
(International Atomic Energy Agency 1999)
As the tester has knowledge of the source code, it becomes very
easy to find out which type of data can help in testing the
application effectively.
It helps in optimizing the code.
Extra lines of code can be removed which can bring in hidden
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 36/130
defects.
Due to the tester's knowledge about the code, maximum coverage
is attained during test scenario writing.
(tutorialspoint)
Testing can be commenced at an earlier stage.
Testing is more thorough, with the possibility of covering most
paths.
(Software Testing Fundamentals)
Restrictions The specification of the program requirements must be well
structured.
Relevant test data sets should be generated, and automatic test set
generation may be needed for effective testing.
(International Atomic Energy Agency 1999)
Disadvantages White box testing can only show that there is no unreachable
source code hidden in the software module, but there is no real
possibility of detecting faults or inconsistencies within the
requirement specification.
With increasing complexity of software modules, the number of
test runs increases more than exponentially
Normally, 100% path coverage is not feasible owing to the huge
number of paths even in a simple program. (International Atomic
Energy Agency 1999) Sometimes it is impossible to look into
every nook and corner to find out hidden errors that may create
problems as many paths will go untested. (tutorialspoint)
The correct outputs, as success criteria, can sometimes be difficult
to determine in advance.
(International Atomic Energy Agency 1999)
Since tests can be very complex, highly skilled resources are
required, with thorough knowledge of programming and
implementation (Software Testing Fundamentals) and due to this
fact that a skilled tester is needed to perform white box testing, the
costs are increased. (tutorialspoint)
It is difficult to maintain white box testing as the use of specialized
tools like code analysers and debugging tools are required.
(tutorialspoint)
Test script maintenance can be a burden if the implementation
changes too frequently.
Since this method of testing is closely tied with the application
being testing, tools to cater to every kind of
implementation/platform may not be readily available.
(Software Testing Fundamentals)
Tools LDRA Testbed™
ADA Test™
(International Atomic Energy Agency 1999)
There is a non exhaustive list of white box testing tools. A classification
approach is the following and under each class there are plenty of
available tools (White Box Testing):
Code Complexity Tools
Code Coverage Tools
Code Beautifier Tools
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 37/130
Memory Testing and Profiling Tools
Mock Frameworks Tools
Model Based Testing Tools
Performance Testing Tools
Source Code Static Analysis Tools
Unit Testing Tools
Perspectives White Box Testing constitutes a whole class which includes Unit Testing,
Integration testing, Regression testing (L. Williams 2006) and other
testing techniques useful and appropriate for applications that combine the
future internet and manufacturing aspects.
2.3.2.8 Black box testing
Goal Black box testing, which refers mainly to the validation aspect of V&V
procedure, attempts to find errors in the external behaviour of the code in
the following categories:
incorrect or missing functionality,
interface errors,
errors in data structures used by interfaces,
behaviour or performance errors, and
initialization and termination errors.
(L. Williams 2006)
In that way black box testing aims at checking whether there are any
program errors in meeting particular aspects of the given specification and
whether all specified requirements have been met. (International Atomic
Energy Agency 1999)
Description A test data set is derived solely from the specification and then the chosen
software module is executed. The correctness of the software module is
verified by testing against the specification.
Functional black box testing covers:
—Equivalence partitioning;
—Boundary value analysis;
—Cause–effect graphing.
(International Atomic Energy Agency 1999)
With black box testing, the software tester does not (or should not) have
access to the source code itself. The code is considered to be a “big black
box” to the tester who can’t see inside the box. The tester knows only that
information can be input into to the black box, and the black box will send
something back out. Based on the requirements knowledge, the tester
knows what to expect the black box to send out and tests to make sure the
black box sends out what it’s supposed to send out.
Offerings The software module is normally tested in its final form on
hardware that is the same as or similar to that in which the
program will run.
Information is obtained if any specified requirement is missing.
Demonstration that the chosen software module meets all specified
requirements is possible.
Black box testing helps to show whether or not the software
module fulfils specified requirements.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 38/130
(International Atomic Energy Agency 1999)
Efficient when used on large systems or large code segments
(Code Project) (tutorialspoint)
Since the tester and developer are independent of each other,
testing is balanced and unprejudiced.
Tester can be non-technical. (Code Project) Large numbers of
moderately skilled testers can test the application with no
knowledge of implementation, programming language or operating
systems. (tutorialspoint)
Tests will be done from an end user's point of view, because the
end user should accept the system.
Testing helps to identify vagueness and contradictions in
functional specifications.
Test cases can be designed as soon as the functional specifications
are complete.
(Code Project)
Code Access not required.
Clearly separates user's perspective from the developer's
perspective through visibly defined roles.
(tutorialspoint)
Restrictions The specification of the software module must be well structured.
(International Atomic Energy Agency 1999)
Disadvantages It is very difficult to define relevant test data sets comprehensively
enough to detect errors with maximum probability.
This test method only detects non-conformance and it cannot be
used to prove the correctness of a software module.
The method is applied very late in the software life cycle.
(International Atomic Energy Agency 1999)
Test cases are challenging and difficult to design without having
clear functional specifications. (Code Project) (tutorialspoint)
It is difficult to identify tricky inputs if the test cases are not
developed based on specifications.
It is difficult to identify all possible inputs in limited testing time.
As a result, writing test cases may be slow and difficult.
There are chances of having unidentified paths during the testing
process.
There is a high probability of repeating tests already performed by
the programmer.
(Code Project)
Limited Coverage since only a selected number of test scenarios
are actually performed.
Inefficient testing, due to the fact that the tester only has limited
knowledge about an application.
Blind Coverage, since the tester cannot target specific code
segments or error prone areas.
Tools Typhon (nccgroup)
MatriXay (DBAPPSecurity)
NGSSQuirreL (nccgroup)
Watchfire AppScan (IBM)
Cenzic Hailstorm (CENZIC)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 39/130
Burp Intruder (Portswigger)
Acunetix Web Vulnerability Scanner (acunetix)
WebSleuth - http://www.sandsprite.com (sandsprite)
NT Objectives NTOSpider (NTObjectives n.d.)
Fortify Pen Testing Team Tool (fortifysoftware)
Sandsprite Web Sleuth (sandnsprite)
MaxPatrol Security Scanner (Positive Technologies)
N-Stalker Web Application Security Scanner (N-Stalker)
Perspectives Black Box Testing is a general class that incorporates Integration testing,
Functional and System testing, Acceptance testing, Regression testing,
Beta testing (L. Williams 2006) and other testing techniques which are
useful for validating software. Thus, the black box testing approach is
considered proper to be taken into account for manufacturing software
applications.
2.3.2.9 Unit testing
Goal The primary goal of unit testing, which is only a verification technique, is
to take the smallest piece of testable software in the application, isolate it
from the remainder of the code, and determine whether it behaves exactly
as you expect. (MSDN)
Description Unit testing is a software development process in which the smallest
testable parts of an application, called units, are individually and
independently scrutinized for proper operation. (searchsoftwarequality)
Each unit is tested separately before integrating them into modules to test
the interfaces between modules. (MSDN). Unit testing is often automated
but it can also be done manually. This testing mode is a component of
Extreme Programming (XP), a pragmatic method of software
development that takes a meticulous approach to building a product by
means of continual testing and revision. (searchsoftwarequality)
Offerings Unit testing has proven its value in that a large percentage of
defects are identified during its use.
(MSDN)
Unit testing involves only those characteristics that are vital to the
performance of the unit under test. This encourages developers to
modify the source code without immediate concerns about how
such changes might affect the functioning of other units or the
program as a whole.
Once all of the units in a program have been found to be working
in the most efficient and error-free manner possible, larger
components of the program can be evaluated by means of
integration testing.
(searchsoftwarequality)
Restrictions Rigorous documentation must be maintained.
(searchsoftwarequality)
Disadvantages Unit testing can be time-consuming and tedious. It demands
patience and thoroughness on the part of the development team.
Unit testing must be done with an awareness that it may not be
possible to test a unit for every input scenario that will occur when
the program is run in a real-world environment.
(searchsoftwarequality)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 40/130
Tools There is a vast variety of unit test frameworks according to the
programming language used.
Perspectives Unit testing is essential before proceeding in integration in order to have
detected the most possible defects of each unit separately. Unit testing can
be performed in both agile and waterfall schools and in general in the
most development approaches. In that way unit testing is quite useful,
totally applicable and seriously potential to be involved in verifying
manufacturing software services.
2.3.2.10 Integration testing
Goal Integration testing is a software testing methodology, which aims at
testing individual software components or units of code (techopedia)
The goal of integration testing is
As a validation technique to provide feedback as a user of the
integrated components, and
As a verification technique to verify interaction between various
software components and detect interface defects.
Description The IEEE defines integration testing as ‘testing in which software
components are combined and tested to evaluate the interaction between
them’.
Component integration testing is an indispensable testing phase. Software
components can be incorporated into a system as units, or multiple
components may work together to provide system functionality. A
component, whether integrated individually or with other components,
requires integration testing, i.e. the testing of component interactions as
envisaged in the actual usage environment before actual deployment.
The testing of component deployment can be further divided into two sub-
phases. In the first sub phase, the component can be tested as integrated in
an environment made up of stubs that roughly implement the components
as envisaged by the specifications. In this manner it is possible to check at
an early stage whether the component correctly interacts with the ‘ideal’
environment. In the second sub-phase the real integration between several
chosen components is verified. To achieve this, the interactions among the
actual implementations of the architectural components during the
execution of some test cases are monitored, and undesired interactions can
possibly be detected.
It is worth noting that potential mismatches discovered by the component
user during integration testing are not, in general, ‘bugs’ in the
implementation. Rather, they highlight the non-conformance between the
expected component and the component being tested (and hence the need
to look for other components).
A particular case of integration testing is when a real component comes
equipped with the developer’s test suite, and the assembler re-executes
those tests in their environment. They can possibly include test cases that,
although not relevant to the user’s specific purposes, might still be useful
in evaluating the behaviour of the component under the user’s envisaged
conditions. (Rehman, et al. 2007)
Offerings essential in obtaining an acceptable level of reliability
these tests guarantee that the ‘intentions’ of the developer are
respected in the final environment and their execution generally
leads to a more comprehensive evaluation.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 41/130
(Rehman, et al. 2007)
Restrictions Integration tests need to be written to help determine if the
interfaces between modules are working as defined in the design
specifications. (Rakitin, Software Verificaiton and Validation for
Practitioners and Managers 2001)
Disadvantages Automated test case generation at integration test level is not
usually possible. (ESA Board for Software Standardisation an
Control 1995)
Tools ObjectPlanner, ObjectSoftware Inc.
GUI
Cantata
Autotest
Cantata++
(Pohjolainen 2002)
Perspectives As integration testing ensures that the software components are interacting
and operating properly, it is considered essential in the V&V procedures
and in that context in the manufacturing ICT applications as well.
2.3.2.11 System testing
Goal The goal of system testing, which is mainly a validation technique, is to
simply test the system as a whole and to compare the software with the
software requirements
Description System testing is performed by the component user when all the various
components are integrated and the entire system is ready to run. System
testing encompasses load and performance testing, so as to test the
functionality of the whole system. The emphasis of system testing is not
restricted to the components undergoing testing. Rather, these components
are tested within the context of the system, in order to assess the behaviour
and performance of the whole system as a black box. (Rehman, et al.
2007)
Offerings It can long term assist in staying within a budget and meeting a
deadline
(Chaira)
Restrictions The more documentation that is produced in terms of documenting
errors and producing detailed test plans the quicker it would be to
finish the product with fewer errors or bugs.
(Chaira)
Disadvantages Automated test case generation at system test level is not usually
possible.
(ESA Board for Software Standardisation an Control 1995)
The time taken to complete this testing is dependent on not many
errors or bugs appearing in the testing phase, or this stage can take
a long time to complete.
(Chaira)
Tools System Testing include all the following types of tests and under each test
there are available plenty of tools:
Graphical user interface testing
Usability testing
Software performance testing
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 42/130
Compatibility testing
Exception handling
Load testing
Volume testing
Stress testing
Security testing
Scalability testing
Sanity testing
Smoke testing
Exploratory testing
Ad hoc testing
Regression testing
Installation testing
Maintenance testing [clarification needed]
Recovery testing and failover testing.
Accessibility testing
Perspectives System tests assess that the system as a whole is compliant with the
software requirements. This is an indispensable step before releasing any
software product. Thus system testing is appropriate in the context of the
current work.
2.3.3. V&V Approaches
2.3.3.1 V&V Classification Schema
As mentioned in the previous paragraphs, a V&V reporting and classification schema is
developed and applied in the context of the FITMAN V&V state-of-the-art analysis. A set of
appropriate facets relevant to verification, validation and evaluation aspects and the
applicability to manufacturing settings are identified and described below:
Coverage (Holistic/ Partial) indicating whether the reported approach/ methodology/
application refers to the whole FITMAN V&V lifecycle (verification/ validation/
evaluation), or deals with a subset of it. The possible values of this facet are: Holistic/
V&V/ Verification (Ver)/ Validation (Val)/ Evaluation (Ev).
Baseline that aims to indicate whether the approach/ methodology/ application under
review utilizes either the waterfall development model or the agile development model
as a baseline. The possible values are Waterfall/ Agile/ Hybrid.
Analysis Level aiming to provide information on how analytic the approach/
methodology/ application under review appears to be. The possible values of this facet
are High/ Medium/ Low. The assessment is based on the values of the five (5)
following facets, namely: Roles’ Identification, Stakeholders’ Engagement, Definition
of Criteria, Techniques Applied, and Tool(s) Involvement.
Roles’ Identification indicating whether the approach/ methodology/ application under
study identifies and reports different stakeholders’ profiles (e.g. customer,
development team, product owner). The possible values are Yes/ No.
Stakeholders’ Engagement: This facet advances one step forward the previous one by
indicating which is the proposed method for directly engaging (some of) the
stakeholders, if any (e.g. requirements identification from the end-customer via a
requirements’ description template). The possible values of this facet are Open (e.g.
though a Crowdsourcing Method)/ Restricted (to a specific audience)/ Not available.
The exact method applied should be provided/ reported in summary.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 43/130
Definition of Criteria: This classification facet aims to report whether specific/
appropriate criteria are proposed for the approach/ methodology/ application under
review. The possible values are Yes/ No.
Techniques Applied that identifies and reports whether the approach/ methodology/
application under review proposes/ applies any specific techniques for its
implementation. The possible values include every specific methodology described in
the following section.
Tool(s) Involvement with the purpose of identifying whether the approach/
methodology/ application under review reports and proposes specific IT (or of other
nature) tool(s) aiding its implementation. The possible values are Fully/ Partially/ No.
Applied denoting whether the approach/ methodology/ application under review has
been applied to a real-life case/ scenario, regardless of its nature/ domain. This piece
of information is particularly important as information on actual application may
indicate the success and appropriateness (or, from the other hand, the unsuitability) of
a specific V&V approach. The possible values are Yes/ No. In case of a positive
answer, a link to the trial case should be provided/ reported.
Application Area(s), utilised to indicate the application area of the reported approach.
It has a broad spectrum of possible values, including Software, Hardware, Supply
Chains, Documentation etc.
Support from Community to report whether the approach/ methodology/ application
under review has active support from any type of community (e.g. open source
community). The possible values are Yes/ No. In case of a positive answer, the
community supporting the approach/ methodology/ application will be reported.
Maturity with the scope to identify and report the level of maturity of the approach/
methodology/ application under analysis. The possible values are Inception/ Traction/
Hyper-Growth/ Maturity/ Decline.
Relation to Manufacturing referring to whether the reported approach/ methodology/
application presents relation to manufacturing procedures or not. The possible values
are Yes/ No/ Possibly.
FITMAN Assessment: Based on the values of all the aforementioned facets, the value
of this facet indicates whether the reported approach/ methodology/ application has
the potential to add value to the FITMAN V&V methodology. The possible values of
this criterion are High/ Medium/ Low.
Last but not least, 1 extra cell will be provided in order to report the source of all previous
information (e.g. website, standard, book, white paper).
2.3.3.2 V&V Approaches Mapping
In this section, the approaches/ methodologies/ applications retrieved from bibliographic
research in multiple online and conventional sources is mapped against the V&V
Classification Schema (the criteria defined and described in the previous section) in order to
directly compare their features and capabilities.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 44/130
Title Analysis
Level
Roles’
Identi-
fication
Stakeholders’
Engagement
Definition of
Proposed Criteria Techniques Applied Tool(s) Involvement
Guideline for V&V Embedded
Software Systems Medium Yes No Yes
Static Analysis, Simulation, Model Checking,
Symbolic Execution, Inspection, Walk-Through,
Testing
No
Verification, Validation and
Testing of Computer Software Low No No No
Walk-Through, Inspection, Review, Proof-of-
Correctness, Simulation, Cause-Effect Graphing,
Coverage-Based Testing, Complexity-Based Testing,
Symbolic Execution
Yes – Automated Correctness Proving
Software Evaluation: Criteria-
based Assessment Medium No
Yes -
Questionnaire Yes No No
V&V of SW Related to Nuclear
Power Plant Instrumentation and
Control
High No No Yes No
Yes – Compilers, CAD Tools,
Symbolic Execution Tools
(MALPAS, SPADE), based on
Programming Languages (e.g. ADA,
OCCAM)
Expert System Verification and
Validation: A Survey and
Tutorial
Medium No No Yes No
Yes – Knowledge Acquisition Tools,
Conflict Resolution Tools (OPS5),
Graph-based (COVER)
Concepts of Model Verification
and Validation Medium No No No No No
CHARMY: A Framework for
Designing and Verifying
Architectural Specifications
High No No No No Yes – CHARMY Tool, Wright,
LTSA
A Methodology for V&V of
Models and Simulations:
Acquirers’ View Point
Medium No No Yes
Face Validation, Inspection, Review, Walk-Through,
User Interface Analysis, Traceability Assessment,
Sensitivity Analysis, Comparison Testing, Functional
Testing
No
Verifying Architectural Design
Rules of the Flight Software
Product Line
Medium No No No N/A Yes – Code Verification (SAVE),
Rules Verification (RPA, Grep)
Formal Methods for Verification
and Validation of partial
specifications: A Case Study
Low No Yes -
Questionnaire No N/A No
Guide to Software Verification High Yes No No Walk-Through, Audit, Testing (Unit, Integration, Yes – Reviewing, Tracing, Proof,
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 45/130
Title Analysis
Level
Roles’
Identi-
fication
Stakeholders’
Engagement
Definition of
Proposed Criteria Techniques Applied Tool(s) Involvement
and Validation System, Acceptance), Inspection Testing
Software Verification and
Validation Medium No No Yes
Testing (Specification-based, Model-based, Program-
based, Usage-based)
Yes – Test Generation Tools,
Mutation Testing (e.g. Mothra)
Verification and Validation of
Simulation Models Medium No No No Comparison, Testing No
Software Verification and
Validation for Practitioners and
Managers
High Yes Yes -
Checklists Yes
Inspection, Testing (Unit, Integration, Validation,
Alpha and beta, Acceptance), Review
Yes – Configuration Management
Tools, Automated Test Tools,
Reliability Modelling Tools,
Complexity Analysis Tools
Software Verification and
Validation: An Engineering and
Scientific Approach
Medium Yes No No Testing (Integration, System, Acceptance) No
Verification and Validation in
Scientific Computing High No No
Yes -
Mathematical Inspection, Testing (Unit, Component, System)
Yes – Diagnostic/Debugging Tools,
Simulation Tools, Automatic Static
Analysers
Automated Software Testing Low No No No Testing (Unit, Integration, System) Yes – Automated Test Tools
A Formal Verification and
Validation Approach for Real-
Time Databases
Medium No No No
Audit, Inspection, Face Validity, Walk-Through,
Review, Turing, Testing (Turing, Correctness Proof,
Basis-Path, Execution, Regression, Assertion, Bottom-
Up, Top-Down, Visualisation, Field, Functional,
Stress, Structural, Acceptance, Alpha)
No
Verification, Validation and
Testing of Engineered Systems High Yes No Yes
Inspection, Walk-Through, Audit, Peer Review,
Testing
Yes – Simulation Tools, System
Design Tools, Testing Tools,
Statistical Analysis Tools, ARM Tool,
QuARS Tool, CAE Tools
DMO Verification and
Validation Manual High Yes No Yes
Test, Simulation, Modelling, Demonstration,
Inspection, Experiments, Trials, Audit Walk-Through
Yes – CORE, DOORS, RDT, ADO,
RANTEAA, DTRIALS, T-Plan
More testers – The effect of
crowd size and time restriction in
software testing
Medium No
Yes –
Graphical User
Interface
Yes – Measures
(Effectiveness,
Validity)
Testing N/A
Crowd sourced software Testing Low Yes Yes - Testers No Yes (Crowd Testing) N/A
uTest Medium No Yes - Testers No Yes (Crowd Testing – Functional, Security, Load, uTest
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 46/130
Title Analysis
Level
Roles’
Identi-
fication
Stakeholders’
Engagement
Definition of
Proposed Criteria Techniques Applied Tool(s) Involvement
Localisation, Usability)
The tailoring process in the
German V-Model Low No No Yes - Conditions N/A N/A
Validation of Agent-Based
Simulation through Human
Computation: An Example of
Crowd Simulation
Low No No Yes Yes – Crowd Simulation N/A
Evaluation Methods in Research High Yes N/A Yes Yes (Document study, Focus group, Interview,
Observation, Questionnaire) No
Fundamentals of Performance
Evaluation of Computer and
Telecommunication Systems
Medium No No No Yes (Analytic Modelling, Simulation, Measurement
and Testing)
Yes – Benchmark programs (WebTP,
NpBench, Probing, GRA, NetBench
etc.)
Impact Evaluation in Practice Medium No No Yes – Evaluation
questions N/A No
A Framework to Evaluate the
Informatization Level Medium Yes No Yes N/A No
A Balanced Analytic Approach to
Strategic Electronic Commerce
Decisions: A Framework of the
Evaluation Method
Medium Yes Yes – Pairwise
Comparisons Yes ANP No
Evaluation Based on Critical
Systems Heuristics Low Yes Yes No No No
Systems Evaluation: Methods,
Models, and Applications Medium Yes Yes No ANP, Delphi, Nominal Group, Brainstorming No
Evaluating the operational
performance of manufacturing
enterprises: an evaluation
framework and application to
Pakistani industry
Low No No Yes N/A No
Strategic Evaluation of Advanced
Manufacturing Technology Medium No No Yes N/A No
Manufacturing evaluation system Yes Yes Yes ANP/AHP No
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 47/130
Title Analysis
Level
Roles’
Identi-
fication
Stakeholders’
Engagement
Definition of
Proposed Criteria Techniques Applied Tool(s) Involvement
based on AHP/ANP approach for
wafer fabricating industry
A comprehensive methodology
for manufacturing system
evaluation and comparison
Medium No No Yes N/A No
An evaluation methodology for
hotel electronic channels of
distribution
Medium No No Yes Delphi No
A modelling and evaluation
methodology for E-Commerce
enabled BPR
Low No No Yes GRAI Grid No
Strategic Evaluation of
Manufacturing Technologies Medium No No Yes Technology Readiness Level (TRL) N/A
Performance and Evaluation of
Manufacturing Systems Medium Yes No Yes N/A No
Manufacturing Performance:
Evaluation and Determinants Medium No No Yes DEA Model No
Evaluating Manufacturing
System Design and Performance
with the Manufacturing System
Design Decomposition Approach
Medium No No Yes MSDD No
Lean Manufacturing
Performance – Metrics and
Evaluation
Low No No Yes N/A No
A systematic approach to
evaluate the process
improvement in Lean
Manufacturing Organizations
Medium No No Yes
Standard Processes (e.g. SOPs), Work Force (e.g.
Cross-Training), Visual Management (e.g. Lean
Tracking), Quality Assurance (e.g. Pareto Inspection),
Workplace Organisation (e.g. 5S), Material Flow (e.g.
MTM Analysis), Lot Sizing (e.g. KANBAN Systems),
Continuous Improvement (e.g. PDCA)
See “Techniques Applied”
Quality-driven workforce
performance evaluation based on
robust regression and
High No No Yes ZIP Regression, ANOMR/ANOMRV Chart N/A
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 48/130
Title Analysis
Level
Roles’
Identi-
fication
Stakeholders’
Engagement
Definition of
Proposed Criteria Techniques Applied Tool(s) Involvement
ANOMR/ANOMRV chart
A methodology for effective
implementation of lean strategies
and its performance evaluation
in manufacturing organizations
High No No No CPM Measurement Tools
Lean performance evaluation of
manufacturing systems: A
dynamic and innovative
approach
Low No No Yes (Metrics) Fuzzy Logic N/A
(Value, Risk)-based performance
evaluation of manufacturing
processes
Medium No No Yes MACBETH N/A
Novel methodologies and a
comparative study for
manufacturing systems
performance evaluations
Medium No No Yes Fuzzy AHP (AHP comparison matrix with default
pairwise evaluation of factors using fuzzy numbers) N/A
A combined methodology for
supplier selection and
performance evaluation
Medium No No Yes Fuzzy AHP, Fuzzy TOPSIS N/A
A methodology for evaluation of
metal forming system design and
performance via CAE simulation
High No No Yes CAE Simulation N/A
Performance evaluation of
reconfigurable manufacturing
systems via holonic architecture
and the analytic network process
High Yes No Yes AHP for RMS No
Performance evaluation of
manufacturing systems based on
dependability management
indicators—case study: chemical
industry
Medium No No Yes (Indicators) PCA, DEA, AHP No
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 49/130
Table 2-4: V&V Approaches’ Mapping based on the first set of Criteria
Title Cove-
rage Baseline Applied Appl. Area(s)
Support from
Community Maturity
Relation to
Manufa-
cturing
Source FITMAN
Assessment
V&V of SW Related to Nuclear
Power Plant Instrumentation and
Control
V&V Waterfall
Yes – Nuclear Power
Plants in various
Countries
Software N/A Possibly
(International
Atomic Energy
Agency 1999)
Medium
Guideline for V&V Embedded
Software Systems V&V Waterfall No Software N/A Traction No (ITEA 2001) Medium
Verification, Validation and Testing
of Computer Software V&V Waterfall No Software N/A Inception No
(Richards, Branstad
and Cherniavsky
1962)
Low
Software Evaluation: Criteria-based
Assessment Val Hybrid N/A Software N/A Inception No
(Jackson, Crouch
and Baxter 2011) Medium
V&V of SW Related to Nuclear
Power Plant Instrumentation and
Control
V&V Waterfall
Yes – Nuclear Power
Plants in various
Countries
Software N/A Possibly
(International
Atomic Energy
Agency 1999)
Medium
Expert System Verification and
Validation: A Survey and Tutorial Holistic Waterfall N/A Expert Systems N/A Traction No
(O'Keefe and
O'Leary 1993) Low
Concepts of Model Verification and
Validation V&V N/A N/A Models N/A Inception No
(Thacker, et al.
2004) Low
CHARMY: A Framework for
Designing and Verifying
Architectural Specifications
Ver N/A Yes Specs (Charmy
2013) Traction Possibly
(Pelliccione,
Inverardi and
Muccini 2009)
Medium
A Methodology for V&V of Models
and Simulations: Acquirers’ View
Point
V&V Waterfall
Yes – Middle East
Technical University
Project
Models, Simulations N/A Inception Possibly (Molyer 2001) Medium
Verifying Architectural Design Rules
of the Flight Software Product Line Ver N/A N/A Design Rules N/A Inception Possibly
(Ganesan, et al.
2009) Low
Formal Methods for Verification and
Validation of partial specifications:
A Case Study
V&V N/A
Yes – Space station
communication bus
verifications
Specs N/A Traction No (Easterbrook and
Callahan 1998) Low
Guide to Software Verification and
Validation V&V Waterfall N/A Software N/A Inception Possibly
(ESA Board for
Software
Standardisation an
High
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 50/130
Title Cove-
rage Baseline Applied Appl. Area(s)
Support from
Community Maturity
Relation to
Manufa-
cturing
Source FITMAN
Assessment
Control 1995)
Software Verification and Validation V&V Waterfall N/A Software N/A Inception No (Kung and Zhu
2008) Low
Verification and Validation of
Simulation Models V&V N/A N/A Simulation Models N/A Inception Possibly N/A Low
Software Verification and Validation
for Practitioners and Managers V&V Waterfall N/A
Software, Models,
Simulation N/A Traction Possibly
(Rakitin, Software
Verificaiton and
Validation for
Practitioners and
Managers 2001)
High
Software Verification and
Validation: An Engineering and
Scientific Approach
V&V Waterfall N/A Software N/A Traction Possibly (Fisher 2006) Medium
Verification and Validation in
Scientific Computing Ver Hybrid
Yes – Various Test
Cases Software, Code N/A Traction Possibly
(Oberkampf and
Roy 2010) Medium
Automated Software Testing V&V N/A N/A Software N/A Inception Possibly (Dasso and Funes
2006) Low
A Formal Verification and
Validation Approach for Real-Time
Databases
V&V N/A N/A Databases N/A Inception No (Dasso and Funes
2006) Low
Verification, Validation and Testing
of Engineered Systems V&V Hybrid N/A Software, Systems N/A Inception Possibly (Engel 2010) High
DMO Verification and Validation
Manual V&V Waterfall N/A Capability Solutions N/A Inception Possibly (DMO 2004) Medium
More testers – The effect of crowd
size and time restriction in software
testing
V&V N/A
Yes – Software
Quality Assurance
Experiment Case
Software N/A Inception Possibly (Mäntylä and
Itkonen 2013) Medium
Crowd sourced software testing V&V N/A N/A Software N/A Traction Possibly (Sridharan 2013) Medium
uTest V&V N/A Yes Web, Desktop and
Mobile Applications
Yes –
Commercial
Product
Maturity Possibly (uTest 2013) Medium
The tailoring process in the German V&V Waterfall N/A Software, IT projects N/A Traction Possibly (Plogert 1996) Medium
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 51/130
Title Cove-
rage Baseline Applied Appl. Area(s)
Support from
Community Maturity
Relation to
Manufa-
cturing
Source FITMAN
Assessment
V-Model
Validation of Agent-Based
Simulation through Human
Computation: An Example of Crowd
Simulation
Val N/A Yes – Test Case Agent-based
Simulation N/A Inception Possibly (Xing, et al. 2012) Low
Evaluation Methods in Research Ev N/A Yes – Case studies General research N/A Traction Possibly (Bennett 2003) Medium
Fundamentals of Performance
Evaluation of Computer and
Telecommunication Systems
Ev N/A N/A
Computer and
Telecommunication
Systems
N/A Traction Possibly (Obaidat and
Boudriga 2010) Low
Impact Evaluation in Practice Ev N/A N/A Policy Decisions N/A Inception No (Gertler, et al.
2011) Low
A Framework to Evaluate the
Informatization Level Ev N/A N/A Various N/A Inception Possibly
(Van Grembergen
2001) Low
A Balanced Analytic Approach to
Strategic Electronic Commerce
Decisions: A Framework of the
Evaluation Method
Ev N/A N/A eCommerce N/A Inception Possibly (Van Grembergen
2001) Medium
Evaluation Based on Critical
Systems Heuristics Ev N/A Yes – Case studies Various N/A Traction No
(Williams and
Imam 2007) Low
Systems Evaluation: Methods,
Models, and Applications Ev N/A N/A Various N/A Traction Possibly (Liu, et al. 2011) Medium
Evaluating the operational
performance of manufacturing
enterprises: an evaluation framework
and application to Pakistani industry
Ev N/A Yes – Pakistani
Industry Industry N/A Traction Yes (Sarmad 1985) Low
Strategic Evaluation of Advanced
Manufacturing Technology Ev N/A Yes – Case Studies Manufacturing N/A Inception Yes (Kakati 1997) Medium
Manufacturing evaluation system
based on AHP/ANP approach for
wafer fabricating industry
Ev N/A Yes – Trial Case Manufacturing N/A Inception Yes (Yanga, Chuangb
and Huangc 2009) High
A comprehensive methodology for
manufacturing system evaluation and Ev N/A Yes – Trial Case
Manufacturing
Systems N/A Inception Yes
(Troxler and Blank
1989) Medium
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 52/130
Title Cove-
rage Baseline Applied Appl. Area(s)
Support from
Community Maturity
Relation to
Manufa-
cturing
Source FITMAN
Assessment
comparison
An evaluation methodology for hotel
electronic channels of distribution Ev N/A Yes – Trial Case
Electronic Distribution
Channel N/A Inception No
(O’Connora and
Frewb 2004) Low
A modelling and evaluation
methodology for E-Commerce
enabled BPR
Ev N/A Yes – Trial Case BPR N/A Inception No
(Tatsiopoulos,
Panayiotou and
Ponisa 2002)
Low
Strategic Evaluation of
Manufacturing Technologies Ev N/A Yes – Trial Case
Manufacturing
Technologies N/A Inception Yes
(Reinhart,
Schindler and
Krebs 2011)
Medium
Performance and Evaluation of
Manufacturing Systems Ev N/A
Yes – Use Case
Scenario Manufacturing N/A Inception Yes (Hon 2005) Medium
Manufacturing Performance:
Evaluation and Determinants Ev N/A
Yes – Use Case
Scenario Manufacturing N/A Inception Yes
(Leachman, Pegels
and Shin 2005) Medium
Evaluating Manufacturing System
Design and Performance with the
Manufacturing System Design
Decomposition Approach
Ev N/A N/A Manufacturing N/A Inception Yes (Cochran and
Dobbs 2003) Medium
Lean Manufacturing Performance –
Metrics and Evaluation Ev N/A N/A Manufacturing N/A Inception Yes (Andreeva 2009) Low
A systematic approach to evaluate
the process improvement in Lean
Manufacturing Organizations
Ev N/A Yes – Use Case
Scenario Manufacturing N/A Inception Yes
(Amin and Karim
2011) Medium
Quality-driven workforce
performance evaluation based on
robust regression and
ANOMR/ANOMRV chart
Ev N/A Yes – Real-world
Case Study
Workforce
Performance
Evaluation
N/A Inception Yes (Wang, Dessureault
and Liu 2013) Medium
A methodology for effective
implementation of lean strategies
and its performance evaluation in
manufacturing organizations
Ev N/A
Yes - Real Life
Manufacturing
Environment
Manufacturing N/A Inception Yes (Karim and Arif-
Uz-Zaman 2013) Medium
Lean performance evaluation of
manufacturing systems: A dynamic
and innovative approach
Ev N/A N/A Manufacturing
Systems N/A Inception Yes
(Behrouzi and Yew
Wong 2011) Low
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 53/130
Title Cove-
rage Baseline Applied Appl. Area(s)
Support from
Community Maturity
Relation to
Manufa-
cturing
Source FITMAN
Assessment
(Value, Risk)-based performance
evaluation of manufacturing
processes
Ev N/A Yes – Case Study Manufacturing
Processes N/A Inception Yes (Shah, et al. 2012) Medium
Novel methodologies and a
comparative study for manufacturing
systems performance evaluations
Ev N/A Yes – Test Case Manufacturing
Systems N/A Inception Yes
(Göleça and Taşkın
2007) Medium
A combined methodology for
supplier selection and performance
evaluation
Ev N/A Yes – Car Factory Supplier Performance
Evaluation N/A Inception Yes
(Zeydana, Çolpanb
and Çobanoğlu
2011)
Medium
A methodology for evaluation of
metal forming system design and
performance via CAE simulation
Ev N/A Yes – Case Study System Design N/A Inception Possibly (Fua, et al. 2007) Medium
Performance evaluation of
reconfigurable manufacturing
systems via holonic architecture and
the analytic network process
Ev N/A Yes – Case Study
Reconfigurable
Manufacturing
Systems
N/A Inception Yes (Abdia and Labibb
2010) High
Performance evaluation of
manufacturing systems based on
dependability management
indicators—case study: chemical
industry
Ev N/A Yes – Chemical
Industry Case Study
Manufacturing
Systems N/A Inception Yes
(Rezaie,
Dehghanbaghi and
Ebrahimipour
2009)
Medium
Table 2-5: V&V Approaches’ Mapping based on the second set of Criteria
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 54/130
2.3.4. Community-based V&V Insights
Till some years ago, software verification and validation were typically considered as
“in-house” processes of software development since the only stakeholders involved were the
development team and the customer who ordered for the software. However with the
evolution of the internet and of the social networks, the situation in the software V&V
landscape has recently been transformed since a crowdsourcing dimension has been added for
several aspects of V&V and especially for performing the different types of tests inherently
involved in a V&V approach. However, the crowdsourcing aspect still remains limited and
not established despite the fact that huge potential is opened up considering the new ways of
engaging wider audiences.
From a validation perspective, the need for involving the crowd emerges from getting early
feedback and opinions of a broader pool of customers that will eventually use the end product.
From a verification perspective, the need for involving the crowd is related both to the issue
that it is virtually impossible to test the vast number of devices and configurations of software
that – especially web-based – software can run on today and to the fact that robust testing is
time consuming, and ensuring that every possible permutation and combination of features,
localizations, and platforms works as intended is nearly impossible.
In essence, the idea of taking advantage of the crowd for verifying or validating software
components and applications is not entirely new – techniques like beta-testing are based on
the feedback received by groups of users who have access to early versions of the software
under assessment/testing. However, thanks to the development of the internet, and especially
of Web 2.0, crowdsourcing V&V processes have become approaches to be considered for
several types of software, including not only web services & tools but also software that has
to be downloaded and installed in specific environments in order to be able to test it. The role
of Web 2.0 is mostly related to the interaction of users/ testers who may build on the results of
other users and go the V&V activities one step further. By creating groups either of selected
specialists or of potential end-users, an interactive V&V process can be initiated and
managed, allowing the inclusion of several different large-scale testing techniques in the
verification and validation of a software product. At the same time, several platforms have
become available to the internet allowing the instantiations of virtual test beds for a software
product and the participation of the public or of selected communities in the V&V processes.
Usually, when speaking for a commercial product or software for industrial or business usage,
the “testers” are explicitly selected either individually or as members of a specific community,
since offering versions of the software to the open public is a difficult endeavour with the
associated privacy or security issues.
The main reason for choosing to verify or validate a product through a crowdsourcing
approach is to overcome the biases that infest most developers and managers around the
world: for a particular expert, it is nearly impossible to imagine and look beyond the
knowledge he/ she has acquired i.e. the set of concepts, beliefs and scenarios that he/ she
knows or predicts. As a result, it is particularly challenging to think outside the box and
conceive the various ways a typical end user would use particular software (Sridharan 2013).
The members of a software development team usually follow V&V procedures and conduct a
series of tests that they consider as representative and adequate of capturing the set of end-
user scenarios for how the software would be used. However, any expert would assert that it
is impossible to capture the complete set of scenarios that an end user may throw at a software
system. As a result, critical elements usually are not being verified and validated in full
extent, something that could lead to software malfunctioning, production system crashes and
customer escalations etc. Crowd sourced testing, and in general V&V involving the crowd
(referring either to specific communities’ members or to the general public), circumvents all
these headaches by bringing a comprehensive set of code coverage mechanisms and end user
scenarios during the design and development stages of software engineering, during which the
cost of modification is meagre. This results in identifying critical use cases early on and
providing for those contingencies, which reduces software maintenance costs later on during
and after productive deployment. Besides progressive code coverage, the quality and depth of
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 55/130
software testing among various vital software modules is achieved, which ultimately results in
a higher code quality, among other benefits (Sridharan 2013).
A very interesting approach involving crowdsourcing in the V&V process, especially as far as
it concerns its verification aspects, is the Crowd Source Formal Verification (CSFV) program
(DARPA 2013). The approach proposed and followed by DARPA focuses on the application
of a formal verification approach based on crowd sourcing. The approach builds on the fact
that formal verification has been too costly to apply beyond small, critical software
components. The Crowd Sourced Formal Verification program makes formal program
verification more cost-effective by reducing the skill set required for verification. The
approach followed is to transform verification into a more accessible task by creating a game
that reflects the code model and is intuitively understandable. Completion of the game
effectively helps the program verification tool complete the corresponding formal program
verification proof.
Another approach involving the crowd (however more focused on the selection of specific
community members) in software V&V is uTest (uTest 2013), a service offering to software
developers a complete platform for performing community based verification and software
testing tasks. The service offers over-the-air distribution capabilities of the software to
selected members of a community and provides the means for collecting bugs, as well as the
detailed results of several different types of tests (functional, security, load, usability,
localization, etc.).
Finally, while crowdsourced V&V shows tremendous opportunity among software
development teams and companies, there are a few critical criteria which need to be addressed
to successfully integrate crowd sourcing into V&V.
The three most important identified criteria are the following (Passbrains Projects 2012):
Crowd selection
- Profiles: The profiles of the people who will participate in the process have to be
selected on the basis of knowledge, experience and demographic criteria.
- Professional background, skills and qualification: Depending on the V&V step,
certain domain knowledge specific skills and qualifications would be beneficial or
even required.
Managed Process. Community based V&V cannot be implemented without applying a
managed process including proper planning, crowd engagement and support, as well as
review and consolidation of results.
Cloud based engagement platform. All the tools needed during the V&V activities should
be easily accessible ideally through one single platform. The platform should provide the
functionality to manage the entire crowd engagement process from registration and
selection to rewarding (if applicable).
2.3.5. V&V in Other FI-PPP Projects
As depicted in the following figure, several projects of different nature to cover the FI-PPP
programme needs have been funded ranging from infrastructure building projects to use case
projects that aim at covering a variety of usage areas by incorporating trials of advanced
Future-Internet-based services and applications.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 56/130
Figure 2-3: FI-PPP Programme Architecture
In summary, Phase 1 projects’ main goal was to lay the technology foundation of the
programme, define "use case scenarios" in different industry sectors, make an inventory of
available (public) infrastructures via capacity building and support the whole Programme,
while Phase 2 projects aim at developing use case pilots and platforms and set up
infrastructures (European Commission). In this context, the Phase 2 projects shall validate the
openness and versatility of the FI-WARE Core Platform.
The following table presents the key findings of selected Phase 1 projects whose approaches
for the validation, assessment and/ or evaluation of their results was publicly available at the
time this review was conducted (April 2013). The results have been elaborated to the degree
to which they were considered relevant for the FITMAN V&V activities.
Table 2-6: FI-PPP Projects Analysis
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
FINEST FINEST proposes a twofold business assessment
approach which is deployed before proceeding to
real coding and aims at validating if the FINEST
collaboration platform covers both the non-domain
and the domain related aspects. Though the
proposed assessment approach is simply paper
based findings and the contribution of the FINEST
platform to the domain shall be evaluated by the
real transport and logistics stakeholders who will or
will not accept the platform solution and change
their way of doing business.
In order to assess the FINEST solution and the
extent of how much it covers the transport and
logistics domain requirements, an end-to-end
assessment scenario, a snapshot of a common
business process, is described and used for
(FINEST:
D1.5 Business
assessment of
technological
solutions
2012)
Medium
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 57/130
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
assessing how the solution provided supports these
business operations and for deriving requirements
from the actors involved in a supply chain. Within
the scenario each of the actors has to fulfil several
tasks and perform certain actions determined by the
tasks. The requirements and demands of the domain
and their reflection in the proposed FINEST
platform are elaborated through a questions and
answer (Q&A) process. The results of this
evaluation are documented in an assessment table.
The assessment table results are used to evaluate
the level of support provided by the technical
solution. This analysis led to a qualitative “fit for
purpose” grade. Finally, a determination is made as
to the level of fulfilment achieved by the technical
solution in meeting the identified needs.
The functional evaluation described and performed
in this context examines how technical solutions
will be accessible by end users and how they can be
utilized (non-domain specific) and on the other
hand how domain business activities can be carried
out, but the true proof of whether the design lives
up to its current appearance will occur when the
platform and its modules are implemented in a test
environment in Phase II of the project.
In the context of evaluating the use case scenarios
and providing the experimentation plan for large
scale trials, FINEST after an extensive state of the
art analysis concerning KPIs in Supply Chain,
Performance measurement in the transport and
logistics, reference projects and relevant theoretical
frameworks proposes a generic FINEST
performance evaluation framework and describes
the envisioned experimentation and evaluation
process. The evaluation framework concerns the
evaluation of the performance improvement for
each one of five scenarios.
The evaluation methodology, which focuses on two
dimensions: the user acceptance and the
performance improvement, aims at defining the
business improvement, degree of improvement and
effects of FINEST capabilities by examining
specific use case scenarios under defined metrics.
Additionally FINEST provide the use case
scenarios in Phase 2 of the FI PPP program with a
template of experiment specifications. The aim is to
help in the description of the scenarios from the
perspective of the business user, by a set of
events/steps to be performed during a test session.
Finally, a final use case specification and phase II
experimentation detailed plan for the post-FINEST
phase for conducting early trials is provided and
includes experimentation specifications and
evaluation methodologies for each of the selected
(FINEST:
D2.4 Initial
experimentatio
n specification
and evaluation
methodologies
for selected
use case
scenarios
2012)
(FINEST:
D2.5 Final use
case
specification
and phase 2
experimentatio
n plan 2012)
High
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 58/130
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
use case scenarios (along with possibilities for
extending the set of scenarios with ones from other
FI PPP use cases). This experimentation will be
conducted by the project cSpace, a continuation of
two FI PPP phase I projects: FINEST and Smart-
Agrifood. The proposed evaluation and
experimentation framework incorporates User
Acceptance Testing, Solution Evaluation and
Benefit Analysis aspects.
In the context of FINEST work along with the
Phase 2 planning for FI-PPP Alignment, FINEST
presents the validation results concerning the usage
of the GE available on the FI-WARE test bed used
for specific elements of the FINEST solution. GE
validation in FINEST has occurred as part of
designing and prototypically implementing FINEST
platform components and core modules in form of
prototypes and proof-of-concepts.
The FINEST project has assessed an validated 24
out of the 30 Generic Enablers that have been
provided within the first release of the FIWARE,
whereof 6 GE are used within the proof-of-concept
implementations, for 8 GE have been taken into
consideration for the technical design and
specifications, and another 10 are considered within
the phase 2 implementation plans.
(D9.3 Report
and Phase 2
Plan for FI
PPP
Alignment
2013)
High
FINSEN
SY
FINSENSY emphasizes the necessity of a first level
of experimentation of the domain specific enablers
defined in the different scenario work packages
before starting early trials in phase II. This early
experimentation would impact in a more successful
trial in phase II. According to FINSENSY’s
approach the evaluation of the practicability of the
different DSEs will be done by means of
Simulation and Experimentation with real
capabilities. Thus, in order to perform the feasibility
study a questionnaire was prepared and applied on
DSEs and the results extracted provided important
criteria why the 5 DSEs are selected for further
evaluation and experimentations. Once the
questionnaire was finished, TELOS (Technical,
Economical, Legal, Operational and Scheduling)
methodology was performed on the DSEs as the
second step of the feasibility study. The TELOS
methodology is usually applied to a project prior to
the start of the project. In this methodology the
Technical, Economical, Legal, Operational and
Scheduling preconditions of a project are evaluated
to judge the feasibility of a project. Thus, TELOS
methodology was applied to each of the DSEs.
These DSEs have then been analysed to find how
they could be experimented in one or several of the
already existing laboratory facilities in FINSENY
consortium, considering both real test and simulated
(FINSENSY:
D8.2
Experiments
and Evaluation
2012)
Low
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 59/130
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
one, in order to have a first test of the practicability
of the DSEs for the Phase II of the FI-PPP program
focused in early trials.
FI-
CONTE
NT
FI-CONTENT provides a common methodology to
describe and assess the SE as well as a validation
methodology. The first methodology aims at
identifying the FI-CONTENT specific enablers
compared to the GE of the FI-WARE by defining a
set of prioritization criteria to rank the different
enablers, describing each of them, applying the
prioritization criteria, and getting the final table
which exhibits the most critical ones. These criteria
are the relevance to the use case, the state of the art
maturity, the technical difficulty of achieving and
the technical difficulty of integration.
The latter methodology provides a validation plan
which describes how each Content Area has been
validated, through demonstrators and proof of
concept implementations of key functionalities
served by single enabler or a combination of
enablers.
The critical specific enablers were validated in two
steps. The first one occurred during the first year
review where a subset of enablers has been
demonstrated. Then, a workshop was organized
with the aim to demonstrate the prototypes of the
FI-CONTENT and about one hundred people have
visited the twelve stands set and provided useful
feedback. All the FI-CONTENT critical specific
enablers identified have been validated during this
workshop.
(FI-
CONTENT:
D1.9: Final
publishable
summary
report 2013)
Medium
Instant
Mobility
In the context of the Instant Mobility project in
order to assess the social acceptability for a
generalization of demand-driven multi-modal
transportation services based on real-time traveller
localization and advanced preferences, Instant
Mobility introduces a research methodology based
on the technology acceptance model (TAM) and
socio-economic variables that influence ICT
adaptation.
In order to test which factors mainly determine the
intention of Instant Mobility services use,
questionnaires electronically self-administered in
the four partner cities of the project were used. Two
groups of assumptions were defined. The first one
aimed to identify which factors mainly influence
the citizen travellers’ behavioural intention towards
Instant Mobility services’ acceptability and the
second group of assumptions’ main objectives was
to compare the four populations of citizen
traveller’s socio-economic, real daily travel,
preferences and real behaviour profile and the
influence on their behaviour intentional behaviour
towards Instant Mobility services’ acceptability.
(Instant
Mobility: D6.4
Instant
Mobility
multimodal
services
acceptability -
final report
2013)
Low
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 60/130
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
Smart
Agri-
Food
Smart Agrifood presents the validation results
extracted from a usage driven design and evaluation
process that was conceptualised by a model that
was labelled V7 model. The aforementioned model
incorporates seven steps which portray two types of
efforts, i.e., expert-based design tasks and different
design and evaluation - oriented interactions with
end users. The repeated end users evaluation during
the entire development process were accomplished
through recurrent design workshops, national
panels, on spot meeting or phone calls, interviews
and focus groups to elicit detailed data of the
information needs and acquire useful feedback.
Finally, evaluation is performed so as to assess the
economical, environmental and social aspects as
well as the evolution path prescribed in terms of
extensibility, flexibility, scalability and portability.
(Smart Agri-
Food: D200.4
Smart
Farming: Final
Assessment
Report 2011)
High
Additionally in order provide a report on the
validation activities, architectural requirements and
system specifications of the project’s pilots, Smart
Agrifood performs backward traceability and
validation to ensure that the provided functionalities
indeed address the functional requirements posed.
A crucial part in system validation is the
verification backward traceability i.e. trace back the
initial user requirements and identify whether they
have been met or not. Towards this end, Smart
Agri-Food extracts the requirements tables and try
validate whether the requirements have been
encapsulated in the architecture design and finally
implemented (and to which extent) in the pilots.
(Smart Agri-
Food: D200.3
Final Report
on Validation
Activities,
Architectural
Requirements
and Detailed
System
Specification
2011)
High
Smart Agrifood provides also a report on the
validation activities that have been carried out in 2
of the project’s pilot cases. Smart Agrifood’s one
approach involved active stakeholders, consumers
to test the system via workshops. In order to assess
viability and compliance with their expectations,
identify their needs and requirements and then test
and validate the app mock-ups and the final app the
project involved a panel of 15-20 consumers in all
the process, its conception, development and
evaluation through interactive and open
discussions. In that way, the results of the
workshops would help and assure the feasibility of
an open deployment of the pilot.
Additionally the validation concerned the
comparison of functional requirements and
developed modules and the examination of the
extent of the adaptation of scenarios.
The other pilot’s validation approach was to use a
dummy product and data sets for system testing
which represent a fully realistic scenario since the
system doesn’t distinguish the dummy product from
a real one.
(Smart Agri-
Food: D400.3
Report on
validation
activities and
Detailed
Specification
Revision n.d.)
Medium
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 61/130
Project Approach Reference
Relevance to
the concept of
FITMAN
Methodology
Smart Agrifood outlines the most appropriate
scenarios for trial specifications in FI-PPP Phase II
and assesses the feasibility of Generic Enablers to
realise the trials. Generic Enablers were also tested
in isolation, i.e. not in the context of a pilot
implementation
The end-to-end scenarios are tested in terms of both
assessing the realisation and usability of the
prototypes and assessing the feasibility of the Core
Platform and individual Generic Enablers. The
evaluation approach includes feedback from the
community as well, during different phases. Before
the pilots implementation, during the development
of the pilots and at the end of the project.
(Smart Agri-
Food: D500.6
Feasibility
Assessment
2013)
High
In a nutshell, the Phase 1 projects have presented several approaches regarding both the
feasibility, acceptability and validation of the FI-WARE GEs and their own SEs as well as the
evaluation and validation of each project’s trial results. Phase I projects were typically
performing validation activities in the context of their own scope but they have offered
directly propositions and suggestions for the Phase II projects as well. FITMAN takes into
account the results of the phase I projects and aspires to collaborate with the parallel, ongoing
Phase II projects concerning the V&V aspect.
From a software assessment perspective, the verification aspect was restricted with only
references made to what extent the product was being built according to the requirements and
design specifications during the coding activities. Significant importance was attached to the
concept of “fit for purpose” evaluation/ validation, where a set of frameworks was introduced
in order to assess the project solutions offered in the domain. The offerings and advancements
made by the deployed solutions were estimated with the stakeholders involved playing major
roles in the evaluation procedures by providing comments and opinions in most approaches.
In more detail, the phase 1 projects presented the following features:
Formal validation processes and feasibility/ acceptability analysis of GEs and SEs
partially embedded in Phase I Projects
Phase 1 projects approach the GE and SE validation as follows: FINEST validates and
assesses the FI-WARE GEs as part of designing and implementing the FINEST platform
components and core modules in the form of prototypes and proof-of-concepts. FINSENSY
performs a feasibility study of domain specific enablers through a questionnaire and the
TELOS (Technical, Economical, Legal, Operational and Scheduling) methodology. On the other
hand, FI-CONTENT provides a common methodology to describe and assess the SEs as well
as a validation methodology.
Implicit verification aspects put forward by Phase I Trial Projects
An overall verification mentality is absent from Phase I projects. Yet there is evidence of
implicit verification aspects in projects which denotes its necessity. For example, FINSENSY
presents an indirect notion of verification during each DSE experiment and Smart Agrifood
performs traceability analysis to trace the requirements which implies except from the
validation concept.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 62/130
Focus on evaluation and business validation spanning multiple domains in a fit-for-
purpose manner
The phase 1 projects have presented a variety of approaches to validate and/ or evaluate the
solutions they offer, yet typically (and with a few exceptions) they do not build on the
plethora of V&V techniques available in the international bibliography. FINEST presents a fit
for purpose framework to assess the domain and generic (non-domain) requirements covered
as well as a performance evaluation framework to assess through defined metrics and KPIs
the added value of the project’s solution to use case scenarios and a template of experiment
specification as well as a detailed experimentation and evaluation plan. Instant Mobility
introduces a research methodology based on the technology acceptance model (TAM) and
socio-economic variables that influence ICT adaptation in order to assess its services
acceptability. Smart Agri-Food performs validation activities on assessing its pilots’ viability
and compliance with users’ expectations and presents validation results extracted from a
usage driven design and evaluation process that was conceptualized by a model that was
named as V7 model. Such an evaluation in the Smart Agri-Food context assesses the
economic, environmental and social aspects as well as the evolution path prescribed in terms
of extensibility, flexibility, scalability and portability. Smart Agri-Food is also implementing
requirements traceability activities to validate its pilots.
Lessons learnt and recommendations for the Phase II projects released
In accordance with their work plan, the FI-PPP Phase I projects have accumulated their
experience, results and use cases in consequent reports with the aim of contributing to
successful trials in the next phase. For example, the FINEST project provides Phase II
projects with a detailed experimentation and evaluation plan and with a first GE validation.
Limited involvement of stakeholders in terms of feedback and validation
End users and active stakeholders were playing a significant role to most of the
aforementioned approaches. However, no crowdsourcing technique has been applied and the
stakeholder engagement is restricted to ex post validation, feedback and acceptance of
deployed results rather than actually brainstorming and actively contributing during the
intermediate steps. FINEST involved active stakeholders to an end to end scenario initially to
assess and derive the requirements and later to participate in an experimentation procedure in
terms of assessing and extracting conclusions from the way actors in the scenarios were
interacting and doing business by using the FINEST platform. FI-CONTENT also organized a
workshop with the aim to demonstrate the prototypes of the FI-CONTENT and about one
hundred people provided useful feedback. All the FI-CONTENT critical specific enablers
identified have been validated during this workshop. Instant mobility circulated online
questionnaires that were self-administered in the four partner cities of the project to acquire
useful insights and feedback. Smart Agri-Food involved consumers to test the system via
workshops and open and interactive discussions, and to elicit detailed needs and acquire
useful feedback through recurrent design workshops, national panels, on spot meetings or
phone calls, interviews and focus groups.
2.4. Key Take-aways
Based on the detailed V&V and Evaluation state-of-the-art analysis that took place in the
previous Sections (2.3.3.1 to 2.3.3.5), a set of emerging conclusions and key take-aways
regarding both the concept of V&V and Evaluation as well as the most relevant underlying
methodologies on which the FITMAN V&V method should be based are presented and
debated.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 63/130
1. Active On-going Research on the fields of V&V throughout the years
Verification and Validation are processes closely coupled with almost any business activity.
Especially when dealing with manufacturing, which constitutes an active and constantly
evolving domain of large financial investment, V&V of every relevant aspect (e.g. software,
hardware, documentation etc.) plays a crucial role.
In general, research on V&V has left significant legacy over the last years. It is indicative that
in the context of FITMAN V&V state-of-the-art analysis, approaches analysed range from
methodologies more than 30 years old to publications of the last few months of 2013.
From a manufacturing perspective though, V&V approaches dedicated to the specific
application domain are limited in the research material retrieved. In most cases, existing V&V
approaches, typically viewed from the business evaluation perspective and proposed in their
own context (i.e. machinery verification, machine software), are applied in manufacturing
settings, which denotes a tendency to adopt V&V results from software engineering.
From the state-of the art analysis, it also proved challenging to retrieve holistic approaches in
terms of spanning the whole FITMAN V&V lifecycle (verification, validation and
evaluation). The overwhelming majority of approaches/ methodologies/ cases focus on either
verifying and validating or evaluating the product in question. It needs to be clarified that
there is quite a number of academic publications that view evaluation as part of a broader
validation phase and not as a clearly independent procedure, which has been taken on board in
the FITMAN V&V activities.
2. A Plethora of Tools and Techniques without clear distinction between V&V and
Evaluation
There is a heavily diversified range of V&V techniques as reflected in the international
bibliography. Each approach/ methodology utilizes a set of at least 2-3 techniques, thus
creating a huge pool of underlying techniques. Nevertheless, there are techniques that occur
frequently and are widely accepted as capable of dealing with particular aspects of the V&V
lifecycle and are analysed in order to be adopted in the context of the FITMAN V&V
methodology (see Section 2.3.2).
From a tool deployment perspective, there is contradictory evidence in V&V: on the one
hand, certain approaches do not involve any tools for V&V activities and on the other hand,
approaches that aims at automating the V&V procedures exploit at least 3-5 tools.
As the V&V Approaches Mapping (in Section 2.3.3) also demonstrates, there is often no clear
distinction of tools and techniques between V&V and Evaluation approaches:
Techniques that can be utilized as part of V&V can be also utilized as part of
evaluation (e.g. testing, observation).
Techniques that can be utilized as part of an evaluation methodology are, at the same
time, subject to verification, validation and testing.
For example, while simulation constitutes a widely utilized tool for evaluation (with the
system being simulated in laboratory conditions and the evaluation eventually taking place
based on simulation results), there is, at the same time, active research on how to efficiently
and effectively verify and validate simulation techniques themselves.
3. No “Universal” or Holistic V&V Approach covering any potential need
Diverse needs and backgrounds (e.g. manufacturing of different products, different
manufacturing procedures, different manufacturing scales etc.) have resulted to highly
differentiated approaches regarding verification, validation and evaluation. There is no
“universal” or “commonly accepted” methodology that can act as the ‘bible’ in any V&V
endeavour, but fit-for-purpose approaches that exploit a wide range set of readily available
techniques are proposed. In addition, most of the identified and analysed methodologies/
approaches are not mature and widely established; the diversified needs of each separate
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 64/130
(manufacturing) procedure typically leads to the conceptualization of new targeted
approaches instead of reusing and adapting existing ones.
4. Prevalence of the Waterfall model in software V&V
As it can be easily concluded from the analysis presented in Section 2.3.3, most of the
existing V&V methodologies/ approaches are based on the waterfall model of software
development. The clear, structured and sequential philosophy it presents from requirements
analysis and software design to implementation and testing eventually acts as a catalyst that
drives corresponding V&V approaches like the V-model. On the other hand, the agile model
of software engineering draws a different mentality that renders V&V more challenging not
only due to the inherent time constraints, but also because of the limited attention and interest
of the scientific community. Currently, most of the resources referring to agile V&V activities
appear in blogs and online resources rather than rigorous academic work presented in papers.
5. Limited Crowdsourcing Aspects
As experience in V&V indicates so far, V&V and Evaluation methodologies are usually
“closed” procedures. Specific stakeholders are typically involved in a highly structured way
(like questionnaires) in the validation and evaluation activities. Engaging different
stakeholders in an open manner is not just that it is not common practice; crowdsourcing
aspects cannot be easily located in existing/ established V&V and/ or evaluation approaches
as section 2.3.4 debates. However, technology for crowdsourcing purposes is already
advanced and easy to use and could be put into practice for the V&V purposes with notable
outcomes.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 65/130
3. FITMAN V&V Methodological Foundations
After reviewing, analysing and presenting existing and well-established V&V approaches/
methodologies, section 3 presents the FITMAN holistic V&V methodological approach that
aims at contributing to:
Verifying and assessing the appropriateness of generic enablers (GEs)
Verifying and validating the specific enablers and trial specific components under
development during the project
Validating and evaluating the complete solution which will be developed for each
Trial in the framework of the project
3.1. Roles’ Profiling
The FITMAN V&V methodology which is described in the following section requires various
actors to be involved, in order to perform the FITMAN V&V activities. The roles described
and the following profiling approach constitutes a proposal which also helps to understand the
major responsibilities emerging from the FITMAN V&V method. The different teams are free
to define their own roles which should be somehow compatible with the ones identified
below. The FITMAN V&V actors are depicted in the figure 3-1 and defined as follows:
Trial Team: FITMAN’s core stakeholders are the 11 trials. Trials’ representatives are
the customers, who state the business needs and translate those needs into trial visions.
Trial Supporting Team: The Trial Supporting Team is responsible for ensuring that
the integrated solution delivered to a trial is in alliance with the trial’s vision.
The Sprint Master, the Product Owner and the Development Team compose the self-
organizing, and self-accountable Sprint Team.
Sprint Master: In literature, a Sprint Master is often referred as Scrum Master (Rising
and Norman 2000), (Marcal, Soares and Furtado and Belchior 2007), Agile Master
(Leffingwell 2010), (Leffingwell 2013), Coach/ Team Coach (Dubinsky and Hazzan
2004), (Derby and Larsen 2007), Servant-Leader etc. FITMAN introduces and adopts
the Sprint Master’s role, who acts as an integral part of the sprint team, ensures the
achievement of each sprint’s goals and helps protect the team from distractions and
clear impediments.
In the context of FITMAN the Sprint Master is held accountable for:
o The communication inside-out and vice-versa of the development team,
ensuring that the team is not distracted and is focused on the tasks at hand to
deliver value in a short time-box.
o Following the rules of agile and ensuring each sprint/ iteration and the whole
process is performed as planned and in lines with the methodological
directions
o The protection, mentoring and guidance of the Development Team
Product Owner: The Product Owner (Leffingwell 2013), (Kettunen 2009) is the
Sprint Team member, who serves as customers’ representative and is responsible for
creating the product backlog in a way that customer needs are met through technical
robust solutions.
The role of product owner in the context of FITMAN is associated with:
o Ensuring the sprint team delivers value to the trials and that the trial business
needs are addressed with the Product Backlog
o Defining, creating and updating the prioritized featured list and user stories of
the product backlog
Development Team: The Development Team is in charge of delivering shippable
extended product versions per sprint and to cover the necessary expertise. It consists
of software analysts, programmers and testers. In view of the FITMAN V&V oriented
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 66/130
scope, the Development Team is also responsible for the verification and testing
activities.
Finally, FITMAN aspires to collaborate with the other FI-PPP projects to bilaterally exchange
views on its concept and the overall FI-PPP experiences. By engaging with a considerable
number of experts from Phase I and II projects, the need to move towards a community based
V&V methodology becomes even more evident. Thus, the FI-PPP stakeholders (including
Phase I and II projects) and the General Public are also envisaged as actors in the external
environment for the FITMAN V&V activities.
In Figure 3-1, the external and the internal actors in the V&V activities are clearly depicted.
On the one hand, the internal enviroment involves the FITMAN stakeholders, i.e. the project
partners, who perform the V&V activities, at two distinct levels: the almost purely IT roles in
the sprint team are accountable for verification activities while higher level, business-related
responsibilites concern the trials who validate the outcomes, with the trial supporting team
playing the intermediate role and the communication node between the sprint team and the
trials. On the other hand, the external enviroment involves a wider community engagement.
It needs to be noted that at the time when the deliverable at hand was written, all these roles
are organized in teams under the Generic Enablers Task Forces, Specific Enablers Task
Forces, Digital-Virtual-Smart Factory Task Forces that include an appropriate mix of
expertise and background which is in accordance with the actors involved in the V&V
activities.
Figure 3-1 Actors involved in the various phases of the FITMAN V&V Methodology
3.2. FITMAN V&V Methodology
Integrating key aspects from different existing V&V approaches and adapting them to the
needs of FITMAN, while incorporating new aspects whenever necessary, the FITMAN V&V
methodology bears two main phases, for the interconnection of which the FITMAN platform
plays an instrumental role:
V&V Methodology Phase I: Product Specific (P)
V&V Methodology Phase II: Trial Specific (T)
The methodology is driven by two key trends: blending the agile philosophy for software
development with best practices from the waterfall philosophy and the crowd phenomenon in
Web 2.0. In Figure 3-2, distinct activities that will be carried out in FITMAN are depicted in
the blue-coloured boxes while the steps of the FITMAN V&V Methodology are visualized as
green-coloured arrows, which beginning from the top include:
I. Business Validation (T-2) to assess whether the overall trial solution eventually
offers sufficient added value to the trial.
II. Trial Solution Validation (T-1) to guarantee that the overall trial solution
satisfies intended use and user needs.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 67/130
III. Product Validation (P-5) to examine whether the product satisfies intended use
and user needs.
IV. Release Verification (P-4) to determine whether the requirements of the final
product release are met.
V. Backlog Verification (P-3) to determine whether the requirements of the product
after each sprint are met.
VI. Model Verification (P-2) to coordinate the alignment between design and
requirements, as well as between design and code.
VII. Code Verification (P-1) to ensure functionality, correctness, reliability, and
robustness of code.
It needs to be noted that it is anticipated that some of the above activities will be less intense
in a Use Case project like FITMAN (such as Code Verification and Model Verification), yet
they are defined in the V&V methodology for completeness purposes.
In the following paragraphs, for each step of the FITMAN V&V method, one technique
retrieved from the international bibliography and the practical approaches is recommended
while alternative techniques are also proposed in case they are considered more appropriate
for some trials. Such a selection of the most appropriate techniques to be combined shall be
based on certain selection criteria and applied by weighing the strengths and weaknesses of
each technique against those criteria. The four most commonly used criteria in the industry
(L. Lobo & J.D. Arthur 2005), which are proposed to be applied in FITMAN as well, are the
following:
Available Personnel: Selection based on the number of people that have to get
involved in the V&V team in order to apply the technique and their expertise
o The Available Personnel is the most widely used criterion in the selection of
V&V techniques since some of the techniques may require the team
performing the verification and validation activities to have more members
than those available per trial.
Time: Selection based on the time needed to apply the V&V technique
Cost: Selection based on expenses involved in applying the V&V technique
o The time and cost criteria aim to lead to the selection of techniques that are
able to conduct the V&V activities in the available timeframe as well as within
the available budget, since some techniques may incur significant costs for
being applied.
Completeness: Selection based on the coverage of V&V objectives
o The completeness criterion is used in order to make sure that all the necessary
aspects are examined, either by selecting a sole technique or by applying two
or more techniques per trial.
A set of additional and probably most crucial criteria for the selection of the appropriate V&V
technique is the degree of familiarity and expertise the responsible V&V team demonstrates
for a technique, as well as the motivation, perceived value and sponsorship of V&V activities.
Eventually, the application of the V&V method consolidates the followed techniques per step
and is self-certified in the FITMAN V&V Package.
With regard to crowdsourcing the V&V activities, a set of optional approaches and platforms
to engage experts and the public in each V&V step are proposed and it remains to the trials to
decide if and which approach they will adopt.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 68/130
Figure 3-2: FITMAN V&V High-level Perspective
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 69/130
3.2.1. Domain/ Trial Specific Perspective
The domain / trial perspective of the V&V activities bears the Business Validation (T-2) and
Trial Solution Validation (T-1) steps as elaborated in the next paragraphs.
Figure 3-3: FITMAN V&V Method from a domain / trial perspective
3.2.1.1 Step T-1: Trial Solution Validation
Trial solution validation refers to the validation of the FITMAN integrated solution,
developed and installed on each trial on the basis of the scenario that has been defined for the
specific trial. This step includes the validation of each trial-specific solution from a technical
and functional point of view only, since the business scope is being validated in step T-2
which follows.
The validation of the trial solution takes place after the integration of the Generic Enablers,
Specific Enablers and Trial Specific Components into a complete package (software solution),
in the scope of covering the full set of the technical, functional and non-functional
requirements defined for each trial. Such a trial solution validation is based on the technical
indicators defined in D2.2.
The purpose of the validation activities in this step is to demonstrate that the integrated
solution correctly implements the functions defined in the technical requirements
specification of the trial case. This requires testing of the solution to determine that the system
performs all specified functions completely, correctly, consistently and accurately. During the
validation process, the interfaces with any other systems and, of course, with the users
(admins, power users, normal users, etc.) are evaluated with respect to completeness,
correctness, consistency and accuracy (International Atomic Energy Agency 1999).
Validation must also assess whether technical performance parameters, critical functions and
other system requirements have been achieved.
Practically, trial solution validation involves answering whether the solution covers all the
technical requirements as set by the trial (on a functional and technical level) and at the same
time whether the integration of the individual components has led to the expected
functionality. In this context, the validation refers to the examination of the characteristics of
the trial solution and to the confirmation of the required functionality of the software in terms
of operating according to the way defined in each trial scenario and producing the expected
results (but not eventually covering the impact of these results – this is the focus point for T-2
Business Validation).
For performing the T-1 validation step one should select and initiate the most suitable, for
each case, validation techniques of the ones examined in chapter 2. While several techniques
can be selected and combined per case, the most usual approach to be applied is ‘User
Acceptance Testing / Beta Testing’. In this context the first action of validation is to prepare a
testing plan describing the validation strategy to be applied. Additionally the plan includes a
complete set of test cases which are designed in order to cover all the functional scenarios of
the solution under validation. These test cases are described in a way that users which will be
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 70/130
selected as testers will be able to easily understand them and carry them through, without
having specific IT knowledge. This implies that the testing at this step is indeed a validation
activity, as it is preferably taken over by ordinary users (mostly end-users of the product) and
not by IT experts (which points more in the direction of verification).
During the validation procedures, the testing team performs the tests described in the testing
plan and documents all the findings in a structured way defined by the plan. Additionally the
trial employees and the external users may perform additional non-prescribed tests on the
basis of their experience. When all tests are completed, the test engineers collect the testing
documents from the whole team and produce an accumulated ‘validation results’ document,
describing in a structured way all the findings of the validation, focusing on the functional or
technical problems identified.
While the aforementioned process refers to ‘User Acceptance Testing / Beta Testing’, almost
any validation technique that can be selected follows a similar process. Based on this, in the
framework of FITMAN trial solutions, validation is recommended to combine ‘User
Acceptance Testing/ Beta Testing’ with other possible validation techniques, described in
Chapter 2.
No matter which combination of validation techniques will be selected, the complete solution
for each trial shall be, in any case, examined and tested under the real operating conditions of
the trial, using real data, or – if this is not possible – by using a simulated environment with
test (preferably historical) data and under operating conditions as close as possible to the real
industrial environment. The reason behind this, is that the scope of the validation is not
limited to examining whether the software quality objectives are met but it extends to
providing evidence that the trial solution is suitable for the trial, thus operates as it should and
produces what it is expected according to the operating scenario defined.
Under those terms, the validation of the trial solution does not involve thorough testing of
every technical aspect of the integrated software but depends on the trial scenario which
guides the validation process, since the main point is to verify that the solution is suitable for
performing what is expected for the trial. In case, of course, the trial solution is not fully
aligned to trial’s vision and requirements as described in the trial scenario, the scope of the
validation is to ensure that any missing or misunderstood requirements (both functional and
non-functional) are revealed and that any inconsistencies become clear. Given the fact that
each component of the solution has been verified and validated in earlier stages, any
inconsistencies may be a result of either not suitable integration of the components into a
complete trial specific solution, or of not suitable selection of components at first place. No
matter what, the scope of the trial solution validation in those cases is to identify such a
problem and to reveal what has to be amended in order to end up with trial specific solutions
that their characteristics, quality and operations do fit to the needs of the trials as described in
the digital, virtual and smart factories scenarios of FITMAN.
Recommended Technique: User Acceptance Testing
Potential Techniques to be employed and combined (based on the trial’s needs): Beta
Testing, Black Box Testing, Inspection, Traceability Assessment.
FITMAN Stakeholders: The stakeholders engaged in the Validation of the Trial Solutions
are the Trial Solution Owner and the Trial Team. The Trial Solution Owner, being the main
contributor behind the development and integration of the solution, is responsible for
supporting the Trial Team to conduct all necessary tests, including trial operation (either by
using real or simulated data) of the solution. At the same time the Trial Team collects users’
feedback, in order to proceed to modifications to the software, either the identified problems
are related to the integration of the components or to incomplete decomposition of the trial’s
needs.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 71/130
Optional Crowd Engagement Method: The Trial solution validation and in general the
validation of any software installed in a manufacturing enterprise, constitutes in most cases an
in-house process. Typically this process is being implemented by a testing team, as described
above which examines the software under simulated or real-life circumstances. However, the
possibility of extending the boundaries of the validation beyond the Trial Team, towards a
crowd-oriented validation (engaging, for example, staff from other departments of the
enterprise, partners who interconnect to the IT systems of the enterprise, or even the broader
public in case the software provides a public interface) is an interesting option to be examined
in the framework of FITMAN, in particular with the help of enterprise collaboration
environments.
Enablers Concerned: Trial Solution Validation refers to each integrated FITMAN solution
installed on each trial, which may incorporate all types of enablers (GEs, SEs and TSCs).
Although in this step the enablers are not individually tested (as this was part of the V&V step
P-5), the interactions between them and the way they are integrated in order to form a solution
are examined and assessed thoroughly, a procedure that is core for the validation process,
which indirectly also assesses the usefulness of the generic and specific enablers and their
potential to interoperate for the production of added-value results.
3.2.1.2 Step T-2: Business Validation
Business Validation the final and perhaps the most crucial step of the FITMAN V&V process
since it answers the question whether the FITMAN trial solutions offer sufficient value to the
trials against their stated business requirements. Business validation entails demonstrating that
the software developed has clear benefit to the trials, allowing them to operate more
efficiently (usually in terms of cost, time or quality) than before, or supporting them to do
something they couldn't do before. So, while trial solution validation (Step T-1) ensures that
the software has been developed and operates correctly and according to the trial scenario, it
is the business validation step that examines if the delivered solution is of real value for the
trial.
To this direction, the scope of this step is strictly business oriented since all technical aspects
have already been examined and assessed in the previous stages of V&V process. This is why
if the results of the business validation are not considered successful, then the solution has to
be examined not in terms of how it is developed, but in terms of re-defining the specifications
on the basis of which the individual components (GEs, SEs, TSC) have been developed (or
selected) and integrated.
In general, Business Validation takes place against the trials’ business requirements on the
basis of specific business performance objectives. In order to measure the accomplishment of
the business objectives, a set of key performance indicators is defined for each trial, providing
the way to compare the efficiency of the operation of the trial before and after the application
of FITMAN trial solution. In order to proceed to the business validation of the trials, it is
required to define specific key performance indicators for each trial and to compare the values
of those indicators before and after the application of the FITMAN trial solution. In other
terms, since the indicators reflect the performance of the trials as industrial systems towards
specific objectives identified per trial, the scope of the validation gets to the comparison of the
performance of the industrial systems of the trials with and without FITMAN solutions.
The requirements for proceeding to the referred business validation are:
i. the definition of the most appropriate key performance indicators per trial
ii. the calculation of the values of the indicators by collecting required data before and
after the application of FITMAN trials
iii. the appropriate data access and availability from the trials side
As far as it concerns requirement (i), the definition of the indicators can be implemented by
applying several methodologies and techniques. For FITMAN, the approach considered most
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 72/130
appropriate to be applied, as described and documented in detail in D2.2, is a simplified
version of the ECOGRAI methodology. An outline of the proposed method is the following:
in order to define the key performance indicators based on ECOGRAI, it is required to
proceed to a detailed analysis of the trial cases for identifying the Decision Centres of each
trial. Using this analysis, the identification of the objectives linked to each Decision Centre is
implemented and afterwards these objectives are linked to sets of Drivers for each Decision
Centre. The next phase of the methodology leads to the definition (on the basis of the defined
objectives and drivers) of the Performance Indicators as well as of the parameters linked to
each one, like: data to be collected, required data processing, representation types etc. A
detailed description of the proposed approach for the definition of Performance Indicators,
described above in high level, can be found in Deliverable D2.2.
As far as the second requirement (ii) is concerned, it refers to the calculation of the values of
the indicators in order to produce measurable results for the business validation of the trial
cases. It is thus proposed that all required data can collected in three phases:
Before the installation of the FITMAN solution in each trial
A short period after the installation of the solution to the trial, in order to confirm that
the values of the performance indicators are moving towards expected directions (e.g.
time decreasing, quality increasing etc.)
At a specifically defined time for each trial, a few months after the installation of the
trial solution, in order to measure whether specific targets for the business indicators
values are met. The exact timeframe depends both on the production lifecycle of each
trial as well as on the overall available time in order to perform the validation in the
framework of the project – this is why a validation timeframe shall be individually
defined for each trial at the beginning of its V&V process.
The calculation of indicators’ values right after the installation of the FITMAN solution is
mostly considered indicative, since usually some time is needed in order for any change to get
reflected to business indicators’ values. However, the idea behind this is to examine the
changes in the values of some explicitly selected indicators which are expected to get directly
influenced by the operation of the FITMAN solution in the trial. Examining the direction
towards which the values of those indicators are moving is important in order to be able to
early diagnose and fix possible configuration errors or to fine-tune parameters which could
have impact on the performance of the industrial system.
The second stage is the most crucial since it evolves measuring the performance of the trial,
by examining defined performance indicators and checking whether they have reached
specified targeted values. Obviously the business validation of a trial solution is considered
successful when the specified targets are met. However, the business validation can be also
considered successful in the case that specific changes in the internal or external environment
of the trial lead to the need of re-examining and amending the targeted values. In this case the
performance of the system, for assessing the effectiveness of the trial solution, is measured
towards the updated values in order to reduce the chance of meeting or not meeting the targets
“by accident” and to be able to validate not only the implementation of the solution but also
the requirements initially defined for each trial. It has to be noted that depending on the
validation timeframe which will be defined for each trial, this validation stage could be
repeated two or more times in order to confirm that the results of applying the software
solution lead to constant improvements. Eventually, it will be defined after setting the exact
validation timeframe of each trial whether validation will be an iterative process.
Recommended Technique: Simplified ECOGRAI Methodology as defined in FITMAN
Deliverable D2.2
Additional Potential Techniques to be employed (based on the trial’s needs): Balance
Scorecard, Manufacturing and Value Chains’ SCOR-VCOR methods
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 73/130
FITMAN Stakeholders: The users of each Trial (forming the Trial Team) are the
stakeholders mainly involved in this step (together with the Trial Supporting Team when
required), since business validation takes place after the trial solution starts operating in the
real environment of each trial. The rest of the stakeholders involved in the specification and
development of the trial solution do not, normally, participate in the operation of the trial or of
the installed system. While they provide any support requested by the end users, they don’t
directly participate in the business evaluation process. However they may be involved in the
analysis of the results of the business validation, especially in the case that targets are not met,
in order either to identify what could be wrong with the implementation of the solution or to
correlate the validation results with changes to the environment of the trial considered
important in terms of redefining expected performance outcomes.
Optional Crowd Engagement Method: The engagement of a large number of stakeholders
in the business validation step is instrumental to perceive whether the trial solution has added
value and clear benefits to the trials in this last step of the integrated V&V method.
In alignment with previous steps of the V&V methodology, crowd engagement with
stakeholders from any tier (e.g. employees, customers, collaborators) and of every
background (e.g. IT, business) can be realized through open collaboration and innovation
platforms in the factory to post comments and ideas (i.e. Yammer (Yammer 2013), Jive (Jive
2013), etc.); online deliberation and feedback tools (like Uservoice (Uservoice 2013),
LinkedIn (LinkedIn 2013)); physical and online workshops; or even traditional online survey
tools (like SurveyMonkey (SurveyMonkey 2013)).
Additional crowd-engagement directions to follow involve gathering insights from
discussions in forums and the social media and applying opinion mining and sentiment
analysis techniques. Another approach, based on the rising concept of gamification (S.
Deterding et al. 2011) is to develop a game-like application, following widely accepted
gamification principles and mechanics that simulates the trial solution and focuses on the
sectors and processes influenced. By examining the behavior of the “players” and by getting
their feedback with a larger horizon (after the solution is publicly released), the Trial Team
can jump into very useful conclusions that enhance the results of the validation performed by
applying common/conventional validation techniques.
Enablers Concerned: None, as this step does not investigate software developments but is
rather focused on the implications of the developed software to the business operations of the
trials.
3.2.2. Product Specific Perspective
Depending on the nature of the product, the product specific perspective of the V&V
activities bears some or all steps ranging from Product Validation (P-5) to Release
Verification (P-4), Backlog Verification (P-3), Model Verification (P-2) and Code
Verification (P-1), as elaborated in the next paragraphs.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 74/130
Figure 3-4: FITMAN V&V Method from a product specific perspective
3.2.2.1 Step P-1: Code Verification
Code Verification aims at analysing single units of code (both separately and with regard to
their dependencies across collections of code units and programs) and proving (typically in a
mathematic, formal manner) that they will function as intended. With a view to identifying
defects, errors and problems in code, Code Verification evaluates the correctness and
improves the overall product’s quality. During this step, the compliance to good code
practices also needs to be verified in order to meet the CISQ (Consortium for IT Software
Quality) software characteristics that include Reliability, Performance Efficiency, Security,
and Maintainability (CISQ Technical Work Groups 2012).
In alignment with the software testing directions for dynamic verification of a program’s
behaviour on a finite set of test cases, suitably selected from the usually infinite executions
domain, against the expected behaviour (International Organization for Standardization
2005), Code Verification is characterized by the following features:
Adopting the agile pattern in terms of strategy, in that Code Verification constitutes an
integral part of each iterative software development phase («sprint») and runs in
parallel with the coding activities.
Ranging from reviewing and inspecting the source code (static) to running specific test
cases (dynamic) to reveal as much potential for failure as possible and produce
repeatable results on specified computer hardware, operating systems, and compilers.
The main focus of this activity is to identify and eliminate programming and
implementation errors within the software (software quality assurance) and to verify
the correctness of the numerical algorithms that are implemented in the code
(numerical algorithm verification).
Given the inherently complex nature of both Future Internet and Manufacturing and the short
time-frame, it is recommended that Automated Development Tests are primarily employed
(with a set of additional techniques for correctness proof to the extent that it is possible).
Finally, Code Verification requires access to the code being tested and is carried out with the
support of configuration management and debugging tools. Typically, it is partially
accomplished by using software quality assurance procedures. Additionally it has to be
mentioned that most development environments incorporate code analysis tools for checking
coding practices and to this end, the decision of exactly which code verification tools shall be
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 75/130
used per case belongs to the development team (especially since it depends on the type and
the technical characteristics of the component under development).
Recommended Technique: White Box Testing
Additional Potential Techniques to be employed (based on the trial’s needs):
Walkthrough
FITMAN Stakeholders: In the code verification phase, the active stakeholders are the
members of the Development Team that is held accountable for running the tests.
Optional Crowd Engagement Method: Code Verification is envisaged as an automated
procedure without involving any stakeholder, other than the Development Team.
Enablers concerned: Since the coding refers to developing new software, only the Specific
Enablers (SEs) and the Trial Specific Components (TSCs) which are aimed to be developed
fall in scope of this phase.
3.2.2.2 Step P-2: Model Verification
Model Verification refers to the process of determining that the product implementation
accurately represents its conceptual design which, on its behalf, is built according to the
requirements. To this direction, Model Verification ensures that the specification is complete,
the look’n’feel and functionalities are according to the requirements and design
documentation and that mistakes have not been made when eventually implementing the code
based on the models.
In contrast to the expected outcome of the model verification process in the international
bibliography, typically defined as the quantified level of agreement between experimental
data and model prediction, as well as the predictive accuracy of the model, Model
Verification in FITMAN is foreseen in 2 sub-steps, fulfiling two purposes in a timely manner:
To ensure that the model representation is according to the requirements as soon as the
Design and Visual Modelling phase is completed. This phase can be also related to
Model-based testing as an abstract (formal) representation of the software under test or
of its requirements is used for validating requirements, checking their consistency, and
generating test cases focused on the behavioural aspect of the software. In practice, the
ideas as captured in visual mock-ups and diagrams can be quickly demonstrated
(within the Development Team or externally) and gather feedback before actually
proceeding to the coding activities. A set of readily available tools for fully interactive
visual prototyping can be exploited to this end.
To accumulate evidence of the product correctness or accuracy for its intended use in
comparison with the visual models and requirements. When the Continuous
Integration and Refactoring activities have ended and since the product
implementation from the technical perspective of the code has been already verified in
the Code Verification phase, the purpose of this step is to ensure that the code is
compliant with the underlying specification and design.
During the Model Verification phase, acceptance testing checks the product behaviour against
the requirements that have been expressed as user stories of the Sprint Backlog. By ensuring
that models are actually defined according to the requirements set and that they have
accurately driven the code implementation, product credibility is eventually built on the basis
of models.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 76/130
In this step, it becomes evident that explaining the product through diagrams (whether they
are depicted in mock-ups or structural and behavioural diagrams) can make the development
team focus on its different aspects and early discover potential problems. Even if the engaged
audience does not understand the details of the model, or the product, the developer may
become aware of bugs simply by studying the models carefully and trying to explain how the
product works.
The key components of the Model Verification (Utting και Legeard 2007) are: the notation
used for representing the model of the software, the test strategy, or algorithm for test case
generation; and the supporting infrastructure for the test execution, including the evaluation of
the expected outputs.
It needs to be noted that depending on the nature of the product, (fully or partially automated)
model checking may be also applied as it consists of a systematically exhaustive exploration
of the product’s states and transitions through mathematical models, exploiting using smart
and domain-specific abstraction techniques. Another approach is deductive verification that
foresees generating from the system and its specifications (and possibly other annotations) a
collection of mathematical proof obligations, the truth of which imply conformance of the
system to its specification, and discharging these obligations using either interactive theorem
provers, automatic theorem provers, or SMT solvers.
Recommended Technique: Traceability Analysis
Additional Potential Techniques to be employed (based on the trial’s needs):
Walkthrough
FITMAN Stakeholders: The FITMAN stakeholders that have an active role in Model
Verification are the Development Team and the Sprint Master. The development team is
responsible for the design and visual modelling as well as for the continuous, efficient and
effective integration and refactoring of the modular components (based on the model
verification results), while the Sprint Master (besides his/ her responsibility of coordinating
the development team and keeping it focused on the target) bears the responsibility of
evaluating the model verification’s results and providing the Development Team with the
necessary feedback/ directions to incorporate feedback during this or the forthcoming sprints.
Optional Crowd Engagement Method: Innovators (defined as the early adapters of a
software product) can be engaged in the model verification phase following a twofold
approach:
The developed models can be presented (in the form of low level mock-ups) in
tentative (and short-lasting) physical or online workshops, where end-users’
representatives with sufficient technical background would provide their valuable
feedback. Readily-available prototyping applications depending on the nature of the
system to be developed (i.e. for mobile applications AppTaster (AppTaster 2013) or
proto.io (Proto.io 2013)) could be also exploited to this end.
The developed models can be provided openly to all interested stakeholders (e.g.
through a wiki, a deliberation platform or other appropriate social channels like
LinkedIn) and call for innovative user stories applying to the models.
Through this engagement procedure the development team and the sprint master could
retrieve valuable feedback on models’ applications that were not foreseen, as well as possible
gaps in the implementation sequence.
Enablers Concerned: Since model verification refers to the software development
procedures, the enables concerned in this specific step are the Specific Enablers (SEs) and the
Trial Specific Components (TSCs) that are envisioned to be developed.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 77/130
3.2.2.3 Steps P-3 & P-4: Backlog & Release Verification
Backlog and Release Verification constitute the final verification steps for each sprint and
release, correspondingly, through a number of tests of various natures. Both Verification steps
need to ensure that the product developed is in compliance with the requirements reflected in
the user stories of the Product Backlog.
Since each sprint incrementally builds on the code created by the previous sprints, Backlog
Verification, on the one hand, aims at ensuring that the product’s behaviour remains
unchanged. To this end, regression testing as the “selective retesting of a system or
component to verify that modifications have not caused unintended effects and that the
system or component still complies with its specified requirements”, according to
IEEE/ISO/IEC 24765:2009 Systems and Software Engineering Vocabulary, is instrumental.
Regression testing tools that support the re-execution of a test suite after the software product
has been modified can be applied in this step.
Release Verification, on the other hand, is involved in the last step of all sprints, followed by
the official product release; thus, it is of high importance to execute the appropriate system
tests in order to achieve the required result. At this phase when the majority of functional
failures should already have been identified and resolved, system testing compares the system
to the non-functional product requirements, such as security, performance, accuracy, and
reliability. External interfaces to other products and services are also evaluated at this level.
Recommended Technique: Regression Testing
Additional Potential Techniques to be employed (based on the product’s needs): Alpha
Testing
FITMAN Stakeholders: The stakeholders that have an active role in FITMAN V&V backlog
verification are the Development Team, the Sprint Master and the Product Owner. The
development team is responsible for the execution of all the necessary tests, while the Sprint
Master (besides his/ her responsibility of coordinating the development team and keeping it
focused on the target) bares the responsibility of evaluating the backlog verification’s results
and providing the Development Team with the necessary feedback/ directions for any
corrections/ refinements. The Product Owner verifies, in a high level, that all necessary
acceptance tests have been executed successfully and approves the product release.
Optional Crowd Engagement Method: During the backlog and release verification, the
effective engagement of the so-called crowd (in terms of other FI-PPP participants or the
general public) becomes even more crucial at two distinct phases:
Ex-ante crowdsourcing for brainstorming and bringing collective intelligence to the
product features. When prioritizing the user stories and the features that will be
implemented in each sprint or overall in the product backlog, users can provide an
external, more neutral perspective on which are the killing features with the help of
online deliberation and feedback tools like UserVoice or even LinkedIn.
Ex-post crowdsourcing for testing purposes after a product release is available. With
the help of dedicated IT tools (e.g. uTest (uTest 2013)), FITMAN could call for a
large number of testers, aiming towards testing all critical aspects of the product
release (e.g. security, load stability etc.). The organization of hackathons that provide
proper incentives for application developers to develop their own solution on top of
the FI-WARE and FITMAN results (providing them with proper incentives), could be
also instrumental in attracting attention and verifying a product.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 78/130
Enablers Concerned: The components under development, Specific Enablers (SEs) and
Trial Specific Components (TSCs), are subject to Backlog Verification, as part of each
sprint’s activities. Release Verification though refers to a higher level of verification
compared to the previous steps of the FITMAN V&V Methodology, thus all categories of
enablers (including Generic Enablers (GEs)) are involved in this specific step.
3.2.2.4 Step P-5: Product Validation
Product validation takes place immediately after the release of a product or right after it
becomes available for testing and trial use. It refers to the process of performing the necessary
number of integrated product/ system operational tests (of various natures) against sets of
requirements deriving from the trials.
As far as the FI-WARE Generic Enablers are concerned, the basis for the validation of each
GE is the consolidated set of the functional and non-functional requirements of all the trials.
Since the GEs are already developed and cannot be altered or updated in the framework of
FITMAN, their validation is the key for definitely deciding and confirming whether each
enabler is going to be used in one or more the trials. Since the validation of the GEs takes
place towards the relevant (in terms of specs and functionality) requirements of the trials, the
result of the validation depends on each examined trial. A GE may be appropriate for one trial
and have positive validation results, while at the same time this GE may not pass the
validation tests when examined for other trials that have different operational requirements.
For the Specific Enablers, as well as for the Trial Specific Components, their validation is
related to their development as it constitutes the last part of the FITMAN development
approach. Those components are being developed by gathering and analysing the
requirements of the trials, so their validation is made on the same basis right after the release
verification of the “final” product.
Product Validation includes review of the operational and technical characteristics of each
product, as well as system acceptance tests in order to confirm that the component is fully in
accordance with the requirements of each one of the trials that will be integrated (according to
the initial selection of enablers). So, for the SEs and TSCs, acceptance testing should target
towards validating both the stories and its implementations.
Finally, it needs to be noted that during the product validation step, the technical indicators
defined in D2.2 can be exploited.
Recommended Technique: Black Box Testing for Validation
Potential Techniques to be employed (based on the product’s needs): System Testing,
User Acceptance Testing, Inspection
FITMAN Stakeholders: The stakeholder mainly engaged in FITMAN V&V product
validation is the Product Owner. As the main contributor behind the product definition and
requirements, the Product Owner is responsible for conducting all necessary acceptance tests
in order to assure the correctness and appropriateness of the product both in technological and
business/ conceptual manner. The Trial Supporting Team also validates the product to ensure
it will individually cover the needs foreseen for the trial.
Optional Crowd Engagement Method: During the Product Validation step, the importance
of engaging a larger audience magnifies. Following the paradigm of backlog and release
verification, multiple crowdsourcing-like techniques can be employed:
Online deliberation and feedback tools (like Uservoice, IdeaScale (IdeaScale 2013), &
LinkedIn) empower users to speak up and understand whether the product covers their
needs and requirements.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 79/130
Open calls for testing scenarios / trials; many organisations/ projects exist in the FI-
PPP level that have relevant (to FITMAN) experience and could provide the product
owner with a large number of appropriate and of high value scenarios.
Workshops and specific working groups aiming towards inspecting in a high level the
developed product and reporting (either in a structured or unstructured manner) any
issues they come up with could provide valuable results.
More traditional online survey tools (like SurveyMonkey) may gather structured
viewpoints for a product.
For constant validation purposes through social media, specific hashtags can be also allocated
to each product. By monitoring the discussions and tweets posted with the hashtag(s),
concrete feedback can be gathered real-time.
A main difference in comparison to the previous phase is that, in product validation,
participants without specific technical/ IT background could provide valuable insights.
Enablers Concerned: Product validation deals with the official release of the final product as
far as it concerns all SEs and TSCs, and with the validation of the GEs as soon as they are
available to try towards the requirements of each trial. The stability, correctness and
appropriateness of the examined product have to be assessed in this step, while at the same
time the levels of openness and versatility of will be evaluated, always in terms that those
criteria are considered extremely important for selecting FITMAN components to be
integrated in the trial solutions.
3.3. FITMAN V&V Methodology Mapping to the Reference Architecture
The draft Reference Architecture for next generation FI-based enterprise systems and
applications that appears in the DoW builds on the concept is that next generation Enterprise
Systems will be collaborative and interoperable by design and they will be mostly serving the
needs of an ecosystem of enterprises rather than the specific requirements of a single
manufacturing enterprise or OEM. The purpose of this architecture is to provide the FITMAN
Trials basic guidelines in order to help them review their business case’s needs according to
three layers: Future Internet Cloud, Business Ecosystem, and Individual Factory/ Enterprise.
In an effort to identify where V&V aspects are embedded in such a Reference Architecture, a
preliminary mapping between specific phases/ steps of the FITMAN V&V methodology and
the FITMAN reference architecture has been conducted as depicted in the following table:
The Future Internet Cloud level of the FITMAN Reference Architecture maps to the
procedures regarding the FI-WARE Generic Enablers (GEs). To this end, the relevant
V&V steps include:
o Release Verification (P-4)
o Product Validation (P-5)
The Business Ecosystem level of the FITMAN Reference Architecture maps to
FITMAN Specific Enablers (SEs), as it aims towards covering needs of specific
business ecosystems. Thus, regarding the FITMAN V&V Methodology, the V&V
steps under consideration bear:
o Code Verification (P-1)
o Model Verification (P-2)
o Backlog Verification (P-3)
o Release Verification (P-4)
o Product Validation (P-5)
The Individual Factory/ Enterprise level of the FITMAN Reference Architecture maps
to the procedures regarding the FITMAN Trial Specific Components (TSCs), as it
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 80/130
aims towards covering needs of individual businesses/ enterprises. With reference to
the FITMAN V&V Methodology, the following steps are considered:
o Code Verification (P-1)
o Model Verification (P-2)
o Backlog Verification (P-3)
o Release Verification (P-4)
o Product Validation (P-5)
o Trial Solution Validation (T-1)
o Business Validation (T-2)
It needs to be noted that (X) denotes steps that may be omitted depending on whether a
component is developed from scratch. The steps T-1 and T-2 of the trial specific perspective
are conducted once per trial.
Table 3-5: FITMAN V&V method and Reference Architecture
Reference
Architecture
Level
FITMAN V&V Methodology Step
Enablers
Concerned
P-1
: C
od
e V
erif
ica
tio
n
P-2
: M
od
el V
erif
ica
tio
n
P-3
: B
ack
log
Ver
ific
ati
on
P-4
: R
elea
se V
erif
ica
tio
n
P-5
: P
rod
uct
Va
lid
ati
on
T-1
: T
ria
l S
olu
tio
n
Va
lid
ati
on
T-2
: B
usi
ness
Va
lid
ati
on
Future Internet
Cloud X X
X X
GEs
Business
Ecosystem (X) (X) (X) X X SEs
Individual
Factory/
Enterprise (X) (X) (X) X X TSCs
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 81/130
4. FITMAN V&V Criteria Elaboration
In order to perform FITMAN Verification and Validation tasks and together with defining the
approaches that shall be initiated according to the FITMAN V&V Methodology in section 3,
several sets of criteria must be defined spanning all the verification and validation steps
described in the methodology. Such V&V criteria of FITMAN can be divided into two
categories: the Business Validation criteria that accompany the FITMAN V&V approach in
manufacturing settings and the IT / Technical V&V criteria, and are expected to be updated as
the project evolves.
4.1. Business Validation Criteria
4.1.1. Generic Criteria
This section aims to provide the general business criteria (defined as the criteria relevant to
business aspects) of the integrated FITMAN V&V methodology. They have been mainly
derived from the SCOR methodology approach (Supply Chain Council Aug 2010), to ensure
consistency in their definition, avoid overlapping or gaps and providing a standard measuring
and assessing metrics.
Another advantage in adopting a reference model like SCOR is that it ensures a fair coverage
of Business Criteria over all Business & Logistic Processes without inhomogeneous
distributions. For each specific trial, if needed, specific criteria will be defined to highlight
peculiar aspects of one trial that is elicited by Business Requirements expressed in FITMAN
task T1.1 during the definition of Business Requirements.
Selected criteria are divided in two major families: internal and external. The first class
focuses on effectiveness to serve customer expectations, the second on efficiently running
processes and optimizing economical and financial results.
The focus is mainly on tangible and measurable criteria to eventually and consistently
instantiate a set of Performance Indicators, but other intangible criteria (as reputation, work
environment and sentiment, etc.) are also more and more kept in consideration nowadays
(Paul 2011), although not emphasized in this section (as specific liaisons with STEEP will be
created in FITMAN Deliverable D2.2).
In particular, in the specific criteria section, specific attention has been paid to what is
expressed in term of specific business objectives for some of the trials (ref. D1.1-FITMAN
Use Case Scenarios).
Table 4-1: Generic Criteria for Business Validation
Criteria Category Criterion
Customer Reliability
The Reliability category addresses the
ability to perform tasks as expected.
Reliability focuses on the predictability of
the outcome of a process. Typical criteria
for the reliability attribute include: On-time,
the right quantity, the right quality.
External
Customer Order Fulfillment Quality
The ability to deliver an order with complete
and accurate documentation and no delivery
damage of what the customer wants in terms
of quality.
Customer Order Fulfillment Quantity
The Ability to deliver orders which all of the
items are received by customer in the
quantities committed by the committed terms
Documentation Accuracy
The ability to deliver an order with accurate
documentation supporting the order,
including packing slips, bills of lading,
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 82/130
Criteria Category Criterion
invoices, etc.
Customer Order Fulfillment Cycle Time
The ability to consistently achieve to fulfill
customer orders. For each individual order,
this time starts from the order receipt and
ends with customer acceptance of the order.
Source Cycle Time
The ability to consistently achieve to fulfill
the sourcing process (including
subcontracting)
Responsiveness The Responsiveness category describes the
speed at which tasks are performed.
Responsiveness addresses repeated speed of
doing business. Example criteria are cycle
time metrics. Responsiveness is a customer
focused attribute.
External
Make Cycle Time
The ability to consistently achieve to fulfill
the production process based on the internal
standard times and market expectations
(including fabrication and assembly). It is
including also Planning and Design time.
Flexibility
The ability to manage an unplanned
sustainable increase in quantities delivered.
Note: the reference values are usually
provided from benchmarking analysis.
The new operating level needs to be
achieved without a significant increase of
cost per unit. Component metrics (Upside
Source Flexibility, Upside Make Flexibility,
etc) can be improved in parallel and as a
result, this calculation requires the result to
be the least amount of time to achieve the
desired result).
Adaptability
The ability to increase or decrease the
quantity delivered on a certain period of
time Note Flexibility refers to the ability to
manage a peak of demand, while
Adaptability refers to the behavior of the
system upon a certain period of time (e.g.
switching from a Level Production approach
to a Chase Production approach)
Agility The Agility category describes the ability to
respond to external influences; the ability to
change. External influences include: Non-
planned increases or decreases in demand,
suppliers or partners going out of business,
natural disasters, acts of (cyber) terrorism,
availability of financial tools (the economy),
labor issues. The SCOR performance
indicators include Flexibility and
Adaptability. Agility is a customer focused
attribute.
External
Supply Chain Value at Risk (VAR)
It indicates how much the complete Supply
Chain (including production) is resilient to
the occurring of Unfavorable events (Risks)
It considers the probability of risk events
and the impact of the events for all the
supply chain functions (e.g. Plan, Source,
Make, Deliver and Return). It indicates how
much the Supply Chain is robust.
Total Supply Chain Management Costs
It refers to the efficiency level of costs
associated with the SCOR Level 2 processes
to Plan, Source, Make, Deliver, and Return.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 83/130
Criteria Category Criterion
Note – The associated values of indicators
are usually referred to AS-IS vs To-Be
situation or based on benchmarks vs
competitors.
Cost of Goods Sold
It refers to the efficiency level of costs
associated with buying raw materials and
producing finished goods. This cost includes
direct costs (labor, materials) and indirect
costs (overhead) and it reflects the efficiency
of the production process.
Cost
The Cost category describes the cost of
operating the process. Typical cost includes
labor cost, material cost, transportation cost.
Cost is an internal focused attribute and
indicates how much the process is efficient
in carrying out its activities.
Internal
Cash-to-Cash Cycle Time
The ability to optimize the time it takes for
an investment made to flow back into a
company after it has been spent for raw
materials. For services, this represents the
time from the point where a company pays
for the resources consumed in the
performance of a service to the time that the
company received payment from the
customer for those services.
Inventory Optimization
The capability to optimize the level of
inventory achieving a trade-off among
service level to customer, optimal production
pace and inventory holding
Assets
The Asset Management Efficiency
(‘Assets’) category describes the ability to
efficiently utilize assets. Asset management
strategies in supply chain include inventory
reduction and in source vs. outsource. It
encompasses as well an optimized
utilization of production resources
(machineries and operators)
Internal
Sales Receivable Optimization
Represents the efficiency of the selling
process from the financial standpoint
including the Receivable payment terms.
Payable Optimization
Represents the efficiency of the sourcing
process from the financial standpoint
including the payable terms. Note:
appropriate management of supply process
(e.g. with a JIT technique) improves the
performance.
Supply Chain Fixed Assets Optimization
Supply Chain Fixed Assets optimization
reflects the return an organization receives
on its invested capital in supply chain fixed
assets. This includes the fixed assets used in
Plan, Source, Make, Deliver, and Return.
Working Capital Optimization
Working capital Optimization reflects the
magnitude of investment relative to a
company’s working capital position verses
the revenue generated from a supply chain.
Needs to consider accounts receivable,
accounts payable, inventory, supply chain
revenue, cost of goods sold and supply chain
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 84/130
Criteria Category Criterion
management costs.
Note: In the previous table criteria reflect values provided from benchmarking analysis to
define improvement from AS-IS to TO-BE. These values will be defined in collaboration with
the Trial Stakeholders based on their specific objectives and expectations. It will actually
constitute a value to be defined when actual indicators are instantiated.
4.1.2. Specific Criteria
During the elicitation of business requirements in T.1, the FITMAN Trials were requested
(for each of their Business scenario) to specify their Business Objectives along with their
description, Impact and Effect in value, according to the following dimensions in alignment
with the European FP7 Project COMVANTAGE (ComVantage):
Table 4-2: Effect Dimensions [Source: ComVantage Project]
Effect Type Decription
Cost ICT capabilities can contribute to cost reduction in various ways: Automation
and standardization of tasks, qualify workers faster and better, inventory and
stock management improve and integrating the suppliers into its customer´s
ICT.
Efficiency ICT capabilities can contribute to improve efficiency in various ways: Improve
availability and efficiency of machines, enabling easy access of maintenance
employees to relevant information, facilitating identification and analysis of
problems, shortening the response time to malfunctions, optimization of the
machine´s activity scheduling and automating decisions regarding maintenance
and production activities.
Quality ICT capabilities can contribute to improve quality in various ways: Supporting
the creation of the characteristics of the product or service, understanding the
requirements that the product or service should fulfil, increase the feet to the
specification of the product or service (decrease errors), improve the
communication of the involved parties in the creation and delivery of the
product or service, making information about the customer´s order available
from initiation to completion, facilitating transparency and early identification
of deviation from desired outcome, adapting to the changing need of
customer’s and the constrains of suppliers, facilitating better understanding of
customer´s needs and generating inside into de customer´s tacit as well as
explicit needs and requirements.
Flexibility ICT capabilities can contribute to improve flexibility by product and service
customization to adapt to market changes and customers preferred options at
designed stage.
Innovation ICT capabilities can contribute to improve innovation in various ways:
Common learning and knowledge sharing and facilitating information,
integration and accessibility among supply chain partners.
Sustainability ICT capabilities can contribute to improve sustainability by reducing carbon
footprint in three ways: Reduced paper usage, reduction of energy consumption
and reduce fuel consumption in distribution.
Trials have expressed their specific expectations and objectives as a combination of such
criteria. Based on these results available in interim form (final draft D1.1 dated on Jul 19th,
2013), a definition of specific criteria has been attempted in case the generic criteria defined
in section 4.1.1 do not apply. A set of specific criteria has been identified, matching the
corresponding objectives from specific Trials in the following table.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 85/130
Table 4-3: Trials specific business objectives and related impacts
Trial Objective Impact Criteria Specific /
Generic
T1 – VW Reduction of time needed for the assessment
of product related inquiries
Reduced Product development time
reduced time to market
Make Cycle Time G
Reduction of costs, spend for the assessment
of product related inquiries
Reduced cost impact Supply Chain Fixed Assets
Optimization
G
Reduction of costs, spend for the
management of the Machinery Repository
Reduced cost impact Supply Chain Fixed Assets
Optimization
G
Immediate accessibility of experts
knowledge about production equipment
Reduced cost impact Supply Chain Fixed Assets
Optimization
G
T2 –TRW
Effective and consistent prevention strategy To design a complete and coherent
prevention strategy
Working Environment Safety S
Optimization of prevention costs The investment in the prevention strategy
will be exploited
Working Capital Optimization G
Reduction of accidents and incidents The workers, technicians and coordinators
will have real information and the
instructions they have to follow in order to
assure that the risks disappear or that the
consequences are minimized
Working Environment Safety S
Increase of the productivity The improvement of H&S of the workers in
the factory and the reduction of dangerous
situations will directly affect to the
productivity
Working Capital Optimization G
T3 - Agusta
Reduction of time of final version of logbook
preparation and relevant data
Improvement of the actual procedure Make Cycle Time G
Improvement of H/C Configuration Control Improvement of the actual procedure Customer Order Fulfilment Cycle
Time
Working Capital Optimization
G
Support and improvement of FOD Improvement of the actual procedure Working Environment Safety S
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 86/130
Trial Objective Impact Criteria Specific /
Generic
prevention management
Improvement of the tools tracking
management
Improvement of the actual procedure Fixed Assets Optimization S
T4 – Whirlpool Improve the communication effectiveness
along the help chain organization
Help chain can react faster and more
effectively, thus reducing waste time
Make Cycle Time G
Improve the effectiveness of decision makers
along their role in help chain
Workers will be more involved and aware in
taking right decision at the right time
People empowerment S
T5 – Piacenza
Better exploitation of internal and external
production capacity
Improved ROI in production infrastructure,
better service and shorter time to market,
higher customer satisfaction
Supply Chain Fixed Assets
Optimization
G
Improved monitoring of production capacity Increased data update and accuracy,
improved efficacy of MES and ERP to
manage production
Supply Chain Fixed Assets
Optimization
G
T6 – APR Decrease the order processing time Enhance the capacity of order treatment Customer Order Fulfilment Cycle
Time
G
Improve the quality of the relationship Reduce the redundancy of information
verification in the validation of customer
orders
Documentation Accuracy G
Improve price competitiveness 5% gain on purchases Total Supply Chain Management
Costs
G
Improve the contracting facilities with
partners
Reduction of the negotiation steps (time)
within the consultation process in
procurement
Source Cycle Time G
T7 – Consulgal Improving readability of the concreting
zones with the combination of visual and
textual information
Impact on the way the concreting operation
is performed and managed
Concreting Operation Time S
Decrease the time for access to information
on concreting operations and eventually help
in quick decision making
This will help in keeping all the stakeholders
well informed and take immediate actions as
necessary.
Make Cycle Time G
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 87/130
Trial Objective Impact Criteria Specific /
Generic
Improve the methodology for performing test
activity, recoding of test results and analysis
of the of test results
The new platform will allow for automatic
analysis (statistically and deviation) of the
test results, thus decreasing the repetitive
work-load for standard operations
- Make Cycle Time
- Customer Order Fulfilment
Quality
G
G
Reduction in the use of paper
Achieved with the automation of the
business scenario under consideration.
Documentation Accuracy S
Ensuring Secure Access improve the existing authentication and
authorization mechanism
Working Environment safety S
T8 – TANet
Faster, and more cost effective, and more
reliable platform and service maintenance
Reduced cost of maintenance, and improved
service to end-user and its clients
Cost of Goods Sold G
Agile, evolving service offering, adaptable to
technology developments.
Higher value service, offering progressively
more competitiveness to end user clusters
and VOs
Market Share Increase S
T9 – COMPlus
Reduction of the development time about
50%
Products are faster developed and can be
brought therefore faster to the mark maturity
Make Cycle Time G
Reduction of mistakes and errors Reduction of the costs and the development
time
Customer Order Fulfilment
Quality
G
Improvement of Front-end Loading Products are faster developed and can be
brought therefore faster to the mark maturity
Make Cycle Time G
T10 – SEGEM-
MacBo
reduce quoting lead time and increase
accuracy
The quality of pricing, in terms of lead time
and cost assessment, is a core performance
factor of SEGEM in its market.
Customer Order Fulfilment
Quality
G
reduce Call for Tender cycle Dealing with the most efficient suppliers is a
critical activity in SEGEM’s engineering
business.
Source Cycle Time G
automate Data management Process Provide the company with internal and
external interoperability for collaborative
business.
Interoperability of mission critical
systems
IT Criteria
T11 – AIDIMA Facilitate the detection and initial analysis of The expected impact for discovering Weak Market Share Increase S
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 88/130
Trial Objective Impact Criteria Specific /
Generic
home trends for further product design and
development
Signals are related to automation and
improvement of the effectiveness of the
trends analysis process.
Identify customers latent demands and
suggestions: Delve where end user opinions
are
Designers will have access to real customer
needs, having a very positive impact on
enhancement of product quality and demand
response skills of product designers.
Market Share Increase S
Increasing sales and obtaining customer’s
satisfaction by using all results obtained in
UC1, i.e.: all weak signals encountered, in
combination with BD of materials.
Innovation that yields to a huge
improvement that will allow designers to
face, in the best possible way, the current
markets demands.
Market Share Increase S
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 89/130
It needs to be noted that most of objectives expressed by the Trials are mapped by Business
Generic Criteria identified above and based on the SCOR methodology.
A set of specific criteria has been identified, matching objectives from specific Trials as
highlighted above:
Safety / Healthy in working Environment: The capability of the solution to significantly
impact the working and environmental conditions in order to minimize casualties and
improve working conditions in terms of health or ergonomic
People Motivation and Empowerment: The solution will create condition for people to be
motivated in contributing to work, bring new ideas, have a positive attitude, etc.
Market Positioning and Market Share: The solution will enable the company to play a
more active role in the market, gaining visibility and reputation based on its ability to
provide innovative, good quality, well designed, sustainable products
Quality: alignment with general or industry specific standards for defining compliancy of
product or process to recognized specifications or procedures, based on market de-facto
definition or statuary/legislations.
These criteria will be further elaborated as an outcome of the STEEP analysis in D2.2.
4.2. IT Verification & Validation Criteria
From an IT perspective, the V&V criteria are categorized in six categories, in alignment with
the ISO 9126 standard (International Organization for Standardization 2001). It’s important to
mention that although the categories derive from the aforementioned standard, the criteria
chosen are not limited to those proposed by ISO 9126. However other individual V&V
criteria have been also selected and elaborated.
Table 4-4: IT Criteria
Criteria Category Criterion
FITMAN-relevant
Openness
Ensuring that specific people groups may access the software
for free with specified rights (depending on the level of
openness)
Versatility
The ability to deliver consistently high performance in a very
broad set of application domains, from server workloads, to
desktop computing and embedded systems
Functionality
The capability of the
software to provide
functions which meet the
stated and implied needs of
users under the specified
conditions of usage
Correctness
The degree to which a system is free from defects in its
specification, design, and implementation
Interoperability
The capability of the software to interact with other systems
Security
The capability of the software to prevent unintended access
and resist deliberate attacks intended to gain unauthorized
access to confidential information
Reliability
The capability of the
software product to
maintain a specified level of
performance when used
under specified conditions
Software maturity
The capability of the software to avoid failure as a result of
faults in the software
Fault Tolerance
The capability of the software to maintain a specified level of
performance in case of software faults or of infringement of
its specified interface.
Recoverability
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 90/130
Criteria Category Criterion
The capability of the software to re-establish its level of
performance and recover the data directly affected in the case
of a failure
Usability
The capability of the
software product to be
understood learned, used
and attractive to the user,
when used under specified
conditions
Understandability
The capability of the software product to enable the user to
understand whether the software is suitable, and how it can
be used for particular tasks and conditions of use.
Ease of learning (learnability)
The capability of the software product to enable the user to
learn its applications.
Operability
The capability of the software product to enable the user to
operate and control it
Communicativeness
The degree to which software is designed in accordance with
the communicative characteristics of users
Attractiveness
The capability of the software product to be liked by the
user.
Efficiency
The capability of the
software product to provide
appropriate performance,
relative to the amount of
resources used, under stated
conditions
Time Behaviour
The capability of the software to provide appropriate
response and processing times and throughput rates when
performing its function under stated conditions (e.g. latency)
Resource Behaviour
The capability of the software to use appropriate resources in
an appropriate time when the software performs its function
under stated condition.
Portability
The capability of the
software product to be
transferred from one
environment to another
Adaptability
The capability of the software to be modified for different
specified environments without applying actions or means
other than those provided for this purpose for the software
considered.
Installability
The capability of the software to be installed in a specified
environment.
Coexistence
The capability of the software to coexist with other
independent software in a common environment sharing
common resources
Replaceability
The capability of the software to be used in place of other
specified software in the environment of that software.
Hardware independence
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 91/130
Criteria Category Criterion
Degree to which the software is dependent on the underlying
hardware
Maintainability
A set of attributes that bear
on the effort needed to
make specified
modifications
Analysability
The capability of the software product to be diagnosed for
deficiencies or causes of failures in the software or for the
parts to be modified to be identified.
Changeability
The capability of the software product to enable a specified
modification to be implemented.
Stability
The capability of the software to minimize unexpected
effects from modifications of the software.
Testability
The capability of the software product to enable modified
software to be validated
Code Consistency
Level of use of uniform design, implementation techniques
and notation
Traceability
Ability to link software components to requirements
The criteria are elaborated further in the following paragraphs.
FITMAN-relevant
Criterion ID SC1
Criterion Name Openness
Definition Ensuring that specific people groups may access the software for
free with specified rights (depending on the level of openness)
V&V Step(s) P-4, P-5, T-1
Generic/Trial
Specific
Generic
Preconditions -
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Criterion ID SC2
Criterion Name Versatility
Definition The ability to deliver consistently high performance in a very
broad set of application domains, from server workloads, to
desktop computing and embedded systems
V&V Step(s) P-4, P-5, T-1
Generic/Trial
Specific
Generic
Preconditions -
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 92/130
Functionality (F)
The capability of the software to provide functions which meet the stated and implied needs
of users under the specified conditions of usage
Criterion ID F1
Criterion Name Correctness
Definition The degree to which a system is free from defects in its
specification, design, and implementation.
V&V Step(s) P-1, P-2, P-3, P-4, P-5
Generic/Trial
Specific
Generic
Preconditions Testing techniques applied by the development team in order to
find possible defects
Reference (Rakitin 2001), (Meyer 1997), (Wilson 2009)
Criterion ID F2
Criterion Name Interoperability
Definition The capability of the software to interact with other systems
V&V Step(s) P-1, P-3, P-4, P-5, T-1
Generic/Trial
Specific
Generic
Preconditions
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Criterion ID F3
Criterion Name Security
Definition The capability of the software to prevent unintended access and
resist deliberate attacks intended to gain unauthorized access to
confidential information
V&V Step(s) P-4, P-5, T-1
Generic/Trial
Specific
Generic
Preconditions Implementation of specific tests concerning security of the
software and of its components
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Reliability (R)
The capability of the software product to maintain a specified level of performance when used
under specified conditions.
Criterion ID R1
Criterion Name Software Maturity
Definition The capability of the software to avoid failure as a result of faults
in the software
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 93/130
V&V Step(s) P-2, P-3, P-4
Generic/Trial
Specific
Generic
Preconditions
Reference (Syahrul Fahmy et. al. 2012)
Criterion ID R2
Criterion Name Fault Tolerance
Definition The capability of the software to maintain a specified level of
performance in case of software faults or of infringement of its
specified interface
V&V Step(s) P-2, P-3, P-4
Generic/Trial
Specific
Generic
Preconditions -
Reference (Syahrul Fahmy et. al. 2012)
Criterion ID R3
Criterion Name Recoverability
Definition The capability of the software to reestablish its level of
performance and recover the data directly affected in the case of a
failure
V&V Step(s) P-2, P-3, P-4
Generic/Trial
Specific
Generic
Preconditions
Reference (Syahrul Fahmy et. al. 2012)
Usability (U)
The capability of the software product to be understood learned, used and attractive to the
user, when used under specified conditions
Criterion ID U1
Criterion Name Understandability
Definition The capability of the software product to enable the user to
understand whether the software is suitable, and how it can be
used for particular tasks and conditions of use.
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Generic
Preconditions
Reference (Syahrul Fahmy et. al. 2012) , (SQRL 2013)
Criterion ID U2
Criterion Name Ease of learning (learnability)
Definition The capability of the software product to enable the user to learn
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 94/130
its applications.
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Generic
Preconditions
Reference (Syahrul Fahmy et. al. 2012) , (SQRL 2013)
Criterion ID U3
Criterion Name Operability
Definition The capability of the software product to enable the user to operate
and control it
V&V Step(s) T-1
Generic/Trial
Specific
Trial Specific
Preconditions Validation by end users
Reference (Syahrul Fahmy et. al. 2012), (SQRL 2013)
Criterion ID U4
Criterion Name Communicativeness
Definition The degree to which software is designed in accordance with the
communicative characteristics of users
V&V Step(s) T-1
Generic/Trial
Specific
Trial Specific
Preconditions
Reference (Syahrul Fahmy et. al. 2012), (SQRL 2013)
Criterion ID U5
Criterion Name Attractiveness
Definition The capability of the software product to be liked by the user.
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Both Generic and Trial Specific
Preconditions
Reference (Fahmy, et al. 2012), (SQRL 2013)
Efficiency (E)
The capability of the software product to provide appropriate performance, relative to the
amount of resources used, under stated conditions
Criterion ID E1
Criterion Name Time Behaviour
Definition The capability of the software to provide appropriate response and
processing times and throughput rates when performing its
function under stated conditions
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 95/130
V&V Step(s) P-3, P-4, P-5, T-1
Generic/Trial
Specific
Both Generic and Trial Specific
Preconditions Measurement of processing times and throughput rates
Reference (Fahmy, et al. 2012)
Criterion ID E2
Criterion Name Resource Behavior
Definition The capability of the software to use appropriate resources in an
appropriate time when the software performs its function under
stated condition
V&V Step(s) P-3, P-4, P-5, T-1
Generic/Trial
Specific
Both Generic and Trial Specific
Preconditions
Reference (International Organization for Standardization 2001)
Portability (P)
The capability of the software product to be transferred from one environment to another
Criterion ID P1
Criterion Name Adaptability
Definition The capability of the software to be modified for different
specified environments without applying actions or means other
than those provided for this purpose for the software considered.
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Both Generic and Trial Specific
Preconditions
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Criterion ID P2
Criterion Name Installability
Definition The capability of the software to be installed in a specified
environment.
V&V Step(s) T-1
Generic/Trial
Specific
Generic
Preconditions
Reference (Rakitin 2001)
Criterion ID P3
Criterion Name Coexistence
Definition The capability of the software to coexist with other independent
software in a common environment sharing common resources
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 96/130
V&V Step(s) T-1
Generic/Trial
Specific
Generic
Preconditions
Reference (Rakitin 2001)
Criterion ID P4
Criterion Name Replaceability
Definition The capability of the software to be used in place of other
specified software in the environment of that software.
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Both Generic and Trial Specific
Preconditions
Reference (Rakitin 2001)
Criterion ID P5
Criterion Name Hardware independence
Definition Degree to which the software is dependent on the underlying
hardware
V&V Step(s) P-3, P-4, P-5
Generic/Trial
Specific
Generic
Preconditions
Reference (Rakitin 2001), (Fahmy, et al. 2012)
Maintainability (M)
A set of attributes that bear on the effort needed to make specified modifications
Criterion ID M1
Criterion Name Analysability
Definition The capability of the software product to be diagnosed for
deficiencies or causes of failures in the software or for the parts to
be modified to be identified
V&V Step(s) P-1, P-2, P-3, P-4, P-5
Generic/Trial
Specific
Generic
Preconditions
Reference (International Organization for Standardization 2001)
Criterion ID M2
Criterion Name Changeability
Definition The capability of the software product to enable a specified
modification to be implemented
V&V Step(s) P-4, P-5, T-1
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 97/130
Generic/Trial
Specific
Preconditions
Reference (International Organization for Standardization 2001)
Criterion ID M3
Criterion Name Stability
Definition The capability of the software to minimize unexpected effects
from modifications of the software
V&V Step(s) P-4, P-5, T-1
Trial Specific (y/n) Generic
Preconditions
Reference (International Organization for Standardization 2001)
Criterion ID M4
Criterion Name Testability
Definition The capability of the software product to enable modified software
to be validated
V&V Step(s) P-4, P-5
Generic/Trial
Specific
Generic
Preconditions
Reference (International Organization for Standardization 2001)
Criterion ID M5
Criterion Name Code Consistency
Definition Level of use of uniform design, implementation techniques and
notation
V&V Step(s) P-1, P-2, P-3, P-4, P-5
Generic/Trial
Specific
Generic
Preconditions
Reference (International Organization for Standardization 2001)
Criterion ID M6
Criterion Name Traceability
Definition Ability to link software components to requirements
V&V Step(s) P-5, T-1
Generic/Trial
Specific
Both generic & trial specific
Preconditions Access to the individual components of the software
Reference (Syahrul Fahmy et. al. 2012)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 98/130
5. Guidelines for the FITMAN V&V Method Application
Section 5 attempts to provide more concrete, practical guidelines towards all actors involved
in the FITMAN V&V methodology application.
5.1. Timing Perspective
According to Section 3.2 the application of the FITMAN methodology is divided in two
main phases:
V&V Methodology Phase I: Product Specific; Steps P1-P5 (optional)
Assessment target: Software development
V&V Methodology Phase II: Trial Specific; Steps T1-T2
Assessment target: Use Case trials, technical and business assessment
Figure 5.1 gives an overview of the overall timing and relationships to the development and
usage of the FITMAN V&V Method. In addition to the two phases, the overall timing
perspective involves also the following activities:
FITMAN V&V Methodology development and experimentation
Preparation for V&V Assessment
Smart/ Digital/ Virtual Factory Trials experimentation and assessment
Reporting and consolidation of results
The overview of the timing therefore contains the following partly overlapping activities and
phases:
Project Months PM1-PM6 Development of the FITMAN V&V Method (Duration of
WP2)
Project Months PM1-PM4 Selection of GEs; Phase I, Steps P-4-P-5 (optional)
Project Months PM5-PM8 V&V Methodology Phase I: Product specific assessment of
FITMAN software development (optional); Steps P-1- P-5
Project Months PM7-PM12 Preparation for V&V assessment (to be used for training and
roll out); Technical & Business Indicators at
Smart/Digital/Virtual Factory level
Project Months PM13-PM21 Trials experimentation, application of FITMAN V&V
Method (Assessment and Evaluation of Use Case trials;
Trial-specific assessment, Phase II), Steps T-1- T-2
Project Months PM22-PM24 Reporting (Consolidation of results)
The phases are illustrated in Figure 5.1 at the bottom (yellow lines in Gantt chart)
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 99/130
Figure 5-1: FITMAN V&V Method application timings
Project Month
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
Task Result A M J J A S O N D J F M A M J J A S O N D J F M
T1.1 FITMAN Consolidation of Use Case Scenarios and Business RequirementsBusiness objectives defined
T1.2 FITMAN Trials IT Requirements for FI-WARE Trials IT Requirements defined
T1.3 FI-WARE Generic Enablers final selection for FITMANFI-WARE GEs selected
T2.1 FITMAN V&V Generic Method and Criteria IdentificationFITMAN V&V Assment Methodology
T2.2 FITMAN V&V Business and Technical Indicators DefinitionBusiness & Tech indicators selected
T2.3 FITMAN V&V Assessment Package Generic Assessment Package ready
T2.4 Instantiation of V&V Assessment Package per Use Case TrialUC Specific Assessm. Packages
T3.2 Planning and Preparation of the Trial Business CasesHuman and org. resources prepared for trials
T3.3 Design & Development of FITMAN Specific EnablersFI Specific Enables ready
T4.1, T5.1, T6.1Instantiation of FITMAN System FITMAN System instantiated for UC
T4.4, T5.4, T6.4Measuring indicators and Governance Start measuring indictors
Finish measuring indictors
WP7 Lessons learned, recommendations, best practices Trials Experiences
Project Month
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
A M J J A S O N D J F M A M J J A S O N D J F M
FITMAN V&V Method Application Timing Perspective
Development of V &V Method Method Development
V&V Methodology Phase I: Product Specific FITMAN V&V Enable developent Assement
Preparation for V&V Assesment Training & Experimentation
V&V Methodology Phase II: Trial Specific FITMAN V&V Use Case Assement
Repoting Consolidation of results
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 100/130
The following table contains the relationships/ dependencies and timing constraints of the
FITMAN V&V Method development and application:
Table 5-2: Relationships and Constraints of the FITMAN V&V Method
Performed
in:
Step Relationships /
dependence,
needs
Timing constraint Deadl
ine
T1.3 Assessment/selection of GEs
V&V Methodology Phase I,
steps P-4-P-5
FITMAN V&V
methodology
A tight timeframe
with WP2 deadlines
PM4
T2.1 Consolidated FITMAN V&V
Methodology and Criteria
Needed for indicator
definition (T2.2)
In parallel with T2.2 PM3
end
T2.2 Use of ECOGRAI
Methodology for KPI
definition methodology
UC Trial business
objectives from T1.1
Task 1.1 finishes also
at PM3
PM3
T2.2 Selection of Generic Business
indicators
In alignment with Business
Criteria
In parallel with T2.1 PM3
end
T2.2 Selection of Generic Technical
indicators
In alignment with IT
Criteria
In parallel with T2.1 PM3
end
T2.2 Experimentation with FITMAN
V&V Methodology to test the
KPI methodology
Need at least one Trial
from each Smart, Digital
and Virtual business UC
Task 1.1 finishes also
at PM3
Task 2.1 finishes also
at PM3
PM3
end
T2.3 Consolidated Generic
Assessment Package
- FITMAN V&V
Methodology
- Business & Tech.
indicators
Dependent on T2.1
and T2.2 results
PM4
end
T2.4 Instantiation of Assessment
Packages.
Define UC Specific Tech. and
Business indicators.
- Generic Assessment
package
- UC Trial business
objectives
- Business & Tech.
indicators
Dependent on T2.3
results
PM6
end
T3.3 V&V Methodology Phase I:
Assessment of FITMAN
software SE and TSC specific
development (Steps P-1-P-5)
optional Very short time for
SE and TSC
development /
adaption
PM8
end
WP4,5,6 FITMAN Technical &
Business indicators for Smart/
Digital / Virtual Factory
WP2 V&V assessment
method , V&V package
and V&V instantiation for
trials
tight schedule PM8
not
assigned
Building preparedness for UC
trials. Training of UC human
resources
- FITMAN V&V
Methodology
- UC specific Assessment
Packages
Ready by PM12 when
measurement starts
PM12
WP4,5,6 V&V Methodology Phase II
Use Case Trial Specific
Assessment, Steps T-1-T-2
Use Case Specific
Instantiation of indicators
and Assessment Packages.
PM21
WP7 Lessons learnt; Results
consolidation and reporting
WP4,5,6 assessment PM24
Recommendation
The FITMAN V&V Method and Assessment packages (WP2) will be ready by PM6; the
indicators at Smart, Digital, Virtual Factory level are defined by M8 while the actual
measurement starts at PM13. The time between the finalization of WP2 and the start of the
measurement activities should be efficiently used for increasing the preparedness of the UC
trial human resources for the assessment and measurement. In this context, to facilitate the
overall FITMAN activities, it is recommended that Task 2.4 is prolonged until PM12 (at least)
to take care of building preparedness for UC trials and training of UC human resources.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 101/130
5.2. V&V Checklist per stakeholder
This section lists the groups of stakeholders and individual stakeholders that are involved in
the V&V assessment process. For each stakeholder a list of items to be checked is defined.
For each item in the check list the timing and responsibility are also defined. The content, i.e.
the items on the checklist, depends on and takes input from the methodologies defined in the
previous chapters.
For the industrial partners, the checklist will be an integrated part of the assessment package.
The checklist is instantiated for each use case trial in Task 2.4. For other stakeholders the
checklist will be a tool for managing the V&V assessment process.
5.2.1. Stakeholders involved in the V&V assessment process
The stakeholders of the FITMAN V&V process have been defined through roles profiling
according to the chosen FITMAN V&V methodology. The stakeholders, i.e. the actors in the
assessment are the following:
Table 5-3: Stakeholders of the FITMAN V&V Method
Stakeholders Definition Role Description
Trials Smart Factory trial partners:
TRW Automotive
Whirlpool
COMPlus
Piacenza
Digital Factory trial partner
Volkswagen
Consulgal
Agusta Westland
AIDIMA
Virtual Factory trial partner
Geoloc
TANet
A.P.R.
FITMAN’s core
stakeholders are the 11
trials. Trials’
representatives are the
customers, who state the
business needs and
translate those needs into
trial visions. They are also
the main actors in
assessing the business
benefits of the solutions.
Trial Solution Owners Support organisations for
each trial:
Smart Factory: Innovalia,
Politecnico di Milano,
Softeco, Fraunhofer
Institute
Digital Factory:
Fraunhofer IPK, TXT e-
Solutions, Universitat
Politecnica de Valencia,
Uninova
Virtual Factory:
University Lumiere Lyon
2, Coventry University,
University Bordeaux 1
The Trial Solution Owner
is responsible for ensuring
that the integrated solution
delivered to a trial is in
alliance with the trial’s
vision.
Sprint Master Software developers
SE provider
Trial Specific Component
provider
In the context of FITMAN
the Sprint Master is held
accountable for:
The communication
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 102/130
Interoperability service
providers
Platform & infrastructure
providers
WP3
inside-out and vice-
versa of the
development team,
ensuring that the team
is not distracted and is
focused on the tasks at
hand to deliver value in
a short time-box.
Following the rules of
agile and ensuring each
sprint/ iteration and the
whole process is
performed as planned
and in lines with the
methodological
directions
The protection,
mentoring and
guidance of the
Development Team
Product Owner The role of product owner
in the context of FITMAN
is associated with:
Ensuring the sprint
team delivers value to
the trials and that the
trial business needs are
addressed with the
Product Backlog
Defining, creating and
updating the prioritized
featured list and user
stories of the product
backlog
Development Team The Development Team is
in charge of delivering
shippable extended product
versions per sprint. In view
of the FITMAN V&V
oriented scope, the
Development Team is also
responsible for the Phase I
(development phase)
verification and testing
activities.
External stakeholders /
community-based view FI-PPP stakeholders (including Phase I and II
projects) and the General
Public
FITMAN aspires to
collaborate with the other
FI-PPP projects. Thus, the
FI-PPP stakeholders (including Phase I and II
projects) and the General
Public are also potential
actors which can support
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 103/130
the crowd based V&V
approach. The activity is of
course dependent on the
interest of the FI-PPP
stakeholders.
In addition to the above mentioned stakeholders, also V&V Assessment Support for
methodology, indicators, assessment package and instantiation of assessment package can be
seen as a stakeholder of the FITMAN V&V assessment.
5.2.2. Checklists
The checklists include a general part and a stakeholder specific part. They are also divided
into V&V phases which are the following:
Phase I of the V&V Methodology: Product Specific
Phase II of the V&V Methodology: Trial Specific
The checklists are also divided into V&V Methodology steps where necessary. The steps are:
Code verification - P1
Model verification - P2
Backlog verification - P3
Release Verification - P4
Product validation - P5
Trial solution validation - T1
Business Validation - T2
An example of a Phase I “Product Specific” checklist for trial solution owners concerning
GEs is presented in the following table:
Table 5-4: Phase I “Product Specific” Checklist
Checklist for trial solution owners, Phase I, Steps P-1 – P-5 / GEs; SEs, TSCs
Item / Topic Detailed action Timing Responsible √
GE selection Selection of the GEs to be
used in the trial
Month 1-4 Trial solution
owner
Phase I V&V
decision for
GEs
Decision about the
application of Phase I
assessment (optional)
Month 3 Trial solution
owner
FI-WARE
platform
instantiation
for FITMAN
GE application and
validation; steps P-4- P-5
Month 5-6 Trial solution
owner,
development
team
Phase I V&V
decision for
SEs
Decision about the
application of Phase I
assessment and steps
included (optional)
Month 6 Trial solution
owner
Design and
development
of SEs
SE development and Phase
I V&V steps P-1 – P-5
Month 7-8 Development
team, Trial
solution owner
Crowd
inclusion in
the V&V
Decide how if and how to
include crowd
M 7-8 Development
team, Trial
solution owner
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 104/130
An example of a Phase II “Trial Specific” & Steps “Trial solution validation” and “Business
validation” checklist for trial industrial partners is presented in the following table:
Table 5-5: Phase II “Trial Specific” & Steps “Trial solution validation” and “Business validation”
Checklist
Checklist for trial industrial partners, Phase II, Steps T-1-T-2
Item / Topic Detailed action Timing Responsible √
Select
persons
The involvement of end
users preliminary defined
and selected
Month 4 Trial
support
organization
FITMAN
V&V
instantiation
First selection of business
and technical indicators,
based on FITMAN KPI
methodology (in D2.2).
Input to Trial handbook.
Month 4 Trial
support
organization
Trial specific
indicator
definition
Final indicator definition;
Trial handbook update.
Month 8 Trial
support
organization
Extension of
the main
assessment
group
Persons from department X
(R&D, purchasing, sales,
after sales, training etc.
according to the trial focus).
Decision on type and
method of crowd
involvement
Month 8 Trial
support
organization
Guidance of
the
assessment
group
Guidance and training as
needed
Month 8-
12
Trial
support
organization
V&V
assessment
Phase II
Run the experimentation
with the trials. Collect data.
End user and stakeholder
involvement in the
assessment
Month 13-
21
Trial
support
organization
Reporting Consolidation of assessment
results. Input to Trial
handbook.
Month 21-
24
Trial
support
organization
The checklists will be further updated and completed along the methodology updates and
according to the Trial handbook. This will be done in the V&V assessment package (D2.3).
5.3. V&V Decalogue for the Trials
On the basis of the trial scenarios as studied and elaborated by the project team and as
reflected in the interaction with their stakeholders, a set of trial implications has been derived.
Such implications have been formulated into the following concrete recommendations (“the
decalogue of V&V”) that should be taken into account by the trials, their support teams and
generally all stakeholders when initiating similar endeavours.
This set of recommendations is addressed towards not only trials, but also development
teams, which should all work together in a collaborative manner towards delivering effective
and added value applications and systems to advance manufacturing solutions. With this
audience in mind, the complete set of recommendations (characterized as the “decalogue”) is
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 105/130
presented with the aim to infuse a very practical and applicable philosophy of the V&V steps
to all stakeholders. It is crucial for all of them to understand and acknowledge all
recommendations for a complete trial. Such a mutual understanding will allow more fruitful
collaborations in the future and more result-oriented activities, where both parties will be able
to comprehend the V&V requirements and the work carried out by each set of actors.
As such, the recommendations that are presented below refer to all steps of the V&V method
and target at the various V&V teams.
Trial Recommendation 1. Have a deep knowledge of the requirements and the ends
sought in each trial.
Within each trial’s environment, business and technical requirements represent what must be
delivered to provide value. In order to proceed to effective verification and validation
activities, all stakeholders involved need to be familiar with the requirements that have been
expressed as well as with the objectives pursued with each trial solution and product.
In this direction, concrete requirements and specifications have a crucial role in V&V,
especially when demonstrating the following attributes (Rakitin, Software Verificaiton and
Validation for Practitioners and Managers 2001): Unambiguous (one and only interpretation
of each requirement); Complete; Verifiable (existing process through which a human or a
machine can verify that the component correctly implements the stated requirements);
Consistent; Modifiable; Traceable; and Usable.
Trial Recommendation 2. Infuse a V&V culture to your trial.
In order to attain the maximum impact within a trial, a V&V-rooted culture needs to be
fostered. All stakeholders need to become aware of the potential benefits through appropriate
communication and dissemination activities that will contribute to viewing the V&V activities
under a new perspective, especially taking into account that they are typically considered as
time and effort consuming. Successful V&V activities for a trial solution eventually lead to
significant added value and benefits for a trial that would be impossible to get otherwise, thus
stakeholders with hands-on experience on V&V are often the best ambassadors of this culture
in an organization.
Trial Recommendation 3. Create a concrete V&V plan to be followed in each step,
tailored to your real needs and competences.
Since building a generic V&V plan to cover all aspects for each trial and product is
impossible and customization according to the specific needs is the most recommended way
to go, concrete plans should be collectively developed. Such plans should adhere to the
concepts and principles of the techniques applied in each step of the V&V method, while
being characterized as feasible for the constraints and limitations of the trials. It is expected
that a V&V plan should be periodically revised to incorporate feedback received by the
stakeholders in each expected iteration of the trial solution.
Trial Recommendation 4. Engage from the very beginning the required stakeholders,
assigning them with appropriate responsibilities and decision power.
In order to consider a V&V step implementation as successful, a wide range of innumerable
stakeholders needs to be involved at various engagement levels: from active, everyday
engagement, to merely periodic feedback. Involving all stakeholders at the appropriate roles
based on their expertise and background is a time consuming task that should not be
underestimated. To this end, the necessary roles in each V&V step need to be explicitly
outlined and the associated responsibilities should be put into effect from the very beginning.
Trial Recommendation 5. Tap the power of data to effectively conduct V&V activities.
No matter how well-defined or detailed a technique is, high-quality data represent the “holy
grail” of V&V. Particular attention thus needs to be given to collect, filter, curate and
intelligently tap data, available from multiple sources, i.e. through the enterprise back-office
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 106/130
systems, destined for commercial purposes, or even available from open data initiatives. As
current validation cases typically struggle to cope with too much or too little data of the
appropriate level and frequency to calculate the appropriate indicators, reliable data sources
need to be foreseen from the very beginning and incorporated in a real-time manner to allow
for pragmatically informed judgments.
Trial Recommendation 6. Proceed with each V&V step on the basis of at least 1
recommended technique.
Based on the detailed guidelines for each step of the FITMAN V&V method available in the
FITMAN V&V package, each trial is in the position to proceed with the required verification
and validation activities. Trials should generally not skip any step of the V&V method which
is required for the trial solution under development, while they should follow the right order
of steps and not halt during any intermediate step until the reach to the Business Validation
step. Taking into account multiple criteria (like the availability and expertise of the
personnel), each trial should also select an appropriate mixture of techniques to be applied.
It needs to be noted that self-declaration of conformity to a technique may be applied along
the steps of the V&V method, especially for Release Verification (P-4), Backlog Verification
(P-3), Model Verification (P-2) and Code Verification (P-1).
Trial Recommendation 7. Keep the balance between business and technical V&V
aspects.
In the scope of FITMAN, achieving the business intentions of the manufacturing enterprises
that comprise the trials goes hand in hand with the need to assess the openness, versatility and
applicability of a set of software products (either the FI-WARE Generic Enablers or the
FITMAN Specific Enablers and Trial Specific Components). Typically, the technical aspects
may seem generally limited to high-level verification and validation activities, yet their
importance to examine whether the product satisfies intended use and user needs and
determine whether the requirements of the final product release are met should not be
underpinned. Ensuring that the verification and validation activities balance between the
inherent business and IT perspectives is thus not a trivial task that should be undertaken by
the FITMAN stakeholders, trials and V&V teams.
Trial Recommendation 8. Invest on crowd-sourcing techniques aligned to the trial
philosophy.
Capitalizing on the technology readiness and the experiences gained in the Web 2.0 era,
engaging the crowd holds the credentials to add significant value to the V&V activities.
Interacting with a large number of stakeholders during verification and validation can indeed
aid towards thinking “out of the box” and identifying potential problems, misunderstandings
and malfunctions that haven’t been projected/ thought of. In addition, crowdsourcing V&V
can also aid towards wider adoption and dissemination of the trial solution within the
organization.
Trial Recommendation 9. Keep a complete V&V log and document in detail the V&V
findings.
In order to evaluate and ameliorate the underlying V&V methodology itself, it is important to
communicate the actual/ real-life results of the verified/ validated/ evaluated trial solutions/
Specific Enablers/ Generic Enablers back to the teams engaged in the V&V process. In
addition, a detailed documentation of the findings/ results can act like a guide towards
detecting the causes of potential malfunctions, as well as avoiding the same mistakes in future
similar initiatives.
Trial Recommendation 10. Treat V&V as a continuous, iterative procedure.
The V&V activities should be treated as an intagible, continuously exploitable asset of your
organisation under a long-term perspective. They should not represent an one-off effort before
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 107/130
releasing a trial solution, but rather represent the commitment of the trial to measure a set of
Key Performance Indicators along a certain period of time.
5.4. Recommendations for the V&V Package
5.4.1. Objectives and definition of FITMAN V&V package
The FITMAN DoW includes the following description: “FITMAN Verification & Validation
Assessment Package includes all the necessary and optional indicators and their metadata:
link to criteria, descriptions and guidance etc. The package supports the application of the
indicators, documentation and visualisation of the assessment results, and context-based
comparison with other use case trials. The package will be self-explanatory, easy to use and a
documented tool and it can easily be instantiated for each respective Use Case Trial in Task
2.4. The layers of the architecture; the single Enterprise level, the Ecosystem level and the FI
level will be used. “
The package is developed in WP2, T2.3 and instantiated for each of the Use Case Trial in
T2.4:
“The Use Case Trial scope and specific requirements and environment conditions will affect
the each instantiation of the package. In addition to the necessary indicators relevant to all
the optional indicators are reviewed and selected. The Instantiation of Package will be
provided to the Use Case Trial to support and document the assessment.” (DoW)
Thus the objective of FITMAN V&V Assessment package is to offer holistic support to
FITMAN trials in the V&V assessment and indicator measurement. It is recommended that
this will be the main focus of FITMAN V&V package: The package will focus on Phase II:
Trial specific technical and business indicators. It will not cover the software development
phase I V&V activities (optional) as described in this deliverable. The following definitions
are proposed:
The FITMAN generic V&V package consolidates together all the generic information
regarding the FITMAN criteria, methodologies and guidelines, technical and business
indicators and supports the FITMAN trials in FITMAN V&V assessment process and
result analysis.
Instantiation of the FITMAN V&V package means the selection of the criteria,
metrics and methodologies for a specific FITMAN trial and configuration of the
package to apply the selection.
The deliverables of WP1 (and the Trial handbook as an unofficial, live document
being drafted and updated during the FITMAN implementation) will have a role in the
instantiation. This will be better reflected after the first deliverables have been
developed. These definitions will be further updated after the delivery of the V&V
assessment package (M4).
5.4.2. Recommendations for FITMAN V&V package functions
This chapter gives recommendations for the FITMAN V&V package functions. The
recommendations have been developed in the timeframe of D2.1; that is: in parallel with the
FITMAN V&V methodology and KPI methodology development. Because of this and
because of the very tight timeframe all the recommendations presented in this chapter will not
necessarily be implemented in the V&V assessment package. The decisions will be made in
D2.3 (M4).
In this chapter, it is recommended that FITMAN V&V assessment package should focus on
the trials: Phase II assessment: evaluation of business and technical indicators. Thus the
background assumption here is that FITMAN V&V assessment package potentially covers
Phase I = software development phase methods, only partially and during Steps P-4: Release
Verification and P-5: Product Validation. Phase I is optional and in many cases needs to be
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 108/130
kept in minimum because of the tight schedule and software which is not developed from the
beginning but more adapted from existing components, thus the degree to which it will be
covered will be decided in due time.
To offer a holistic support for the FITMAN trials the following potential functional
requirements for the package have been identified (draft):
1. Integrated information package
2. Support for trial-specific instantiation
3. V&V assessment support and guidance
4. Data input and collection
5. Aggregation and analysis of data to calculate indicators
6. Trial result visualization
7. Documentation
8. Assessment status indication individually per trial and consolidated at FITMAN level.
9. Search functions
The functions are described more below:
1. Integrated package of generic information: The FITMAN V&V package should
describe all the information regarding the criteria, methodologies and indicators as
defined in tasks T2.1 and T2.2. This includes:
- definition of FITMAN V&V glossary
- definition and description of all the FITMAN criteria
- definition and description of all indicators and metrics (performance indicators,
technical, non-technical indicators, qualitative and quantitative indicators),
including the formula for calculation, used units, data requirements and example
of use.
- description and guidance of the methodologies for the applicable steps of the V&V
method
- description and guidance of the methodologies for collecting the information and
calculating the indicators; including the stakeholder participation, number of
evaluators, methods for data collection, assessment scope and temporal
requirements (as-is, to-be, target values…). Different indicators may have different
methodologies.
2. Support for trial-specific instantiation. The FITMAN V&V package is used to
instantiate the generic criteria and indicators for a specific trial. This includes:
- setting up the instantiated V&V package with trial basic description: trial name,
objectives, list of used GEs and SEs (whenever needed)
- selection of criteria from a pre-defined set, adding new criteria if needed
- offering the generic indicators for selection: partly based on criteria, partly some
indicators are offered for all trials
- option for adding new trial-specific indicators, including indicator description
- saving the trial-specific selection for further usage
- setting up target values for the indicators (option)
- presentation of the selected indicators on the “trial dashboard”
3. V&V assessment support and guidance: The FITMAN V&V package should guide the
users through the different phases of V&V assessment (Phase II). This means offering
the indicator-specific or generic methodologies, depending on the selected criteria and
indicators, offering links to guidance material etc. Potentially also the V&V process
could be visualized with links to the tasks and material of each phase.
4. Data input and collection; potential crowd inclusion. Different indicators require
different kinds of data which are collected from different types of sources. The
FITMAN V&V package should support the collection and input of data. (Currently
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 109/130
the indicators have not been defined; thus the data requirements are not yet known).
There are different options for data input/ collection:
- methods for the input of existing data (collected elsewhere), just fed to the
FITMAN V&V
- data based on measuring in trials; manual input of performance values etc.
- expert opinions and assessment; collection of the same data items from several
experts
- questionnaires (for example web queries) or other evaluations based on crowd
sourcing.
5. Aggregation and analysis of data to calculate indicators. The aggregation is needed:
- to calculate the indicators from raw data using the defined formula
- to aggregate values given by different experts/ evaluators at the trial level to
receive the mean value, median and distribution (also crowd approach possible)
- to analyse the values against target values or development or change in time (if
possible)
- overall consolidation: to aggregate values of the same items and scope from
different trials measuring the same indicators, for example for same GEs etc.
6. Trial result visualization: The objective of visualization is to offer the possibility to
understand and interpret the trial results, also through comparison:
- At the trial level the V&V assessment results should be visualized and allow
comparison for example between different user/ expert groups, different time
points or to target values. The visualization may depend on indicator type.
- At the FITMAN level the visualization should support comparison between trials,
trial groups (smart, digital, virtual), architecture levels or between GEs.
7. Documentation. The FITMAN V&V package should document all the raw data, query
etc. results and the process: which methods were used, who participated in the
assessment, when it was performed etc. to allow tracing back.
8. Assessment status indication: This allows the identification of the status (the phases
completed or currently active) of the trial assessment both to support the trial
management and the use case management at the FITMAN level:
- The trial dashboard should visualize the current state of the trial assessment: which
phases have been completed, which stakeholders have completed their assessment
etc. (depending on the phases and methodology).
- The FITMAN level dashboard should give an overview of the status of all the
trials V&V status and offer links to the trial dashboards. It should also offer
consolidation and comparison functionalities for benchmarking among trials.
9. Search functions. The V&V package could include a search function to search all
FITMAN V&V assessment information to identify for example: which trials have
assessed which GEs, which trials have used specific indicators, which trials have
reached their target values etc.
The requirements above are recommendations and they will be - in the future - checked again
against the defined assessment methodologies. It is clear that not all functions can be
implemented in the short timeframe, but the most important of them need to be selected. This
is dependent also on how the V&V package is implemented.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 110/130
5.4.3. Other recommendations
The FITMAN V&V package should support the V&V Trial specific assessment (Phase II)
and potentially part of the V&V Product specific assessment (Phase I, Steps P-4 and P-5) in
all FITMAN trials. Thus it should be easy to use for the end users and not require additional
workload for the assessment.
1. Simple, self-explanatory, easy to use tool: The package should be as simple as
possible to use. It should show clearly the different V&V sections (technical, non-
technical, requirement verification, performance). Guidance material, for example
through an example, should be available.
2. Easy to take into use and instantiate: The tools should not require any software
installations, or be dependent on specific non-standard environment. From this point
of view either web-based of excel-based tool is recommended.
3. On-line or/and off-line usage: On one hand, a web-based tool has advantages as it is
available in each machine with an internet connection, able to collect all the results
into a common storage and thus does not require additional transfer of data from one
computer to another. On the other hand, the requirement of an Internet connection is a
restriction as it might be the case that internet is not always available in the trial
assessment situation.
4. Assessment data openness. This is a question that needs to be discussed and answered
in T2.3. It is also dependent on the data contents in the Trial handbooks and how open
they are within the FITMAN consortium or outside FITMAN. Full openness of data
(both raw data and indicators and metadata) will probably not be accepted. The
indicator level results are public through deliverables D4.3-4; D5.3-4 and D6.3-4.
5. Modular Implementation/ Operation. As also mentioned in the previous sections, the
project’s Trials may not come up with the same needs in the same time intervals; it can
be also considered possible that main parts of the FITMAN V&V Methodology might
not be utilised/ implemented at all in the timeframe of the project. Thus, the ability to
implement modular parts of the methodology/ package (without dependence from the
previous/ next ones) can be considered as an important prerequisite.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 111/130
6. Conclusions & Next Steps
The purpose of the deliverable at hand (D2.1 - FITMAN Verification & Validation Method
and Criteria) has been to provide FITMAN with the appropriate methodology in order to:
1. Verify that the FI-WARE generic and FITMAN specific enablers (as well as Trial
Specific Components) satisfy the technological, platform and architectural integration
requirements imposed
2. Validate that the FI-WARE generic and FITMAN specific enablers (as well as Trial
Specific Components) satisfy the requirements of Smart-Digital-Virtual use case trials,
and
3. Identify the evaluation and assessment criteria to be used in all Use Case Trials, taking
into account sustainability (STEEP) and future business benefits
According to the methodology set for this report, the following results have been achieved:
Extensive analysis of the underlying state of the art in verification, validation and
evaluation in order to provide recommendations and key take-aways that can be
summarized along the following lines:
o KT-A.1: Active On-going Research on the fields of V&V throughout the years
o KT-A.2: A Plethora of Tools and Techniques without clear distinction between
V&V and Evaluation
o KT-A.3: No “Universal” or Holistic V&V Approach covering any potential need
o KT-A.4: Prevalence of the Waterfall model in software V&V
o KT-A.5: Limited Crowdsourcing Aspects
In addition, a state of the art analysis in definitions and terms relevant to the
methodology’s scope was also conducted to reach consensus on a common glossary.
Formulation of an integrated V&V methodology in order to address the needs of all
eleven (11) FITMAN Trials both from an IT and a Business perspective. The
methodology formulation included work on:
o Roles’ identification featuring: Trial Team, Trial Supporting Team, Product
Owner, Development Team, Sprint Master, Sprint Team.
o V&V Steps’ identification and reporting, including mapping between
individual steps and stakeholders’ categories, proposition of appropriate
techniques and/ or tools, as well as of innovative crowd engagement methods
for each step.
o Mapping of the FITMAN V&V Methodology to the FITMAN reference
architecture. The steps following the formulation of the integrated FITMAN
V&V methodology are closely coupled with the GEs, SEs and TSCs that will
be implemented, validated and evaluated in the course of the project’s Trials.
The correspondence between each process regarding the lifetime of GEs/ SEs/
TSCs and the developed V&V Methodology has been described in detail and
has to be followed in order to lead to pragmatic and useful results.
The V&V method is summarized in the following table.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 112/130
Activity Perspective Recommended
Technique
FITMAN
Stakeholders
Addressed
Crowd Engagement Method Enablers
Concerned
Code
Verification
Product
Specific
White Box Testing Development Team - SEs, TSCs
Model
Verification
Product
Specific
Traceability Analysis Development Team,
Sprint Master
Physical or online workshops, Readily-available prototyping
applications, Social deliberation platforms
SEs, TSCs
Backlog &
Release
Verification
Product
Specific
Regression Testing Development Team,
Sprint Master, Product
Owner
Deliberation and feedback tools for ex-ante crowdsourcing,
Dedicated IT tools for ex-post crowdsourcing
GEs, SEs,
TSCs
Product
Validation
Product
Specific
Black Box Testing for
Validation
Product Owner Online deliberation and feedback tools, Open calls for
testing scenarios/ trials, Workshops and specific working
groups, Traditional online survey tools, Social networks
GEs, SEs,
TSCs
Trial Solution
Validation
Trial
Specific User Acceptance Testing
Trial Solution Owner,
Trial Team
Enterprise collaboration environments, Traditional methods
(like questionnaires)
Integrated
Trial
Solution
Business
Validation
Trial
Specific
Simplified ECOGRAI
Methodology for
decisional models
Trial Team
Open collaboration and innovation platforms, Online
deliberation and feedback tools, Physical and online
workshops, Traditional online survey tools, Game-like
applications, Social networks
Integrated
Trial
Solution
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 113/130
Definition of several sets of criteria covering all the verification and validation steps
described in the methodology formulated in the previous step. The V&V criteria of
FITMAN (described in detail via a properly designed description template) were also
divided into two categories: Business Validation criteria and IT/ Technical V&V
criteria. Such criteria are coupled with the methodology steps and constitute the base
in order to extract proper indicators in the context of the D2.2 work. The
categorisation and the whole concept supports the BPIs extraction and is covering
business generic and specific aspects as well as IT aspects. Business generic criteria
categories indicatively are customer reliability, agility etc. while the trial oriented
specific criteria concern cost, innovation, sustainability and so on. On the other hand
IT criteria are also thoroughly studied and classified into categories such as FITMAN-
relevant (including openness and versatility) and generic like maintainability, usability
etc.
Definition and reporting of a set of concrete, practical guidelines towards all actors
involved in the FITMAN V&V methodology application. The guidelines aimed
towards:
o Addressing the timing perspective of the methodology application
o Providing an indicative (yet not exhaustive) V&V Checklist per stakeholder
o Defining the “V&V Decalogue” for trials with guidance for the application of
the FITMAN V&V method.
o Elaborating on general recommendations of various natures for the FITMAN
V&V package.
With regard to future steps, the primary imminent step upon the completion of the FITMAN
V&V Methodology will be the formulation of the FITMAN V&V Assessment Package. This
package will consolidate the developed FITMAN V&V Methodology, the assessment criteria,
as well as the identified IT and business indicators. It will constitute the main tool to be
utilised by the project’s Trials, as it will be parameterised and instantiated for each respective
Trial.
Based on the results deriving from the application of the FITMAN V&V Methodology, the
project’s Trials will update their operation in order to ameliorate/ fine-tune the projected/
achieved results. In addition, the methodology itself might be updated/ adapted in order to fit
possible specialised needs that have not be projected from the early steps. The FITMAN team
will closely monitor these specific procedures and provide additional recommendations and
support in case considered necessary.
Additionally, on completion of the V&V Methodology application, an overall/ sum-up
analysis could be derived (e.g. in the form of “best practices package”). This analysis could
incorporate lessons learnt on methods/ approaches/ tools (or combination of methods/
approaches/ tools), either in a trial-level or in a generic one.
Last but not least, the FITMAN V&V Methodology was developed in an agile and dynamic
way itself. Thus, if intended in the future to serve a purpose different than the one specified
for the needs of the FITMAN Trials, the methodology could be properly altered/ adapted in
order to fit the differentiated needs.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 114/130
7. Annex I: Definitions and Terms
Key Term Definition
Criterion A benchmark, standard, or yardstick against which
accomplishment, conformance, performance, and suitability
of an individual, alternative, activity, product, or plan, as well
as of risk-reward ratio is measured (Business Dictionary
2013)
A standard on which a judgment or decision both at business
and at IT level may be based. Each criterion should be clearly
defined to avoid ambiguity in understanding and prioritizing
the differing views that affect a decision or an assessment
Often there are several potential Actions or Decisions
Variables to reach an objective. A criterion is an entity which
allow to choose the right DV or AV
Α function to optimize allowing to choose the DV for
reaching the goal (Ducq 1999)
FITMAN scope:
When referring to Verification and Validation the term
criterion refers to the diverse parameters that have to be
examined and assessed during the V&V process.
Examples:
‘Stability’ is a typical IT criterion to be examined when
considering software Verification.
‘Inventory Optimization’ is a typical business criterion,
considering business Validation.
When referring to Business Performance Indicators
methodology, a ‘criterion’ allows to choose the right variable
for measuring the performance of a system.
Example:
In a workshop, the objective is to smooth the load of the
machine. Consider there are 3 potential DV/DA: to
subcontract the load, to hire new employees, to negotiate with
the customer the delivery date, to ask the employees to make
supplementary hours of work. In such case a criterion could
be: ‘to choose the most economical solution’
V&V Indicator State of the art:
Quantitative or qualitative factor or variable that provides a
simple and reliable means to measure achievement, to reflect
the changes connected to an intervention, or to help assess the
performance of a development actor.
An estimate or evaluation of specified attributes derived from
a model with respect to defined information needs
(International Organization for Standardization 2007)
A measure that is derived from other measures using an
analysis model as measurement approach (Standardization
2000)
FITMAN scope:
Individual quantitative and/ or qualitative measures that provide
answers whether the V&V criteria are met. A complete set of
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 115/130
indicators is required in order to assess the extent to which
objectives have been achieved on the basis of the defined criteria.
Quantitative indicators relate to changes in numbers (e.g., the
number of users), whereas qualitative indicators relate to changes
in perception (e.g., the opinion of users). Indicators need to be
based on a clear idea of which stakeholder group(s) they will be
applied to, and to be feasible both technically and financially.
Example:
‘Throughput’ (a measure of the estimated number of tasks that
can be performed over a unit of time) is a typical indicator used
in software V&V
Key Performance
Indicators (KPIs)
State of the art:
KPIs show what needs to be done in an internal operative
perspective. These indicators focus on the parts of an
organization’s performance that are the most critical to
success, both for present time and future. A good KPI affects
a number of critical success factors. It also affects other KPI ́s
in a positive manner (Parmenter 2007)
A Key Performance Indicator is an indicator which allows to
define the performance of a system at strategic level.
FITMAN scope:
A Key Performance Indicator (KPIs) refers to a Performance
Indicator that evaluates the global situation. KPIs can be used to
evaluate defined Key Success Factors.
Example:
‘Number of defect products sold on Year N/ N-1’
Key Success Factors State of the art:
They are the strategic elements that an organization must
control to ensure its permanence in the competition. They
focus on the changes that the organization must follow
according to the change of the external environment (Iribarne
2006)
They represent the stakes in success especially with the
clientele (Garibaldi 2001)
FITMAN scope:
The Key Success Factors (KSF) are internal or external elements
that an organization can control to reach its objectives. They can
relate to products (quality), organization (skill), customers
(satisfaction) etc.
Example:
‘Quality’ is a Key success factor for customers’ satisfaction
Mission State of the art:
A Mission of an organization characterizes and identifies the
reasons for existence (Camillus and Beall 1997)
A Mission captures the overriding purpose of an organization
in line with the values and expectations of stakeholders and
should typically answer the questions: “what business are we
in?” (Johnson and Tjahjono 1993) and “what is our business
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 116/130
for?” (Drucker 1973)
A Mission is the nature of function or task to which a
company must assume or achieve as organizational entity
(Ex: service delivery activity) (Iribarne 2006)
A Mission is the primary business or purpose of an
organization. It describes what an organization does, for
whom, and its benefit. The mission of an organization is not a
time-bound objective (Gates 2010)
FITMAN scope:
All aforementioned definitions refer to the nature of the purpose,
the activity or function of an organization which justify its
existence. Generally, the mission is invariant throughout the life
cycle of the organization.
Mission can be defined is the primary business or purpose of an
organization. It describes what an organization does, for whom,
and its benefit.
Objectives State of the art:
The objectives reflect the mission of the organization and the
finalities which concretize the mission (Meleze 1972)
An objective is the result or the target that has to achieve the
system (Enterprise or Trial) controlled by the decision-
maker (Marcotte 1995)
An objective translates the intention of the decision-maker, in
a given decision-making frame, to pass from the state of
existing performance to the state of wished performance for
the controlled system. Thus, it constitutes a proactive
representation of the performance to be reached (Krohmm
1997)
FITMAN scope:
Objectives allow to define the results that the global company, or
a part of the company (trials) must reach. In fact for a company a
set of objectives will be defined according the functions, the
processes or the services of the organization. In such case, it is
very important to check the coherence of the various objectives in
order that the global performance will be improved. Each
objective must contribute to the achievement of the global
objectives.
Examples:
‘Improved end product quality’
‘Reduction of production cost’
‘Improvement of the image of the company’
Performance State of the art:
Performances in the enterprise are the results of actions which
contribute to reach the strategic objectives (Lorino 1996)
It results of the animation of a dynamic of generalized
progress, lying on the deployment of an indicators system
associated with objectives and levers enabling in a continual
and systematic way to apprehend the situation of the moment,
to visualize the layers and to light the way to be traversed and
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 117/130
finally to evolve while measuring the performed progress
(Compétitivite 1997)
Performance is what the organization hires one to do, and do
well but only actions which can be scaled, i.e. measured, are
considered to constitute performance (Campbell, et al. 1993)
The ability of an entity, such as a person, group or
organization, to make results in relation to specific and
determined objectives (Laitinen 2002), (Lebas and Euske
2004)
FITMAN scope:
Performance is associated with objectives, actions and results to
make a system evolve and achieve its objectives. The
performance of an organization (enterprise, system, trial)
measures the evolution of this organisation towards its objectives,
under the influence of external or internal factors.
Performance Indicators State of the art:
A quantified data which measures the effectiveness and/or
efficiency of all or part of a process or system in comparison
to a standard or a plan or a determined objective and accepted
in the frame of an enterprise strategy (Association Française
de Gestion Industrielle 1992)
Information that must help an actor, individual or more a
group of actors, to define actions towards an objective
achievement and must allow him to evaluate the results
(Lorino 1996)
A PI (Performance Indicator) is a quantified data which
measure the efficiency the action variables or the decision
variables of the decision makers and the degree of their
objectives achievement in the frame of the strategy of the
enterprise (Doumeingts 1998)
Which anticipates and measures condition changes or specific
situation (Carneiro 2005)
FITMAN scope:
Three elements are very important to define a PI. The first
element is the system (Enterprise, Trials) in which we will define
the PIs. The second element is the objective assign to this system,
and the last element is the action variable (or decision variable) to
reach the objective.
A Performance Indicator can be defined is a quantified data
which measures the efficiency of action variables or decision
variables, in the frame of achieving the objectives defined for this
system.
Process A process is a set of correlated or interactive activities which
transform input elements in output elements (International
Organization for Standardization 2008)
FITMAN scope:
An enterprise as production system is constituted by interactive,
interdependent and inter connected processes composed
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 118/130
themselves of activities which allow the system to reach its
objectives.
Validation State of the art:
The process of providing evidence that the software and its
associated products satisfy system requirements allocated to
software at the end of each life cycle activity, solve the right
problem (e.g., correctly model physical laws, implement
business rules, use the proper system assumptions), and
satisfy intended use and user needs (IEEE 2012)
Process of evaluating a system or component during or at the
end of the development process to determine whether it
satisfies specified requirements (Tran 1999)
Confirmation by examination and through provision of
objective evidence that the requirements for a specific
intended use or application have been fulfilled (International
Organization for Standardization 2008)
FITMAN scope:
FITMAN adopts the IEEE definition as far as it concerns
validation. In this context, the objective of validation is: to ensure
that the product actually meets the user's needs, the specifications
were correct in the first place and the product fulfills its intended
use when placed in its intended environment.
Verification State of the art:
The process of providing objective evidence that the software
and its associated products conform to requirements (e.g., for
correctness, completeness, consistency, accuracy) for all life
cycle activities during each life cycle process (acquisition,
supply, development, operation, and maintenance); satisfy
standards, practices, and conventions during life cycle
processes; and successfully complete each life cycle activity
and satisfy all the criteria for initiating succeeding life cycle
activities (IEEE 2012)
Act of reviewing, inspecting, testing, checking, auditing, or
otherwise establishing and documenting whether items,
processes, services or documents conform to specified
requirements (American National Standard 1978)
Process of evaluating a system or component to determine
whether the products of a given development phase satisfy
the conditions imposed at the start of the phase (ANSI/IEEE
1990)
Formal proof of program correctness (ANSI/IEEE 1990)
FITMAN scope:
FITMAN adopts the IEEE definition as far as it concerns
verification. In this context, the objective of verification is: to
ensure that the product is being built according to the
requirements and design specifications.
Vision State of the art:
A view of a realistic, credible, attractive future for the
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 119/130
organization, a condition that is better in some important way
than what now exists (Bennis and Nanus 1997)
It defines the expected position by the organization in the
long term for a well determined time that is necessary to
specify and quantify (Iribarne 2006)
An ideal that an organization intends to pursue. It links the
organization to the future by articulating instantiations of
successful execution of the mission. It might, in fact, describe
what can be achieved in a broader environment if the
organization and others are successful in achieving their
individual missions (Gates 2010)
FITMAN scope:
The vision is an ideal state that managerial staff imagines in the
future for the organization according to its activity by taking
account of the opportunities and threats of the environment.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 120/130
8. Annex II: Acronyms and Abbreviations
Abbreviation/ Acronym Definition
AC Action Variable
AHP Analytic Hierarchy Process
ANP Analytic Network Process
ANSI American National Standards Institute
BDD Behavior-driven development
BPI Business Performance Indicator
BSC Balanced Scorecard
C.P.M. Critical Path Method
CSFV Crowd Source Formal Verification
DEA Model Data Envelopment Analysis Model
DoW Description of Work
DV Decision Variable
EC European Commission
ENAPS European Network for Advanced Performance Studies
ESA European Space Agency
EU European Union
FI-PPP Future Internet Public Private Partnership
FMECA Failure mode effect and criticality analysis
GE Generic Enabler
GUI Graphical User Interface
ICT Internet and Communication Technology
IEC International Electrotechnical Commission
IEEE Institute of Electrical and Electronics Engineers
ISO International Organization for Standardization
IT Information Technology
KPI Key Performance Indicator
KSF Key Success Factors
KT-A Key Take-Away
MTM Methods-Time Measurement
P.E.R.T. Program Evaluation and Review Technique
PDCA Plan – Do – Check - Act
PI Performance Indicator
PM Project Month
QA Quality Assurance
SCOR Supply Chain Operations Reference
SE Specific Enabler
SMT Satisfiability Modulo Theories
SOP Standard Operating Procedure
SotA State of the Art
SQuaRE Systems and software Quality Requirements and Evaluation
STEEP Social – Technological – Economical – Environmental -
Political
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 121/130
Abbreviation/ Acronym Definition
SWEBOK Software Engineering Body of Knowledge
TAM Technology Acceptance Model
TELOS Technical, Economical, Legal, Operational and Scheduling
TOPSIS Technique for Order of Preference by Similarity to Ideal
Solution
TRL Technology Readiness Level
TSC Trial Solution Component
UAT User Acceptance Testing
V&V Verification and Validation
VAR Value at Risk
VCOR Value Chain Operations Reference Model
WP Work Package
XP Extreme Programming
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 122/130
9. Annex III: References
Abdia, R., and A. Labibb. 1 August 2010.
http://www.tandfonline.com/doi/abs/10.1080/00207543.2010.520989.
acunetix. n.d. http://www.acunetix.com/.
AlPiNA : an Algebraic Petri Net Analyzer. n.d. http://alpina.unige.ch/.
American National Standard. “ANSI/ASQC Standard A3-1978 "Quality Systems
Terminology".” 1978.
Amin, Md. Al, and M. Azharul. Karim. 2011. http://eprints.qut.edu.au/43991/.
Andreeva, N. 2009. http://www.tu-sofia.bg/faculties/mf/adp/nntk_files/konf-
09/Materials/NAPRAVLENIE_8/2_N.Andreeva.pdf.
ANSI/IEEE. IEEE Standard Glossary of Software Engineering Terminology, ANSI/IEEE Std
610.12-1990. 1990.
AppTaster. 2013. https://itunes.apple.com/us/app/apptaster-play-mockups-
prototypes/id518977767?mt=8 (accessed July 9, 2013).
Arkley, P., and S. Riddle. “Overcoming the traceability benefit problem.” 13th IEEE
International Conference on Requirements Engineering. 2005. 385 - 389.
Association Française de Gestion Industrielle. Evaluer pour évoluer, les indicateurs de
performance au service du pilotage industriel. Ouvrage collectif AFGI, 1992.
Beacham, Dr NIGEL. MSc Project in Advanced Computer Science, Advanced Information
Systems, or E-Commerce. University of Aberdeen, 2010.
“Behaviour Driven Development with Cucumber, RSpec, and Shoulda.” www.slideshare.com.
n.d. http://www.slideshare.net/drmanitoba/behaviour-driven-development-with-
cucumber-rspec-and-shoulda (accessed 5 2013).
Behrouzi, F., and K. Yew Wong. 2011.
http://www.sciencedirect.com/science/article/pii/S1877050910004400.
Benefits of User Acceptance Testing (UAT). n.d.
http://www.meritsolutions.com/meritmatters/index.php?/archives/454-Benefits-of-
User-Acceptance-Testing-UAT.html.
Bennett, J. 2003.
http://www.google.gr/books?hl=el&lr=&id=9thMtRRFzeAC&oi=fnd&pg=PP5&dq=
Evaluation+Methods+in+Research+bennett&ots=tsoYssOxwR&sig=BZZlhN1F0Er-
3Tr8_MfBaP_Vrnc&redir_esc=y.
Bennis, W., and B. Nanus. Leadership. New York, 1997.
Beyer, Dirk, and Andreas Stahlbauer. BDD-Based Software Model Checking. University of
Passau, Germany, 2013.
Beyer, Dirk, Henzinger A. Thomas, Jhala Ranjit, and Rupak Majumdar. “The software model
checker Blast.” International Journal on Software Tools for Technology Transfer 9,
no. 5-6 (2007): 505-525.
Boehm , B., and R. Turner. Balancing Agility and Discipline: A Guide for the Perplexed.
Vols. ISBN 0-321-18612-5 Appendix A, pages 165–194. Boston: MA: Addison-
Wesley, 2004.
Brothers, L., V. Sembugamoorthy, and M. Muller. “ICICLE: Groupware for code inspection.”
Proceedings of the ACM Conference on Computer Supported Cooperative Work.
1990. 169-181.
bugseng. n.d. http://bugseng.com/products/eclair.
Business Dictionary. 2013. http://www.businessdictionary.com/definition/evaluation-
criteria.html.
Camillus, J., and D.R. Beall. “Shifting the strategic management paradigm.” European
management journal Vol.15, N°1, 1997: 1-7.
Campbell, J.P., R.A. Mc Cloy, S.H. Oppler, and Sager C.E. “A theory of performance.” In
Personnel selection in organizations, by E. Schmitt and W.C. Borman, 35–70. San
Francisco, 1993.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 123/130
Carneiro, M.F.S. Indicadores: a maneira de se acompanhar progresso e medir sucesso de
programas e projetos. Mundo Project Management, 2005.
CENZIC. n.d. http://www.cenzic.com/products/desktop/index.html.
Chaira, Talveer. Integration and System Testing. n.d.
Charmy, Project. 8 June 2013. http://www.di.univaq.it/charmy/.
CISQ Technical Work Groups. CISQ Specifications for Automated Quality Characteristic
Measures. OMG, 2012.
Cochran, D., and D. Dobbs. 6 September 2003.
http://systemdesigninstitute.com/pdf/paper09.pdf.
Code Project. n.d. http://www.codeproject.com/Articles/5579/Black-Box-Testing-Its-
Advantages-and-Disadvantages.
Compétitivite, CPC - Club Production et. Les indicateurs de performance. Ministère de
l’Industrie de la Poste et des Télécommunications - Edition Londez Conseil, 1997.
ComVantage. Collaborative Manufacturing Network for Competitive Advantage . n.d.
http://www.comvantage.eu/results-publications/public-deriverables/.
Cordis. 2013. http://cordis.europa.eu/ (accessed July 9, 2013).
D9.3 Report and Phase 2 Plan for FI PPP Alignment. 2013.
DARPA. CROWD SOURCED FORMAL VERIFICATION. 1 6 2013.
http://www.darpa.mil/Our_Work/I2O/Programs/Crowd_Sourced_Formal_Verification
_(CSFV).aspx.
Dasso, A., and A. Funes. July 2006. http://www.igi-global.com/book/verification-validation-
testing-software-engineering/1027.
DBAPPSecurity. n.d. http://www.dbappsecurity.com/.
Derby, Esther, and Diana Larsen. Agile Retrospectives: Making Good Teams Great!
Washington, D.C., 2007.
DMO. “Defence Materiel Organisation.” 2004.
http://rudderconsultingcomau.melbourneitwebsites.com/files/vv_manual_v1.1.pdf.
Doumeingts, G. “GIM: GRAI Integrated Method.” Document interne au LAP/GRAI, 1998.
Drucker, P. F. Management. Oxford: Butterworth-Heinemann, 1973.
Dubinsky, Yael , and Orit Hazzan. “Roles in Agile Software Development Teams.” Extreme
Programming and Agile Processes in Software Engineering (Springer-Verlag) 3092
(2004): 157-165.
Ducq, Y. “Contribution à une méthodologie d’analyse de la cohérence des Systèmes de
Production dans le cadre du Modèle GRAI.” Thèse de doctorat en Productique
Université de Bordeaux 1-Février, Bordeaux, 1999.
Easterbrook, S., and J. Callahan. 1998.
https://www.google.gr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved
=0CDMQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownlo
ad%3Fdoi%3D10.1.1.80.8099%26rep%3Drep1%26type%3Dpdf&ei=xqqcUdGMF62
w4QTAu4CwBw&usg=AFQjCNGz7AMJGsobBH40-j1uLlq88dz.
ELSEVIER. 2013. http://www.elsevier.com/ (accessed July 9, 2013).
Engel, A. 15 June 2010. http://www.amazon.com/Verification-Validation-Engineered-
Engineering-Management/dp/047052751X.
ESA Board for Software Standardisation an Control. Guide to software verification and
validation. Netherlands: ESA Publications Division, 1995.
ESA Board for Software Standardisation and Control. ESA Software Engineering Standards.
Netherlands: ESA Publications Division, 1991.
European Commission. DIGITAL AGENDA FOR EUROPE: A EUROPE 2020 INITIATIVE.
n.d. https://ec.europa.eu/digital-agenda/en/future-internet-public-private-partnership.
European Commission, Europe 2020. 2013. http://ec.europa.eu/europe2020/index_en.htm.
Exforsys Inc. n.d. http://www.exforsys.com/tutorials/testing-types/alpha-testing-advantages-
limitation.html.
“Fabasoft.” 5 2013. http://www.fabasoft.com/en/fabasoft-apptest.html.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 124/130
Fagan, Michael E. “Advances in software inspections.” Software Engineering, IEEE
Transactions SE-12, no. 7 (1986): 744 - 751.
Fahmy, Syahrul, Nurul Haslinda, Wan Roslina, and Ziti Fariha. “Evaluating the Quality of
Software in e-Book Using the ISO 9126 Model.” International Journal of Control and
Automation 5, no. 2 (2012).
FI-CONTENT: D1.9: Final publishable summary report. 2013.
FINEST: D1.5 Business assessment of technological solutions. 2012.
FINEST: D2.4 Initial experimentation specification and evaluation methodologies for
selected use case scenarios. 2012.
FINEST: D2.5 Final use case specification and phase 2 experimentation plan. 2012.
FINSENSY: D8.2 Experiments and Evaluation. 2012.
Fisher, M. 27 November 2006. http://www.amazon.com/Software-Verification-Validation-
Engineering-Scientific/dp/0387327258.
Fit: Framework for Integrated Test. n.d. http://fit.c2.com/.
FITMAN. D2.2 – FITMAN Verification & Validation. 2013.
FitNesse. n.d. http://fitnesse.org/FrontPage.
FMT Formal Methods & Tools. n.d. http://fmt.cs.utwente.nl/tools/ltsmin/.
fortifysoftware. n.d. http://www.fortifysoftware.com/products/tester.
Fua, M.W., M.S. Yonga, K.K. Tonga, and T. Muramatsua. 22 February 2007.
http://www.tandfonline.com/doi/abs/10.1080/00207540500337643#.UbYBDPn0GSo.
FUTURE INTERNET PPP. 2013. http://www.fi-ppp.eu/.
Ganesan, D., M. Lindvall, C. Ackermann, D. McComas, and M. Bartholomew. 2009.
http://www.few.vu.nl/~rkrikhaa/adam/architecture_analysis_splc09.pdf.
Garavel, Hubert, Frederic Langy, and Radu Mateescuz. An overview of CADP 2001. 2001.
Garibaldi, G. L’analyse stratégique. Editions d’Organisation, 2001.
Gates, L.P. “Strategic Planning with Critical Success Factors and Future Scenarios: An
Integrated Strategic Planning Framework.” TECHNICAL REPORT, CMU/SEI-2010-
TR-037 ESC-TR-2010-102, 2010.
Gertler, P., S. Martinez, P. Premand, L. Rawlings, and C. Vermeersch. 2011.
http://siteresources.worldbank.org/EXTHDOFFICE/Resources/5485726-
1295455628620/Impact_Evaluation_in_Practice.pdf.
Gintell, J., M. Houde, and R. McKenney. “Lessons learned by building and using scrutiny, a
collaborative software inspection system.” Seventh International Workshop on
Computer-Aided Software Engineering. 1995. 350–357.
Göleça, A., and H. Taşkın. 1 December 2007.
http://www.sciencedirect.com/science/article/pii/S0020025507003258.
Google. 2013. http://www.google.gr/ (accessed July 9, 2013).
Google Scholar. 2013. http://scholar.google.com/ (accessed July 9, 2013).
Hansen, Helle Hvid , Jeroen Ketema, Bas Luttik, MohammadReza Mousavi, and Jaco van de
Pol. “Towards model checking executable UML specifications in mCRL2.”
Innovations in Systems and Software Engineering (Springer-Verlag) 6, no. 1-2 (2010):
83-90.
Hon, K.K.B. 2005. http://www.sciencedirect.com/science/article/pii/S0007850607600237.
IBM. n.d. http://www-03.ibm.com/software/products/us/en/subcategory/SWI10.
IBM Rational Software. n.d. http://www-01.ibm.com/software/rational/.
IdeaScale. 2013. http://ideascale.com/ (accessed July 9, 2013).
IEEE. 2013. http://www.ieee.org/index.html (accessed July 9, 2013).
IEEE. “1012-2012 - IEEE Standard for System and Software Verification and Validation.”
2012.
IEEE. “IEEE Guide for Software Verification and Validation Plans.” 1993.
—. Standard for Software Verification and Validation. 2004.
Instant Mobility: D6.4 Instant Mobility multimodal services acceptability - final report. 2013.
International Atomic Energy Agency. 1999. http://www-
pub.iaea.org/mtcd/publications/pdf/trs384_scr.pdf.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 125/130
International Atomic Energy Agency. “Verification and Validation of Software Related to
Nuclear Power Plant Instrumentation and Control.” TECHNICAL REPORTS SERIES
No. 384, Vienna, 1999.
International Organization for Standardization. 01 August 2007.
http://homes.ieu.edu.tr/~kkurtel/Documents/ISO%2015939_2007.pdf (accessed July
29, 2013).
—. Information technology- Guideline for the evaluation and selection of CASE tools. 2008.
—. Information technology- Measurement and rating of performance of computer-based
software systems. 1999.
International Organization for Standardization. “ISO 9001.” 2008.
—. “ISO TR 19759:2005.” 2005.
http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=33
897.
—. “ISO/IEC 9126 Software engineering -- Product quality.” 2001.
—. Software Engineering- Software product Quality Requirements and Evaluation (SQuaRE).
2005.
—. Systems and software engineering- Information technology project performance
benchmarking framework- Part 1: Concepts and definitions. 2011.
iOpus. n.d. http://www.iopus.com/imacros/web-testing.htm.
Iribarne, P. Les tableaux de bord de la performance: comment les concevoir, les aligner et les
déployer sur les Facteurs Clés de Performance. Dunod, 2006.
ISO. 2013. http://www.iso.org/iso/home.html (accessed July 9, 2013).
ISTQB GUIDE in STATIC TECHNIQUES. ISTQB Exam Certification. n.d.
http://istqbexamcertification.com/what-is-walkthrough-in-software-testing/.
ITEA, Information Technology for European Advancement. December 2001.
http://www.dess-itea.org/deliverables/ITEA-DESS-D162-V01P.pdf.
ItsNat. n.d. http://itsnat.sourceforge.net/?_page=overview.
Jackson, M., S. Crouch, and R. Baxter. November 2011.
http://software.ac.uk/sites/default/files/SSI-SoftwareEvaluationCriteria.pdf.
JHALA, RANJIT. Software Model Checking. n.d.
Jive. 2013. http://www.jivesoftware.com/ (accessed July 9, 2013).
Johnson, P.M., and D. Tjahjono. “Improving software quality through computer supported
collaborative review. .” 19th International Conference on Software Engineering.
1993. 61–76.
JPF. n.d. http://babelfish.arc.nasa.gov/trac/jpf.
Kakati, M. 20 November 1997.
http://www.sciencedirect.com/science/article/pii/S0925527397001151.
Kan, Stephen H. Metrics and Models in Software Quality Engineering. 2004.
Karim, A., and K. Arif-Uz-Zaman. 2013.
http://www.emeraldinsight.com/journals.htm?articleid=17077476.
Kettunen, Petri. “Adopting key lessons from agile manufacturing to agile software.”
Technovation (Elsevier) 29 (2009): 408–422.
Knight, J.C., and E.A. Myers. “An improved inspection technique.” Commun. ACM 36, no.
11 (1993): 51–61.
Krohmm, H. Contribution à l’étude de la cohérence dans la décomposition des objectifs dans
le modèle GRAI. Bordeaux: LAP/GRAI Bordeaux1 University, 1997.
Kung, D., and H. Zhu. 15 January 2008.
http://cms.brookes.ac.uk/staff/HongZhu/Publications/swvv3.pdf.
Kwiatkowska, Marta , Gethin Norman, and David Parker. “PRISM: Probabilistic Symbolic
Model Checker.” Computer Performance Evaluation: Modelling Techniques and
Tools 2324 (2002): 200-204.
L. Lobo & J.D. Arthur. Effective Requirements Generation: Synchronizing Early Verification
& Validation, Methods and Method Selection Criteria. arXiv preprint cs/0503004,
2005.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 126/130
Laitenberger, Oliver, and Jean-Marc DeBaud. “An encompassing life cycle centric survey of
software inspection.” Journal of Systems and Software (Elsevier) 50, no. 1 (2000): 5-
31.
Laitinen, E. “A dynamic performance measurement system: evidence from small Finnish
technology companies.” Scandinavian Journal of Management - Vol.18, 2002: 65-99.
Larman, Craig . Agile and Iterative Development: A Manager's Guide. 2004.
Leachman, C., C. Pegels, and S.K. Shin. 2005.
http://www.emeraldinsight.com/journals.htm?articleid=1513343.
Lebas, M., and K. Euske. “A Conceptual and Operational of Performance.” In Business
Performance Measurement: Theory and Practice, by A. Neely. Cambridge, UK:
Cambridge University Press, 2004.
Leffingwell, Dean. Agile Software Requirements: Lean Requirements Practices for Teams,
Programs, and the Enterprise. 2010.
—. “Scale Agiled Framework.” scaledagileframework.com. 2013.
LinkedIn. 2013. http://www.linkedin.com/ (accessed July 9, 2013).
Liu, S., N. Xie, C. Yuan, and Z. Fang. 20 December 2011. http://www.amazon.com/Systems-
Evaluation-Applications-Prediction-Decision-Making/dp/1420088467.
Lorino, P. Méthodes et Pratique de la performance. 1996.
Macdonald, F. “Assist v1.1 User Manual.” Technical Report RR-96-199 [EFoCS-22-96]
Empirical Foundations of Computer Science, (EFoCS), University of Strathclyde, UK,
1997.
Maiden, Neil, Shailey Minocha, Alistair Sutcliffe, Darrel Manuel, and Michele Ryan. “A co-
operative scenario based approach to acquisition and validation of system
requirements: How exceptions can help!” Interacting with Computers 11, no. 6
(1999): 645–664.
Mäntylä, M., and J. Itkonen. June 2013.
http://www.sciencedirect.com/science/article/pii/S095058491200239X.
Marcal, Ana Sofia C., Felipe S. Soares, and Arnaldo D. Furtado and Belchior. “Mapping
CMMI Project Management Process Areas to SCRUM Practices.” Proceedings of the
31st IEEE Software Engineering Workshop. 2007. 10.
Marcotte, F. Contribution à la modélisation des systèmes de production: extension du modèle
GRAI. Thèse de doctorat - Spécialité Productique, Université de Bordeaux 1, 1995.
Mashayekhi, V., J.M. Drake, W.T. Tsai, and J. Riedl. “Distributed, collaborative software
inspection.” IEEE Software, 1993: 66-75.
McErlang. n.d. https://babel.ls.fi.upm.es/trac/McErlang/.
Meleze, J. L’analyse modulaire des Systèmes de Gestion. Paris: édition Hommes et
Techniques, 1972.
Merz, Stephan. “Model Checking: A Tutorial Overview.” Modeling and Verification of
Parallel Processes 2067 (2001): 3-38.
Meyer, Bertrand. Object Oriented Software Construction. 2nd. Prentice-Hall, 1997.
microsoft research. n.d. http://research.microsoft.com/en-us/projects/tla_tools/.
Microsoft Research. CHESS: Find and Reproduce Heisenbugs in Concurrent Programs. n.d.
http://research.microsoft.com/en-us/projects/CHESS/.
Molyer, O. 12 November 2001. http://ftp.rta.nato.int/public//PubFullText/RTO/MP/RTO-MP-
073///MP-073-20.pdf.
MoonWalker. n.d. http://fmt.ewi.utwente.nl/tools/moonwalker/.
MRMC. n.d. http://www.mrmc-tool.org/trac/.
MSDN. n.d. http://msdn.microsoft.com/en-us/library/aa292197(v=vs.71).aspx.
Murphy, P., and J. Miller. “A process for asynchronous software inspection.” Eighth
International Workshop on Software Technology and Engineering Practice. 1997. 96–
104.
Navy Health Portal. http://ciosp3.us/. 30 March 2012. http://ciosp3.us/is-agile-development-
good-for-health-it/ (accessed 07 08, 2013).
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 127/130
nccgroup. n.d. http://www.nccgroup.com/en/our-services/security-testing-audit-
compliance/information-security-software/#.UapoxkB7LSh.
N-Stalker. n.d. http://www.nstalker.com/.
NTObjectives. n.d. http://www.ntobjectives.com/security-software/ntospider-application-
security-scanner/.
NuSMV. n.d. http://nusmv.fbk.eu/.
O’Connora, P., and A. Frewb. June 2004.
http://www.sciencedirect.com/science/article/pii/S0278431903000975.
Obaidat, M., and N. Boudriga. January 2010.
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0471269832.html.
Oberkampf, W., and C. Roy. November 2010.
http://www.cambridge.org/us/knowledge/isbn/item4027658/?site_locale=en_US.
O'Keefe, R., and D. O'Leary. February 1993.
http://link.springer.com/article/10.1007%2FBF00849196.
Parmenter, D. Key Performance Indicators. Developing, Implementing and Using Winning
KPIs, Hoboken, New Jersey: John Wiley & Sons Télécommunications. Edition Londez
Conseil, 2007.
Passbrains Projects . 2012. http://www.passbrains.com/blog/let-the-crowd-run-your-betatests/.
PAT: Process Analysis Toolkit. n.d. http://www.comp.nus.edu.sg/~pat/.
Paul, John. “Supply Chain Operations Reference-model (SCOR).” 2011.
pcmag.com. n.d. http://www.pcmag.com/encyclopedia/term/37674/alpha-test.
Pelliccione, P., P. Inverardi, and H. Muccini. May-June 2009.
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=4711062&abstractAccess=no
&userType=inst.
Perpich, J., D. Perry, A. Porter, L. Votta, and M. Wade. “Anywhere, anytime code
inspections: using the web to remove inspection bottlenecks in large-scale software
development .” Proceedings of the 19th International Conference on Software
Engineering. 1997. 14-21.
Perry, William E. Effective Methods. 2006.
Plogert, K. 1996. http://ac.els-cdn.com/S1383762196000471/1-s2.0-S1383762196000471-
main.pdf?_tid=e320870c-d7f7-11e2-bd76-
00000aab0f01&acdnat=1371547224_8c0d2d63bd11eb5a5d41ed4191e13e03.
Pohjolainen, Pentti. SOFTWARE TESTING TOOLS. 2002.
Portswigger. n.d. http://portswigger.net/intruder.
Positive Technologies. n.d. http://www.maxpatrol.com/.
Proto.io. 2013. http://proto.io/ (accessed July 9, 2013).
Rakitin, Steven R. Software Verificaiton and Validation for Practitioners and Managers.
2001.
—. Software Verification and Validation for Practitioners and Managers. Boston, London,
2001.
Ranorex. n.d. http://www.ranorex.com/.
Rehman, Muhammad Jaffar-ur, Fakhra Jabeen, Antonia Bertolino, and Andrea Polini.
“Testing software components for integration: a survey of issues and techniques.”
SOFTWARE TESTING, VERIFICATION AND RELIABILITY 17, no. 2 (2007): 95-
133.
Reinhart, G., S. Schindler, and P. Krebs. 2011. http://link.springer.com/chapter/10.1007/978-
3-642-19692-8_31.
Rezaie, K., M. Dehghanbaghi, and V. Ebrahimipour. July 2009.
http://link.springer.com/article/10.1007%2Fs00170-008-1726-8.
Richards, A., M. Branstad, and V. Cherniavsky. June 1962.
https://www.cs.drexel.edu/~jhk39/teaching/cs576su06/week1Readings/adrion.pdf.
Rico, David F. “V & V Lifecycle Methodologies.” 2013.
http://www.stickyminds.com/sitewide.asp?Function=edetail&ObjectType=ART&Obj
ectId=3287#top.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 128/130
Rising, Linda , and Janoff S. Norman . “The Scrum Software Development Process for Small
Teams.” IEEE SOFTWARE, 2000: 32.
RobotFramework. n.d. http://robotframework.org/.
Roméo - A tool for Time Petri Nets analysis. n.d. http://romeo.rts-software.org/.
Roscoe, Bill. FDR in the 21st Century Local search in model checking. n.d.
S. Deterding et al. “Gamification. using game-design elements in non-gaming contexts.”
Proceedings of the 2011 annual conference extended abstracts on Human factors in
computing systems. ACM, 2011. 2011.
sandnsprite. n.d. http://sandsprite.com/Sleuth/.
sandsprite. n.d. http://www.sandsprite.com/tools.html.
Sarmad, K. 1985.
http://www.thefreelibrary.com/Evaluating+the+operational+performance+of+manufac
turing+enterprises%3A...-a0186275774.
Scopus. 2013. http://www.scopus.com/home.url (accessed July 9, 2013).
searchsoftwarequality. n.d. http://searchsoftwarequality.techtarget.com/definition/unit-testing.
SearchSoftwareQuality. 2010. http://searchsoftwarequality.techtarget.com/definition/Scrum-
sprint.
SeleniumHQ. n.d. http://docs.seleniumhq.org/.
Shah, L., A. Etienne, A. Siadat, and F.B. Vernadat. 2012.
http://sam.ensam.eu/bitstream/handle/10985/6295/2012-02-
21%20Value%20Risk%20-
%20Based%20Performance%20Evaluation%20of%20Manufacturing%20Processes%
20%282%29.pdf?sequence=1.
smart. n.d. http://www.cs.ucr.edu/~ciardo/SMART/.
Smart Agri-Food: D200.3 Final Report on Validation Activities, Architectural Requirements
and Detailed System Specification. 2011.
Smart Agri-Food: D200.4 Smart Farming: Final Assessment Report. 2011.
Smart Agri-Food: D400.3 Report on validation activities and Detailed Specification Revision.
n.d.
Smart Agri-Food: D500.6 Feasibility Assessment . 2013.
soft-engineering.com. “Software Engineering, Walkthrough & Inspection.” soft-engineering.
2010. http://soft-engineering.blogspot.gr/2010/12/walkthrough-inspection.html.
Software Testing Fundamentals. n.d. http://softwaretestingfundamentals.com/white-box-
testing/.
sourceforge. n.d. http://sourceforge.net/projects/redlib/.
Spanoudakis, George, and Andrea Zisman. “Software Traceability: A Roadmap.” Handbook
of Software Engineering and Knowledge Engineering III (2005).
Springer. 2013. http://www.springer.com/?SGWID=5-102-0-0-0 (accessed July 9, 2013).
s-qa.blogspot. “what are benefits of walkthroughs?” s-qa.blogspot. 2013. http://s-
qa.blogspot.gr/2009/03/what-are-benefits-of-walkthroughs.html.
SQRL. Software Metrics for Control and Quality Assurance. McMaster University, 2013.
Sridharan, Mithun. Crowd sourced software testing. 2 6 2013.
http://it.toolbox.com/blogs/mithuns-memoirs/crowd-sourced-software-testing-52763.
SRM Technologies. Embedded 360. 2013.
http://www.embedded360.com/execution_approach/traditional_model.htm.
Standardization, International Organization for. 01 February 2000. http://webstore.iec.ch/p-
preview/info_isoiec14598-2%7Bed1.0%7Den.pdf (accessed July 29, 2013).
Stein, M., J. Riedl, S. Harner, and V. Mashayekhi. “A case study of distributed, asynchronous
software inspection.” 19th International Conference on Software Engineering. IEEE
Computer Society Press, 1997. 107–117.
Stout, Glenn A. Requirements Traceability and the Effect on the System Development
Lifecycle (SDLC). n.d.
Supply Chain Council. SCOR Supply Chain Operations Reference Model V10. Aug 2010.
SurveyMonkey. 2013. http://www.surveymonkey.com/ (accessed July 9, 2013).
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 129/130
Syahrul Fahmy et. al. “Evaluating the Quality of Software in e-Book Using the ISO 9126
Model.” International Journal of Control and Automation 5, no. 2 (June 2012): 115-
122.
TAPAAL. n.d. http://www.tapaal.net/.
TAPAs. n.d. http://rap.dsi.unifi.it/tapas/en/index.htm.
Tatsiopoulos, I., N. Panayiotou, and S. Ponisa. September 2002.
http://www.sciencedirect.com/science/article/pii/S0166361502000623.
techopedia. n.d. http://www.techopedia.com/definition/7751/integration-testing.
techopedia. n.d. http://www.techopedia.com/definition/3887/user-acceptance-testing-uat.
techopedia. n.d. http://www.techopedia.com/definition/5935/alpha-test.
TestAutomationFX. n.d. http://www.testautomationfx.com/.
Thacker, B., S. Doebling, F. Hemez, M. Anderson, J. Pepin, and E. Rodriguez. October 2004.
http://www.ltas-
vis.ulg.ac.be/cmsms/uploads/File/LosAlamos_VerificationValidation.pdf.
Thomson Reuters. 2013. http://wokinfo.com/ (accessed July 9, 2013).
Torkar, Richard, Tony Gorschek, Robert Feldt, Uzair Akbar Raja, and Kashif Kamran.
“Requirements traceability state-of-the-art: A systematic review and industry case
study.” Int. Journal of Software Engineering and Knowledge Engineering (IJSEKE)
22, no. 3 (2012): 1-49.
Tran, Eushiuan. Verification/Validation/Certification. Carnegie Mellon University, 1999.
Troxler, J., and L. Blank. 1989.
http://www.sciencedirect.com/science/article/pii/0278612589900393.
tutorialspoint. n.d. http://www.tutorialspoint.com/software_testing/testing_methods.htm.
UPPAAL. n.d. http://www.uppaal.org/.
Uservoice. 2013. https://www.uservoice.com/ (accessed July 9, 2013).
uTest. 2013. http://www.utest.com/ (accessed July 9, 2013).
—. Where the best test. 2013. http://www.utest.com/.
Utting, M., and B. Legeard. “Practical Model-based Testing: A Tools Approach.” Morgan
Kaufmann, 2007.
Vakkalanka, Sarvani, Subodh Sharma, Ganesh Gopalakris, and Robert M. Kirby. ISP: A Tool
for Model Checking MPI Programs. Salt Lake City, Utah, USA, 2008.
Van Grembergen, W. July 2001. http://www.igi-global.com/book/information-technology-
evaluation-methods-management/590.
Vereofy / TU-Dresden. n.d. http://www.vereofy.de/.
Verifying Multi-threaded Software with Spin. n.d. http://spinroot.com/spin/whatispin.html.
Wang, Z., S. Dessureault, and J. Liu. 1 June 2013.
http://www.scopus.com/record/display.url;jsessionid=F3F1C959BABECBF9A46DEF
4F11DB81E0.CnvicAmOODVwpVrjSeqQ?eid=2-s2.0-
84875383336&origin=resultslist&sort=plf-
f&src=s&st1=manufacturing+evaluation+methodology&sid=CD7DF779A7653F3A7
A78D4F80B670786.iqs8TDG0Wy6B.
Waterfall Model. 2013. http://www.waterfall-model.com/v-model-waterfall-model/.
watir. n.d. http://watir.com/.
White Box Testing. n.d. http://www.whiteboxtest.com/Tools.php.
Williams, B., and I. Imam. 2007.
http://users.actrix.co.nz/bobwill/SYSTEMS%20CONCEPTS%20IN%20EVALUATI
ON%20Handout.pdf.
Williams, Laurie. Testing Overview and Black-Box Testing Techniques. 2006.
Wilson, Matthew. “Quality Matters: Correctness, Robustness and Reliability.” accu
professionalism in programming. 2009. http://accu.org/index.php/journals/1585
(accessed 5 2013).
Winston, Royce. MANAGING THE DEVELOPMENT OF LARGE SOFTWARE SYSTEMS.
1970.
Project ID 604674 FITMAN – Future Internet Technologies for MANufacturing
23/08/2013 Deliverable D2.1 – v1.00
FITMAN Consortium Dissemination: Public 130/130
Xing, P., M. Lees, H. Nan, and Viswanthatn V. 2012.
http://link.springer.com/chapter/10.1007%2F978-3-642-28400-7_7.
Yammer. 2013. https://www.yammer.com/ (accessed July 9, 2013).
Yanga, C., S. Chuangb, and R. Huangc. October 2009.
http://www.sciencedirect.com/science/article/pii/S0957417409002760.
Zeydana, M., C. Çolpanb, and C. Çobanoğlu. March 2011.
http://www.sciencedirect.com/science/article/pii/S0957417410008602.