an approach to characterize a software process
TRANSCRIPT
-
8/8/2019 An Approach to Characterize a Software Process
1/7
An Approach to Characterize a Software Process
Markus Suula, Timo Mkinen, Timo Varkoi
Tampere University of Technology, Pori, Finland
Abstract--Process improvement is a cyclic activity consisting
of several phases. According to the Quality ImprovementParadigm, the first phase is characterization. This paper
presents a developed methodology for characterizing a software
process. The methods include software process assessment and
modeling. Using an assessment driven process modeling
methodology we can create a descriptive model that
characterizes the current process, and a prescriptive model
containing improvement suggestions. The selected processes are
reviewed and the process components are gathered according to
a software engineering process meta-model. The SPICE
assessment model is utilized in process reviews. In a case study
the methodology is applied to characterize the software process
of a small enterprise. The current process is defined and its
process capability and product quality are determined. The
enterprises process guide, the produced work products and the
projects workload are analyzed. This paper reports theexperiences of using the methodology and presents a way to
analyze the implications caused by the differences between the
existing process guidance and the actually implemented process.
I. INTRODUCTION
It is widely accepted, that the quality in a software product
is largely determined by the quality of the process that is usedto develop and maintain it [10]. Process improvement means
understanding existing processes and changing these
processes to increase product quality and/or reduce costs anddevelopment time [9]. In this paper, we discuss approaches to
achieve the understanding of processes. The discussion is
based on a case study of a small software enterprise.
Process is a collection of activities that takes one or more
kinds of input and creates an output that is of value to thecustomer [10]. The activities can be grouped into workflows
like in a widely known software development methodology,
the Rational Unified Process (RUP) [4]. RUP is iterative andincremental process divided into four phases: 1. inception, 2.
elaboration, 3. construction, 4. transition. Fig. 1 depicts the
architecture of a RUP-like imaginary software process and
how different activities spread in the process.
According to the Quality Improvement Paradigm (QIP)[1], process improvement is a cyclic activity consisting of six
phases (see Fig. 2). The first phase is characterization for
understanding the current status of processes. The ProductFocused Software Process Improvement methodology(PROFES) [7] divides the characterization phase into four
steps: 1. verify commitment, 2. identify product quality
needs, 3. determine current product quality, and 4. determine
current process capability.
Figure 1. Process Architecture of an Iterative Software Process
Figure 2. Process improvement cycle according to the Quality Improvement
Paradigm [1]
Process capability is a characterization of the ability of a
process to meet current or projected business goals [2]. It is
determined through a systematic assessment and analysis ofselected processes within an organization against a target
capability. During a process assessment an organizational
units processes are evaluated against a process assessmentmodel, such as the International Standard ISO/IEC 15504-5
[3] or the Capability Maturity Model integration (CMMI) [8].
The process assessment models contain reference goals and
practices, and the result of an assessment denotes, how well
the operations of an organizational unit match to the goalsand practices described by an assessment model.
1103
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
2/7
The context of our study is a small software enterprise
that undertook a project to build a monitoring and reporting
system for an industrial customer. Our aim was to experiment
an approach for Assessment Driven Process Modeling
(ADPM) [5], which we are developing to support the fourthstep of the Characterize phase in PROFES. The enterprises
interest was to get practical experiences of certain software
tools for process management, which they considered in thefirst place for their own use, and in the second place as part of
their business. To fulfill both of the objectives, we planned a
quasi experiment in which ADPM and the software tools
were utilized.
During the study we noticed that the state of the softwareproject was not very strong. There were remarkable overruns,
and it seemed that, despite of the existence of standard
processes and the project plan, the project moved ahead on anad hoc basis. Although the outcome of assessment or process
modeling characterizes the current software practices of an
organizational unit, they do not describe how well the results
of the practices meet the business goals. An assessment as a
means to determine process capability as it has been definedin ISO/IEC 15504 [2], does not imply that the business goalsare met.
In this paper we propose an approach to characterize
process capability according to its definition in theInternational Standard, ISO/IEC 15504-1 [2]. The inspiration
for the illustration comes from the architecture of RUP (Fig.
1). The capability of current process of a small software
enterprise is characterized using the approach.
The second section introduces the research contextmeaning the enterprise, its process guide and the studied
project. The third section describes the methods used in the
study. Results of the study are presented in the fourth section.
The fifth section summarizes this study.
II. RESEARCH CONTEXT
The software enterprise where the study was carried outhas a defined standard process called Standard Process Set
(SPS). The performed process was defined by studying one of
the enterprises projects. One of our team members also
worked in the studied project as a programmer so we couldget in-depth information about the project and the performed
process.
A. The Enterprise
The enterprise was founded in Finland in 1994. The wide product repertoire included products for information
management systems and industrial solutions. The enterprise
became advanced IBM business partner in 1998 so it was
natural that the products were based on IBM technologies.Process improvement has always been part of the enterprises
strategy and Software Process Improvement (SPI) is based on
the ISO/IEC 15504 standard (SPICE) [3]. Their customer
projects follow the waterfall model. Nowadays the enterprisehas reduced its product repertoire and has concentrated in
production monitoring and reporting systems (PMRS) and
enterprise content management (ECM). The products are base
on IBM technologies e.g. DB2, DB2 Content Manager,Websphere Portal and Lotus Domino. The enterprise has
expanded its market area to Scandinavia, the United
Kingdom and the Baltic countries. SPI activities are
considered to be crucial in order to gain growth in theinternational market.
B. Standard Process Set
In the year 2004 a study [6] was carried out in theenterprise to improve their processes towards the ISO/IEC
15504 capability level 2, where a process is performed and
managed. The planning of the study consisted of an analysis
of the latest process capability assessments conducted in2001-2002. The main focus was on the management practicesso the management process group had the first priority for
improvement. Second priority was given to the customer-
supplier process group and third priority to the engineeringprocess group. Based on the assessment report from 2001, the
enterprise had recognized and listed ten individual software
processes. Those ten processes were used to create a standard
process for the enterprise. During the process improvement
study the number of processes was increased to nineteen to
complete the SPS.The idea of the SPS is that it is customized to the needs of
a project. The nineteen processes of the SPS are divided into
five groups. SPS is a folder structure where each of theprocesses has their own folder containing the process content.
Table 1 lists the SPS processes and the process related
content grouped into four groups: 1. Guidance, 2. Checklists,
3. Templates, 4. Misc. The group Misc consists of pictures,
tables and other miscellaneous content. Roles are inalphabetical order and the abbreviations used are explained in
Table 2.
We can see from Table 1 how much SPS provides supportfor the processes. Engineering process group has the most
documents (30) but the least documents per process - avg.
three (3) documents per process. It should be noticed that 40
per cent of the ENG documents belong to the TS-process and
that the test plan document template is the same for IT, FTand ST processes. Project management is the most supported process with sixteen (16) documents. It is clear that the
prioritization made when establishing the SPS came true.
1104
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
3/7
TABLE 1. THE COMPONENTS OF STANDARD PROCESS SET
Guidance
Checklists
Templates
Misc.
Standard Process Set -component Roles
Customer-Supplier process group CUSSM SALES AND MARKETING 3 1 5 8 CD, MD, S
DL DELIVERY 0 1 2 0 MD, PM, Sec
CR CUSTOMER REQUIREMENT SPECIFICATION 0 1 1 0 CD, D, PM
PS PRODUCTION SERVICE 0 1 1 0 PM, SM
CUS sum 3 4 9 8
Engineering process group E1G
FS FUNCTIONAL SPECIFICATION 1 1 2 0 CD,D,PM
TS TECHNICAL SPECIFICATION 3 2 6 1 CD,D,PM
SI SOFTWARE IMPLEMENTATION 0 1 0 0 P, PM
MT MODULE TESTING 0 1 1 0 D, P, PM
IT INTEGRATION TESTING 0 1 1 0 D, P, PM
FT FUNCTIONAL TESTING 0 1 2 0 D, PM
ST SYSTEM TESTING 0 1 1 0 D, P, PM
RT REQUIREMENT TESTING 0 1 0 0 P, PM
PL PILOTING 0 1 0 0 P, PM, TE
MS MAINTENANCE SERVICE 0 2 0 0 MD, PM, SME1G sum 4 2 3
Support process group SUP
DN DOCUMENTATION 4 1 3 2 DD
QS QUALITY ASSURANCE 0 1 1 0 CD, MD, PM
CM CONFIGURATION MANAGEMENT 0 1 1 0 CD, MD, PM
SUP sum 4 3 5 2
Management process group MA1
PM PROJECT MANAGEMENT 1 1 9 5 CD, MD, PM
Organization process group ORG
PE PROCESS ESTABLISHMENT 3 1 1 2 MD
SPS sum 5 2 37 8
TABLE 2. ROLE ABBREVIATIONS
Abbreviation Role
CD Chief DesignerD Designer
DD Document Developer
MD Managing Director
P Programmer
PM Project Manager
S Sales
Sec Secretary
SM Service Manager
TE Test Engineer
C. The Project
In the studied project a client had ordered a productionmonitoring and reporting system (PMRS). The PMRS was
designed to integrate with the clients already existing
information system. The system was implemented with a newtool that the enterprise had no experience with. Requirement
elicitation started in April 2006; client approved the customer
requirement specification in June 2006. Implementation project began in the summer of 2007 and the system was
estimated to be released in September 2007.
The project was divided into ten phases which wereplanned to be carried out according to the waterfall model. In
most of the phases a major document was to be created
(Table 3). In addition, a final report was also planned to be
written by the project manager in the end of the project. The
project team consisted of the following roles: project
manager, requirements engineer, chief designer, designer,
production environment designer, specialist in customers
industry, programmer, and tester. One person could havemultiple roles e.g. project manager was also a specialist,
requirements engineer and tester.
There were two major releases in the project: first inOctober which was an implementation review, and the
second in November 2007 which was supposed to be the final
release. The client was not satisfied with either the first
release in October or the release in November. Changes weremade until October 2008 when client accepted the release
1.0.
1105
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
4/7
TABLE 3. PROJECT PHASES, PHASE DOCUMENTS AND ACCOUNTABLES FOR CREATING THE DOCUMENTS
Project phase Document Accountable
(0. Requirement elicitation) Customer requirement specification Project manager,
Chief designer,
Designer
1. Project kick-off and planning Project plan Project manager
2. Analysis Functional specification Chief designer,
Project manager3. Design Technical specification Chief designer, Designer
4. Implementation and unit testing - -
5. Integration testing in production environment Integration testing plan Chief designer, Designer
6. System testing in production environment System testing plan Project manager, Designer
7. Approval test by the supplier Approval test report Project manager
8. Training and deployment - -
9. Approval test by the client - -
10. Maintenance - -
It was stated in the project plan, that the project shouldfollow the SPS. It was declared that the project manager is
responsible for ensuring that the SPS processes are followed.
The documentation process was emphasized in the project
plan: every project phase and document should be reviewedor inspected internally. The change management part of the
project plan stated, that depending on the change requests
magnitude and impact to the system decisions were made
either internally (small changes that had minor affect on the
schedule or work load) or between the supplier and the client(major changes that increase the work load significantly).
The project team recorded their work into the enterprise's
work hour tracking system by filling a form illustrated in Fig.3. Process and task are selected from a static list which is the
same for all projects, i.e. the task is not project specific. The
description field can be used to specify what exactly was
done. The tracking system cannot be used to track the
progress of individual, project specific tasks.
Figure 3. The Work Hour Form
Project specific tasks are kept in a spreadsheet task listcontaining the following information about a task: state of the
task, target/module, task description, criticality and urgency.
The state of the task was color coded according to readinessper cent in a way that red meant 0-25% readiness, orange 25-
50%, yellow 50-75% and green 75-100%. Criticality and
urgency were numeric values from zero to five.
III. METHODS
The study was carried out by taking part in the
enterprises activities. In this study the enterprises process
guide and the work products and workload of the sampleproject were analyzed. Information about the sample project
was gathered in process information gathering workshops
participated by both the project team and the modeling team.
The chosen processes were reviewed and process components
were gathered according to the software process engineeringmeta-model. The process components were tasks, roles, work
products and tools. Problems and notices were also gathered.
Earlier assessed processes were chosen to be reviewedutilizing the SPICE-standard. Modeling was based on an
assessment driven process modeling methodology [5].
In this study the product was the project itself not the
system that the project produced. The product quality was
determined by comparing project team members work hour
data to the project plans work breakdown structure (WBS)which was a Gantt chart. The work hour data was gathered
from the work hour tracking system. The work hour data wastransferred into Microsoft Excel which was used to create
diagrams. The diagrams were scaled and combined with an
open source software Paint.NET using the RUP process
architecture diagram (Fig. 1) as a template. Releases and
project schedule were added also to the diagram.
Process Modeling
Preparation
In the start-up session we introduced the modelling teamand the ADPM process to the software enterprise and the
goals and phases of the ADPM process were described ingeneral. We proposed a set of processes and capability levels
to be reviewed which was approved by the enterprises
management. Next we scheduled the process reviewworkshops. The modeling team was also authored to access
the work products of the project
1106
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
5/7
ExecutionThe performed process was determined by reviewing
selected processes of the project, namely ENG.1
Requirements Elicitation, ENG.4 Software Requirements
Analysis, ENG.5 Software Design, ENG.6 SoftwareConstruction, ENG.8 Software Testing and MAN.3 Project
Management. The processes were selected according to the
previous assessments. Processes were reviewed in six threehour workshops. The enterprises project team was invited to
every workshop but only the key persons in the process to be
reviewed were required to attend. There were two persons in
the modeling team: a chairman and a secretary. The chairman
led the conversation and drew preliminary models which
were projected with an overhead projector to the wall. The
models were drawn with OmniGraffle tool. The symbols usedin the models were similar to the Eclipse Process Framework
(EPF) symbols. The secretary wrote down notes about the
conversation.The workshops followed the process presented in Fig. 4.
The agenda of each workshop consisted of two or three
SPICE processes, of which one was chosen in the beginning
of the workshop to be reviewed. The process review
proceeded with one base practice at a time. Each base
practices was divided into tasks. Then the work products(inputs and outputs), roles, subtasks, tools, notices and
problems related to the tasks were gathered.
Figure 4. State Diagram of a Workshop
In the first workshop we reviewed the process ENG.6
Software Construction. The review started with a vast base
practice ENG.6.BP2 Develop Software Units which took the
whole three hours of the first workshop. In the secondworkshop the review of the ENG.6 process was continued. In
the beginning the modeling team presented the models
created in the first workshop and the project team commented
them. Comments were written down and the models werealtered accordingly. In the second workshop base practices
ENG.6.BP1 Develop Unit Verification Procedures,
ENG.6.BP4 Verify Software Components and ENG.6.BP3Ensure Consistency were reviewed. Processes ENG.4Software Requirements Analysis and ENG.8 Software
Testing were in the agenda of the third workshop. Time run
out before the ENG.8 process was reviewed but the ENG.4
process was completed. In the fourth workshop processENG.1 Requirements Elicitation was reviewed. In the fifth
workshop the previously missed or superficially reviewed
base practices were revised thoroughly. Process attribute PA
2.2 Work Product Management emphasizing ENG.6 processwas reviewed in the sixth workshops.
Finalization
Processes ENG. 5 Software Design, ENG.8 SoftwareTesting and MAN.3 Project Management were not reviewed
because of the lack of time, but the processes were discussed
during the reviews of the other processes. Project
management was talked about in every workshop butespecially in the sixth workshop. The modeling produced two
types of models: 1. descriptive models, 2. prescriptivemodels. The descriptive models describe the performed
process of the studied project - how things were done. Theprescriptive models were derived from the descriptive models
by adding components that could solve the detected
problems. The components of the solution were primarily
based on the SPS, but in the case that SPS could not offer asolution, the SPICE standard and other sources were used.
1107
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
6/7
IV. RESULTS
The implementation project under study took twelve
months more than planned and the work hours were exceeded
by 300 per cent. Fig. 5 shows the workload of theimplementation project grouped into seven categories: 1.
Requirements, 2. Analysis & Design, 3. Implementation, 4.
Testing, 5. Delivery, 6. Project Management, and 7. Others.The group Others covers the hours from the following
processes: documentation, sales and marketing, and
administration. The studied project consists of two separate
subprojects which were the requirement elicitation project
and the implementation project. The requirement elicitationproject is not considered in Fig. 5.
Figure 5. Workload chart
Based on Fig. 5 the system delivered in October 2007 was
almost untested. The amount of testing starts to increase after
the first release. After the release in February 2008 there are
two peaks. These peaks are so called implementation rushes.
Tasks piled up because the project team was not anymorefully available for the project. The project was planned to
follow the waterfall model but it became an iterative project.
We discovered that the requirements changed a lot duringthe project which explains multiple releases and why the
amount of project management increases after the project
schedule was exceeded. The customer requirement
specification (CR) was valid throughout the project. The
problem was in the functional specification (FS). The FS didnot define the whats of the CR into hows sufficiently, e.g.
there was a requirement which stated that the system must be
easy to use. This requirement was not clarified in the FS. Thecustomer based many requirement change requests to this
usability requirement. There were practically no analysis &
design after the first release even though there was a lot of
requirement changes and implementation. FS was not
updated according to the changes and it became obsolete fast.SPS doesnt offer a change management practice but itrequires constant maintenance of the documents.
A system test plan (ST) was created based on the first
version of the FS so also the ST became obsolete fast. All thetesting in the project was explorative and unguided.
According to Sommerville [9] in the waterfall approach the
project cost i.e. working hour distribution is 15%
specification, 25% design, 20% development, 40%
integration and testing, and in iterative development 10%specification, 60% iterative development and 30% system
testing. In the studied project testing had ca. 10% proportion
(Fig. 6). The insufficient testing can be assumed to be one of
the key factors for large amount of implementation work. It issaid that 50% of the implementation work is rework for
fixing defects so in the studied project one fifth of the costs
are rework. There is no way to completely eliminate defects
but it is important to remember that it gets more expensivethe later the defect is detected.
Figure 6. Workload Distribution
1108
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET
-
8/8/2019 An Approach to Characterize a Software Process
7/7
V. CONCLUSION AND FUTURE WORK
The study indicated that neither the process guide nor the
project plan was followed in the studied project. Many of the
problems in the studied project could have been avoided byfollowing the process guide. The studied project can be
considered failed because its workload was 300 per cent more
than planned and the deadline was exceeded by a year. Thebiggest problems in the studied project were a constant flood
of changes and a lack of change management system. It was
discovered that the process guide needs to be updated, and
then utilized properly.
The results of this study were presented to the projectmanagers, chief designers and management of the enterprise.
The case study motivated the management to really invest
into SPI activities. The enterprise has now hired a part timequality manager. The management and the project managers
found the produced workload chart (Fig. 5) informative and
useful. Project managers said that similar charts could help to
follow the progress of the project if charts that represent the
current status of the project were constantly available orcould be automatically generated. Project managers alsostated that it is a problem that they cant follow the progress
of individual tasks. These emerged issues initiated a project
to improve the work hour tracking system in the enterprise.
REFERENCES
[1] Basili, V.R., Caldiera, G., Rombach, H.D.; The Experience Factory,
Encyclopedia of Software Engineering, John Wiley & Sons, pp. 469-
476, 1994.[2] ISO/IEC 15504-1, Information Technology - Process Assessment -
Part 1: Concepts and vocabulary, 2004.[3] ISO/IEC 15504-5, Information Technology - Process Assessment -
Part 5: An Exemplar Process Assessment Model, 2006.[4] Jacobson, I., Booch, G., Rumbaugh, J.; The Unified Software
Development Process, Addison-Wesley, 1998.[5] Mkinen, T., Varkoi, T.; Assessment Driven Process Modeling for
Software, Process Improvement. Proceedings of the PICMET'08
Conference, Cape Town, South Africa, 6 p, 27-31 July 2008.[6] Nurkkala, R.; Improving Project Management In a Small Software
Organization, Masters Thesis, Tampere University of Technology,
Department of Information Technology, 73 p, 2004.
[7] PROFES, PROFES User Manual, Final Version, Retrieved20.10.2008 World Wide Web,
http://virtual.vtt.fi/virtual/proj1/profes/UserManual.html[8] Software Engineering Institute, CMMI for Development (CMMI-
DEV) Version 1.2 (CMU/SEI-2006-TR-008), Software Engineering
Institute, Carnegie Mellon University, August 2006.
[9] Sommerville, I.; Software Engineering. Seventh Edition, Pearson
Education Limited, 2004.[10] Zahran, S. (1998), Software Process Improvement. Pearson Education,
1998
1109
PICMET 2009 Proceedings, August 2-6, Portland, Oregon USA 2009 PICMET