implementing software process improvement: an empirical study

13
SOFTWARE PROCESS IMPROVEMENT AND PRACTICE Softw. Process Improve. Pract. 2002; 7: 3–15 (DOI: 10.1002/spip.150) Implementing Software Process Improvement: An Empirical Study Research Section Tracy Hall,* , Austen Rainer and Nathan Baddoo Computer Science Department, University of Hertfordshire, Hatfield, AL109AB, UK In this paper we present survey data characterizing the implementation of SPI in 85 UK companies. We aim to provide SPI managers with more understanding of the critical success factors of implementing SPI. We present an analysis of the critical implementation factors identified in published case studies. We use a questionnaire to measure the use of these factors in ‘typical’ software companies. We found that many companies use SPI but the effectiveness of SPI implementation is variable. Many companies inadequately resource SPI and fail to evaluate the impact of SPI. On the other hand, companies show a good appreciation of the human factors associated with implementing SPI. Copyright 2002 John Wiley & Sons, Ltd. KEY WORDS: software process improvement; implementation; empirical; survey 1. INTRODUCTION In this paper we present survey data characterizing the implementation of software process improve- ment (SPI) in 85 UK companies. Our data identifies the critical factors that managers can use to more effectively control the implementation of SPI. The aim of our work is to develop a maturity-based framework that SPI managers can use to assess the readiness of companies to accelerate the implemen- tation of SPI (Wilson et al. 2001). SPI has become a popular approach to delivering improvements in software products (Humphrey 1989). Many companies have either a formal or informal SPI programme based on one of the fashionable SPI models. Generally SPI is reported to deliver significant benefits (Humphrey et al. Correspondence to: Tracy Hall, Computer Science Department, University of Hertfordshire, Hatfield, AL109AB, UK E-mail: [email protected] Contract/grant sponsor: UK Engineering and Physical Science Research Council; contract/grant number: EPSRC GR/L91962. Copyright 2002 John Wiley & Sons, Ltd. 1991, Johnson 1994). Research shows high maturity companies benefit from increased product quality, improved customer satisfaction and reduced risk (Herbsleb and Goldenson 1996, Humphrey et al. 1991). Some companies report impressive SPI returns: ‘The [Space Shuttle] project reported a 300% improvement in productivity and two orders of magnitude reduction in defect rates.’ (Paulk et al. 1994, p. 100). Despite this, many companies continue to report: low process maturity (Paulk and Chrissis 2000); difficulty getting SPI off the ground (Wilson et al. 2000); failure to maintain SPI in the longer term (Paul- ish and Carleton 1994). In this paper we explore how ‘typical’ software companies are meeting the challenge of implement- ing SPI. We particularly explore the non-technical, people-management factors that may explain why

Upload: tracy-hall

Post on 06-Jul-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

SOFTWARE PROCESS IMPROVEMENT AND PRACTICESoftw. Process Improve. Pract. 2002; 7: 3–15 (DOI: 10.1002/spip.150)

Implementing SoftwareProcess Improvement: AnEmpirical Study

Research SectionTracy Hall,*,† Austen Rainer and Nathan BaddooComputer Science Department, University of Hertfordshire, Hatfield,AL109AB, UK

In this paper we present survey data characterizing the implementation of SPI in 85 UKcompanies. We aim to provide SPI managers with more understanding of the critical successfactors of implementing SPI. We present an analysis of the critical implementation factorsidentified in published case studies. We use a questionnaire to measure the use of these factorsin ‘typical’ software companies. We found that many companies use SPI but the effectiveness ofSPI implementation is variable. Many companies inadequately resource SPI and fail to evaluatethe impact of SPI. On the other hand, companies show a good appreciation of the human factorsassociated with implementing SPI. Copyright 2002 John Wiley & Sons, Ltd.

KEY WORDS: software process improvement; implementation; empirical; survey

1. INTRODUCTION

In this paper we present survey data characterizingthe implementation of software process improve-ment (SPI) in 85 UK companies. Our data identifiesthe critical factors that managers can use to moreeffectively control the implementation of SPI. Theaim of our work is to develop a maturity-basedframework that SPI managers can use to assess thereadiness of companies to accelerate the implemen-tation of SPI (Wilson et al. 2001).

SPI has become a popular approach to deliveringimprovements in software products (Humphrey1989). Many companies have either a formal orinformal SPI programme based on one of thefashionable SPI models. Generally SPI is reportedto deliver significant benefits (Humphrey et al.

∗Correspondence to: Tracy Hall, Computer Science Department,University of Hertfordshire, Hatfield, AL109AB, UK†E-mail: [email protected]/grant sponsor: UK Engineering and Physical ScienceResearch Council; contract/grant number: EPSRC GR/L91962.

Copyright 2002 John Wiley & Sons, Ltd.

1991, Johnson 1994). Research shows high maturitycompanies benefit from increased product quality,improved customer satisfaction and reduced risk(Herbsleb and Goldenson 1996, Humphrey et al.1991). Some companies report impressive SPIreturns:

‘The [Space Shuttle] project reported a 300%improvement in productivity and two ordersof magnitude reduction in defect rates.’ (Paulket al. 1994, p. 100).

Despite this, many companies continue to report:

• low process maturity (Paulk and Chrissis 2000);• difficulty getting SPI off the ground (Wilson

et al. 2000);• failure to maintain SPI in the longer term (Paul-

ish and Carleton 1994).

In this paper we explore how ‘typical’ softwarecompanies are meeting the challenge of implement-ing SPI. We particularly explore the non-technical,people-management factors that may explain why

Research Section T. Hall, A. Rainer and N. Baddoo

many companies are not achieving high processmaturity. Indeed, as DeMarco and Lister say:

‘If you find yourself concentrating on the tech-nology rather than the sociology, you are likethe vaudeville character who loses his keys ona dark street and looks for them on the adja-cent street because, as he explains, ‘The light isbetter there’.’ (DeMarco and Lister 1987, p. 6).

Other empirical studies on SPI confirm theimportance of people factors. Kaltio and Kinnula’s(2000) work on deploying defined softwareprocesses found that people factors were mostimportant and a recent study of process assessmentfound a relationship between high processmaturity and high staff morale (Herbsleb andGoldenson 1996). We focus on the organizational,management and human factors associated withimplementing SPI in industry. These are factorsthat have been said to be neglected by softwareengineering research (McDermid and Bennett1999).

Our work complements the extensive SPI workundertaken by the Software Engineering Institute(SEI) as exemplified by the CMM (Humphrey 1989).CMM identifies the software engineering activitiesthat companies should have in place at variousstages in their maturity. By contrast, our workfocuses on how companies can best implement theseactivities. This is an area increasingly identified ascritical to SPI success (Humphrey 1998), as Kaltioand Kinnula say:

‘. . .capitialisation of organisational learning isthrough process deployment, without deploy-ment process assets are little more than shelf-ware.’ (Kaltio and Kinnula 2000).

Indeed, studies show that 67% of SPI managers wantguidance on how to implement SPI activities, ratherthan what SPI activities to implement (Herbsleb andGoldenson, 1996).

The work we present here is unusual. Most SPIwork is based on single case studies. Such workhas been criticized for being company-specific andtherefore potentially unrepresentative (El Emamand Briand 1997, Herbsleb and Goldenson 1996).We not only present data from 85 companies,but data collected by us as an impartial thirdparty.

In Section 2 of the paper we identify the criticalimplementation factors that have been presented inthe literature. In Section 3 we describe more fullyour study methods and outline the companies in thestudy. In Section 4 we present our survey results.We go on to discuss these results in Section 5 andsummarize and conclude in Section 6. We brieflyexplain our plans for future work in Section 7.

2. BACKGROUND

Appendix 1 shows that since the inception of CMMthere has been a tremendous increase in SPI publi-cations during the last ten years. We present the keyimplementation factors that have been reported inthis literature.

2.1. Human Factors

‘all personnel must be interested in process,as the success of software projects depends onprocess’ (Komiyama et al. 2000).

This slightly unconventional angle emphasizes theinextricable relationship between SPI and people.Indeed many of the case studies reported in theliterature consider human factors critical to SPIsuccess. Small companies are reported to be particu-larly vulnerable to human factors – as they usuallydepend heavily on key individuals (Horvat et al.2000). Reports identify a variety of human factorsthat must be handled effectively.

2.1.1. SPI LeadersCompanies report that SPI managers contribute sig-nificantly to SPI success. Stelzer and Mellis (1998)analysed organizational factors in SPI and identi-fied the importance of change agents and opinionleaders in managing SPI. Herbsleb and Golden-son (1996) identified the need to appoint highlyrespected people to SPI. This view is supportedby the experience of Hughes Aircraft where soft-ware process expertise is considered an essentialprecursor to SPI success (Humphrey et al. 1991).

2.1.2. Management CommitmentAlmost every SPI publication emphasizes the crit-icality of gaining and maintaining managementsupport for SPI. Indeed, as long ago as 1991 man-agement commitment is reported to have played

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

4

Research Section Implementing Software Process Improvement

a major role in the success of SPI at Hughes Air-craft (Humphrey et al. 1991). However, managementcommitment is reported to have many strands. Aswell being generally important (Paulish and Car-leton 1994), commitment must be visible (Pitterman2000) and consistent (Butler 1997). Other reports alsocomment on the importance of ‘managing’ (Stelzerand Mellis 1998) and ‘monitoring’ (Goldenson andHerbsleb (1996)) the improvement project itself.

Reports of SPI at Schlumberger demonstrate thevalue of good management. They found that whena manager with high SPI commitment was replacedby one with less commitment, previous processimprovements were lost (Wohlwend and Rosen-baum 1994).

2.1.3. Staff InvolvementMost SPI publications report on the importanceof involving development staff in SPI (Stelzer andMellis 1998, Goldenson and Herbsleb 1995). Casestudies report that without developer involvement,cynicism can develop (Herbsleb and Goldenson1996). Generally, SPI publications support existingideas on developer empowerment and involvement(Bach 1995). For example, the need to generate a cul-ture of process ownership is emphasized (Pitterman2000), as is the need to value SPI as ‘real work’ (But-ler 1997). The success of process improvement in theSpace Shuttle Project is largely attributed to drivingSPI from the bottom-up:

‘staff are the most effective locus of processmanagement, since they are the closest to itand are best able to interpret process data’(Paulk et al. 1994).

2.2. Organizational Factors

A variety of organizational factors are reportedto impact on SPI. Even in 1993, Kitson and Mas-ters’ (1993) study of 59 companies revealed thathalf faced important organizational problems whichwere outside the scope of SPI. For example, theyfound that ineffective organizational structures andthe after affects of previous failed initiatives dam-aged SPI. Furthermore, cultural problems featurestrongly as barriers to SPI success. For example,organizational politics and ‘turf guarding’ are saidto have a negative impact on SPI (Herbsleb andGoldenson 1996).

2.2.1. CommunicationThe theme of effective communication recurs in theSPI literature (Stelzer and Mellis 1998). Effectivelines of communication are said to benefit SPI, forexample, communication gives opportunities forsharing best practice (Stelzer and Mellis 1998). Oneof the features of successful SPI is said to be multi-channelled communications. For example, in theSpace Shuttle Project (Paulk et al. 1994) feedbackand communication points are designed into manyaspects of SPI. Furthermore, Schlumberger reportsthat SPI actually generates better communicationbetween and within departments (Wohlwend andRosenbaum 1994).

2.2.2. ResourcesAll case studies emphasize the importance, and dif-ficulty, of adequately resourcing SPI. Nokia reportsthat it is necessary to assign resources to the deploy-ment of new processes (Kaltio and Kinnula 2000).Siemens report that dedicated SPI resources areessential (Paulish and Carleton 1994). Many SPIcommentators say that insufficient staff time andresources are a major impediment to SPI success.Herbsleb and Goldenson (1996) found that 77% ofcompanies consider that SPI suffers from lack ofresources. Furthermore, SPI costs are reported bea major disincentive to small companies as costsare disproportionate to the size of the organization(Brodman and Johnson 1994).

2.3. Implementation Factors

Nokia describes a process that was never performedbecause of poor implementation – its documenta-tion was difficult to access, outdated and lackedcoverage (Kaltio and Kinnula 2000). This experi-ence emphasizes the importance of getting imple-mentation right. Many implementation factors arereported in the literature as critical to SPI success.

2.3.1. SPI InfrastructureCase studies report a variety of ways to deploySPI effort. Curtis (2000) found that each of histhree case study companies had central SPI steeringgroups supported by local improvement staff. TheSpace Shuttle Project has a sophisticated approachto organizing SPI (Paulk et al. 1994). ‘Control boards’consisting of a multi-functional team provide pointsof co-ordination on crucial issues. Managementresponsibilities have also been delegated to control

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

5

Research Section T. Hall, A. Rainer and N. Baddoo

boards and a determined effort made to developmanagers from within the project’s existing staff.

2.3.2. Setting ObjectivesAnother recurring theme relates to companies set-ting explicit goals and objectives for SPI. Curtis(2000) found that each of the three high maturitycompanies he analysed ‘had quantifiable businesstargets and managed their pursuit of these tar-gets empirically’. Other commentators emphasizethe need to set relevant and realistic objectives(Stelzer and Mellis 1998) that are clearly stated andwell-understood (Herbsleb and Goldenson 1996).Furthermore, the Oklahoma City Air Logistics Cen-tre found that priorities and goals must be highlytailored to the needs of the particular organization(Butler 1997). Reports also comment on the need torelate goals and objectives to formal process defini-tions (Pitterman 2000) and action plans (Humphreyet al. 1991).

2.3.3. Tailoring SPIA recurring theme in the SPI literature is the needto tailor SPI to particular company requirements(Stelzer and Mellis 1998, Butler 1997, Kaltio andKinnula 2000). Related to this, SPI must also betailored to the appropriate maturity level – highmaturity processes must not be implemented inlow maturity projects (Kaltio and Kinnula 2000).

2.3.4. EvaluationCompanies also report on the importance of evalu-ating SPI progress. Curtis (2000) found that each ofhis three case studies ‘demonstrated improvementson multiple criteria’, including ‘time to market’.Reports from the Space Shuttle Project also say that‘process improvement must be applied with con-stant attention to its impact on results’ (Paulk etal. 1994). Corning Inc. measured SPI success usingproduct quality and project slippage (Johnson 1994).

3. THE STUDY METHODS

3.1. Survey Design

We used the implementation factors reported inthe SPI literature to design a questionnaire mea-suring the use of these factors in the UK soft-ware industry. The questionnaire was designedaccording to classical questionnaire design princi-ples (Berdie and Anderson 1974) and can be viewed

at [http://homepages.feis.herts.ac.uk/∼pppgro-up/questionnaire.html]. The aim was to gatherquantitative and qualitative data characterizing theimplementation of process improvement in soft-ware companies.

We identified a target sample of software com-panies using public domain information, e.g. rele-vant mailing lists and conference attendance lists.Between November 1999 and February 2000 wetargeted questionnaires at 1000 software compa-nies posting them to software process improve-ment managers. We used standard approaches tofollowing-up non-respondents to generate 200 res-ponses. We used only those questionnaires wherecompanies had a software development functionand where an attempt had been made at SPI. Thisleft 85 fully relevant questionnaires. Although ourresponse rate may appear low, a response rate of20 per cent is considered acceptable. Furthermore,we know of no other detailed data characterizingSPI in the UK software industry and so we considerresponses from 85 companies to be good.

3.2. Companies in the Study

The tables in Appendix 2 provide some basic demo-graphic data characterizing the companies whoresponded to the questionnaire. The tables showthat a range of companies are represented in oursample. The sample has the following main charac-teristics:

• a balance of UK and multinational companies;• a range of software development function sizes;• a varied profile of company ages, with most com-

panies having been established over ten yearsago and many over 20 years ago;

• a very high proportion of ISO9001 certifiedcompanies (this is probably a reflection of ourtargeting techniques rather than being represen-tative of the industry as a whole);

• a balanced representation of application areas,with a well balanced split between bespoke andcommercial development activities;

• a broad focus in software development effort,with development, maintenance, support andoperations complementing a small amount ofconsultancy.

Only 15 of the 85 companies in our sample said theyhad undergone formal CMM assessment. Maturityis one characteristic of the companies that we are

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

6

Research Section Implementing Software Process Improvement

interested in and so we emulated Herbsleb andGoldenson (1996) and asked the other 70 compa-nies to informally estimate their CMM level (seeTable A7 in Appendix 2).

Our sample contains very few companies beyondlevel 2. This is probably typical as Paulk and Chrissis(2000), in their survey of high maturity organiza-tions, refer to only 44 level 4 organizations and 27level 5 organizations in the world (though they saythat there are probably more).

4. FINDINGS

4.1. Initiating SPI

Table 1 shows that a large proportion of companiesin our sample have attempted SPI. FurthermoreTable 2 shows that the vast majority of these havebeen formal rather than informal approaches toSPI (in the questionnaire we defined formal as SPIsupported by documentation).

Table 3 shows that SPI is firmly established inmany companies. Only a fifth of companies say SPIis less than two years old and 35% say it has been inplace for more than five years. Table 3 also suggeststhat the initiation of SPI may have started to decline,as further analysis of the data shows that only 9% ofSPI programmes have been set up in the last year.

In total, 72% of respondents told us that SPI intheir company had clearly stated objectives. Table 4

Table 1. SPI across companies

Has your company tried to improveits software development process?

Frequency Percentage

Yes 79 93No 6 7

Total 85 100

Table 2. Formality of SPI

How formal is your company’s approach tosoftware process improvement?

Frequency Percentage

Formal 65 83Informal 13 17

Total 78 100

Missing 7

Table 3. Age of SPI

How long has your company’s process improvementprogramme been in operation?

Frequency Percentage

0–2 years 16 213–5 years 34 44More than 5 years 27 35

Total 78 100

Missing 7

Table 4. Objectives for SPI

Why did your company embark on process improvement?(Respondents could select more than one motivation)

Frequency Percentage

Improve software quality 67 79Reducing development costs 61 72Shorten development times 61 72Increase productivity 47 55Improve management visibility 38 45Meet client requirement 22 26For marketing purposes 18 21

identifies particular SPI objectives and shows thatthe most popular reasons for introducing SPI are:

• improving software quality;• reducing costs;• reducing timescales.

Table 4 also shows that relatively few companiesintroduced SPI because of client requirements orfor marketing purposes.

4.2. The Design of SPI Programmes

To understand the way companies have approachedSPI we asked respondents to comment on variousaspects of SPI design promoted in the SPI literature.First we asked whether respondents agreed thatparticular aspects of SPI design were important,and second, we asked them to assess whether theircompany uses these SPI design principles.

Table 5 shows that almost all respondents be-lieved it was important to:

• gain senior management commitment to SPI;• tailor SPI to the needs of the particular organi-

zation;• align SPI goals with organizational goals.

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

7

Research Section T. Hall, A. Rainer and N. Baddoo

Table 5. Essential components of SPI

Agreement withstatement

Score out of five oncompany

implementation(1 = low; 5 = high)

Percentage Median Mode

It is important. . .for senior management to be committed to SPI 93 4 5to tailor SPI to the needs of particular companies 92 4 4that the goals of SPI are congruent with company goals 89 4 4to make realistic assessments of SPI paybacks 63 2 1to research SPI before developing an SPI programme 60 3 2

Furthermore Table 5 shows that respondents judgedtheir organizations relatively highly for achiev-ing these aims. Respondents scored organizationslower for making realistic assessments of pay-backs – despite this factor being valued as impor-tant by 63% of respondents. Table 5 also showsmixed views and scores on the importance of doingresearch on SPI.

Table 6 shows that less than half of the companieshad published an SPI implementation plan. Of thecompanies with no implementation plan, furtheranalysis shows that many had an informal approachto SPI. However, of the 65 companies identified inTable 2 as having a formal approach to SPI, 30 hadno implementation plan.

Table 6. SPI implementation plan

Did your company publish a SPI implementation plan?

Frequency Percentage

Yes 35 46No 39 51Don’t know 2 3

Total 76 100

Missing 9

Table 7. Introduction of SPI

How was the improvement programme introduced?

Frequency Percentage

Phased, through projects 23 30Phased, through teams 4 5Phased, through SPI issues 28 36Throughout the company 16 21Don’t know 6 7

Total 77 100

Missing 8

Table 7 shows variable approaches to implement-ing SPI. The most popular implementation was tophase the introduction of SPI through projects. Thisis, perhaps, no surprise given the project-basednature of most companies. However a large pro-portion of companies phased SPI via SPI issues. Afifth of companies implemented SPI throughout thecompany in one go.

4.3. SPI Personnel

Table 8 outlines the number of SPI staff in oursample of companies. On average, each companyhas three full-time SPI staff. However two com-panies report 15 SPI staff. Further analysis showsthat there is a relationship between software engi-neering effort and SPI effort. For example, all ofthe companies who have no dedicated SPI staff aresmall companies (less than 100 software staff). Fur-thermore 85% of all the small companies in thissample report less than five dedicated SPI staff;66% of companies employing more than 500 soft-ware staff report between five and 15 SPI staff.These results suggest that the direct costs of SPIare relatively small. However, this relatively small

Table 8. SPI staff effort

How many people are dedicated to SPI in your company?

Full time equivalents Frequency Percentage

0 18 261–4 37 535–10 12 17More than 10 3 4

Total 70 100

Missing 15

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

8

Research Section Implementing Software Process Improvement

dedicated SPI resource may explain the results pre-sented in Table 10, where respondents say that SPIis inadequately funded.

SPI infrastructure varied between companies.Only 25 companies had a central SPI resource (aSoftware Engineering Process Group (SEPG) or aSoftware Process Action Team (SPAT)). Only 39%of SPI teams were independent of the developmentfunction. However, 51% of respondents said thatclear responsibilities were assigned to SPI teams.

4.4. People Issues in SPI

We asked companies to comment on various humanissues in SPI. We first asked about the use of specifichuman factors, and then about the impact of use.Table 9 summarizes responses.

Table 9 shows that the use of experienced staff inSPI seems to be the most important factor. Otherimportant factors include:

• internal leadership;• process ownership;• executive support.

Table 9 also shows that 59% of companies use exter-nal consultants in SPI with only 25% reporting highbenefit from them.

Table 10 shows the factors that respondents con-sidered important to establishing buy-in to SPI.

How well respected SPI staff were emerged asan important factor, and one that companies alsoscored highly on (this may be a biased answer asrespondents were mostly SPI managers). Participa-tion by developers in SPI also emerged as important,and was another factor on which companies scoredhighly. However, companies scored less well onadequately resourcing SPI and on selling SPI todevelopers.

Table 11 shows that respondents believe that, onthe whole, developers have responded positively toSPI. Only 26% of companies report a less than sat-isfactory reception from developers. This indicatesthat developers are more receptive to SPI than con-ventional wisdom suggests. Furthermore, we found

Table 11. Developers’ response to SPI

How have developers generally responded to SPI?

Frequency Percentage

Very enthusiastic 9 12Satisfactory 45 59Some interest 12 16Indifference 8 10Don’t know 2 3

Total 76 100

Missing 9

Table 9. Human factors in SPI

Use of factor (percentage) Benefit of using factor (percentage)

No Yes DK None Low High

Experienced staff 1 99 4 1 24 71Internal leadership 10 90 7 10 30 53Process ownership 10 89 8 10 34 48Executive support 16 84 12 6 38 44External consultants 41 59 6 39 31 25

Table 10. Establishing buy-in to SPI

Agreement withstatement

Score on companyimplementation

(1 = low; 5 = high)

Percentage Median Mode

It is important that SPI. . .teams consist of highly respected staff 95 4 4involves developers 91 4 5is properly resourced 89 3 3is ‘sold’ to developers from the start 86 3 3training is provided 76 3 4

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

9

Research Section T. Hall, A. Rainer and N. Baddoo

significant relationships (using the chi square statis-tic) between positive responses from developersand the following factors: feedback being providedto them; highly respected SPI staff, and SPI trainingbeing provided. We found no relationship betweendeveloper enthusiasm and comp̀any maturity.

4.6. Evaluating SPI

Table 12 shows that while 68% of companies con-sider SPI to have been successful, 23% say it hasbeen less than successful. In fact only 16% of com-panies chose the highest success rating. On the otherhand, Table 13 shows that SPI has delivered bene-fit to management in most companies. Our further

Table 12. Success of SPI

How successful has SPI been in your company?

Frequency Percentage

Not successful 2 3Marginally successful 15 20Successful 40 52Very successful 12 16Not able to assess 8 10

Total 77 100

Missing 8

Table 13. Management value

Has SPI delivered benefits to management?

Frequency Percentage

Yes 54 70No 9 12Don’t know 14 18

Total 77 100

Missing 8

analysis of the companies reporting success with SPIshowed that resourcing was the only factor with asignificant relationship (using the chi square statis-tic) to SPI success. We found no other significantrelationships with SPI success.

Table 14 shows that less than a third of com-panies have evaluated the impact of SPI. However,Table 15 shows that more than half of the companiesmeasure SPI effort. This suggests that while com-panies are keen to measure costs they are less keento measure other aspects of SPI. Furthermore, giventhe relatively small amount of evaluation data avail-able, it is difficult to understand how companies areassessing SPI success.

Of the 22 respondents (26%) who said their com-pany did measure the impact of SPI, Table 16 shows

Table 14. Evaluation of SPI

Has your company evaluated the impact of SPI?

Frequency Percentage

Yes 22 29No 46 63Don’t know 7 9

Total 75 100

Missing 10

Table 15. Effort in SPI

Does your company measure SPI effort?

Frequency Percentage

Yes 42 54No 35 45Don’t know 1 1

Total 78 100

Missing 7

Table 16. Evaluation factors

Evaluation measures Use of measure(percentage of the

22 companiescollecting

evaluation data)

Benefit of use (percentageof the 22 companies

collecting evaluation data)

No Yes None Low High DK

Development time 4 96 0 27 64 9Development costs 5 95 0 22 72 6Customer satisfaction 7 93 4 20 68 8Post-delivery defects 11 89 8 28 64 0Defect density 22 78 15 30 55 0

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

10

Research Section Implementing Software Process Improvement

the factors that were tracked. These factors gener-ally correspond to the motivators for SPI that arepresented in Table 4.

4.5. SPI and CMM Maturity

Table 17 indicates a relationship between maturityand SPI success (significant at the 5% level). Highermaturity companies are more likely to consider SPIsuccessful than low maturity companies. This maybe rather a tautology but nevertheless reconfirmsthe importance of process maturity. Table 18 showsthat there is no clear relationship between matu-rity and the length of time companies have beenusing SPI. This is a useful result as it suggeststhat the acceleration of maturity is not necessarilylinear.

5. DISCUSSION OF RESULTS

Generally our results reveal that companies havebeen using SPI over a relatively long period oftime. Despite this, comparatively few companies in

our study report high process maturity. Companiesin our sample do not seem to be accelerating SPIas quickly as has been reported elsewhere (Hayesand Zubrow 1995). This is probably because thecase studies reported in the literature describe lead-ing edge SPI efforts. Companies in our study areprobably more typical of the industry overall. Ourfindings substantiate Curtis’ view (Curtis 2000) thatthe most successful companies have now masteredSPI, but the rest of the industry is yet to catchup. Furthermore, our results suggest that there isstill a long way to go for the majority of com-panies. On the other hand our results may belinked to Paulk et al.’s findings on the time lapsebetween implementing SPI and generating bene-fit – ‘In some cases process changes took up to twoyears to demonstrate results.’ (Paulk et al. 1994,p. 98).

Our findings suggest that the biggest impedimentto SPI success is inadequate resourcing. Respon-dents told us that resourcing was important to SPIsuccess, but that companies were failing to pro-vide adequate resources. Furthermore we foundthat resources were the only single factor directly

Table 17. CMM maturity and SPI success

CMM maturity levels How successful has SPI been?(formal and informal) Less successful More successful

Frequency Percentage Frequency Percentage

1 9 41 3 82 9 41 19 503 2 9 8 214 0 0 2 55 0 0 0 0

DK 2 9 6 13

Total 22 100 38 100

Table 18. CMM maturity and SPI age

CMM levels How long has SPI been in operation?(formal and informal) Less than 2 years 3–5 years More than 5 years

Frequency Percentage Frequency Percentage Frequency Percentage

1 4 33 6 23 2 92 6 50 12 46 10 433 2 22 2 8 6 274 0 0 1 4 1 45 0 0 0 0 0 0

DK 0 0 5 19 3 13

Total 12 100 26 100 22 100

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

11

Research Section T. Hall, A. Rainer and N. Baddoo

related to SPI success. Perversely resources wasthe only consistently collected measurement data.This indicates that companies were very interestedin monitoring resources. This may explain whyrespondents said that inadequate resources wereprovided.

Companies were also generally ineffective atevaluating the impact of SPI. SPI effort was gen-erally not focused and performance not systemati-cally assessed. The only aspect of SPI that seemedto be evaluated was costs. However this resultmay be related to the overall low maturity of thesample.

Companies did not value background SPI re-search particularly highly. Linked to this is ourfinding that, on the whole, companies did notparticularly highly value the input of external con-sultants. Companies seemed happiest relying oninternal process expertise. This may be linked to theimportance companies place on tailoring SPI to theparticular needs of the company.

The quality of internal SPI staff emerged as afundamentally important aspect of SPI success.Respondents felt that the use of experienced peoplein SPI teams was critically important to SPI suc-cess. Furthermore that developer buy-in to SPI wasdependent on respect for SPI staff. Our findingssupport experiences from the Space Shuttle casestudy (Paulk et al. 1994) where they found that SPIis implemented most successfully by experiencedpeople.

Overall, our findings indicate that companies hada good understanding of the human factors associ-ated with SPI. Most companies involved developersin SPI and understood the value of communicatingto developers about SPI. Furthermore companiesalso seemed to understand the importance of man-agement commitment to SPI and most seemed tobe successfully demonstrating such commitment.Overall our findings show that developers wereresponding relatively positively to SPI and thatdevelopers were most positive about SPI when theyreceived plenty of feedback on SPI.

6. SUMMARY AND CONCLUSIONS

Our findings generally indicate that SPI is progress-ing in the UK software industry. We show that SPIactivity is widespread and that management benefit

is generated by SPI. Furthermore companies gener-ally consider SPI reasonably successful. On the otherhand, companies in our sample are not maturing asquickly as some of the experiences described in theliterature.

Overall companies show a good understandingof the human issues related to SPI implementa-tion. Many of the implementation factors citedin the literature are being addressed. In particu-lar companies show a good appreciation of theimportance of involving developers in SPI anddemonstrating senior management commitment toSPI. Companies report relatively high developerenthusiasm for SPI. Our results show companiesare performing less well on adequately resourc-ing SPI. We show a statistically significant rela-tionship between resources and SPI success. Thisconfirms the experiences of companies cited in theliterature.

In addition, although most companies claim tohave objectives for SPI, many are failing to ade-quately track and evaluate SPI.

We also confirm the importance of high qualitySPI staff. Our study provides corroborating evi-dence of the relationship between highly respectedSPI staff and SPI success. This relationship has beenidentified in a number of the SPI case studies pub-lished in the literature.

Overall our findings show steady SPI progressin the UK software industry – Progress that maybe accelerated by improvement in the few weakimplementation areas we identify.

7. FUTURE WORK

In conjunction with the questionnaire data we pre-sent here, we have also collected detailed casestudy data from 13 UK software companies. Weplan to analyse this data to further investigatesome of the issues uncovered by the questionnairedata. Our ultimate aim is to construct and vali-date a maturity-based model of SPI implementationfactors.

APPENDIX 1: A TREND ANALYSIS OFPROCESS IMPROVEMENT PUBLICATIONS

Figure A1 shows a plot of the total number ofpapers published against the number of paperspublished with the phrase ‘process improvement’

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

12

Research Section Implementing Software Process Improvement

Figure A1. Number of papers published versus number of papers published with the phrase ‘process improvement’in their title or abstract. (We were unable to obtain various publications for 1995. This partially explains the dip in SPIpublications for that year.)

in their title or abstract. The software engineer-ing publications we analysed were: IEEE Software;IEEE Transactions in Software Engineering; Communi-cations of the ACM; Journal of Systems and Software;Proceedings of the IEEE International Conference onSoftware Engineering; Empirical Software EngineeringJournal (began publishing in 1996); Software ProcessImprovement and Practice Journal (began publishingin 1996).

APPENDIX 2: DEMOGRAPHIC DATAFROM QUESTIONNAIRE RESPONSES

Table A1. Scope of company

Frequency Percentage

Multinational 46 55UK-based 38 45

Total 84 100

Missing 1

Table A2. Size of development effort

Staff numbers Frequency Percentage

0–25 38 4526–100 22 26101+ 23 27DK 1 1

Total 84 100

Missing 1

Table A3. Age of company

Years Frequency Percentage

0–5 2 26–10 12 1411–20 34 4020+ 37 43

Total 85 100

Missing 0

Table A4. ISO certification status

Frequency Percentage

Yes 77 95No 4 5

Total 81 100

Missing 4

Table A5. Effort areas (respondents could choose morethan one effort area

Frequency* Percentage

System development 58 68System maintenance 32 38User support 33 39Computer operations 5 6Consultancy 23 27

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

13

Research Section T. Hall, A. Rainer and N. Baddoo

Table A6. Application areas (respondents could choosemore than one application)

Frequency Percentage

Bespoke systems 64 75Commercial packages 44 52

Safety critical 24 28Data processing 45 53Business systems 54 63Systems software 37 44Telecommunications 34 40

Table A7. CMM maturity

CMM maturity Formal Informallevels assessment assessment

Frequency Percentage Frequency Percentage

1 5 33 13 212 3 20 28 463 5 33 10 164 2 13 2 35 0 0 0 0

DK 0 0 8 13

Total 15 100 61 100

ACKNOWLEDGEMENTS

We are sincerely grateful to all the companies andpractitioners (who, for reasons of confidentiality,must remain anonymous) for their participation inthis project. We are also grateful to David Wil-son from the University of Technology, Sydney,for his contribution to this project. The project isfunded by the UK’s Engineering and Physical Sci-ence Research Council, under grant number EPSRCGR/L91962.

REFERENCES

Bach J. 1995. Enough about process: what we need areheroes. IEEE Software 12(2): 96–98.

Berdie DR, Anderson JF. 1974. Questionnaires: Design AndUse. The Scarecrow Press: Metuchen.

Brodman JG, Johnson DL. 1994. What small businessesand small organizations say about the CMM. 16thInternational Conference on Software Engineering, May16–21.

Butler KL. 1997. Process lessons learned while reachinglevel 4. CrossTalk May: 1–6.

Curtis De Marco B. 2000. The global pursuit of processmaturity. IEEE Software July/August: 76–78.

DeMarco T, Lister T. 1987. Peopleware–Productive Projectsand Teams. Dorset House Publishing.

El Emam K, Briand L. 1997. Lister Costs and benefits ofsoftware process improvement. Technical report, ISERN-97-12 Fraunhofer Institute for Experimental SoftwareEngineering.

Goldenson DR, Herbsleb JD. 1995. After the appraisal – asystematic survey of process improvement. CMU/SEI-95-TR-009. Software Engineering Institute, Carnegie MellonUniversity.

Hayes W, Zubrow D. 1995. Moving on up:data and experience doing CMM-based processimprovement. Technical report CMU/SEI-95-TR-008Software Engineering Institute.

Herbsleb JD, Goldenson DR. 1996. A systematic surveyof CMM experience and results. 18th InternationalConference on Software Engineering – ICSE. Berlin,Germany, 25–29 March, 323–330.

Humphrey WS. 1989. Managing The Software Process.Addison Wesley.

Humphrey WS, Snyder TR, Willis RR. 1991. Softwareprocess improvement at Hughes Aircraft. IEEE Software8(4): 11–23.

Humphrey WS. 1998. Why don’t they practice what wepreach? Annals of Software Engineering 6: 201–222.

Horvat RV, Rozeman I, Gyorkos J. 2000. Managing thecomplexity of SPI in small companies. Software Processand Improvement Journal 5: 45–54.

Johnson A. 1994. Software process improvementexperience in the DP/MIS function. In Proceedingsof IEEE International Conference on SoftwareEngineering.

Kaltio T, Kinnula A. 2000. Deploying the defined softwareprocess. Software Process and Improvement Journal 5:65–83.

Kitson DH, Masters SM. 1993. An analysis of SEI softwareassessment results. In Proceedings of IEEE InternationalConference on Software Engineering.

Komiyama T, Sunazuka T, Koyama S. 2000. Softwareprocess assessment and improvement in NEC – currentstatus and future direction. Software Process Improvementand Practice 5: 31–43.

McDermid JA, Bennett KH. 1999. Software engineeringresearch: a critical appraisal. IEE Proceedings–Software146(4): 179–186.

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

14

Research Section Implementing Software Process Improvement

Paulish DJ, Carleton AD. 1994. Case studies of software-process-improvement measurement. Computer 27(9):50–57.

Paulk MC, Chrissis MB. 2000. The November 1999 HighMaturity Workshop. Software Engineering Institute,Carnegie Mellon University.

Paulk MC, Weber CV, Curtis B, Chrissis MB (eds). 1994. Ahigh-maturity-example: Space Shuttle onboard software.In The Capability Maturity Model: Guidelines for ImprovingThe Software Process. Addison-Wesley, Harlow, UK.

Pitterman B. 2000. Telcordia technologies: the journey tohigh maturity. IEEE Software 17(4): 89–96.

Stelzer D, Mellis W. 1998. Success factors oforganizational change in software process improvement.

Software Process – Improvement and Practice 4(4):227–250.

Wilson D, Hall T, Baddoo N. 2000. The softwareprocess improvement paradox. In Approaches to QualityManagement, Chadwick D, Hawkins C, King G, Ross M,Staples G (eds). British Computer Society Publication:BCS, UK; 97–107.

Wilson D, Hall T, Baddoo N. 2002. A frameworkfor evaluation and prediction of software processimprovement success. Journal of Systems & Software (toappear).

Wohlwend H, Rosenbaum S. 1994. Schlumberger’ssoftware improvement program. IEEE TSE 20(11):833–839.

Copyright 2002 John Wiley & Sons, Ltd. Softw. Process Improve. Pract., 2002; 7: 3–15

15