iwsm2014 importance of benchmarking (john ogilvie & harold van heeringen)

35
The importance of benchmarking software projects Using function points and historical data to improve organization success Harold van Heeringen, ISBSG president John Ogilvie, CEO ISBSG

Upload: nesma

Post on 29-Jun-2015

183 views

Category:

Software


0 download

DESCRIPTION

IWSM Presentation

TRANSCRIPT

  • 1. Harold van Heeringen,ISBSG presidentJohn Ogilvie,CEO ISBSG

2. TopicsBenchmarking;Software Project Industry;Comparing apples to apples;3 cases from experience:Reality check proposal;Competitiveness analysis;Supplier performance measurement.Other use of the ISBSG data;Data submission initiative. 3. Benchmarking (wikipedia)Benchmarking is the process of comparing one's business processesand performance metrics to industry bests or best practices from otherindustries.Benchmarking is used to measure performance using aspecific indicator (cost per unit of measure, productivity per unit ofmeasure, cycle time of x per unit of measure or defects per unit ofmeasure) resulting in a metric of performance that is then comparedto othersThis then allows organizations to develop plans on how to makeimprovements or adapt specific best practices, usually with the aim ofincreasing some aspect of performance. Benchmarking may be aone-off event, but is often treated as a continuous process in whichorganizations continually seek to improve their practices. 4. Where are we now?Even the most detailed navigation map of an area isuseless if you dont know where you are???? 5. Informed decisionsSenior Management of IT departments/organizations needto make decisions need to make decisions based onwhere they are and where they want to go.Benchmarking is about determining where you arecompared to relevant peers, in order to make informeddecisions.But how to measure and determine where you are? 6. Software project industryLow performance metrics maturity:Few Performance Measurement Process implemented;Few Benchmarking processes implemented.Most organizations dont know how good or how bad theyare in delivering or maintaining software.These organizations are not able to assess theircompetitive position, nor able to make informed strategicdecisions to improve their competitive position. 7. ButBest in Class organizations deliver software up to 30 timesmore productively than Worst in Class organizationsHigh Productivity, High Quality;More functionality for the users against lower costs value;Shorter Time to Market competitive advantage!Worst in Class organizations will find themselves in trouble inan increasingly competitive marketOutperformed by competition;Internal IT departments get outsourced;Commercial software houses fail to win new contracts.Important to know where you stand!Benchmarking is essential! 8. Difficulty low industry maturityHow to measure metrics like productivity, quality,time-to-market in such a way that a meaningful comparisonis possible?Comparing apples to apples 9. Software is not easy to compare 10. Functional Size MeasurementFunction Point Analysis (NESMA, IFPUG or COSMIC)Measure the functional user requirements size in function points;ISO standards objective, independent, verifiable, repeatable;Strong relation between functional size and project effort needed.What to do with the results?Project effort/duration/cost estimation;Benchmarking/performance measurement;Use in Request for Proposal management (answer price/FP questions).What about historical data?Company data (preferably for estimation);Industry data (necessary for external benchmarking). 11. Historical data: ISBSGrepositories International Software Benchmarking Standards Group; Independent and not-for-profit; Grows and exploits two repositories of software data: New development projects and enhancements (> 7000 projects); Maintenance and support (> 1100 applications). Everybody can submit project data DCQ on the site; Anonymous; Free benchmark report in return; New rewards to be added soon! 12. 3 Cases from my experience Case 1: Telecom project reality check on the expertestimate Case 2: Assessing the competitive position of anorganization Case 3: Supplier Performance Measurement 13. Case 1 A telecom company wished to develop a new Javasystem for the maintenance of subscription types; A team of experts studied the requirements documentsand filled in the WBS-based estimation calculation(bottom-up estimate); They decide that an estimate of 5.500 hours and aduration of 6 months should be feasible; The project manager decided not to believe the expertson their blue eyes only and wished to carry out a realitycheck. 14. ISBSG Reality Check: Effort An estimated FPA comes up with the expected size: Min: 550 FP, likely 850 FP, Max 1300 FP Selecting the most relevant projects in the ISBSG D&Erepository show the next results:PDR (h/FP)FP)Min. 3,2Percentile 10% 10% 4,3Percentile 25% 25% 6,2Median 8,9Percentile 75% 75% 12,9Percentile 90% 90% 19,8Max. 34,2N 89Max. 34,2N 89Functional Size550 850 13003.410 5.270 8.0604.895 7.565 11.5707.095 10.965 16.7705.500 hoursseems optimistic 15. ISBSG Reality Check: Duration Same analysis is possible; Also, formulas have been published in the Practical ProjectEstimation book; For instance:table C-2.2 Project Duration, estimated from software size onlyFunctionele omvang 550 FPC uit tabel 0,507E1 uit tabel 0,429Duration = C * Size^E1Duration = 7,6 elapsed months550 850 1300Duration 7,6 9,2 11,0 16. Result Expert estimate was assessed optimistic; Adjusted Estimate: Effort: 8.000 hours; Duration: 10 months. This turned out to be quite accurate! The project manager now always carries out a realitycheck and is spreading the word. 17. Case 2 Senior management of a software company wondered howcompetitive they were when it comes to productivity. Many bids for projects were lost and they wished to improve,especially their Microsoft .Net department. Analysis of the bids by department showed the next figures:Nr. of bids 23Average PDR in bid 16,3 h/FPAverage Size (FP) 230 FPAverage teamsize 6 ftePDR (h/FP)Min. 3,2Percentile 10% 3,8Percentile 25% 5,9Median 7,6Percentile 75% 12,9Percentile 90% 18,9Max. 34,2N 35ISBSG dataanalysis 18. Case 2 Analysis of the bid phase showed a number of issues: Estimates were extremely pessimistic due to severepenalties in case of overruns; In a number of stages, risk surcharges were added; They wished to work in fixed team of 6 fte, but ISBSG datashows that the project size was usually to small for thisteams size to be efficient. Because of the knowledge that the department bids were notmarket average (or better), the bid process was redesigned,making the company more successful! 19. Case 3 An organization outsourced all of their IT work to onesupplier; Therefore, the competition was gone and potentially thesupplier could charge whatever they wish; The ISBSG data was used to construct the productivitytargets for the supplier; Bonus/malus arrangements were agreed upon based onthese targets. 20. Case 3: Supplier targets13121110987PDR (h/FP)Target PDR PDR (h/FP)1300120011001000900800700600500EUR/FPTarget PCR PCR (EUR/FP)454035302520151050jan-13 feb-13 mrt-13 apr-13 mei-13 jun-13 jul-13 aug-13 sep-13 okt-13Target Defects/1000 FP Defects/1000 FPThe supplier is measured continuouslyand still has to make his target for thefirst time!The organization is happy that thetrends show improvement and they feelin control. 21. Other uses for ISBSG dataVendor selection, based on productivity, speed or qualitymetrics, compared to the industry.Definition of SLA agreements (or other KPIs) based onindustry average performance.Establish a baseline from which to measure futureimprovement.Explain to the client/business that a project was carried outin a better-than-average way, while the client may perceiveotherwise. 22. Analysis of the dataAnalyze the difference in productivity or qualitybetween two (or more) types of projects:Traditional vs. Agile;Outsourced vs. In-house;Government vs. Non-government;One site, multi site;Reuse vs. no reuse;Etcetera.ISBSG Special Analysis reportsfree of charge for Nesma members 23. Special reports Impact of Software Size on Productivity; Government and Non-Government Software ProjectPerformance; ISBSG Software Industry Performance report; ISBSG The Performance of Business Application, Real-Timeand Component Software Projects; Estimates How accurate are they? Planning Projects Role Percentages; Team size impact on productivity; Manage your M&S environment what to expect? Many more. 24. Country Report- 12 countries->40 projects submitted- Australia- Brazil- Canada- Denmark- Finland- France- India- Italy- Japan- Netherlands- UK- USA 25. IFPUG / NESMA FPA dataCountry N (max) Size Effort DurationAustrali 624 140 2054 6,3Brazili 74 253 2047 7Canada 94 244 3207 8Denemarken 167 253 3476 10Frankrijk 464 145 1843 7India 129 283 2794 6Itali 18 247 3706 9Japan 777 280 2108 4,2Nederland IFPUG 26 326 3988 6Nederland NESMA 153 192 1576 3,5UK 42 268 1932 5United States 1435 100 865 3,9Totaal 4003Mediaan 250 2081 6,15 26. IFPUG / NESMA FPA metricsPDR Speed Manpower DR Defect densityAustrali 14,1 25 5,4 5,9Brazili 15,2 32 2,8 0Canada 15,4 26 3,3 14,3Denemarken 14,3 26 3,2 13,5Frankrijk 13,2 19 2,5 0India 8,6 63 6,3 0Itali 9,3 32 5,4 -Japan 6,9 63 13,4 17,1Nederland IFPUG 9,6 48 9,8 2,9Nederland NESMA 6,9 61 2,8 5,2UK 4,1 50 23,6 55,4United States 11,6 25 18,4 0,5Mediaan 10,6 32 5,4 5,2PDR: uren/FPSpeed: FP / kalender maandManpower DR: FP/maand/persoonDefect density: defects /1000 FP geleverd 27. PDR (h/FP) per country181614121086420PDRMediaan PDRHours per FP 28. Delivery Speed per country706050403020100SpeedMediaan SpeedFP per calendar month 29. Manpower Delivery Rate per countryFP per month per person2520151050Manpower DRMediaan Manpower DR 30. Defect density per country6050403020100Defect densityMediaan DefectdensityDefects delivered per 1000 FP 31. We need data! Everybody wants to use data But nobody wants to submit data Why not? Is it hard? Is there a risk? Is the reward not big enough? Does it take to much time? Are there any factors preventing you? Lets discuss! WWW.ISBSG.ORG 32. Data collection initiatives Now: COSMIC initiative!! Concise DCQ, rewards for submitting data1: Free benchmark report; Free report: The performance of business application,real-time and component software projects (March2012); Amazon coupons up to 100 USD; ISBSG coupons up to 140 USD; 20 ISBSG portal credits. Anonymity guaranteed!1: check the Eligibility criteria at the ISBSG booth 33. IWSM-Mensura Event offerIWSM participants get a20% discount. Use the codeat check-out.Please check out the ISBSGbooth here at the IWSM!NB: Nesma members candownload the ISBSG specialanalysis reports for free. 34. Thank you!