recent trends and activities around software process improvement

Click here to load reader

Post on 17-Jan-2016

36 views

Category:

Documents

0 download

Embed Size (px)

DESCRIPTION

Recent trends and activities around Software Process Improvement. Reidar Conradi Dept. of Computer and Information Science Norwegian University of Science and Technology Trondheim www.idi.ntnu.no/~conradi, [email protected] Tel +47 73.593444, Fax +47 73.594466. 1. My recent background - PowerPoint PPT Presentation

TRANSCRIPT

  • Recent trends and activities around Software Process ImprovementReidar ConradiDept. of Computer and Information ScienceNorwegian University of Science and TechnologyTrondheim

    www.idi.ntnu.no/~conradi, [email protected] +47 73.593444, Fax +47 73.594466

  • Overview1. My recent background2. Some current SPI projects and activities3. Three promising new technologies: 3.1 Incremental/COTS-based development3.2 Internet/mobile technologies3.3 Inspections (for UML)4. Knowledge management and experience bases5. SPI frameworks and strategies6. SFF-Fornebu7. Empirical software engineering8. Future SPI challenges

  • 1. My recent backgroundESPRIT sw.eng. projects in 1990-1999: reuse, configurations, reengineering, process modeling, ...Natl SPI projects: SPIQ 1997-99, PROFIT 2000-02.Sabbatical stay at Univ. Maryland and Politecnico di Milano, 1999-2000.Visits and contact with IESE, Kaiserslautern.Workshops in 2000: EWSPT, PROFES, Learning Software Organizations and FEAST.Discussions at ISERN research network meeting (Basili/Rombach et Co.), Oct. 2000, Hawaii.Some contact with Q-labs, Univ. Lund, and VTT.

  • 2a. PITAC: Presidents IT Advisory CommitteeFinal recommendation Feb. 1999, www.hpcc.gov:Software -- no surprise software, empirical workScalable Software Infrastructure -- too fragileHigh End ResearchHigh End ApplicationsSocioeconomic FactorsDouble funds for basic research in IT, to 2 bill. $.NSF has initiated first ITR projects, 100 mill. $, 50 projects accepted out of 1500 applications!First projects started Sept. 2000, including CeBASE.

  • 2b. Univ. Maryland / prof. BasiliWorld famous for its SPI work with the NASA-SEL in 1976-2000: Quality Improvement Paradigm (QIP)Experience Factory / Experience Bases Goal Question Metrics method (GQM) Inspection techniques (perspective-based reading)Empirical methods (contextual factors)Some COTS workProjects with NASA, Motorola, DaimlerChrysler, Telcordia, Lucent, consortium of local SMEs etc.Univ. Maryland and its new FC-MD Fraunhofer institute: 20 researchers/PhD students.I worked on experience bases, OO reading, COTS.

  • 2c. NASA Software Engineering LaboratoryGoddard Space Flight Center, 270 software developers, 9000 people. Unmanned spacecrafts, geophysical data. Project size: 30 person-years over two years. Technology: FORTRAN, later Ada and C++.NASA-SEL: coop. by NASA, CSC, U. Maryland. Tracked 150 projects, testing different technologies.Results in 1976-96: reduced defects in operation to 1/4, 2X productivity, more complex projects.Most important technology: software reuse by domain-specific frameworks -- from 25% to 80% reuse.Egoless approach: measuring/learning in the culture.

  • 2d. CeBASE projectNew CeBASE (Center of Empiricism) NSF-PITAC project from 15. Sept. 2000, www.cebase.org.V.Basili, B.Boehm (USC), U. Nebraska, U. Tennessee.1.2 mill. $ per year for two years + three more years:Inspections (Maryland)COTS (USC)Empircial methods (all)Experience Bases, with contents (U. Nebraska)Software Engineering Education (U. Tennessee)Slogan: FAD-based => empirically-based => science-based => engineering-based.Manifest of Understanding:U.Maryland & NTNU/UiO.

  • 2e. IESE, Kaiserslautern, prof. RombachIESE: Fraunhofer Institute, Germany, applied researchArea: Experimental Software Engineering.Established in 1995, made permanent in 1998. Close to 100 persons, ca. 40% government funded.SPI and software qualitySoftware product lines and reuseInspectionsExperience bases using Case-Based ReasoningSoftware Engineering Education: SW AcademySister institute in Maryland from 1998.NTNU PhD student Dingsyr: at IESE, Spring 2001

  • 2f. ISERN, Intl Sw.Eng. Research NetworkClub around Basili/Rombach from1992, 20 members.NTNU a member since 1997, UiO will apply.Meetings:Oct. 2000 on Hawaii, Aug. 2001 in Glasgow.Topics:Inspections, e.g. for OO (Maryland, Vienna, NTNU)Technology transferEmpirical methods, practical experiencesRepeatable experiments: planned for OO designShared repository for experiments (Conradi ed.)ESERNET: network of excellence proposal in EU, Nov. 2000 (NTNU participates).

  • 3.1a Incremental and component-based developmentLead-times shrinking: one Internet-year a dog-year (7 years) in other businesses.Strict waterfall model is not functioning anymore:Incremental/iterative approaches, having continuous dialog with customer, reduce risk. Ex. RUP, Genova.Component-based approaches, reusing parts of existing solutions or ready-made components (COTS).(Re)negotiating requirements, if needed component is not yet available, but increased dependency on other actors (higher risk).Different work style: need models to reason about and balance e.g. time/quality trade-offs, market-value models.

  • 3.1b Barry Boehms COCOMO-IIIncremental development using COTS. Initial, iterative steps e.g. to scale and refocus project.

    Traditional situation:

    Current trends:

    Clear separation between

    requirements and design

    Everything connected

    Stable requirements

    Rapid requirements change

    (50% change per 3 year)

    Technical solutions are separate

    COTS influence requirements

    Mostly in-house and controllable parts

    No control over COTS evolution

    >2 year projects

    Ever-decreasing lead times

    Repeatable, mature models

    Adaptable process models

  • 3.1c INCO project: Incremental and Component-Based DevelopmentNewly accepted NFR basic R&D project, 2001-2004. UiO and NTNU. 2 PhDs and one postdoc.Need a revolution in project models -- waterfall is dead, internet time, extend RUP and similar models. Rqmts design: a negotiation!Component-based development, using Components-Off-The-Shelf (COTS) -- but how to manage the risks?Reuse through mature product lines?Empirical studies towards Norwegian IT industry.Proposers: Dag Sjberg, Univ. Oslo (coord.) and Reidar Conradi, NTNU.

  • 3.2a New network technologies: revolution in sw.eng. applications and solutions!Application side: 300 mill. mobile phones in Jan. 2000, 3 bill. in 2005 and 10% with powerful PCs/PDAs? Entry ticket to become a global (virtual?) company: reduced by factor 100 in recent years.Enormous dynamics: 3C convergence, liberalizationNew work modes: RAD, nomadic, 100% on-line.Technology side:Deluge: www, IP, Java, CORBA, XML, applets / servets / displets, agents = active html pages?, E.g. XML used for sending phone bills to bank, XML/Java tools to make graphical editors.

  • 3.2b MOWAHS project: Mobile Work Across Heterogeneous SystemsNewly accepted NFR basic R&D project, 2001-2004. IDI SU and DB group. 2 PhDs and one postdoc.Software infrastructure for mobile agents that can move across fixed and portable PCs, PDAs etc. Novel transaction models to regulate access and updates to partly shared data.Try out for software engineering and teachingInternationalization, Telenor and other cooperation?Proposers: Reidar Conradi (coord.) and Mads Nygrd, NTNU.

  • 3.3a InspectionsBy Michael Fagan at IBM, 1976; refined by Tom Gilb.Established gains, finds 2/3 of defects at 1/3 of price, saves 15-20% of total effort.Still challenges:Perspective-based reading: different checklists for different readers (customer, designer, tester) to find complementary defects.Inspections vs. testing: which defects by which technique; cf. Cleanroom.Longitudinal studies: find defect-prone modules?Root-Cause-Analysis: follow-up, prevention?OO reading techniques: for UML, for architecture?

  • 3.3b Ericsson data, 95-97Study 1: Cost-effectiveness of inspections and testing

    Activity

    Defects [#]

    Effort [h]

    Of total [%]

    Effort spent to find one defect

    [h:m]

    Estimated saved effort by early defect removal [h]

    Inspection reading (design)

    928

    786.8

    61.78

    00:51

    8200

    Inspection meeting (design)

    29

    375.7

    1.92

    12:57

    Desk check

    404

    1257.0

    26.91

    03:07

    -

    Function test

    89

    7000.0

    5.92

    78:39

    -

    System test

    17

    -

    1.13

    -

    -

    Field use

    35

    -

    2.33

    -

    -

  • 3.3c Ericsson data, 95-97 Study 1- number of defects in field-use versus states in a module

    The number of system failures increases with increasing number of states (complexity).Defects found in field-useNo. of states

  • 3.3d OO Reading of UML Documents Software Artifacts, with seven new OO Reading Techniques (OORTs) indicated:RequirementsDescriptionsUse-CasesRequirementsSpecification/AnalysisHigh LevelDesignClassDiagramsClassDescriptionsState DiagramsSequenceDiagramsOORT-1Vertical readingHorizontal readingOORT-6OORT-7OORT-5OORT-3OORT-2OORT-4

  • 3.3e OO Reading of UML (contd)How to read graphical diagrams, not linear text?Horizontal reading (OORT1-4) within a phase captures mainly inconsistencies.Vertical reading (OORT5-7) across phases captures mainly omissions and wrong facts.Easy to spot simple defects,what about deeper comments wrt. architecture and maintenance?Tried at Maryland and NTNU.Now emphasis on simple defects. Need to refine and extend before industrial use.

  • 4a. Knowledge Management, Experience BasesSystematize own experiences as reusable knowledge.Problems:Few, representative observationsFew hard data, mix of quantitative/qualitative dataReference point/baseline often missingVulnerable to short-time project pressure, even cancellations -- as in SPI and reuseLearning organization: apply methods from Knowledge Engineering and/or Social Sciences?Novel investment models, reward systems?

  • 4b. NASA Experience Factory

  • 4c. The Knowledge Spiral

  • 4d. Externalization and InternalizationKnowledge: tacit (work skills) and explicit (written).The Knowledge Spiral [Nonaka90]: 1. Externalization (tacit => formal): write post-mortem. 2. Combination (formal => more formal): generalize post-mortems. 3. Internalization (formal => tacit): attending a course. 4. Socialization (tacit => tacit): integrate in daily work.Challenge: achieve 3.-4. (real changes), not only 1.-2. (piles of models or lessons-learned); similar in reuse -- avoid white elephants.

  • 4e. Attitudes to quality system routines, study of 23 sw people in 5 companies, NTNU student project Spring 1999Are routines an effective medium for knowledge transfer?

    The degree of involvement in the development of routines:

    13 developers

    10 managers

    No

    Both

    Yes

    No

    Both

    Yes

    6

    7

    0

    1

    2

    7

    Degree of involvement

    Who

    Low

    High

    Developers

    9

    4

    Managers

    3

    7

  • 4f. Four success factors for Software Experience BasesBased on eight studies, half in Norway(presented at ICSE2000, PROFES2000): F1: Stability -- for organization and local champion. F2: Cultural changes -- becoming a learning organization, sharing in an egoless way, rewards? F3: Business relevance -- no fancy database/UI etc., integrate in work processes using normal documents. F4: Incremental approach -- no big bang, e.g. start slowly with low-key web system?=> Treat accumulated knowledge as other investments!

  • 5a. General SPI observationsUsual assumption: Quality(Process) Quality(Product).

    Choose proper SPI framework, strategy, and plan -- given:Process refinement (TQM: stable, large companies) vs. product innovation (RUP, DSDM: Internet startups)Disciplined work (testing) vs. creative work (design).Procurer-oriented (CMM for risk reduction) vs. customer-oriented (TQM for product quality).

    Have we given the wrong medicine? Ex. Nokia at CMM level 1, Motorola at level 5: who is best?

  • 5b. SPI framework selectionDifferent emphasis in current SPI frameworks -- TQM, CMM, BOOTSTRAP, SPICE (ISO/IEC 15504), ISO 9001, EQM, QIP, SPIQ.How select the right one?Proposed NTNU taxonomy for SPI framework characterization, with five main categories: GeneralProcessOrganizationQualityResult

  • Category

    Characteristic

    CMM v1.1

    EF/QIP/GQM

    General

    Geographic Origin/ Spread

    U.S./World

    U.S./World

    Scientific Origin

    TQM, SPC

    Partly TQM

    Development/ Stability

    Since 1986

    Since 1976

    Popularity

    Top (esp. in U.S.)

    Medium

    Software Specific

    Yes

    Yes

    Prescriptive/ Descriptive

    Both

    Descriptive

    Adaptability

    Limited

    Yes

    Process

    Assessment

    Org. maturity

    None

    Assessor

    Internal and external

    NA

    Process Improvement Method

    IDEAL

    QIP

    Improvement Initiation

    Top-down

    Iterative bottom-up

    Improvement Focus

    Management processes

    Experience reuse

    Analysis Techniques

    Assessment questionnaires

    GQM

    Organization

    Actors/Roles/ Stakeholders

    Management

    Experience factory, project organization

    Organization Size

    Large

    All

    Coherence

    Internal

    Internal

    Quality

    Quality Perspective

    Management

    All

    Progression

    Staged

    Continuous

    Causal Relation

    F(Key process areas) ( F(Maturity level) ( Q(Process) ( Q(Product)

    F(Experience reuse) ( Q(Process) ( Q(Product)

    Comparative

    Yes, maturity level

    No

    Result

    Goal

    Process improvement, supplier capability determination

    Organization specific

    Process Artifacts

    Process documentation, assessment result

    Experience packages, GQM models

    Certification

    No

    No

    Implementation Cost

    -

    -

    Validation

    Surveys and case studies

    Experimental and case studies

  • Category

    Characteristic

    CMM v1.1

    EF/QIP/GQM

    Quality

    Quality Perspective

    Management

    All

    Progression

    Staged

    Continuous

    Causal Relation

    F(Key process areas) ( F(Maturity level) ( Q(Process) ( Q(Product)

    F(Experience reuse) ( Q(Process) ( Q(Product)

    Comparative

    Yes, maturity level

    No

    Result

    Goal

    Process improvement, supplier capability determination

    Organization specific

    Process Artifacts

    Process documentation, assessment result

    Experience packages, GQM models

    Certification

    No

    No

    Implementation Cost

    -

    -

    Validation

    Surveys and case studies

    Experimental and case studies

  • 5c. Causal Relations in SPI FrameworksProcess quality difficult to determine:Quality indicatorsMulti-factor problem:F(Comparison method) F(SPI framework) F(Quality indicator) Quality(Process) Quality(Product) Comparison method may influence quality:

    SPI Framework

    Causal Relation

    TQM

    Not applicable

    CMM

    F(Key process areas) ( F(Maturity level) ( Quality(Process) ( Quality(Product)

    ISO 9001

    F(Quality elements) ( F(Certification) ( Quality(Process) ( Quality(Product)

    ISO/IEC 15504 (SPICE)

    F(Process attributes) ( F(Capability level) ( Quality(Process) ( Quality(Product)

    QIP/GQM/EF

    F(Experience reuse) ( Quality(Process) ( Quality(Product)

    SPIQ

    F(Experience reuse) ( Quality(Process) ( Quality(Product)

  • Proposed by Ifi/UiO (Dag Sjberg) and IDI/NTNU (Reidar Conradi), July 1999Accepted part of SFF-Fornebu, per Nov. 2000Decentralized: Fornebu, Oslo, TrondheimBudget of 16 mill. NOK per year (?).25 teachers, researchers and PhD students; 30-40 MSc students.6a. SFF-Fornebu: Center of Excellence in Software Engineeering (Senter for Fremragende Forskning)

  • 6b. SFF and cooperating partnersSFFSE (UiO/NTNU)SINTEF

    DnV, Telenor, ...

    > 20 Norwegian companies, partly in PROFIT/DAIMNTNU T.heimIntl contacts

    ISERN network

    Other projects

  • 6c. SFF- Scientific Profile

    Six themes:Research method: Model construction and subsequent validationin industry, among students and through international cooperation.

    Theme

    Responsible(s)

    Object-oriented development and maintenance

    D . Sjberg

    Incremental and component-based development

    R. Conradi, D . Sjberg

    Methods to achieve quality of Web-based, distributed systems

    L. Jaccheri

    Estimation, planning and risk evaluation of software projects

    M. Jrgensen

    General software process improvement and quality work

    R. Conradi, T. Skramstad

    Empirical software engineering

    M. Jrgensen, T. Stlhane

  • 6d. SFF: Industry is our lab!Both IDI, NTNU and Ifi, UiO has had an industrial focus over some time. We expect that SFF-Fornebu will offer even better possibilities for industrial cooperation.Since 1993 we have published the following published papers, 1/3 based on industrial cooperation:

    Institution

    Total number of papers

    Papers with intl colleagues

    Papers with natl/industrial colleagues

    NTNU

    150

    30

    10

    UiO

    60

    17

    13

  • 6e. SFF: Published papers from industrial coop., June 2000

  • 6f. SFF: Intl cooperating partners, with some candidates for guest researchers etc.1.Professor Chunnian Liu, dr.ing. (NTH 1983). Beijing Polytechnic University, PBR China, Area: software engineering, process support, distributed systems.2.Professor Alfonso Fuggetta, Politecnico di Milano, Italy, Area: software engineering, software architecture, feature engineering, middleware.3.Professor dr. Claes Wohlin, Telecom, Lund Techn. Univ., Sweden, Area: software engineering, testing, requirement analysis, software process improvement (SPI), metrics.4.Professor dr. Hans-Dieter Rombach, Univ. Kaiserslautern and Fraunhofer Inst. for Experimental Software Engineering (IESE), dir. at IESE. IESE areas: SPI and software quality, reuse, component-based sw.eng., metrics, empirical studies, experience bases.5.Prof.s dr.s Victor R. Basili and Marvin Zelkowitz, Univ. of Maryland, with a sister institute of IESE in software engineering (FC-MD). Area: SPI and software quality, inspection techniques, COTS, experience bases, metrics, empirical studies.6.Associate Professor Lionel C. Briand, Ph.D. (Paris, France), Carleton Univ. (Ottawa), Area: Inspections and testing in OO software. Software quality assurance and control. Project planning and risk analysis. Technology evaluation, Experimental SW engineering.7.Professor Ray Welland, Head of Computing Science Department, University of Glasgow, Area: software engineering, Web application development, software tools, design methods.8.Professor Malcolm Atkinson, University of Glasgow, Area: Persistent programming, language design, (distributed) information systems, software engineering.

  • 7. Empirical Software EngineeringNot yet a mature scientific/engineering discipline. Recent criticism that we are a stone-age field, with scant scientific methods [Tichy98] [Zelkowitz98].Much less practical validation in our written scientific papers than in other science/engineering fields -- ghostware both in SW development and research?But very fast moving technology and markets, no time to stabilize: how (when) to measure, assess, reflect, and learn? how (when) to practice SPI? Need tradition and training in empirical methods: => Empirical Software Engineering!

  • 8a. Future SPI challengesSPI frameworks must allow flexible strategies:both refinement and innovationboth traditional technologies and network ones both evolutionary and revolutionary changes ( size) Need systematic, empirical approach.SPI cost/benefits: need novel market-value and amorization models, balance short/long term aspects.SPI assumes cultural changes -- need expertise from social sciences. SPI is about learning -- not control as in QA.Have soft start with Experience Bases, e.g. on web.

  • 8b. PROFIT national SPI project 2000-02Emphasis on three factors: 1. SPI under change/uncertainty 2. Knowledge management and learning organizations 3. Novel software technologiesThese factors need to be combined!Cooperation between research (SINTEF, NTNU, UiO) and ca. 10 companies.9 ongoing pilot projects in SPI.Technology clusters between companies for:Experience bases for better estimation SPI in dot.com companiesModel-based development and process control

  • 8c. On Quality and ChangeHalf of the IT products/services five years from now do not yet exist, and neither do the companies.

    The worlds three biggest companies -- Cicso, MicroSoft, Oracle -- are all less than 25 years old.

    The only constant thing is change-- freely after Benjamin Disraeli.

    Quality: everybody wants it, but impossible to define, and futile to measure -- freely after Barbara Kitchenham.

    SPI challenge: How to manage change and quality?