university of southern california center for software engineering cse usc 3/22/01 ©usc-cse 1...
TRANSCRIPT
3/22/01 ©USC-CSE 1
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Developing and Integrating Software/System Product, Process, Property, and Success Models
Barry Boehm, USC-CSE
TAMU, SMU Presentations
March 22, 23, 2001
http://sunset.usc.edu
3/22/01 ©USC-CSE 2
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline• Developing Software/System Models at USC-
CSE– Process Models
– Product Models
– Property Models
– Success Models
• Integrating Software/System Models
• Conclusions
• Further information
3/22/01 ©USC-CSE 3
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Process Model Research at USC-CSE• Spiral Model Extensions
– WinWin spiral model– Life cycle anchor points
• WinWin Negotiation Model– IPT, JAD guidelines– Groupware support tools
• Object-oriented processes with COTS, reuse
• RAD-SAIV process model • System dynamics models
3/22/01 ©USC-CSE 4
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Spiral Model Experience
• Where do objectives, constraints, alternatives come from?
- Win Win extensions
• Lack of intermediate milestones
- Anchor Points: LCO, LCA, IOC
- Concurrent-engineering spirals between anchor points
The WinWin Spiral Model
2. Identify Stakeholders’win conditions
1. Identify next-levelStakeholders
Reconcile win conditions. Establishnext level objectives,constraints, alternatives
3.
Evaluate product and process alternatives.Resolve Risks
4.
Define next level of product andprocess - including partitions
5.
Validate productand processdefinitions
6.
Review, commitment7.
Win-Win Extensions
OriginalSpiral
3/22/01 ©USC-CSE 5
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Life Cycle Anchor Points• Common System/Software stakeholder commitment
points– Defined in concert with Government, industry affiliates
– Coordinated with Rational’s Unified Software Development Process
• Life Cycle Objectives (LCO)– Stakeholders’ commitment to support system architecting
– Like getting engaged
• Life Cycle Architecture (LCA)– Stakeholders’ commitment to support full life cycle
– Like getting married
• Initial Operational Capability (IOC)– Stakeholders’ commitment to support operations
– Like having your first child
3/22/01 ©USC-CSE 6
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Win Win Spiral Anchor Points(Risk-driven level of detail for each element)
*WWWWWHH: Why, What, When, Who, Where, How, How Much
Milestone Element Life Cycle Objectives (LCO) Life Cycle Architecture (LCA)
Definition of OperationalConcept
• Top-level system objectives and scope - System boundary - Environment parameters and assumptions - Evolution parameters• Operational concept - Operations and maintenance scenarios and parameters - Organizational life-cycle responsibilities (stakeholders)
• Elaboration of system objectives and scope of increment• Elaboration of operational concept by increment
• Top-level functions, interfaces, quality attribute levels, including: - Growth vectors and priorities - Prototypes• Stakeholders’ concurrence on essentials
• Elaboration of functions, interfaces, quality attributes, and prototypes by increment - Identification of TBD’s( (to-be-determined items)• Stakeholders’ concurrence on their priority concerns
• Top-level definition of at least one feasible architecture - Physical and logical elements and relationships - Choices of COTS and reusable software elements• Identification of infeasible architecture options
• Choice of architecture and elaboration by increment - Physical and logical components, connectors, configurations, constraints - COTS, reuse choices - Domain-architecture and architectural style choices• Architecture evolution parameters
• Elaboration of WWWWWHH* for Initial Operational Capability (IOC) - Partial elaboration, identification of key TBD’s for later increments
• Assurance of consistency among elements above• All major risks resolved or covered by risk management plan
• Identification of life-cycle stakeholders - Users, customers, developers, maintainers, interoperators, general public, others• Identification of life-cycle process model - Top-level stages, increments• Top-level WWWWWHH* by stage
• Assurance of consistency among elements above - via analysis, measurement, prototyping, simulation, etc. - Business case analysis for requirements, feasible architectures
Definition of SystemRequirements
Definition of Systemand SoftwareArchitecture
Definition of Life-Cycle Plan
FeasibilityRationale
System Prototype(s) • Exercise key usage scenarios• Resolve critical risks
• Exercise range of usage scenarios• Resolve major outstanding risks
3/22/01 ©USC-CSE 7
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Anchor Points and Rational USDP PhasesAnchor Points and Rational USDP Phases
EngineeringStage
Manufacturing Stage
Inception Elaboration ConstructionTransition
FeasibilityIterations
ArchitectureIterations
Usable Iterations
Product Releases
Management
REQ
DES
IMP
DEP
Management
REQ
DES
IMP
DEP
Management
REQ
DES
IMP
DEP
Management
REQ
DES
IMP
DEP
LCO LCA IOC
RATIONALS o f t w a r e C o r p o r a t i o n
3/22/01 ©USC-CSE 8
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
WinWin Negotiation ModelWinWin Negotiation ModelWin Condition
Rationale
Attachments
Agreement
Rationale
Attachments
Option
Rationale
Attachments
IssueRationale
Attachments
Involves
Addresses
Adopts
Covers
•Win-Win equilibrium–All Win Conditions covered by Agreements–No outstanding Issues
3/22/01 ©USC-CSE 9
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
EasyWinWin Activities
Now a commercial product from
GroupSystems.com
Activity “ThinkLet” Tool Elaborate domain
taxonomy
Could-be/ Should-be
GroupOutliner
Brainstorm stakeholder interests
Free Brainstorming
Electronic Brainstorming
Converge on win conditions
FastFocus Categorizer
Capture domain language
TermCapture Topic Commenter
Prioritize win conditions
MultiCriteria Alternative Analysis
Elaborate Conflicts, Constraints, Issues
CrowBar, MultiPass
GroupOutliner
Elaborate Options
MultiPass GroupOutliner
Negotiate Agreements
MultiPass GroupOutliner
3/22/01 ©USC-CSE 10
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Converge on Win Conditions
FastFocus
3/22/01 ©USC-CSE 11
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Product Model Research at USC-CSE
• Guidelines for LCO, LCA product deliverables– Operational Concept Description
Shared vision, domain model, use case scenarios
– Requirements Description Priorities, evolution requirements, WinWin traceability
– Architecture description Extended UML, COTS integration
– Feasibility Rationale Assurance of consistency, feasibility, risk resolution
• Architecture Research
3/22/01 ©USC-CSE 12
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Architecture Research
• SAAGE (Software Architecture Analysis, Generation, and Evolution)
• Architecture style and view integration – UML view integration
3/22/01 ©USC-CSE 13
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
• Research focus:– style-based application design and implementation– component- and connector-based architectural composition– architecture modeling and analysis– architecture-based software evolution– consistent refinement of architecture into design– system generation and configuration management
• Supported by an integrated toolset composed of in- house and third-party tools• Experimental application to mobile wireless networks
Software Architecture, Analysis, Generation, and Evolution(SAAGE)
3/22/01 ©USC-CSE 14
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
SAAGE OverviewSAAGE Overview
• Integrated environment for transforming C2-style architectures into UML
3/22/01 ©USC-CSE 15
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
View Integration Model (UML/Analyzer Tool)
• implements generic view integration model• describes and identifies causes of architectural and design mismatches across UML views• is integrated with Rational Rose®• complements pure view comparison with transformation (abstraction, translation, and unification) and mapping techniques• currently supports class, object, sequence, collaboration, state, and C2 diagrams• uses mismatch rules and transformation rules
3/22/01 ©USC-CSE 16
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Consistency between Architecture and Design
C2 Architecture(represented in UML)
aTruck aShipaAirplane
availableVehicleCollection
theAWT
aVehiceDialogaWarehouseDialog
aPortDialog
aRouterDialog
aRouteCollection
aVehicle
theVehicleCollection
theCargoRouter
theRouter
theTim eNeeded
aRoute
aDeficiency
theWarehouseCollection
aNavPoint
theStorage
RefrigeratedStorage
RegularStorage
availableGoods
aPortCollection
aLocation
theGoods
aWarehouse
aSurplus
aPort
VehicleDel iveryPort
CargoRouter
RouterConn
GraphicsBinding : GraphicsBinding
GraphicsConn
Warehouse
ClockConn
Clock : Clock
10: notification9: notification
5: request
3: request4: request
2: notification
1: request
7: request
6: notification
8: request
Class Design(UML class diagram)
Vehicle
CargoRouter
WarehouseDeliveryPort
GraphicsBinding
Vehicle DeliveryPort
CargoRouter
GraphicsConn
Warehouse
RouterConn
GraphicsBinding
Automatically derived Views
3/22/01 ©USC-CSE 17
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Property Model Research at USC-CSE
• Constructive Cost Model (COCOMO) II• COCOMO II Extensions
– COCOTS: COTS Integration– COQUALMO: Delivered Defect Density– CORADMO: Rapid Development Cost and
Schedule– COPSEMO: Phase/Activity Distribution– COPROMO: Productivity Improvement
Analysis– COSYMO: System Engineering
• Property Tradeoff Assistance: QARCC, S-COST
3/22/01 ©USC-CSE 18
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Feasibility
Concept ofOperation
Rqts.Spec.
Plansand
Rqts.
ProductDesign
ProductDesignSpec.
DetailDesignSpec.
DetailDesign
Devel.and Test
AcceptedSoftware
Phases and Milestones
RelativeSize Range x
4x
2x
1.25x
1.5x
0.25x
0.5x ApplicationsComposition
(3 parameters)
Early Design(13 parameters)
Post-Architecture(23 parameters)0.67x
0.8x
COCOMO II Model Stages
3/22/01 ©USC-CSE 19
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Early Design and Post-Architecture Models
FactorsScaleProcessSizeEffort
sMultiplier
tEnvironmen
Environment: Product, Platform, People, Project Factors
Size: Nonlinear reuse and volatility effects
Process: Constraint, Risk/Architecture, Team, Maturity Factors
FactorsScaleProcess EffortSchedule Multiplier
3/22/01 ©USC-CSE 20
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
COCOMO II. 2000 Productivity Ranges
Productivity Range
1 1.2 1.4 1.6 1.8 2 2.2 2.4
Product Complexity (CPLX)
Analyst Capability (ACAP)
Programmer Capability (PCAP)
Time Constraint (TIME)
Personnel Continuity (PCON)
Required Software Reliability (RELY)
Documentation Match to Life Cycle Needs (DOCU)
Multi-Site Development (SITE)
Applications Experience (AEXP)
Platform Volatility (PVOL)
Use of Software Tools (TOOL)
Storage Constraint (STOR)
Process Maturity (PMAT)
Language and Tools Experience (LTEX)
Required Development Schedule (SCED)
Data Base Size (DATA)
Platform Experience (PEXP)
Architecture and Risk Resolution (RESL)
Precedentedness (PREC)
Develop for Reuse (RUSE)
Team Cohesion (TEAM)
Development Flexibility (FLEX)
Scale Factor Ranges: 10, 100, 1000 KSLOC
3/22/01 ©USC-CSE 21
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Percentage of sample projects within 30% of actuals
-Without and with calibration to data source
COCOMO II Estimation Accuracy:
COCOMO81 COCOMOII.2000COCOMOII.1997
# Projects 63 83 161
Effort
Schedule
81% 52%64%
75%80%
61%62%
72%81%
65%
3/22/01 ©USC-CSE 22
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Process Maturity (PMAT) Effects
• Clark Ph.D. dissertation (112 projects)– Research model: 12-23% per level– COCOMO II subsets: 9-29% per level
• COCOMO II.1999 (161 projects)– 4-11% per level
• PMAT positive contribution is statistically significant
– Effort reduction per maturity level, 100 KDSI project
– Normalized for effects of other variables
3/22/01 ©USC-CSE 23
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
COCOMO vs. COCOTS Cost Sources
TIME
ST
AF
FIN
G
3/22/01 ©USC-CSE 24
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Integrated COQUALMOCOCOMO II
COQUALMO
Defect Removal Model
Software Size estimate
Software product, process, computer and personnel attributes
Defect removal capability levels
Software development effort, cost and schedule estimate
Number of residual defects
Defect Introduction
Model
Defect density per unit of size
3/22/01 ©USC-CSE 25
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
COCOMO II RAD Extension(CORADMO)
COCOMO II cost drivers
(except SCED)
Language Level,
experience,...
COCOMO II
Stage Distributions (COPSEMO)
CORADMO
Baseline effort,
schedule
Effort,
schedule by stage
RAD effort, schedule by stage
RVHL
DPRS
CLAB
RESL
PPOS
RCAP
3/22/01 ©USC-CSE 26
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
1. Introduction 2. Model Definition 3. Application Examples 4. Calibration 5. Emerging Extensions 6. Future Trends Appendices*
– Assumptions, Data Forms, User’s Manual, CD Content
CD: video tutorials, USC COCOMO II 2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms
COCOMO II Book Table of Contents-Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece
Software Cost Estimation with COCOMO II, Prentice Hall, 2000
3/22/01 ©USC-CSE 27
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Success Model Research at USC-CSE
• Stakeholder Win-Win (Theory W)– Win win negotiation model and tools
– Two Cultures and expectations management
• Success Model profiles and model clashes
3/22/01 ©USC-CSE 28
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Unmet Expectations Problems• LCO success condition
– Describes at least one feasible architecture – Satisfying requirements within cost/schedule/resource
constraints– Viable cost-effective business case– Stakeholder concurrence on key system parameters
• Projects That Failed LCO Criteria- 1996: 4 out of 16 (25%)
- 1997: 4 out of 15 (27%)
why?
3/22/01 ©USC-CSE 29
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Requirements and Expectations: Domain Model Clashes
•Easy/hard things for software people
“If you can do queries with all those ands, ors, synonyms, data ranges, etc., it should be easy to do natural language queries.”
“If you can scan the document and digitize the text, it should be easy to digitize the figures too.”
•Easy/hard things for librarians
“It was nice that you could add this access feature, but it overly (centralizes, decentralizes) control of our intellectual property rights.”
“It was nice that you could extend the system to serve the medical people, but they haven’t agreed to live with our usage guidelines.”
3/22/01 ©USC-CSE 30
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
1998 Simplifier/Complicator Experiment
• Identify application simplifiers and complicators– For each digital library sub-domain– For both developers and clients
• Provide with explanations to developers and clients– Highlight relation to risk management
• Homework exercise to analyze simplifiers and complicators– For two of upcoming digital library projects
• Evaluate effect on LCO review failure rate
3/22/01 ©USC-CSE 31
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Example S&C’s
1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 31, 32, 35, 36, 37, 39
Type ofApplication
Simple Block Diagram Examples Simplifiers Complicators
MultimediaArchive
· Use standardquery languages
· Use standard or COTS searchengine
· Uniform mediaformats
· Natural languageprocessing
· Automatedcataloging or indexing
· Digitizing large archives
· Digitizingcomplex or
fragile artifacts
· Automatedannotation/description/ or meaningsto digital assets
· Integration oflegacy systems
MM assetinfo
Catalog
MMArchive
query
MM assetupdate
query updatenotification
· Rapid access tolarge Archives
· Access toheterogeneousmedia collections
3/22/01 ©USC-CSE 32
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
The Results• Projects That Failed LCO Criteria
- 1996: 4 out of 16 (25%)
- 1997: 4 out of 15 (27%)
- 1998: 1 out of 19 (5%)
- 1999: 1 out of 22 (5%)
- 2000: 2 out of 20 (10%)
• Post-1997 failures due to non-S&C causes
– Team cohesion, client outages
3/22/01 ©USC-CSE 33
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
The Model-Clash Spider Web: Master Net
3/22/01 ©USC-CSE 34
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline• Developing Software/System Models at USC-CSE
– Process, Product, Property, and Success Models
• Integrating Software/System Models– Motivation: Model Clashes– Model-Based (System) Architecting and Software
Engineering (MBASE)
• Examples of Successful MBASE Use – Over 50 Digital Library projects– TRW CCPDS-R project
• Early Adopters
• Conclusions
• Further information
3/22/01 ©USC-CSE 35
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Understanding the Tar Pit: Model Clashes
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
• Model Clash: An incompatibility among the underlying assumptions of a set of models– Often unrecognized
– Produces conflicts, confusion, mistrust, frustration, rework, throwaway systems
3/22/01 ©USC-CSE 36
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Examples of Model Clashes
• Design-to-schedule process, unprioritized requirements, and tightly-coupled architecture
• COTS-driven product and Waterfall process• Risk-based process and spec-based progress
payments• Evolutionary development without life-cycle
architecture• Golden Rule and stakeholder win-win• Spec-based process and IKIWISI success model
– I’ll know it when I see it
3/22/01 ©USC-CSE 37
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE Integration Framework
Process models
Life cycle anchorpoints
Risk managementKey practices
Success models
Business caseIKIWISI
Stakeholder win-win
Property modelsCost
SchedulePerformance
Reliability
Product models
Domain modelRequirementsArchitecture
CodeDocumentation
Planning and control
Milestone content
Evaluation andanalysis
Processentry/exitcriteria
Productevaluation
criteria
3/22/01 ©USC-CSE 38
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
IPM1
guideprogress in
selecting, and
serve andsatisfy
… reify …
… intermediate…Product Models
are refinements of
imposeconstraints
on
provideparameters
for
set context for
identify,prioritize
Stakeholders
Success Models
PropertyModels
ProcessModels
Domain/EnvironmentModels
ConceptualProduct Models
Reified ProductModels
IPMn
enable satisficing among
determine therelevance of
provideparameters
for
Provide evaluations for
reifying
WinWinSpiral
Process
Life Cycle
ArchitecturePackage
Plan inLCA
Package
MBASE Process Framework
3/22/01 ©USC-CSE 39
University of Southern CaliforniaCenter for Software EngineeringC S E
USCUniversity of Southern CaliforniaCenter for Software EngineeringC S E
USC
Success Models Drive Other Model Choices
Success Model
Key Stake-holders
Key Property Models
Process Model
Product Model
Demo agent-based E-commerce system at COMDEX in 9 months
Entrepreneurs, venture capitalists, customers
Schedule estimation
Design-to-schedule
Domain constrained by schedule; architected for ease in dropping features to meet schedule
Safe air traffic control system
Controllers, Govt. agencies, developers
Safety models
Initial spiral to risk-manage COTS, etc.; Final waterfall to verify safety provisions
Architected for fault tolerance, ease of safety verification
3/22/01 ©USC-CSE 40
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE Electronic Process Guide (1)
3/22/01 ©USC-CSE 41
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE Electronic Process Guide (2)
3/22/01 ©USC-CSE 42
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Outline• Developing Software/System Models at USC-CSE
– Process, Product, Property, and Success Models
• Integrating Software/System Models– Motivation: Model Clashes– Model-Based (System) Architecting and Software
Engineering (MBASE)
• Examples of Successful MBASE Use – Over 100 Digital Library projects– TRW CCPDS-R project
• MBASE, CeBASE, and CMMI
• Conclusions
• Further information
3/22/01 ©USC-CSE 43
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
The Challenge• 15 Digital Library Applications
– 2 sentence problem statements– Librarian clients
• 86 Graduate Students– 30% with industry experience– Largely unfamiliar with each other, Library ops.
* Develop LCA packages in 11 weeks• Re-form teams from 30 continuing students* Develop IOC packages in 12 more weeks
– Including 2-week beta test and transition
3/22/01 ©USC-CSE 44
University of Southern CaliforniaCenter for Software EngineeringC S E
USCUniversity of Southern CaliforniaCenter for Software EngineeringC S E
USC
Problem Statement #4: Medieval Manuscripts
Ruth Wallach, Reference Center, Doheny Memorial Library
I am interested in the problem of scanning medieval manuscripts in such a way that a
researcher would be able to both read the content, but also study the scribe’s hand, special markings, etc. A related issue is that of
transmitting such images over the network.
3/22/01 ©USC-CSE 45
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE Model Integration: LCO Stage
determines content of
Domain Model
WinWin Taxonomy
Basic Conceptof Operation
Frequent Risks, Architecture Options
Stakeholders,Primary win conditions
WinWin Negotiation
Model
IKIWISI Prototypes,Properties Models, Process Framework
EnvironmentModels
WinWin Agreements
ViableArchitecture
Options
Updated Conceptof Operation
Life Cycle Planelements
Outstanding LCO risks
RequirementsDescription
LCO Rationale
Life Cycle Objectives (LCO) Package
Anchor PointModel
determinesidentifiesidentifiesdetermines
serves as table of contents for
situates exercise exercise focususe of
focus use of determines
guidesdetermination of
validate
inputs for
provides
initialize adopt identify identify
update update
achieveiterate to feasibility, consistency
determines exitcriteria for validates readiness of
initializes
3/22/01 ©USC-CSE 46
University of Southern CaliforniaCenter for Software EngineeringC S E
USCUniversity of Southern CaliforniaCenter for Software EngineeringC S E
USC
3/22/01 ©USC-CSE 47
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE Laboratory•15 software engineering projects/year
- 5-person USC Digital Library applications
•Rapidly developing successful applications- Multimedia, virtual assistants, data acquisition
•Integrating models and tools- DARPA-EDCS architecture and WinWin tools- Rational Rose, Unified Modeling Language
•Rapidly improving artifact integration- 1996 integrated specs, plans: 160 pages- 1997 integrated specs, plans: 110 pages
•Results transitioning to early adopters
•Ultimate goal: Model-integrated SW Engr. agents
3/22/01 ©USC-CSE 48
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Case Study: CCPDS-R Project Overview
Characteristic CCPDS-RDomain Ground based C3 developmentSize/language 1.15M SLOC AdaAverage number of people 75Schedule 75 monthsProcess/standards DOD-STD-2167A Iterative development
Rational hostDEC hostDEC VMS targets
Contractor TRWCustomer USAFCurrent status Delivered On-budget, On-schedule
Environment
RATIONALS o f t w a r e C o r p o r a t I o n
3/22/01 ©USC-CSE 49
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
CCPDS-R MBASE Models• Success Models
– Reinterpreted DOD-STD-2167a; users involved– Award fee flowdown to performers
• Product Models– Domain model and architecture– Message-passing middleware (UNAS)
• Process Models– Ada process model and toolset– Incremental builds; early delivery
• Property Models– COCOMO cost & schedule– UNAS - based performance modeling– Extensive progress and quality metrics tools
3/22/01 ©USC-CSE 50
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Common Subsystem Macroprocess
Development Life Cycle
ConstructionElaborationInception
Competitive design phase:•Architectural prototypes•Planning•Requirements analysis
Contract awardArchitecture baseline under change control
Early delivery of “alpha” capability to user
Architecture Iterations Release Iterations
SSR IPDR PDR CDR
0 5 10 15 20 25
RATIONALS o f t w a r e C o r p o r a t I o n
(LCA)
(LCO)
3/22/01 ©USC-CSE 51
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
MBASE, CeBASE, and CMMI• Model-Based (System) Architecting and Software Engineering
(MBASE)– Extension of WinWin Spiral Model– Avoids process/product.property/success model clashes– Provides project-oriented guidelines
• Center for Empirically-Based Software Engineering (CeBASE)– Led by USC, UMaryland– Sponsored by NSF, others– Empirical software data collection and analysis– Integrates MBASE, Experience Factory, GMQM into CeBASE Method
• Goal-Model-Question-Metric Method• Integrated organization/portfolio/project guidelines
• CeBASE Method implements Integrated Capability Maturity Model (CMMI) and more– Parts of People CMM, but light on Acquisition CMM
3/22/01 ©USC-CSE 52
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Center for Empirically-Based Software Engineering (CeBASE) Strategic Vision
Strategic FrameworkStrategic Process: Experience FactoryTailoring G/L: Goal-Model-Question-MetricTactical Process: Model Integration (MBASE); WinWin Spiral
Strategic Framework
Empirical MethodsQuantitative Qualitative
Experimental Ethnographic
Observational Analysis Surveys, Assessments
Parametric Models Critical Success Factors
Dynamic Models Root Cause Analysis
Pareto 80-20 Relationships
Experience Base (Context; Results)Project, Context Attributes
Empirical Results; References
Implications and Recommended Practices
Experience Feedback Comments
• Initial foci: COTS-based systems; Defect reduction
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Integrated GMQM-MBASE Experience Factory
•Org. Value Propositions (VP’s)-Stakeholder values
•Current situation w.r.t. VP’s•Improvement Goals, Priorities•Global Scope, Results Chain•Value/business case models
Org-Portfolio Shared Vision
•Strategy elements•Evaluation criteria/questions•Improvement plans
-Progress metrics-Experience base
Org. Strategic Plans
Organization/Portfolio:EF-GMQM
•Monitor environment-Update models
•Implement plans•Evaluate progress
-w.r.t. goals, models•Determine, apply
corrective actions•Update experience base
Org. Monitoring & Control
Monitoring& ControlContext
•Project Value Propositions-Stakeholder values
•Current situation w.r.t. VP’s•Improvement Goals, Priorities•Project Scope, Results Chain•Value/business case models
Project Shared Vision
Project:MBASE
Planningcontext
Plan/Goal mismatches
Project Plans
PlanningContext
Initiatives
Progress/Plan/Goal mismatches-shortfalls, opportunities,
risks
Projectvision,goals
Shortfalls,opportunities,
risksScopingcontext
Shortfalls,opportunities,
risks
PlanningContext
•Monitor environment-Update models
•Implement plans•Evaluate progress
-w.r.t. goals, models, plans
•Determine, apply corrective actions•Update experience base
Proj. Monitoring & Control
Monitoring& Controlcontext
Progress/Plan/goal mismatches-Shortfalls, opportunities, risks
Plan/goal mismatches
Monitoring& Controlcontext
Projectexperience,
progress w.r.t.plans, goals
LCO: Life Cycle ObjectivesLCA: Life Cycle ArchitectureIOC: Initial Operational CapabilityGMQM: Goal-Model-Question-Metric ParadigmMBASE: Model-Based (System) Architecting and Software Engineering
-Applies to organization’s and projects’ people, processes, and products
3/22/0153
©USC-CSE
•LCO/LCA Package-Ops concept, prototypes, rqts, architecture, LCplan, rationale
•IOC/Transition/Support Package-Design, increment plans, quality plans, T/S plans
•Evaluation criteria/questions•Progress metrics
3/22/01 ©USC-CSE 54
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
COCOMO II Experience Factory: IV
Corporate parameters:tools, processes, reuse
Ok?
Rescope
COCOMO 2.0
RecalibrateCOCOMO 2.0
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
EvaluateCorporate
SWImprovement
Strategies
AccumulateCOCOMO 2.0
calibrationdata
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
ImprovedCorporate
Parameters
No
Yes
Cost,Sched,Risks
Cost, Sched,Quality drivers
No
Milestone plans,resources
3/22/01 ©USC-CSE 55
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
CeBASE Method Coverage of CMMI - I
• Process Management– Organizational Process Focus: 100+– Organizational Process Definition: 100+– Organizational Training: 100-– Organizational Process Performance: 100-– Organizational Innovation and Deployment: 100+
• Project Management– Project Planning: 100– Project Monitoring and Control: 100+– Supplier Agreement Management: 50-– Integrated Project Management: 100-– Risk Management: 100– Integrated Teaming: 100– Quantitative Project Management: 70-
3/22/01 ©USC-CSE 56
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
CeBASE Method Coverage of CMMI - II
• Engineering– Requirements Management: 100– Requirements Development: 100– Technical Solution: 60+– Product Integration: 70+– Verification: 70-– Validation: 80+
• Support– Configuration Management: 70-– Process and Product Quality Assurance: 70-– Measurement and Analysis: 100-– Decision Analysis and Resolution: 100-– Organizational Environment for Integration: 80-– Causal Analysis and Resolution: 100
3/22/01 ©USC-CSE 57
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Conclusions• Research on several classes of models
is synergetic– COCOMO II, WinWin, spiral extensions,
architecture research each helped by each other
• MBASE proving successful in avoiding model clashes
• USC-CSE Affiliate support has been essential
3/22/01 ©USC-CSE 58
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
• Commercial Industry (18)
– Automobile Club of Southern California, C-Bridge, Daimler Chrysler, EDS, Fidelity Group, Galorath, Group Systems.Com, Hughes, IBM, Lucent, Marotz, Microsoft, Motorola, Price Systems, Rational, Sun, Telcordia, Xerox
• Aerospace Industry (9)
– Boeing, Draper Labs, GDE Systems, Litton, Lockheed Martin, Northrop, Grumman, Raytheon, SAIC, TRW
• Government (3)
– FAA, US Army Research Labs, US Army TACOM
• FFRDC’s and Consortia (4)
– Aerospace, JPL, SEI, SPC
• International (1)
– Chung-Ang U. (Korea)
USC-CSE Affiliates (35)
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
Further InformationV. Basili, G. Caldeira, and H. Rombach, “The Experience Factory” and “The Goal Question Metric Approach,” in J. Marciniak (ed.), Encyclopedia of Software Engineering, Wiley, 1994.
B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000.
B. Boehm, D. Port, “Escaping the Software Tar Pit: Model Clashes and How to Avoid Them,” ACM Software Engineering Notes, January, 1999.
B. Boehm et al., “Using the Win Win Spiral Model: A Case Study,” IEEE Computer, July 1998, pp. 33-44.
R. van Solingen and E. Berghout, The Goal/Question/Metric Method, McGraw Hill, 1999.
COCOMO II, MBASE items : http://sunset.usc.edu
CeBASE items : http://www.cebase.org