DOC PREVIEW
UMD CMSC 735 - Building Parametric Models

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

9/29/03 USC-CSE 1Building Parametric ModelsBuilding Parametric ModelsBarry Boehm, [email protected] 2003September 29, 20039/29/03 USC-CSE 2Outline• Range of software engineering parametric models and forms • Goals: Model success criteria• 8-step model development process– Examples from COCOMO family of models• Conclusions9/29/03 USC-CSE 3Range of SE Parametric Models• Outcome = f (Outcome-driver parameters)• Most frequent outcome families– Throughput, response time; workload– Reliability, defect density; usage– Project cost, schedule; sizing– Other costs: facilities, equipment, services, licenses, installation, training– Benefits: sales, profits, operational savings– Return on investment = (Benefits-Costs)/Costs9/29/03 USC-CSE 4Parametric Model Forms• Analogy: Outcome = f(previous outcome, differences)– Example: yesterday’s weather• Unit Cost: Outcome = f(unit costs, unit quantities)– Example: computing equipment• Activity-Based: Outcome = f(activity levels, durations)– Examples: operational cost savings, training costs• Relationship-Based: Outcome = f(parametric relationships)– Examples: queuing models, size & productivity cost models9/29/03 USC-CSE 5Goals: Model Success Criteria• Scope: Covers desired range of situations?• Granularity: Level of detail sufficient for needs?• Accuracy: Estimates close to actuals?• Objectivity: Inputs repeatable across estimators?• Calibratability: Sufficient calibration data available?• Contructiveness: Helps to understand job to be done?• Ease of use: Parameters easy to understand, specify?• Prospectiveness: Parameters values knowable early?• Parsimony: Avoids unnecessary parameters, features?• Stability: Small input changes mean small output changes?• Interoperability: Easy to compare with related models?9/29/03 USC-CSE 6Outline• Range of software engineering parametric models and forms • Goals: Model success criteria• 8-step model development process– Example from COCOMO family of models• Conclusions9/29/03 USC-CSE 7Determine Model NeedsStep 1USC-CSE Modeling MethodologyAnalyze existing literatureStep 2Perform Behavioral analysesStep 3Define relative significance,data, ratingsStep 4Perform expert-judgment Delphi assessment, formulate a priori modelStep 5Gather project dataStep 6Determine Bayesian A-Posteriori modelStep 7Gather more data; refine modelStep 8- concurrency and feedback implied9/29/03 USC-CSE 8Step 1: Determine Model Needs• Similar to software requirements determination– Identify success-critical stakeholders• Decision-makers, users, data providers– Identify their model needs (win conditions)– Identify their ability to provide inputs, calibration data– Negotiate best achievable (win-win) model capabilities• Prioritize capabilities for incremental development• Use Model Success Criteria as checklist9/29/03 USC-CSE 9Major Decision SituationsHelped by COCOMO II• Software investment decisions– When to develop, reuse, or purchase– What legacy software to modify or phase out• Setting project budgets and schedules• Negotiating cost/schedule/performance tradeoffs• Making software risk management decisions• Making software improvement decisions– Reuse, tools, process maturity, outsourcing9/29/03 USC-CSE 10Step 2: Analyze Existing Literature• Understand underlying phenomenology– Sources of cost, defects, etc.• Identify promising or unsuccessful model forms– Linear, discontinuous software cost models– Model forms may vary by source of cost, defects, etc.– Invalid assumptions (queuing models)• Identify most promising outcome-driver parameters9/29/03 USC-CSE 11Nonlinear Reuse Effects1.00.750.50.250.250.5 0.75 1.00.550.701.00.046Usual LinearAssumptionData on 2954NASA modules[Selby, 1988]Cost fractionFraction modified9/29/03 USC-CSE 12Reuse Cost Increment forSoftware UnderstandingVery Low Low Nom High Very HighStructure Very lowcohesion, highcoupling,spaghetti code.Moderately lowcohesion, highcoupling.Reasonablywell -structured;some weakareas.High cohesion,low coupling.Strongmodularity,informationhiding indata/controlstructures.ApplicationClarityNo matchbetweenprogram andapplicationworld views.Somecorrelationbetweenprogram andapplication .Moderatecorrelationbetweenprogram andapplication . Goodcorrelationbetweenprogram andapplication .Clear matchbetweenprogram andapplicationworld views.Self -DescriptivenessObscure code;documentationmissing,obscure orobsolete.Some codecommentary andheaders; someusefuldocumentation.Moderate levelof codecommentary,headers,documentation.Good codecommentaryand headers;usefuldocumentation;some weakareas.Self -descriptivecode;documentationup-to-date,well-organized,with designrationale.SU Increment toESLOC50 40 30 20 109/29/03 USC-CSE 13Step 3: Perform Behavioral Analysis• Behavior Differences: Required Reliability LevelsRating Rqts and Product Design Integration and TestVery Low •Little detail•Many TBDs•Little Verification•Minimal QA, CM, draft user manual, test plans•Minimal PDR•No test procedures•Many requirements untested•Minimal QA, CM•Minimal stress, off-nominal tests•Minimal as-built documentationVery High •Detailed verification, QA, CM, standards, PDR, documentaion•IV&V interface•Very detailed test plans, procedures•Very detailed test procedures, QA, CM, standards, documentaion•Very extensive stress, off-nominal tests•IV&V interface9/29/03 USC-CSE 14Determine Model NeedsStep 1USC-CSE Modeling MethodologyAnalyze existing literatureStep 2Perform Behavioral analysesStep 3Define relative significance,data, ratingsStep 4Perform expert-judgment Delphi assessment, formulate a priori modelStep 5Gather project dataStep 6Determine Bayesian A-Posteriori modelStep 7Gather more data; refine modelStep 8- concurrency and feedback implied9/29/03 USC-CSE 15Step 4: Relative Significance: COSYSMORate each factor H, M, or L depending on its relatively high, medium, or low influence on system engineering effort. Use an equal number of H’s, M’s, and L’s.N=63.02.52.31.51.71.71.51.52.72.73.02.01.52.01.3Application Factors__H___Requirements understanding_M - H_Architecture understanding_L - H_ Level of service rqts. criticality, difficulty_L - M_ Legacy transition complexity_L – M COTS assessment complexity_L - H_ Platform difficulty_L – M_Required business process reengineering______ TBD :Ops. concept understanding (N=H)______ TBDTeam Factors_L -


View Full Document

UMD CMSC 735 - Building Parametric Models

Download Building Parametric Models
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Building Parametric Models and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Building Parametric Models 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?