DOC PREVIEW
ODU CS 350 - Lecture Notes

This preview shows page 1-2-3 out of 8 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 350, slide set 10M. OverstreetOld Dominion UniversityFall 2005Reading TSP text, Ch 9, 10 Remember, you are supposed to haveread the chapter on your role from ch.11-15 And ch. 16, 17, and 18.Deadlines, Guidelines - 1 Project: due Monday, Dec. 12 Submit to [email protected] I must be able to determine who did what.Name (or names) of who did it must be includedwith each item. I must be able to determine what works.Include actual output whenever appropriate. See web site checklist section for acomplete list of due dates2Additional Form Additional form: submission checklist List of everything submitted• Based on submissions checklist on web When submitted Identification of who completed eachitem submitted (including forms)Project evaluation - 1 I must be able to determine: Who did what.• Put names in documents (code, forms, test reps, etc)• If no name, no credit What process steps were completed.• Will rely on forms For each module’s reviews and inspections• Will rely on forms What code actually works.• Will rely files produced during testing, so make surethese are included with modules How much testing was performed.• Will rely on test reportsProject evaluation - 2 Individual project grades determined as follows Your peer evaluations: 15% My impression of your contributions ±15% Forms: 35% Evidence of execution 30%• Note that without execution, some formsmust be incomplete Quality/completeness of materials 10% As I go through each group’s submissions, I willrecord what you’ve done. Your grade will bebased on that list3Project code suggestions Most errors come frommisunderstanding of requirements These types of errors should beidentified in inspections.Testing: selected casestudies Remember: hard to get this kind of data Magellan spacecraft (1989-1994) to Venus 22 KLOC – this is small 186 defects found in system test• 42 critical• only 1 critical defect found in 1st year of testing Project a success, but several software relatedemergencies Galileo spacecraft (1989 launch, 1995) Testing took 6 years Final 10 critical defects found after 288 weeks oftestingPSP/TSP approach Find most defects before integration testingduring: Reviews (requirements, HLD, DLD, test plans, code) Inspections Unit testing Each of these activities is expensive, but testingis worse TSP goal: use testing to confirm that code ishigh quality. May need to return low quality code for rework orscrapping Data shows strong relationship between defects foundin testing & defects found by customers4Build and integrationstrategies: big bang Build & test all pieces separatelythen put them all together at theend and see what happens Out of favor Debugging all pieces at the same time;harder to identify real causes ofproblems Industry experience: 10 defects/KLOC;• All-too-typical: system with 30,000 defectsB & I strategies: onesubsystem at a time Design system so that it can beimplemented in steps; each step useful First test minimal system After its components have been tested Add one component at a time Defects are more likely to come from newparts Not all systems admit to this approachB & I strategies: addclusters If system has components withdependencies among them, it maybe necessary to add clusters ofinteracting components5B & I strategies: top down Top-down integration Integrate top-level components first• With lower-level components stubbed as necessary May identify integration issues earlierthan other approaches I suggest this approach this project Write top level routines first when feasible.• It calls stubbed functions. As modules are available, they replacestubbed versionTypical testing goals Show system provides all specifiedfunctions Does what is supposed to do Show system meets stated quality goals MTBF, for example Show system works under stressfulconditions Doesn’t do “bad” things when other systems(e.g. power) fail, network overloads, disk full In reality, schedule/budget considerationsmay limit testing to most frequent orcritical behaviors onlyTest log includes: Date, start and end time of tests Name of tester Which tests were run What code & configuration was tested Number of defects found Test results Other pertinent information Special tools, system config., operator actions See sample test log, pg. 1726Documentation - 1 Probably needs another course Must write from perspective of user ofdocumentation Other programmers on team Future maintenance programmers Installers Managers Users Better to hire English majors to writedocumentation? Easier teach them the computing part than toteach technical geeks how to write well?Documentation - 2 Developers often do poor job Even when proofing, omissions (what you forgot to tellreader) are often undetected since writer knows them Student just finished MS thesis of software metrics ofopen-source code. Did not explain what KDSI meantuntil end of thesis! I missed it too! Guidelines: include Glossary to define special terms Detailed table of contents Detailed index Sections on• Error messages• Recovery procedures• Troubleshooting proceduresTesting script Covered in text7Postmortem Why? We’re still learning how to do this Organization goal: learn for this project toimprove next one We shouldn’t keep making the same mistakes Individual goal: make you a morevaluable employee Update your personal checklists, etc.Postmortem script We’ll skip; works better if 3 cycles Will discuss in class; be ready to tellme Where the process worked and where itdid not How did actual performance comparewith expected? Where did your team do well? Wherenot?PIP objectives While project is fresh, record good ideason process improvements Implicit goal: be skeptical about TSP asthe solution to all software problems Each organization and problem domainprobably has their unique problems; one sizedoes not fit all But a request: be tolerant. Learn fromothers experience; don't reject tooquickly8Peer evaluations Use form from text Includes your impression of who• had hardest role (% sum to 100)• had the most work (% sum to 100) You must use team member names and roles(unlike the form) Written


View Full Document

ODU CS 350 - Lecture Notes

Download Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?