DOC PREVIEW
Toronto CSC 302 - Lecture 15 - Verification and Validation

This preview shows page 1-2-3 out of 9 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 9 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 1Lecture 15:Verification and ValidationRefresher: definitions of V&VV&V strategiesIndependent V&VQuality AssuranceUniversity of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 2Refresher: V&VValidation:“Are we building the rightsystem?”Does our problem statementaccurately capture the realproblem?Did we account for the needs ofall the stakeholders?Verification:“Are we building the systemright?”Does our design meet the spec?Does our implementation meetthe spec?Does the delivered system dowhat we said it would do?Are our requirements modelsconsistent with one another?ProblemStatementImplementationStatementSystemValidationVerificationProblemSituation2University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 3VerificationTraditional approaches to verificationModel-based Verificationexperiment withthe program(testing)reason aboutthe program(static verification)inspect theprogram(reviews)do the use casessatisfy the requirements?(goal analysis)does the codecorrespond to the model?(consistency checking)does the class modelsatisfy the use cases?(robustness analysis)University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 6Basic Cross-Checks for UMLUse Case DiagramsDoes each use case have a user?Does each user have at least one use case?Is each use case documented?Using sequence diagrams or equivalentClass DiagramsDoes the class diagram capture all the classes mentioned in other diagrams?Does every class have methods to get/set its attributes?Sequence DiagramsIs each class in the class diagram?Can each message be sent?Is there an association connecting sender and receiver classes on the class diagram?Is there a method call in the sending class for each sent message?Is there a method call in the receiving class for each received message?3University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 7Understanding ValidationPrior Knowledge(e.g. customer feedback)Observe(what is wrong withthe current system?)Model(describe/explain theobserved problems)Design(invent a better system)Intervene(replace the old system)Note similarity withprocess of scientificinvestigation:Requirements models aretheories about the world;Designs are tests of thosetheoriesInitial hypothesesLook for anomalies - what can’tthe current theory explain?Create/refinea better theoryDesign experiments totest the new theoryCarry out theexperiments(manipulatethe variables)University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 8Validation techniquesPrior Knowledge(e.g. customer feedback)Observe(what is wrong withthe current system?)Model(describe/explain theobserved problems)Design(invent a better system)Intervene(replace the old system)Build aPrototypeGet usersto try it(what is wrong withthe prototype?)Analyzethe modelrun a modelchecking tool(what is wrong withthe model?)Inspectthe model4University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 9PrototypingPresentation Prototypesexplain, demonstrate and inform – then throw awaye.g. used for proof of concept; explaining design features; etc.Exploratory Prototypesused to determine problems, elicit needs, clarify goals, compare design optionsinformal, unstructured and thrown away.Breadboards or Experimental Prototypesexplore technical feasibility; test suitability of a technologyTypically no user/customer involvementEvolutionary(e.g. “operational prototypes”, “pilot systems”):development seen as continuous process of adapting the system“prototype” is an early deliverable, to be continually improved.University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 10Usability TestingReal users try out the system (or prototype)Choose representative tasksChoose representative usersObserve what problems they encounterHow many users?3-5 users gives best return on investment5University of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 11Model AnalysisVerification“Is the model well-formed?”Are the parts of the model consistent with one another?Validation:Animation of the model on small examplesFormal challenges:“if the model is correct then the following property should hold...”‘What if’ questions:reasoning about the consequences of particular requirements;reasoning about the effect of possible changes“will the system ever do the following...”State explorationE.g. use model checking to find traces that satisfy some propertyUniversity of TorontoDepartment of Computer Science© 2008 Steve Easterbrook. This presentation is available free for non-commercial use with attribution under a creative commons license. 12Model CheckersChecks properties expressed in Temporal Logictemporal logic adds modal operators to FOPL:e.g. p - p is true now and always (in the future)e.g. p - p is true eventually (in the future)e.g. (p⇒q) - each p is eventually followed by a qThe model may be:of the program itself (each statement is a ‘state’)an abstraction of the programa model of the specificationsa model of the requirementsA Model Checker searches all paths in the state space…with lots of techniques for reducing the size of the searchModel checking does not guarantee correctness…it only tells you about the properties you ask aboutit may not be able to search the entire state space (too big!)…but is (generally) more practical than proofs of


View Full Document

Toronto CSC 302 - Lecture 15 - Verification and Validation

Documents in this Course
Load more
Download Lecture 15 - Verification and Validation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 15 - Verification and Validation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 15 - Verification and Validation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?