U of U CS 5780 - Lecture 18 - Introduction to Verification

Unformatted text preview:

CS/ECE 5780/6780:Embedded System DesignJohn RegehrLecture 18: Introduction to VerificationWhat is verification?IVerification: A process that de termines if the design conformsto the specification.IIt answers the question, ”Does the design do what isintended?”IIt is more than a testbench.IApplies to both hardware and software.Why do verification as early as pos sible?IBugs become much more costly, the l onger they go unfound.IBugs found at unit level can often be fixed cheaply.IBugs found at the system-level may affect the time-to-market.IBugs found after fabrication require an expensive respin.IBugs found by c ustome rs can result in potentially companycrushing recalls and a bad reputation.Who does verification?IDesigners may begin the process.IVerification engineers manage and complete the process.IVerification engineers may outnumber designers 2-1.IVerification cost may d ominate the overall cost of a project.IIs this good or b ad?Verification”Validation is so complex that even though it consumes the mostcomputational resources and time it is stil l the weakest link in thedesign process. Ensuring functional correctness is the most d ifficu ltpart of designing a hardware system.”–Serdar Tasiran and Kurt KeutzerWhy is verification hard?IGetting a detailed specification of correct behavior may bevery difficult.IProving c onformance betwe en the system and its specificationis usually impossib le.IFor example, Windows XP specification may say “system issecure from network intrusion”IVerification tools may be buggy.IOften we settle for weak forms of verific ation: system seemsto conform to specification in cases we looked at.Common types of verificationIFunctional verification.ISimulation.IDirected tests.IConstrained pseudorandom tests.IFormal methods.IModel checking.IEquivalence checking.IAutomated theorem proving.ITiming verification.IPerformance verification.Functional Verification ApproachesIFunctional verification is primarily done via simulation.IBlack-box.IWhite-box.IGray- box.Black-box VerificationIThe verifier has access to in put, outputs, and device function.IGiven a set of inp uts the verifier checks for correct outputs.ITo fully verify a b lack-box you must s how that function iscorrect for all combi nations of inputs.IFull verification via black-box testing is impractical for any realdesign.White-box VerificationIThe verifier has access to (and uses ) internal signals duringverification.IThis is common during block-level verification.Gray-box VerificationIThe verifier has access to (and uses ) a limited number ofinternal signals during verification.IThis is the reality for m ost verification.IPredicting the correct output value without viewing aninternal signal is often difficult.IKnowing the architecture of the D UT (device under test)enables you to write better tests.The Heart of VerificationIDoes the verification cover all possible inputs?IHow can the verifier detect a failure?Three Commandments of SimulationIThou shalt stress thine logic h arder than it will ever bestressed again.IThou shalt place checking upon all things.IThou shalt not move onto a hi gher platform until the bug ratehas dropped off.Independent Verification is KeyIThe verification engineer should not participate in the logicdesign of the DUT.IDesigners may not think of all failing scenarios.IVerification engineers have a different perspective on thedesign.IVerification engineers must understand the function, but notthe implementation.Verification Do’sITalk to the des igner to understand the function of the design.IThink of situations the designer neglected to consid er.IFocus on the corner cases or exotic scenarios.IFocus on concurrent events.ITry everything that isn’t explici tly forbidden.IThink about the all the pieces of the design you need to verify.ITalk to d esigne rs regarding the interface to your DUT.Verification Don’tsIDon’t take the designer’s word for anything.IDon’t weaken your test plan in order to meet a time schedule.A Typical Verification FlowIDevise a potential bug.IWrite a test bench to expose the bug.IRun the simulator with the test bench.ICheck the simulation result.IIf the test h as uncovered a bug:IVerify the bug.IWork with the designer to fix the bug.Automating the FlowITo be practical this fl ow needs to be automated.IDevising spots for potential bugs is best not automated.IDetermining which parts of the design have been explored canbe automated...we will get to that later.IWriting test benches to expose bu gs can be automated viaconstrained pseudorandom testing.Constrained Pseudorandom TestingIProduces test cases that would be difficult to generate byhand.IFacilitates the generation of combinations of concurrentevents that would be difficult to explicitly devise.IOften done using a specialized language like Specman’s e.IGood verification engineers are still needed to e fficien tly drivethe constraints and develop the tests.IHow do you check the correctness of an pseudorandom test?IReference models.IAssertions.Reference ModelsIAn abstraction of the design impl emen tation.IShould be fast, correct, and represent all the design details.IIn practice, reference models are fast and correct yet l ackdetail.IThe simulator and the reference model use the same stimulus.IThe final result of each i s compared.IIt can be useful to do intermediate comparisons for long tests.AssertionsITwo types:IBuilt-in checkers.IPost-processing checkers.IBuilt-in checkers are always runni ng during every simulation.IAnyone can add them and everyone reaps the benefits.IBuilt-in checkers can increase simulation time.IComplex checkers (computationally intensive) can be movedto post-processing checkers and only run on sel ected tests.Coverage AnalysisIConstrained pseudorandom tests cover many cases.IWhich cases do they cover?IHow many more vectors should I run?ICoverage analysi s provides metrics to answer these questions.Coverage MetricsIServe two main purposes:IAct as heuristic measures that quantify verificationcompleteness.IIdentify inadequately exercised design aspects and guidefuture input stimulus generation.Theory vs. RealityITheory: Increasing the coverage increases confidence in thedesign’s correctness.IReality: At best, there is an intuitive connection betweencoverage m etrics and bugs.IDesign errors are more difficult to characterize.IA formal error model for design bugs hasn’t been found.UtilityIInputs guided by coverage information commonly detect morebugs


View Full Document

U of U CS 5780 - Lecture 18 - Introduction to Verification

Documents in this Course
Lab 1

Lab 1

5 pages

FIFOs

FIFOs

10 pages

FIFOs

FIFOs

5 pages

FIFO’s

FIFO’s

12 pages

MCU Ports

MCU Ports

12 pages

Serial IO

Serial IO

26 pages

Load more
Download Lecture 18 - Introduction to Verification
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 18 - Introduction to Verification and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 18 - Introduction to Verification 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?