DOC PREVIEW
MSU CSE 470 - Testing
Course Cse 470-
Pages 11

This preview shows page 1-2-3-4 out of 11 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 11 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

116-testing1Testing216-testingOutline●Terminology●Types of errors●Dealing with errors●Quality assurance vsTesting●Component Testing■Unit testing■Integration testing●Testing Strategy●Design Patterns &Testing●System testing■Function testing■Structure Testing■Performance testing■Acceptance testing■Installation testing316-testingTerminology●Reliability: The measure of success with whichthe observed behavior of a system confirms tosome specification of its behavior.●Failure: Any deviation of the observed behaviorfrom the specified behavior.●Error: The system is in a state such that furtherprocessing by the system will lead to a failure.●Fault (Bug): The mechanical or algorithmic causeof an error.There are many different types of errors anddifferent ways how we can deal with them.416-testingWhat is this?516-testingErroneous State (“Error”)616-testingAlgorithmic Fault2716-testingMechanical Fault816-testingHow do we deal with Errorsand Faults?916-testingVerification?1016-testingModular Redundancy?1116-testingDeclaring the Bugas a Feature?1216-testingPatching?31316-testingTesting?1416-testingExamples of Faults and Errors●Faults in the Interfacespecification■Mismatch between what theclient needs and what the serveroffers■Mismatch between requirementsand implementation●Algorithmic Faults■Missing initialization■Branching errors (too soon, toolate)■Missing test for nil●Mechanical Faults (very hardto find)■Documentation does notmatch actual conditions oroperating procedures●Errors■Stress or overload errors■Capacity or boundary errors■Timing errors■Throughput or performanceerrors1516-testingDealing with Errors●Verification:■Assumes hypothetical environment that does not match real environment■Proof might be buggy (omits important constraints; simply wrong)●Modular redundancy:■Expensive●Declaring a bug to be a “feature”■Bad practice●Patching■Slows down performance●Testing (this lecture)■Testing is never good enough1616-testingAnother View on How to Dealwith Errors●Error prevention (before the system is released):■Use good programming methodology to reduce complexity■Use version control to prevent inconsistent system■Apply verification to prevent algorithmic bugs●Error detection (while system is running):■Testing: Create failures in a planned way■Debugging: Start with an unplanned failures■Monitoring: Deliver information about state. Find performance bugs●Error recovery (recover from failure once the system is released):■Data base systems (atomic transactions)■Modular redundancy■Recovery blocks1716-testingSome Observations●It is impossible to completely test any nontrivialmodule or any system■Theoretical limitations: Halting problem■Practial limitations: Prohibitive in time and cost●Testing can only show the presence of bugs, nottheir absence (Dijkstra)1816-testingTesting takes creativity●Testing often viewed as dirty work.●To develop an effective test, one must have:◆Detailed understanding of the system◆Knowledge of the testing techniques◆Skill to apply these techniques in an effective and efficient manner●Testing is done best by independent testers■We often develop a certain mental attitude that the program should in a certain waywhen in fact it does not.●Programmer often stick to the data set that makes the program work■"Don’t mess up my code!"●A program often does not work when tried by somebody else.■Don't let this be the end-user.41916-testingTesting ActivitiesTested Subsystem SubsystemCodeFunctionalIntegrationUnit TestedSubsystemRequirementsAnalysisDocumentSystemDesignDocumentTested Subsystem TestTestTestUnit TestUnit TestUser ManualRequirementsAnalysisDocumentSubsystemCodeSubsystemCodeAll tests by developerFunctioningSystemIntegratedSubsystems2016-testingTesting Activities ctdGlobalRequirementsUser’s understandingTests by developerPerformanceAcceptanceClient’s Understandingof RequirementsTestFunctioningSystemTestInstallationUser EnvironmentTestSystem inUseUsableSystemValidatedSystemAcceptedSystemTests (?) by userTests by client2116-testingFault Handling TechniquesTestingFault HandlingFault AvoidanceFault ToleranceFault DetectionDebuggingComponentTestingIntegrationTestingSystemTestingVerificationConfigurationManagementAtomicTransactionsModularRedundancyCorrectnessDebuggingPerformanceDebuggingReviewsDesign Methodology2216-testingQuality Assuranceencompasses TestingUsability TestingQuality AssuranceTestingPrototypeTestingScenarioTestingProductTestingFault AvoidanceFault ToleranceFault DetectionDebuggingComponentTestingIntegrationTestingSystemTestingVerificationConfigurationManagementAtomicTransactionsModularRedundancyCorrectnessDebuggingPerformanceDebuggingReviewsWalkthroughInspection2316-testingComponent Testing●Unit Testing:■Individual subsystem■Carried out by developers■Goal: Confirm that subsystems is correctly coded and carries outthe intended functionality●Integration Testing:■Groups of subsystems (collection of classes) and eventually theentire system■Carried out by developers■Goal: Test the interface among the subsystem2416-testingSystem Testing●System Testing:■The entire system■Carried out by developers■Goal: Determine if the system meets the requirements (functional and global)●Acceptance Testing:■Evaluates the system delivered by developers■Carried out by the client. May involve executing typical transactions on site on atrial basis■Goal: Demonstrate that the system meets customer requirements and is ready touse●Implementation (Coding) and testing go hand in hand52516-testingUnit Testing●Informal:■Incremental coding●Static Analysis:■Hand execution: Reading the source code■Walk-Through (informal presentation to others)■Code Inspection (formal presentation to others)■Automated Tools checking for◆syntactic and semantic errors◆departure from coding standards●Dynamic Analysis:■Black-box testing (Test the input/output behavior)■White-box testing (Test the internal logic of the subsystem or object)■Data-structure based testing (Data types determine test cases)2616-testing Black-box Testing●Focus: I/O behavior. If for any given input, we can predict the output,then the module passes the test.■Almost always impossible to generate all possible inputs ("test cases")●Goal: Reduce number of test cases by equivalence partitioning:■Divide input conditions into


View Full Document

MSU CSE 470 - Testing

Course: Cse 470-
Pages: 11
Download Testing
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Testing and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Testing 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?