DOC PREVIEW
WUSTL CSE 567M - Common Mistakes and How to Avoid Them

This preview shows page 1-2-21-22 out of 22 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 22 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

2-1©2006 Raj JainCSE567MWashington University in St. LouisCommon Mistakes and Common Mistakes and How to Avoid ThemHow to Avoid ThemRaj Jain Washington University in Saint LouisSaint Louis, MO [email protected] slides are available on-line at:http://www.cse.wustl.edu/~jain/cse567-06/2-2©2006 Raj JainCSE567MWashington University in St. LouisOverviewOverview! Common Mistakes in Evaluation! Checklist for Avoiding Common Mistakes! A Systematic Approach to Performance Evaluation! Case Study: Remote Pipes vs RPC2-3©2006 Raj JainCSE567MWashington University in St. LouisCommon Mistakes in EvaluationCommon Mistakes in Evaluation1. No Goals" No general purpose model" Goals ⇒ Techniques, Metrics, Workload" Not trivial2. Biased Goals" ``To show that OUR system is better than THEIRS'‘" Analysts = Jury3. Unsystematic Approach4. Analysis Without Understanding the Problem5. Incorrect Performance Metrics6. Unrepresentative Workload7. Wrong Evaluation Technique2-4©2006 Raj JainCSE567MWashington University in St. LouisCommon Mistakes (Cont)Common Mistakes (Cont)8. Overlook Important Parameters9. Ignore Significant Factors10. Inappropriate Experimental Design11. Inappropriate Level of Detail12. No Analysis13. Erroneous Analysis14. No Sensitivity Analysis15. Ignoring Errors in Input16. Improper Treatment of Outliers17. Assuming No Change in the Future18. Ignoring Variability19. Too Complex Analysis2-5©2006 Raj JainCSE567MWashington University in St. LouisCommon Mistakes (Cont)Common Mistakes (Cont)20. Improper Presentation of Results21. Ignoring Social Aspects22. Omitting Assumptions and Limitations2-6©2006 Raj JainCSE567MWashington University in St. LouisChecklist for Avoiding Common MistakesChecklist for Avoiding Common Mistakes1. Is the system correctly defined and the goals clearly stated?2. Are the goals stated in an unbiased manner?3. Have all the steps of the analysis followed systematically?4. Is the problem clearly understood before analyzing it?5. Are the performance metrics relevant for this problem?6. Is the workload correct for this problem?7. Is the evaluation technique appropriate?8. Is the list of parameters that affect performance complete?9. Have all parameters that affect performance been chosen as factors to be varied?2-7©2006 Raj JainCSE567MWashington University in St. LouisChecklist (Cont)Checklist (Cont)10. Is the experimental design efficient in terms of time and results?11. Is the level of detail proper?12. Is the measured data presented with analysis and interpretation?13. Is the analysis statistically correct?14. Has the sensitivity analysis been done?15. Would errors in the input cause an insignificant change in the results?16. Have the outliers in the input or output been treated properly?17. Have the future changes in the system and workload been modeled?18. Has the variance of input been taken into account?2-8©2006 Raj JainCSE567MWashington University in St. LouisChecklist (Cont)Checklist (Cont)19. Has the variance of the results been analyzed?20. Is the analysis easy to explain?21. Is the presentation style suitable for its audience?22. Have the results been presented graphically as much as possible?23. Are the assumptions and limitations of the analysis clearly documented?2-9©2006 Raj JainCSE567MWashington University in St. LouisA Systematic Approach to A Systematic Approach to Performance EvaluationPerformance Evaluation1. State Goals and Define the System2. List Services and Outcomes3. Select Metrics4. List Parameters5. Select Factors to Study6. Select Evaluation Technique7. Select Workload8. Design Experiments9. Analyze and Interpret Data10. Present ResultsRepeat2-10©2006 Raj JainCSE567MWashington University in St. LouisCase Study: Remote Pipes Case Study: Remote Pipes vsvsRPCRPC! System Definition:! Services: Small data transfer or large data transfer.2-11©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! Metrics:! No errors and failures. Correct operation only.! Rate, Time, Resource per service.! Resource = Client, Server, NetworkThis leads to:" Elapsed time per call." Maximum call rate per unit of time, or equivalently, the time required to complete a block of n successive calls." Local CPU time per call." Remote CPU time per call." Number of bytes sent on the link per call.2-12©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! System Parameters:" Speed of the local CPU." Speed of the remote CPU." Speed of the network." Operating system overhead for interfacing with the channels." Operating system overhead for interfacing with the networks." Reliability of the network affecting the number of retransmissions required.2-13©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! Workload parameters:" Time between successive calls." Number and sizes of the call parameters. " Number and sizes of the results." Type of channel." Other loads on the local and remote CPUs." Other loads on the network.2-14©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! Factors:" Type of channel: Remote pipes and remote procedure calls" Size of the Network: Short distance and long distance" Sizes of the call parameters: small and large." Number n of consecutive calls=Block size: 1, 2, 4, 8, 16, 32, …, 512, and 1024.Note:" Fixed: type of CPUs and operating systems." Ignore retransmissions due to network errors" Measure under no other load on the hosts and the network.2-15©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! Evaluation Technique:" Prototypes implemented ⇒ Measurements." Use analytical modeling for validation.! Workload:" Synthetic program generating the specified types of channel requests." Null channel requests ⇒ Resources used in monitoring and logging.! Experimental Design:" A full factorial experimental design with 23×11=88 experiments will be used.2-16©2006 Raj JainCSE567MWashington University in St. LouisCase Study (Cont)Case Study (Cont)! Data Analysis:" Analysis of Variance (ANOVA) for the first three factors" Regression for number n of successive calls.! Data Presentation:" The final results will be plotted as a function of the block size n.2-17©2006 Raj JainCSE567MWashington University in St. LouisSummarySummary! The analysis technique, metrics, workloads depend upon the goal of the study! Metrics are based on services provided by the


View Full Document

WUSTL CSE 567M - Common Mistakes and How to Avoid Them

Documents in this Course
Load more
Download Common Mistakes and How to Avoid Them
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Common Mistakes and How to Avoid Them and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Common Mistakes and How to Avoid Them 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?