DOC PREVIEW
U of I CS 511 - DB Benchmarking

This preview shows page 1-2-17-18-19-36-37 out of 37 pages.

Save
View full document
Premium Document
Do you want full access? Go Premium and unlock all 37 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Lecture 12 DB Benchmarking Oct 4 2006 ChengXiang Zhai Most slides are adapted from Kevin Chang s lecture slides CS511 Advanced Database Management Systems 1 Why benchmarking What does it measure Price Functionality CS511 Advanced Database Management Systems Performance 2 Why Benchmarks The three most important aspects of a DBMS functionality price and performance Performance is hard to figure out what to implement How to implement Performance is hard to compare response time throughput cost ease of use maintenance CS511 Advanced Database Management Systems 3 Before the Wisconsin Benchmarks Vendors quote performance numbers for marketing but None was published or verified Performance figures generally not comparable Big customers could afford benchmarking competition use real target applications but difficult and confusing without a standard procedure Vendors only as serious as necessary to make the sale Vendors had little incentive to publish their performance because it was often embarrassing TBH91 contribute little to move the state of the art forward CS511 Advanced Database Management Systems 4 When customers do their own A customer will typically involve one or more db vendors and a hardware vendor in this process These organizations will not encourage the customer to conduct more thorough and detailed tests because such tests take longer and are more likely to uncover problems that might kill the sale The customer will be encouraged to hurry the testing process and make the selection TBH91 A customer defined benchmark will present many opportunities for debate over interpretation Both managers and technicians will be involved in rultings that require fundamental tradeoffs between realism fairness and expense A complex benchmark will leave managers with the feeling that Solomon had it easy TBH91 CS511 Advanced Database Management Systems 5 When DB vendors do their own They like to set up and perform preliminary testing in private bring the customer in to witness the test and then get the customer out quickly before anything can go wrong TBH91 CS511 Advanced Database Management Systems 6 The Wisconsin Benchmarks The Wisconsin benchmarks changed all that around 1981 1983 The benchmark a synthesized data set the WISC database control various parameters selectivity of duplicate tuples of aggregate groups a set of 32 single user complex SQL queries selections joins projections aggregates updates CS511 Advanced Database Management Systems 7 Wisconsin Benchmarking Results Several named major vendors benchmarked INGRES university version and commercial version IDM Intelligent Database Machines of Britton Lee with and without DAC database accelerator DIRECT a multiprocessor DB machine of Wisconsin ORACLE CS511 Advanced Database Management Systems 8 Wisconsin Benchmarking Results DeWitt then assistant prof v s Ellison The relative poor performance of ORACLE made it apparent that the system had some fairly serious problems that needed correction for ORACLE was typically a factor of 5 slower than INGRES and the IDM 500 on most selection queries In retrospect the reasons for this popularity were only partially due to its technical quality The primary reason for its success was that it was the first evaluation containing impartial measures of real products DeWitt TBH91 CS511 Advanced Database Management Systems 9 Consequently Angry vendors angry vendor called the author s boss demanded a recount recoding or a remeasuring published their own numbers for the query set began to patch problems in their DBMSs use the Wisconsin benchmarks for regression testing The DeWitt clause in most software license agreements can t publish performance numbers DB gurus criticized the shortcomings the synthesized relations hard to scale make larger it is not real Customers began to demand Wisc bench results CS511 Advanced Database Management Systems 10 Benchmark Wars Followed Benchmark wars start if someone loses an important or visible benchmark evaluation The loser reruns it using regional specialists and gets new and winning numbers Then the opponent reruns it using his regional specialists and of course gets even better numbers The loser then reruns it using some one star gurus This progression can continue all the way to five star gurus At a certain point a special version of the system is employed with promises that the enhanced performance features will be included in the next regular release TBH91 CS511 Advanced Database Management Systems 11 The Long term Effects of the WB Vendors equalized their performance on the Wisc Bench queries cross vendor release to release Gurus thought long and hard about the characteristics of a good DB benchmark and they are still thinking Vendors started to learn how to cheat on benchmarks Customers and gurus began to think about how to stop cheating CS511 Advanced Database Management Systems 12 The WB Shortcomings Not realistic queries were of interest for the authors parallel platform but not reflect the OLTP systems of the day banks Not multiuser System price wasn t factored in The data set hard to scale up make larger 2MB 10 000 tuples systems will grow so should benchmarks Successors the Anon et al paper DebitCredit or TP1 TPC TPC A TPC B TPC C addressed these shortcomings measuring concurrent TPS CS511 Advanced Database Management Systems 13 The Anon et al Paper Jim Gray 1984 early version distributed to professionals in academia and industry for comments published as Anon et al to suggest group effort The benchmark tests an interactive OLTP emulation DebitCredit modeled after the actual state of Bank of America in the early 1970s two batch tests that stress IO Scan scans and updates 1000 records Sort disk sort of one million records CS511 Advanced Database Management Systems 14 What s Good About DebitCredit Why was it popular and influential CS511 Advanced Database Management Systems 15 The DebitCredit Benchmark Relevance modeled bank OLTP Simplicity one deposit transaction over the ABTH files account branch teller and history logs Scalability DB size scales up as TPS does for each TPS 100k accounts 10 branches 100 tellers e g 10 TPS 1000k A 100 B and 1000 T Comparability system requirements 95 transactions with 1sec response time configuration control instead of specifying equivalent configurations use cost as normalization factor simple summary metrics TPS TPS CS511 Advanced Database Management Systems 16 Who Uses Benchmarks Today DBMS vendors to tune tweak


View Full Document

U of I CS 511 - DB Benchmarking

Download DB Benchmarking
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view DB Benchmarking and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view DB Benchmarking and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?