DOC PREVIEW
UW-Madison ECE 539 - ECE 539 Lecture Notes

This preview shows page 1-2-3-4-5-6 out of 19 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 19 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Artificial Neural Network Prediction of Major League Baseball Teams Winning PercentagesMotivationSlide 3GoalsData CollectionNeural Network SelectionPreprocessingTestingTesting ResultsSlide 10TestingSlide 12Preliminary ConclusionsClassification TestingClassification ResultsSlide 16Slide 17Slide 18ConclusionsArtificial Neural Network Prediction of Major League Baseball Teams Winning Percentages Scott Wiese ECE 539Professor HuMotivationCurrent trends in managing player personnel focuses heavily on statistics to weigh future production against potential salaries.Used to determine whether or not to sign specific playersDetermine if current players are overpaidMotivationClaimed that statistics can be a valid predictor of both a player’s and team’s productionClaimed that one season, 162 games, is a long enough trial period that statistics can predict a team’s winning percentageGoalsCan I develop an artificial neural network that when given a team’s statistics for a year that will accurately predict a team’s winning percentage?Data CollectionCollected 3 years of data for all 30 Major League Baseball teamsGathered from statistical database available on www.MLB.com74 statistics besides winning percentage gatheredNeural Network SelectionBack Trained Multi Layer PerceptronExcellent at analyzing large feature setsSupervised TrainingGood at classification problemsPreprocessingNormalized each feature vectorUsed singular value decomposition to emphasize most important featuresTestingWanted to determine which MLP configuration would best predict winning percentageBaseline MLP: 1 hidden layer, 1 hidden neuronTested MLPs: 1 through 5 hidden layers, 1, 3, or 5 hidden neurons in all layersTesting ResultsAverage Success Rates 1 hidden layer 2 hidden layers 3 hidden layers 4 hidden layers1 neuron 33.33 56.67 50 603 neurons 45.56 35.56 41.11 31.115 neurons 32.22 45.56 43.33 40Testing ResultsAverage Success Rates - Exact Matching202530354045505560650 1 2 3 4 5 6Hidden Layers% Success Rate1 neuron3 neurons5 neuronsTestingNow that we know the 4 hidden layers, 1 hidden neuron network performed the best, test it again against the baseline with new dataSuccess when predicted winning percentage within +/- 0.15Testing ResultsFinal Testing Trial 1 Trial 2 Trial 3 MeanBaseline 23.33 16.67 33.33 24.44Best MLP 73.33 43.33 26.67 47.78Best MLP’s performance almost twice as good as baseline’s performance.Preliminary ConclusionsAdvanced MLP structure is better at predicting a team’s winning percentage.Unfortunately, still under 50% given a .15 error boundCan classification work betterClassification TestingClassify teams into 3 groupsDivision winners (> .590)Winning teams (.500<x<.589)Losing teams (<.500)Same process as aboveClassification ResultsAverage Success Rates 1 hidden layer 2 hidden layers 3 hidden layers 4 hidden layers1 neuron 55.5556 61.1111 56.6667 53.33333 neurons 58.8889 61.1111 66.6667 57.77785 neurons 66.6667 62.2222 73.3333 603 hidden layers with 5 hidden neurons is bestClassification ResultsAverage Success Rates - Classification40455055606570750 1 2 3 4 5 6Hidden Layers% Success Rate1 neuron3 neurons5 neuronsClassification ResultsAgain, now that we know the best advanced network, test it against the baseline with more data.Classification ResultsFinal Testing Trial 1 Trial 2 Trial 3 MeanBaseline 66.6667 63.333 60 63.333Best MLP 66.6667 63.333 63.3333 64.444Negligible difference between the two networks even though there was nearly a 50% improvement in the original trial.ConclusionsAdvanced network better at pure prediction than baselineStill a very moderate success rate given the error boundsClassification results very promisingShows that statistics are important in separating teams’


View Full Document

UW-Madison ECE 539 - ECE 539 Lecture Notes

Documents in this Course
Load more
Download ECE 539 Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view ECE 539 Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view ECE 539 Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?