ECE/CS/ME 539 Artificial Neural Networks Final ProjectA Comparison of a Learning Decision Tree and a 2-Layer Back-Propagation Neural Network on classifying a car purchase using a 2-Layer Back-Propagation Neural Network constructed in JavaIntroduction/MotivationDataResultsConclusionsECE/CS/ME 539ECE/CS/ME 539Artificial Neural Artificial Neural NetworksNetworksFinal ProjectFinal ProjectA Comparison of a Learning A Comparison of a Learning Decision Tree and a 2-Layer Decision Tree and a 2-Layer Back-Propagation Neural Back-Propagation Neural Network on classifying a car Network on classifying a car purchase using a 2-Layer Back-purchase using a 2-Layer Back-Propagation Neural Network Propagation Neural Network constructed in Javaconstructed in JavaSteve LudwigSteve Ludwig12-19-0312-19-03Introduction/MotivationIntroduction/MotivationStudied Decision Learning TreesStudied Decision Learning TreesSame purpose as pattern classifying BP Neural Same purpose as pattern classifying BP Neural NetsNetsWanted to compare/contrast using identical Wanted to compare/contrast using identical datadataBuilt own 2-layer back-propagation neural Built own 2-layer back-propagation neural network in Java with customizable network in Java with customizable attributesattributesDataDataLearning Tree uses text-based Learning Tree uses text-based attributes/valuesattributes/valuesConstructs ‘tree’ with nodes as attributesConstructs ‘tree’ with nodes as attributesLeaf nodes classify as positive or negativeLeaf nodes classify as positive or negativeHad to convert to numeric values for BP Neural Had to convert to numeric values for BP Neural NetNete.g. acceptable case = 1, unacceptable case = 0e.g. acceptable case = 1, unacceptable case = 0Could customize Neural Net parametersCould customize Neural Net parametersTried different learning rates, epochs, Tried different learning rates, epochs, permutation of train set (to avoid overfitting)permutation of train set (to avoid overfitting)ResultsResultsBoth Neural Net and Learning Tree Both Neural Net and Learning Tree had almost identical test set had almost identical test set classification ratesclassification ratesLearning Tree = 95.789 %Learning Tree = 95.789 %BP Neural Net = 95.105 %BP Neural Net = 95.105 %Learning Tree runs much faster, Learning Tree runs much faster, always consistentalways consistentNeural Net only consistent when train Neural Net only consistent when train set not permutatedset not permutatedConclusionsConclusionsLearning Tree works faster, great Learning Tree works faster, great accuracy, can use text-based accuracy, can use text-based attributesattributesBP Neural Net has more flexibility, BP Neural Net has more flexibility, can be modified to work better can be modified to work better (more hidden layers), still good (more hidden layers), still good classification rateclassification
View Full Document