Unformatted text preview:

LECTURE NOTES Professor Anita Wasilewska Neural Networks Classifier IntroductionNeural Networks ClassifierData NormalizationExample of Max-Min NormalizationDecimal Scaling NormalizationNeural NetworkNeural NetworkNeural Network LearningA Multilayer Feed-Forward (MLFF) Neural NetworkMLFF Neural NetworkMLFF Neural NetworkMLFF Neural NetworkMLFF Network InputMLFF Network TopologyMLFF Network TopologyClassification by Backpropagation Classification by BackpropagationSteps in Backpropagation AlgorithmSteps in Backpropagation AlgorithmA Neuron; a Hidden, or Output Unit jStep Three: propagate the inputs forwardStep Three: propagate the inputs forwardBack propagation Formulas Step 4: Back propagate the error Step 4: Back propagate the error Step 5: Update weights and biases Weights and Biases UpdatesTerminating ConditionsBack Propagation FormulasExample of Back Propagation Example of Back PropagationNet Input and Output CalculationCalculation of Error at Each NodeCalculation of weights and Bias UpdatingSome Facts to be RememberedAdvanced Features of Neural Network (to be covered by students presentations)Training with SubsetsTraining with subsetsModular Neural Network Evolving Network Architectures Constructive vs Destructive AlgorithmFaster ConvergenceLECTURE NOTESProfessor Anita WasilewskaNEURAL NETWORKSNeural Networks ClassifierIntroduction– INPUT: classification data, i.e. it contains an classification (class) attribute.– WE also say that the class label is known for all data.– DATA is divided, as in any classification problem, into TRAINING and TEST data sets.Neural Networks Classifier– ALL DATA must be normalized, i.e.all values of attributes in the dataset has to be changed to contain values inthe interval [0,1], or [-1,1].TWO BASIC normalization techniques:–Max- Min normalization and – Decimal Scaling normalization.Data Normalization• Max-Min NormalizationPerforms a linear transformation on the original data.• Given an attribute A, we denote by minA, maxA the minimum and maximumvalues of the values of the attribute A.• Max-Min Normalization maps a value vof A to v’ in the range• [new_minA, new_maxA]as follows.AnewAnewAnewAAAvv min_)min_max_(minmaxmin' +−−−=Data NormalizationMax- Min normalization formula is as follows: Example: we want to normalize data to range of the interval [-1,1].We put: new_max A= 1, new_minA = -1.In general, to normalize within interval [a,b], we put:new_max A= b, new_minA = a.Example of Max-Min NormalizationAnewAnewAnewAAAvv min_)min_max_(minmaxmin' +−−−=Max- Min normalization formulaExample: We want to normalize data to range of the interval [0,1].We put: new_max A= 1, new_minA =0.Say, max A was 100 and min A was 20 ( That means maximum and minimum values for the attribute A).Now, if v = 40 ( If for this particular pattern , attribute value is 40 ), v’will be calculated as , v’ = (40-20) x (1-0) / (100-20) + 0 => v’ = 20 x 1/80=> v’ = 0.4Decimal Scaling NormalizationNormalization by decimal scaling normalizes by moving thedecimal point of values of attribute A.A value v of A is normalized to v’ by computingjvv10' =where j is the smallest integer such that max|v’|<1.Example :A – values range from -986 to 917. Max |v| = 986.v = -986 normalize to v’ = -986/1000 = -0.986Neural Network• Neural Network is a set of connected INPUT/OUTPUT UNITS, where each connection has a WEIGHT associated with it.• Neural Network learning is also called CONNECTIONIST learning due to the connections between units.• It is a case of SUPERVISED, INDUCTIVE or CLASSIFICATIONlearning.Neural Network• Neural Network learns by adjusting the weights so as to be able to correctly classify the training data and hence, after testing phase, to classify unknown data.• Neural Network needs long time for training.• Neural Network has a high tolerance to noisy and incomplete data.Neural Network Learning• Learning is being performed by a backpropagationalgorithm.• The inputs are fed simultaneously into the input layer.• The weighted outputs of these units are, in turn, are fed simultaneously into a “neuron like” units, known as a hidden layer.• The hidden layer’s weighted outputs can be input to another hidden layer, and so on.• The number of hidden layers is arbitrary, but in practice, usually one or two are used.• The weighted outputs of the last hidden layer are inputs to units making up the output layer.kOkjwOutput nodesInput nodesHidden nodesOutput vector;classesInput vector;Record: xiwij- weightsNetwork is fully connectedjOA Multilayer Feed-Forward (MLFF) Neural NetworkMLFF Neural Network• The units in the hidden layers and output layer are sometimes referred to as neurones, due to their symbolic biological basis, or as output units.• A multilayer neural network shown on the previous slide has two layers of output units.• Therefore, we say that it is a two-layerneural network.MLFF Neural Network• A network containing two hidden layers is called a three-layer neural network, and so on.• The network is feed-forward in that none of the weights cycles back to an input unit or to an output unit of a previous layer.kOkjwOutput nodesInput nodesHidden nodesOutput vector;classesInput vector;Record: xiwij- weightsNetwork is fully connectedjOMLFF Neural NetworkMLFF Network Input• INPUT: records without class attribute with normalized attributes values. We call it an input vector. • INPUT VECTOR:X = { x1, x2, …. xn}where n is the number of (non class) attributes.MLFF Network Topology• INPUT LAYER – there are as many nodes as non-class attributes i.e. as the length of the input vector.• HIDDEN LAYER – the number of nodes in the hidden layer and the number of hidden layers depends on implementation.jOj=1, 2 ..#hidden nodesMLFF Network Topology• OUTPUT LAYER – corresponds to the class attribute.• There are as many nodes as classes (values of the class attribute).kOk= 1, 2,.. #classes• Network is fully connected, i.e. each unit provides input to each unit in the next forward layer.Classification by Backpropagation• Backpropagation is a neural network learning algorithm. • It learns by iteratively processing a set of training data (samples), comparing the network’s classification of each record (sample) with the actual known class label (classification).Classification by Backpropagation• For each training sample, the weights are modified as to minimize the mean


View Full Document
Download networks07Neural Networks
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view networks07Neural Networks and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view networks07Neural Networks 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?