Questions 11 Feed Forward Neural Networks Roman Belavkin Middlesex University Question 1 Below is a diagram if a single artificial neuron unit x1 Z Z w1 Z Z w2 v w 3 Z x2 y v x3 Figure 1 Single unit with three inputs The node has three inputs x x1 x2 x3 that receive only binary signals either 0 or 1 How many different input patterns this node can receive What if the node had four inputs Five Can you give a formula that computes the number of binary input patterns for a given number of inputs Answer For three inputs the number of combinations of 0 and 1 is 8 x1 0 1 0 1 0 1 0 1 x2 0 0 1 1 0 0 1 1 x3 0 0 0 0 1 1 1 1 and for four inputs the number of combinations is 16 x1 x2 x3 x4 0 0 0 0 1 0 0 0 0 1 0 0 1 1 0 0 0 0 1 0 1 0 1 0 0 1 1 0 1 1 1 0 1 0 0 0 1 1 0 0 1 0 1 0 1 1 1 0 1 0 0 1 1 1 0 1 1 0 1 1 1 1 1 1 1 BIS3226 2 You may check that for five inputs the number of combinations will be 32 Note that 8 23 16 24 and 32 25 for three four and five inputs Thus the formula for the number of binary input patterns is 2n where n in the number of inputs Question 2 Consider the unit shown on Figure 1 Suppose that the weights corresponding to the three inputs have the following values w1 2 w2 4 w3 1 and the activation of the unit is given by the step function 1 if v 0 v 0 otherwise Calculate what will be the output value y of the unit for each of the following input patterns Pattern P1 P2 P3 P4 x1 1 0 1 1 x2 0 1 0 1 x3 0 1 1 1 Answer To find the output value y for each pattern we have to P a Calculate the weighted sum v i wi xi w1 x1 w2 x2 w3 x3 b Apply the activation function to v The calculations for each input pattern are P1 P2 P3 P4 v 2 1 4 0 1 0 2 v 2 0 4 1 1 1 3 v 2 1 4 0 1 1 3 v 2 1 4 1 1 1 1 Question 3 2 0 3 0 3 0 1 0 y 2 1 y 3 0 y 3 1 y 1 0 BIS3226 3 Logical operators i e NOT AND OR XOR etc are the building blocks of any computational device Logical functions return only two possible values true or false based on the truth or false values of their arguments For example operator AND returns true only when all its arguments are true otherwise if any of the arguments is false it returns false If we denote truth by 1 and false by 0 then logical function AND can be represented by the following table 0 1 0 1 x1 x2 0 0 1 1 x1 AND x2 0 0 0 1 This function can be implemented by a single unit with two inputs x1 H w1 H H j H y v v x2 w2 if the weights are w1 1 and w2 1 and the activation function is 1 if v 2 v 0 otherwise Note that the threshold level is 2 v 2 a Test how the neural AND function works Answer P1 P2 P3 P4 v v v v 1 0 1 0 0 1 1 1 0 1 1 0 1 1 1 1 1 1 1 2 0 2 1 2 1 2 2 2 y y y y 0 0 1 0 1 0 2 1 b Suggest how to change either the weights or the threshold level of this single unit in order to implement the logical OR function true when at least one of the arguments is true x1 0 1 0 1 0 0 1 1 x2 x1 OR x2 0 1 1 1 BIS3226 4 Answer One solution is to increase the weights of the unit w1 2 and w2 2 P1 P2 P3 P4 v v v v 2 0 2 0 0 2 1 2 0 2 2 0 2 1 2 2 1 2 1 4 0 2 2 2 2 2 4 2 y y y y 0 0 2 1 2 1 4 1 Alternatively we could reduce the threshold to 1 1 if v 1 v 0 otherwise c The XOR function exclusive or returns true only when one of the arguments is true and another is false Otherwise it returns always false This can be represented by the following table 0 1 0 1 x1 x2 0 0 1 1 x1 XOR x2 0 1 1 0 Do you think it is possible to implement this function using a single unit A network of several units Answer This is a difficult question and it puzzled scientists for some time because it is actually impossible to implement the XOR function neither by a single unit nor by a single layer feed forward network single layer perceptron This was known as the XOR problem The solution was found using a feed forward network with a hidden layer The XOR network uses two hidden nodes and one output node Question 4 The following diagram represents a feed forward neural network with one hidden layer 1 3 5 Z Z Z Z Z Z ZZ ZZ 2 Z 4 6 Z Z BIS3226 5 A weight on connection between nodes i and j is denoted by wij such as w13 is the weight on the connection between nodes 1 and 3 The following table lists all the weights in the network w13 w23 w14 w24 2 3 4 1 w35 w45 w36 w46 1 1 1 1 Each of the nodes 3 4 5 and 6 uses the following activation function 1 if v 0 v 0 otherwise where v denotes the weighted sum of a node Each of the input nodes 1 and 2 can only receive binary values either 0 or 1 Calculate the output of the network y5 and y6 for each of the input patterns Pattern Node 1 Node 2 P1 0 0 P2 1 0 P3 0 1 P4 1 1 Answer In order to find the output of the network it is necessary to calculate weighted sums of hidden nodes 3 and 4 v3 w13 x1 w23 x2 v4 w14 x1 w24 x2 Then find the outputs from hidden nodes using activation function y3 v3 y4 v4 Use the outputs of the hidden nodes y3 and y4 as the input values to the output layer nodes 5 and 6 and find weighted sums of output nodes 5 and 6 v5 w35 y3 w45 y4 v6 w36 y3 w46 y4 Finally find the outputs from nodes 5 and …
View Full Document