Back-Propagation-Example2Question 1&%'$&%'$&%'$&%'$&%'$&%'$7JJJJ]6BBBBBBBM6 6 6w1w2w3w4w5x1x2O1O2O3VV1V2V3The above neural network has two layers (one hidden layer), two inputs, and three outputs. (There are NObias connections.) All nodes compute the sigmoid function with β = 1.A.1 Give explicit expressions to the values of all nodes in forward propagation when the network is giventhe input x1= 3, x2= 9, with the desired output y1= 1, y2= 0,y3= 1. Your answer should be in terms ofthe old weights w1, w2, w3, w4, w5. You may use the notation S(.) instead of explicitly computing sigmoidvalues.AnswerV =V1=V2=V3=O1=O2=O3=A.2 Give explicit expressions to how the weights change by back propagation when the network is giventhe same example as above. Use = 0.1.Your answer should be in terms of the old weights w1, w2, w3, w4, w5and the node values V , V1, V2,V3, O1, O2, O3, that were computed in A.1. You may use the notation S(.) instead of explicitly computingsigmoid values. You may use temporary variables in your answer, but make sure that they are defined interms of the above variables.Answer I am using the following temporary variables in my answer:Answernew w1=new w2=new w3=new w4=new
View Full Document