Back-Propagation-ExampleQuestion 11x1x21x1x2V1V2V366QQQQk3@@@I6w5w4w1w2w3OThe above neural network has 2 layers with nodes that compute the sigmoid function. The value of w4is1 and cannot be changed. Write an explicit expression to how back propagation (applied to minimize theleast squares error function) changes the values of w1, w2, w3, w5when the algorithm is given the examplex1= 1, x2= −1, with the desired response y = 0. Assume that = 0.1, β = 1, and that the current valuesof the weights are: w1= −1, w2= 2, w3= 1, w4= 1 and w5= 2. If the sigmoid argument is nonzero youmay use S(.) instead of explicitly computing sigmoid values.Answer:new w1=new w2=new w3=new w5=Question 21x1x21x1x2V1V2V366QQQQk3@@@I6w5w4w1w2w3OThe above neural network has 2 layers with nodes that compute the sigmoid function. The value of w4is1 and cannot be changed. Write an explicit expression to how back propagation (applied to minimize theleast squares error function) changes the values of w1, w2, w3, w5when the algorithm is given the examplex1= 1, x2= −1, with the desired response y = 0. Assume that = 0.1, β = 1, and that the current valuesof the weights are: w1= −1, w2= 2, w3= 1, w4= 1 and w5= 2. If the sigmoid argument is nonzero youmay use S(.) instead of explicitly computing sigmoid values.Answer:new w1=new w2=new w3=new
View Full Document