Back-Propagation-Example-SolutionsQuestion 11x1x21x1x2V1V2V366QQQQk3@@@I6w5w4w1w2w3OThe above neural network has 2 layers with nodes that compute the sigmoid function. The value of w4is1 and cannot be changed. Write an explicit expression to how back propagation (applied to minimize theleast squares error function) changes the values of w1, w2, w3, w5when the algorithm is given the examplex1= 1, x2= −1, with the desired response y = 0. Assume that = 0.1, β = 1, and that the current valuesof the weights are: w1= −1, w2= 2, w3= 1, w4= 1 and w5= 2. If the sigmoid argument is nonzero youmay use S(.) instead of explicitly computing sigmoid values.Answer:Forward propagationh1= 1V1= S(1)h2= −1V2= S(−1)h3= S(1) + 2S(−1)V3= S(S(1) + 2S(−1))O = V3Back propagation:δ3= 2βV3(1 − V3)(−V3) = −2V23(1 − V3)δ2= 2βV2(1 − V2)2δ3= 4V2(1 − V2)δ3δ1= 2βV1(1 − V1)δ3= 2V1(1 − V1)δ3Weights update:new w1= w1+ δ1= −1 + 0.1δ1new w2= w2+ δ1= 2 + 0.1δ1new w3= w3− δ2= 1 − 0.1δ2new w5= w5− δ3V2= 2 + 0.1δ3V2Question 21x1x21x1x2V1V2V366QQQQk3@@@I6w5w4w1w2w3OThe above neural network has 2 layers with nodes that compute the sigmoid function. The value of w4is1 and cannot be changed. Write an explicit expression to how back propagation (applied to minimize theleast squares error function) changes the values of w1, w2, w3, w5when the algorithm is given the examplex1= 1, x2= −1, with the desired response y = 0. Assume that = 0.1, β = 1, and that the current valuesof the weights are: w1= −1, w2= 2, w3= 1, w4= 1 and w5= 2. If the sigmoid argument is nonzero youmay use S(.) instead of explicitly computing sigmoid values.Answer:Forward propagationh1= 1V1= S(1)h2= −1V2= S(−1)h3= S(1) + 2S(−1)V3= S(S(1) + 2S(−1))O = V3Back propagation:δ3= 2βV3(1 − V3)(−V3) = −2V23(1 − V3)δ2= 2βV2(1 − V2)2δ3= 4V2(1 − V2)δ3δ1= 2βV1(1 − V1)δ3= 2V1(1 − V1)δ3Weights update:new w1= w1+ δ1= −1 + 0.1δ1new w2= w2+ δ1= 2 + 0.1δ1new w3= w3− δ2= 1 − 0.1δ2new w5= w5− δ3V2= 2 +
View Full Document