Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that
Posted: Fri May 20, 2022 11:59 am
Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that uses sigmoid activation function for all neurons. All the weights are initialized to 2 and biases are initialized to 0. Assume that the output should be equal to 1 for the input: x1 = 4, x2 = 0, x3 = -5 a) Make a diagram for the above neural network and label all the inputs and output. Moreover, assign appropriate symbols to any intermediate variables. (5 Marks) b) Show how the backpropagation algorithm would alter the values of all the weights and biases when gradient descent is applied with the above training example. Use the squared loss function and a learning rate of 0.5. You should give values of the input, output, and any intermediate variables for the forward pass. Moreover, you need to give expression and values of all the relevant partial derivatives during backpropagation. (15 Marks)