Page 1 of 1

Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that

Posted: Fri May 20, 2022 11:59 am
by answerhappygod
Consider A Fully Connected 1 Hidden Layer Neural Network With Two Hidden Neurons Three Inputs And A Single Output That 1
Consider A Fully Connected 1 Hidden Layer Neural Network With Two Hidden Neurons Three Inputs And A Single Output That 1 (105.86 KiB) Viewed 31 times
Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that uses sigmoid activation function for all neurons. All the weights are initialized to 2 and biases are initialized to 0. Assume that the output should be equal to 1 for the input: x1 = 4, x2 = 0, x3 = -5 a) Make a diagram for the above neural network and label all the inputs and output. Moreover, assign appropriate symbols to any intermediate variables. (5 Marks) b) Show how the backpropagation algorithm would alter the values of all the weights and biases when gradient descent is applied with the above training example. Use the squared loss function and a learning rate of 0.5. You should give values of the input, output, and any intermediate variables for the forward pass. Moreover, you need to give expression and values of all the relevant partial derivatives during backpropagation. (15 Marks)