Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that

Post by answerhappygod »

Consider A Fully Connected 1 Hidden Layer Neural Network With Two Hidden Neurons Three Inputs And A Single Output That 1
Consider A Fully Connected 1 Hidden Layer Neural Network With Two Hidden Neurons Three Inputs And A Single Output That 1 (105.86 KiB) Viewed 30 times
Consider a fully connected 1-hidden layer neural network with two hidden neurons, three inputs and a single output that uses sigmoid activation function for all neurons. All the weights are initialized to 2 and biases are initialized to 0. Assume that the output should be equal to 1 for the input: x1 = 4, x2 = 0, x3 = -5 a) Make a diagram for the above neural network and label all the inputs and output. Moreover, assign appropriate symbols to any intermediate variables. (5 Marks) b) Show how the backpropagation algorithm would alter the values of all the weights and biases when gradient descent is applied with the above training example. Use the squared loss function and a learning rate of 0.5. You should give values of the input, output, and any intermediate variables for the forward pass. Moreover, you need to give expression and values of all the relevant partial derivatives during backpropagation. (15 Marks)
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply