Objectives • Implement a 2-layer network in python Experiment with the response of the network when setting different pa
Posted: Mon May 23, 2022 11:03 am
What is a Multilayer Network? A network with more than one layer is called a multilayer network. Input First Layer Second Layer Third Layer P RX1 WI W W Slx 11 Sax 1 n E 3 StXR n S'x11 Sexs Sex # - Sºx1 1-b 1b b R slx1 SI 5x1 S2 Sºx1 S3 al =f(W!p+b) a = f(Wa'+b) a=f(Wa+b) 33 = f(Wf?(W2f'(W1p+b!)+b) +b) In a multilayer network, the first layer has Rinputs and S1 neurons. Therefore, W1 is a (51 x R), because we have Rinputs for each neuron. The bias béis a (51 x 1) vector, because we have 1 bias per neuron. n' and al are also (51 x 1) vectors, because we have 1 output for each neuron. The second layer has 51 inputs and s? neurons. Therefore, wê is a (S? x 54), because we have sł inputs for each neuron. The bias b> is a (52 x 1) vector, because we have 1 bias per neuron. n? and a’ are also (S? x 1) vectors, because we have 1 output for each neuron. The third layer has X inputs and X neurons. Therefore, W3 is a (x), because we have X inputs for each neuron. The bias b3 is a (x) vector, because we have 1 bias per neuron. n3 and a3 are also (x) vectors, because we have 1 output for each neuron. In this lab, we will implement a two layer network, depicted in the figure below. We will start by implementing each layer separately then visualizing the network output when we change the values of W and b. Input Log-Sigmoid Layer Linear Layer ni 11 1.1 an ΣΗ W21 bi n2 a2 р b2 W 2.1 ΣΗ W212 bu al = = logsig(Wip+b) q2 = purelin (W241+52)
Q1. Implement the first layer and name it “logsiglayer”. The function takes the following parameters: W-, p and b-. W is the weight matrix, p is the input and b is the bias. The layer computes the output, given the input, using the following formula. [2 marks] at = logsig(W+p+b) Q2. Implement the second layer and name it “linearlayer”. The function takes the following parameters: W?, a- and b. W is the weight matrix, a is the input and b is the bias. The layer computes the output, given the input, using the following formula. [1 marks] a? = purelin(W2 a++b%)
= Q3. Let the network has the following values. wi,i = 10, w21 = 10, b) = -10, b; = 10, wij = 1, wiz 1, w12 = 1,6² = 0. Plot the output of the network. To do so, follow the steps below: 1. Set the weight and bias values as given above 2. Let p (the input) be between -2 and 2. 3. Pass the input p to layer 1, to get a1 (the output of the first layer). 4. Pass a1 to layer 2, to get a2 (the output of the second layer). 5. Plot p vs a2. [2 marks] 2. To do so, follow the steps below: 1. Use the weight and bias values as given previously 2. For each value of p, compute by hand the following: a. at = logsig(W1p+b) b. a? = purelin(W2 a++b) 3. Report the values you got by hand and from your python code in the table below. [2 marks] 1 p a2 (from code) a2 (by hand) -2 0 ? -1 0.5 ? 0 ? 1 1.5 ? 2 2 ? Q4. Test the effect of changing bỉ, and plot the results. To do so, follow the steps below: 1. Set the weight and bias values as given previously 2. Let bį be between 0 s b} = 20 3. For each value of b], repeat: a. Let p (the input) be between-2 and 2. b. Pass the input p to layer 1, to get a1 (the output of the first layer). c. Pass a1 to layer 2, to get a2 (the output of the second layer). d. Plot p vs a2. 4. What changes does bị have on the network output? [2 marks]
Q5. Repeat the same for bŻ. What changes does bè have on the network output? [2 marks] Q6. Test the effect of changing W1,1, and plot the results. To do so, follow the steps below: 1. Set the weight and bias values as given previously 2. Let W1,1 be between -1 5 W1,1 51 3. For each value of W1,1, repeat: a. Let p (the input) be between -2 and 2. b. Pass the input p to layer 1, to get a1 (the output of the first layer). C. Pass a1 to layer 2, to get a2 (the output of the second layer). d. Plot p vs a2. 4. What changes does W1,1 have on the network output? [2 marks] Q7. Repeat the same for W7,1. What changes does W2,1 have on the network output? [2 marks]