Hey please help me with assignment, i have added images of given
code for question2
Neural Networks (152118003) 2021-2022 Spring Term Term Project Question Set #2 (Each question is equally weighted) 1) In Problem set #1, you had expanded a python code example to implement a simple Neural Network from scratch. This time, implement the same one utilizing Keras. You are required to - build a simple Neural Network model in the same size - utilize the same type of train and test data - present training and testing error 2) In the Jupyter notebook provided to you, implement types of NNs according to the instructions given as a comment with the Jupyter notebook. In general you are required build NNs with network structures given below for each 1 st A Sequential NN with following configs I Fully connected layer with Fully connected layer with 25 hidden neurons Softmax 25 hidden neurons and ReLU and ReLU 2nd A CNN with a Sequential layer 2D Conv Pooling (Max) 32@5x5 with ReLU 2x2 Strides 1 Padding same 2D Conv 128@5x5 with ReLU Strides 1 Padding same n EPL LAPUL Pooling (Max) 2x2 Pooling (Max) 2x2 2D Conv 64@5x5 with ReLU Strides 1 Padding same OULPU. t Flatten. Dense 100 w ReLU Output 10 neurons Softmax
Keras Question 2 1 from tensorflow import keras. 2 import matplotlib.pyplot as plt 3 import numpy as np Load MNIST dataset +↓个日 I 1 A # Model / data parameters NUM_ROWS = 28 NUM_COLS = 28 7 num_classes = 10 2 # LOAD the MNIST data from Keras, split between train and test sets # Reshape data # Categorically encode labels # Have first 10000 images as a validatin data and leave the rest as training 18 19 # Make sure the data sizees match your output shape of training data (50000, 784) shape of training labels (50000, 10) shape of validation data (10000, 784) shape of validation labels (10000, 10) shape of test data (10000, 784) shape of test labels (10000, 10)
[0] <matplotlib.image.Axes Image at 0x7f8d0ec8e4e0> 0 5- 8 0 10 15 20 25 Build and train a Sequential Model [0] 1 #load necess Keras modules for building a sequantial model [0] 1 # Build the model first [0] 1 # Configure the model with a categorical_crossentropy and stochastic gradient descent optimizer and observe # metrics as accuracy 2 3 4 10 15 20 25 1 # Plot an image like below from the Training size
#Make sure the following command generates the same output like below in your code model.summary() Model: "sequential_7" Layer (type) Output Shape Param # ======== dense 16 (Dense) (None, 25) 19625 activation_10 (Activation) (None, 25) 0 dense 17 (Dense) (None, 25) 650 activation_11 (Activation) (None, 25) 0 dense 18 (Dense) (None, 10) 260 activation_12 (Activation) (None, 10) Total params: 20,535 Trainable params: 20, 535 Non-trainable params: 0 [0] 1 # Train your model with training and validatin datasets and have only 1 epoch with a batch size of 32 3 Train on 50000 samples, validate on 10000 samples Epoch 1/1 50000/50000 [========== <keras.callbacks.callbacks.History at 0x7f8d0e6cdc18> [0] 1 # Test your NN 10000/10000 [===== [0.4455595921754837, 0.8738999962806702] ====== =============] - 58 93us/step-loss: 0.9214 - accuracy: 0.7391 - val_loss: 0.4456 val_accuracy: 0.8739 ====] 18 57us/step
# This is formatted as code Plot losses with a third party library which will help you generate interactive performance metrics. Make sure you have the call back function invoked with variable PlotLossesCallback below Using the package https://github.com/stared/livelossplot Install using the command pip install livelossplot [0] 1 pip install livelossplot Requirement already satisfied: livelossplot in /usr/local/lib/python3.6/dist-packages (0.5.0) Requirement already satisfied: ipython in /usr/local/lib/python3.6/dist-packages (from livelossplot) (5.5.0) Requirement already satisfied: matplotlib; python_version >= "3.6" in /usr/local/lib/python3.6/dist-packages (from livelossplot) (3.2.1) Requirement already satisfied: pexpect; sys_platform != "win32" in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (4.8.0) Requirement already satisfied: decorator in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (4.4.2) Requirement already satisfied: simplegeneric>0.8 in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (0.8.1) Requirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (4.3.3) Requirement already satisfied: pickleshare in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (0.7.5) Requirement already satisfied: pygments in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (2.1.3) Requirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (46.1.3) Requirement already satisfied: prompt-toolkit<2.0.0, >=1.0.4 in /usr/local/lib/python3.6/dist-packages (from ipython->livelossplot) (1.0.18) Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib; python_version >= "3.6"->livelossplot) (2.8.1) Requirement already satisfied: numpy>=1.11 in /usr/local/lib/python3.6/dist-packages (from matplotlib; python_version >= "3.6"->livelossplot) (1.18.4) Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib; python_version >= "3.6"->livelossplot) (1.2.0) Requirement already satisfied: pyparsing!=2.0.4, 1-2.1.2, 1-2.1.6, >=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib; python_version >= "3.6"->livelossplot) (2.4.7) Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib; python_version >= "3.6"->livelossplot) (0.10.0) Requirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.6/dist-packages (from pexpect; sys_platform != "win32"->ipython->livelossplot) (0.6.0) Requirement already satisfied: ipython-genutils in /usr/local/lib/python3.6/dist-packages (from traitlets>=4.2->ipython->livelossplot) (0.2.0) Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from traitlets>=4.2->ipython->livelossplot) (1.12.0) Requirement already satisfied: wcwidth in /usr/local/lib/python3.6/dist-packages (from prompt-toolkit<2.0.0, >=1.0.4->ipython->livelossplot) (0.1.9) [0] 1 !1s logs sample_data. [0] 1 from livelossplot.keras import PlotLossesCallback [0] 1 # Configure again the same model with a categorical_crossentropy and stochastic gradient descent optimizer and observe #metrics as accuracy 2
[0] [0] 0.28 0.26 0.24 0.22 0.20 0.18 1 # Configure again the same model with a categorical_crossentropy and stochastic gradient descent optimizer and observe # metrics as accuracy 2 3 1 # This time make sure you train and validate your network with epoch size of 10 and batch_size of 32 and 2 3 # Make sure you are able to generate the similar plots given below ^ Log-loss (cost function) accuracy 0.950 0.945 0.940 0.935 training validation 0.930 0.925 6 6 9 training validation. 3 5 9 Log-loss (cost function): training (min: 0.181, max: 0.181) (0.192) validation (min: 0.192, max: accuracy: training (min: 0.894, max: 0.949) validation (min: 0.911, max: 0.946) [0] 1 # Test the NN with the test data, 10000/10000 [============= ===============] - 13 55us/step [0.1770324898339808, 0.9478999972343445] epoch 8 0.374, cur: 0.329, cur: 0.949, cur: 0.946, cur: 7 10 3 5 epoch 7 8 10
Convolutional Neural Network for MNIST [0] 1 #Load the necessary modules from Keras that will be used in CNN training [0] 1 # Build your model according to details given in the World document [0] 1 # Compile the network wtih categorical_crossentropy, sgd optimizer and accuracy metrics 2 model.compile(loss="categorical_crossentropy", optimizer = "sgd", metrics =["accuracy"]) [0] 1 # Train the network with both training and validation data with epoch size of 5 and batch size of 32 2 3 # Make sure you can generate the plots Log-loss (cost function) accuracy 0.984 0.983 0.982 0.981 0.980 0.979 0.978 0.977 4.00 epoch 0.075 0.070 0.065 0.060 0.055 0.050 training validation. 4.75 3.00 3.25 3.50 3.75 4.25 4.50 Log-loss (cost function): training (min: validation (min: 0.048, max: 0.057, max: 0.560, cur: 0.122, cur: 0.048) 0.060) accuracy: 0.829, max: 0.984, cur: 0.984) training (min:: validation (min: 0.962, max: 0.983, cur: 0.983) <keras.callbacks.callbacks. History at 0x7f8d1e03f5c0> [0] 1 5.00 3.00 3.25 3.50 3.75 4.00 4.25 epoch 4.50 training validation 4.75 5.00
Hey please help me with assignment, i have added images of given code for question2
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am