import numpy as np class Linear Reg (object): def __init__(self, indim=1, outdim=1): # initialize the parameters first.

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

import numpy as np class Linear Reg (object): def __init__(self, indim=1, outdim=1): # initialize the parameters first.

Post by answerhappygod »

Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 1
Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 1 (79.41 KiB) Viewed 44 times
Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 2
Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 2 (95.98 KiB) Viewed 44 times
Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 3
Import Numpy As Np Class Linear Reg Object Def Init Self Indim 1 Outdim 1 Initialize The Parameters First 3 (45.18 KiB) Viewed 44 times
import numpy as np class Linear Reg (object): def __init__(self, indim=1, outdim=1): # initialize the parameters first. pass def fit (self, X, T): # implement the .fit() using the simple least-square closed-form solution: # W = (X^T X)^-1 X^TT M=np. linalg.matrix_power (X, T) N=np. linalg.matrix_power (M, -1) W=N*M*T # HINT: # # extend the input features before fitting to it. compute the weight matrix of shape [indim+1, outdim] pass def predict (self, X): # implement the predict() using the parameters learned by .fit() pass
Introduction In this code challenge, we will implement a linear regression model. The code structure we provide can be used to do a multivariate linear regression, where the target variable has multiple variables (outdim>1), or we can simply assume (outdim=1) in this code challenge. Single-Variate Linear Regression Remember in single-variate linear regression, the closed-form solution is given as where in our case, WLS = Þ= [1, X] = Φ'Φ) 1, -x 1, -x₂ -1 T 1,-XN are the feature matrix and the target vector, respectively. t t2
Multi-Variate Linear Regression In the case of multivariate linear regression, we have to predict instead of a single target variable, a target vector that is comprised of multiple variables. But don't worry, doing multivariate linear regression is equivalent to doing single-variate linear regression independently multiple times. In other words, we can do a single-variate regression for the first dimension of the target vector, and for the second, until the last dimension. How can we demonstrate this in the language of mathematics? Suppose the input feature matrix is defined as in the case of single-variate linear regression, and all the target vectors comprise the target matrix: Þ= [1, X] = 1, -X 1,-X₂ B 1,-XN- ERNX (1+indim) T= WLS = $ Φ where indim denotes the number of the dimension of the input data, and outdim denotes the number of the dimension of the target vector. Luckily, the closed-form solution to the Multi-Variate Linear Regression looks exactly the same as the case we covered during the lecture! -1 -t₂- ΦΤ ERNX outdim
Instruction Here is a list of the functions you need to implement: • LinearReg.fit(X, T): fit the data, learn the parameter matrix W. Available unit test: TestFit LinearReg.predict(X): predict the target vector using the learned parameter matrix W. Available unit test: TestPredict
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply