[phpBB Debug] PHP Warning: in file [ROOT]/ext/lmdi/autolinks/event/listener.php on line 237: Undefined array key 9
[phpBB Debug] PHP Warning: in file [ROOT]/ext/lmdi/autolinks/event/listener.php on line 237: Trying to access array offset on value of type null
Answer Happy • Exercise 4 General Linear Regression with Regularisation (10+10+10+10+10 credīts) Let A € RNXN B € RDXD be symmetric, po
Page 1 of 1

Exercise 4 General Linear Regression with Regularisation (10+10+10+10+10 credīts) Let A € RNXN B € RDXD be symmetric, po

Posted: Sun Oct 03, 2021 11:04 am
by answerhappygod
Exercise 4 General Linear Regression With Regularisation 10 10 10 10 10 Credits Let A Rnxn B Rdxd Be Symmetric Po 1
Exercise 4 General Linear Regression With Regularisation 10 10 10 10 10 Credits Let A Rnxn B Rdxd Be Symmetric Po 1 (60.04 KiB) Viewed 41 times
Exercise 4 General Linear Regression with Regularisation (10+10+10+10+10 credīts) Let A € RNXN B € RDXD be symmetric, positive definite matrices. From the lectures, we can use symmetric positive definite matrices to define a corresponding inner product, as shown below. From the previous question, we can also define a norm using the inner products. (x,y)a :=x"Ay IX A :=(x,x) (x,y)s :=x" By x=(x,x) Suppose we are performing linear regression, with a training set {(x1, 91).....(xx,yn)), where for each i, XER" and y, R. We can define the matrix X = [X1,...,xn)" ERN and the vector y=191..--,yN" ERN We would like to find ERⓇ,CER such that y X0 +e, where the error is measured using - A- We avoid overfitting by adding a weighted regularization term, measured using Il We define the loss function with regularizer: LA.B.y,x(0,c) 1 y - Xo-cl + 10 + 12 For the sake of brevity we write c(0,c) for CA,B,y,X(0.c). For this question: A matrix is symmetric positive definite if it is both symmetric and positive definite. • You may use (without proof) the property that a symmetric positive definite matrix is invertible. . We assume that there are sufficiently many non-redundant data points for X to be full rank. In particular, you may assume that the mall space of X is trivial (that is, the only solution to Xz = 0 is the trivial solution, 2=0.) 1. Find the gradient 7.0(0,c). 2. Let oL(0.c) = 0, and solve for 0. If you need to invert a matrix to solve for 0, you should prove the inverse exists. 3. Find the gradiont V. (0,c). We now compute the gradient with respect to c. 4. Let VC(0) = 0, and solve for c. If you need to invert a matrix to solve for e, you should prove the inverse exists. 5. Show that if we set A = Ic=0,B=XI, where N € R, your answer for 4.2 agrees with the analytic solution for the standard least squares regression problem with L2 regularization, given by 0=(X"X + XD)-'X'y.