In this problem, we will investigate minimizing the training objective for a Support Vector Machine (with margin loss).

Business, Finance, Economics, Accounting, Operations Management, Computer Science, Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Algebra, Precalculus, Statistics and Probabilty, Advanced Math, Physics, Chemistry, Biology, Nursing, Psychology, Certifications, Tests, Prep, and more.
Post Reply
answerhappygod
Site Admin
Posts: 899603
Joined: Mon Aug 02, 2021 8:13 am

In this problem, we will investigate minimizing the training objective for a Support Vector Machine (with margin loss).

Post by answerhappygod »

In This Problem We Will Investigate Minimizing The Training Objective For A Support Vector Machine With Margin Loss 1
In This Problem We Will Investigate Minimizing The Training Objective For A Support Vector Machine With Margin Loss 1 (81.74 KiB) Viewed 24 times
In This Problem We Will Investigate Minimizing The Training Objective For A Support Vector Machine With Margin Loss 2
In This Problem We Will Investigate Minimizing The Training Objective For A Support Vector Machine With Margin Loss 2 (53.03 KiB) Viewed 24 times
In this problem, we will investigate minimizing the training objective for a Support Vector Machine (with margin loss). The training objective for the Support Vector Machine (with margin loss) can be seen as optimizing a balance between the average hinge loss over the examples and a regularization term that tries to keep the parameters small (increase the margin). This balance is set by the regularization parameter x > 0. Here we only consider the case without the offset parameter (setting it to zero) so that the training objective is given by TO (4.3) A Lossh (yi) x(i) 13 | + 121101²2 = 2²/12/201 Lossh (y) x(i)) + 11011²2 n i=1 where the hinge loss is given by Lossh (y (0x)) = max{0, 1y (0-x)} (4.4) = Argmin, [Lossh (y0.x) + 1011²2 x (1) Note: For all of the exercises on this page, assume that n = 1 where n is the number of training examples and x = Y = = y(¹). and

Minimizing Loss - Numerical Example (1) 2 points possible (graded) Consider minimizing the above objective fuction for the following numerical example: λ = 0.5, y = 1, x = D Note that this is a classification problem where points lie on a two dimensional space. Hence Ô would be a two dimensional vector. Let 0 = [1,2], where 61, 62 are the first and second components of respectively. Solve for 01, 02. Hint: For the above example, show that Lossh (y (Ô · x)) ≤ 0 61₁ - 0₂ =
Join a community of subject matter experts. Register for FREE to view solutions, replies, and use search function. Request answer by replying!
Post Reply