- Consider An Iterative Optimization Procedure Of A Differentiable Cost Function F X X R Which One Of The Following 1 (80.88 KiB) Viewed 14 times
Consider an iterative optimization procedure of a differentiable cost function f (x),x € R". Which one of the following
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am
Consider an iterative optimization procedure of a differentiable cost function f (x),x € R". Which one of the following
Consider an iterative optimization procedure of a differentiable cost function f (x),x € R". Which one of the following three inequalities is valid for a smooth convex function f A. ƒ (x+1) - ƒ (x) < Vƒ(x;)¹ (x+1 − x) + £||X+1-X₂||² B. ƒ (x+1) - ƒ (x₂) ≤ Vƒ(x,)¹ (x,+1 − x;) – ||-||² C. f (x+1) - f(x) ≤ Vƒ(x₂)¹ (x − x+1) +||+1 -² Which one of the following three inequalities is valid for a strongly convex function f? A. ƒ (x₂) — ƒ (x*) ≥ Vƒ(x,‚)¹ (x − x*) + £||ׂ-x*||² B. f (x) − f (x*) ≤ Vƒ(x,)¹ (x − x*) — ¼ ||×, - ** ||² c. f (x) − f (x*) ≤ Vƒ(x)¹ (x − x*) + ½ ||×, − x* ||² Decide whether the following statements are true or false. 1. Linear program is always convex. 2. Stochastic gradient can be viewd as using an unbiased estimate of the gradient at each step. 3. If f: R" → R is strongly convex with parameter μ> 0, then f is strictly convex and has a unique global minimum. 4. Let C be a convex set. Consider the projection of x onto C; x projects to a unique element of C. 5. Consider f(x) = max {ƒ₁(x), ƒ₂(x)}, for f1, f2 R² → R convex, differentiable. For f₁(x) = f(x), subgradient g is any point on line segment between Vf₁(x) and Vƒ₂(x). 6. Given f(x) and g(x) as convex functions, the composite f g(x), a.k.a. f(g(x)) is a convex function. BC True False