Optimization Methods for Large-Scale Systems
Lecture, four hours; outside study, eight hours. Requisite: course 236B. First-order algorithms for convex optimization: subgradient method, conjugate gradient method, proximal gradient and accelerated proximal gradient methods, block coordinate descent. Decomposition of large-scale optimization problems. Augmented Lagrangian method and alternating direction method of multipliers. Monotone operators and operator-splitting algorithms. Second-order algorithms: inexact Newton methods, interior-point algorithms for conic optimization. Letter grading.
Review Summary
- Clarity
-
N/A
- Organization
-
N/A
- Time
-
N/A
- Overall
-
N/A