Lecture, four hours; outside study, eight hours. Requisite: course 236B. First-order algorithms for convex optimization: subgradient method, conjugate gradient method, proximal gradient and accelerated proximal gradient methods, block coordinate descent. Decomposition of large-scale optimization problems. Augmented Lagrangian method and alternating direction method of multipliers. Monotone operators and operator-splitting algorithms. Second-order algorithms: inexact Newton methods, interior-point algorithms for conic optimization. Letter grading.

Review Summary

Clarity
N/A
Organization
N/A
Time
N/A
Overall
N/A

Course

Instructor
Lieven Vandenberghe
Previously taught
22S 20S 19S
Formerly offered as
EL ENGR 236C