6.253 Convex Analysis and Optimization
Core analytical issues of continuous optimization, duality, and saddle point theory, and development using a handful of unifying principles that can be easily visualized and readily understood. Discusses in detail the mathematical theory of convex sets and functions which are the basis for an intuitive, highly visual, geometrical approach to the subject. Convex optimization algorithms focus on large-scale problems, drawn from several types of applications, such as resource allocation and machine learning. Includes batch and incremental subgradient, cutting plane, proximal, and bundle methods.
6.253 will be offered this semester (Spring 2018). It is instructed by D. P. Bertsekas.
This class counts for a total of 12 credits. This is a graduate-level class.
You can find more information at the http://www.google.com/search?&q=MIT+%2B+6.253&btnG=Google+Search&inurl=https site.
© Copyright 2015 Yasyf Mohamedali