Optimization

(CS 395T)

Request Info

This class covers linear programming and convex cptimization. These are fundamental conceptual and algorithmic building blocks for applications across science and engineering. Indeed any time a problem can be cast as one of maximizing / minimizing and objective subject to constraints, the next step is to use a method from linear or convex optimization. Covered topics include formulation and geometry of LPs, duality and min-max, primal and dual algorithms for solving LPs, Second-order cone programming (SOCP) and semidefinite programming (SDP), unconstrained convex optimization and its algorithms: gradient descent and the newton method, constrained convex optimization, duality, variants of gradient descent (stochastic, subgradient etc.) and their rates of convergence, momentum methods. 

Syllabus

  • Convex sets, convex functions, Convex Programs (1 week)
  • Linear Programs (LPs), Geometry of LPs, Duality in LPs (1 week)
  • Weak duality, Strong duality, Complementary slackness (1 week)
  • LP duality: Robust Linear Programming, Two person 0-sum games, Max-flow min-cut (1 week)
  • Semidefinite programming, Duality in convex programs, Strong duality (1 week)
  • Duality and Sensitivity, KKT Conditions, Convex Duality Examples: Maximum Entropy (1 week)
  • Convex Duality: SVMs and the Kernel Trick, Convex conjugates, Gradient descent (1 week)
  • Line search, Gradient Descent: Convergence rate and step size, Gradient descent and strong convexity (1 week)
  • Frank Wolfe method, Coordinate descent, Subgradients (1 week)
  • Subgradient descent, Proximal gradient descent, Newton method (1 week)
  • Newton method convergence, Quasi-newton methods, Barrier method (1 week)
  • Accelerated Gradient descent, Stochastic gradient descent (SGD), Mini-batch SGD, Variance reduction in SGD (1 week)

Course Category

Theory Course

Course Availability

Fall 2020

Meet Your Instructors

Take the Next Step

Advance your computer science career with UT Austin's Master of Computer Science Online.