University of Chicago

Spring 2013

This course is a continuation of Stat 309 and Stat 310. We will discuss topics that we did not have a chance to cover in these two earlier courses. However it is not necessary to have taken Stat 309 or Stat 310 before taking this course.

In Stat 310, we focused almost exclusively on optimization
*theory*. The first part of this course will be on optimization
*algorithms*. These will be for convex optimization as well as for
nonlinear optimization. The algorithms will include those for
unconstrained optimization (line search methods, trust region methods,
quasi-Newton methods, nonlinear conjugate gradient methods) and those for
constrained optimization (augmented Lagrangian method, penalty methods,
interior point methods). Time permitting, we will also discuss first-order
methods, stochastic gradient descent methods, and derivative-free
methods.

We will see that the most expensive step in these optimization
algorithms is invariably the solution of large systems of linear
equations *A***x** = **b** where *A* is often a
sparse matrix. This brings us to the subject matter of the second part of
this course.

- 06/09/13: Homework 6 posted.

- 05/24/13: Homeworks 4 & 5 posted.

- 05/23/13: No lecture on Tue, Jun 04. Make-up lecture Thu, Jun 06, 2:00–4:00pm in Eckhart 202.

- 05/16/13: Make-up lecture Wed, May 22, 3:00–4:00pm in Eckhart 117, 4:00–5:00pm in Eckhart 133.

- 05/15/13: Homework 3 posted.

- 05/08/13: Office hours next Mon, May 13, 2:30–4:00pm.

- 05/03/13: Homework 2 posted.

- 04/23/13: Homework 1 posted.

- 04/10/13: Homework 0 posted.

- 04/03/13: No lectures on Thu, Apr 18 and Tue, Jun 04. Make-up lectures TBA.

- 04/02/13: Check back regularly for announcements.

**Location:** Eckhart Hall,
Room 117

**Times:** 10:30–11:50am on Tue/Thu.

**Instructor:** Lek-Heng
Lim

Office: Eckhart
122

`lekheng(at)galton.uchicago.edu`

Tel: (773) 702-4263

**Office hours:** Wed, 3:30–4:30pm.

**Course Assistant:** Stefano
Castruccio

Office: Eckhart 8

`castruccio(at)galton.uchicago.edu`

**Office hours:** Wed, 2:30–3:30pm.

Problem set will be assigned weekly and will be due the following week. Collaborations are permitted but you will need to write up your own solutions.

- Problem Set 6: Do these eight problems (posted: Jun 09; due: Jun 24)

- Problem Set 5: Do these five problems (posted: May 24; due: Jun 6)

- Problem Set 4: Rewrite these direct Matlab implementations (delete the lines for generating Figures 11.17, 11.18, 11.21; use only the algorithmic parts) of barrier method for SDP and primal-dual method for LP using CVX (which really calls SDPT3); compare the direct and CVX implementations for accuracy, speed, and robustness (posted: May 24; due: May 30)

- Problem Set 3: Do Exercises 10.5, 10.6, 10.15, 11.1, 11.2; you may start from these Matlab codes (posted: May 15; due: May 23)

- Problem Set 2: Redo
Exercises 9.30 and 9.31 with BFGS updates and compare with
results obtained in Problem Set 1; initialize with
different
*H*_{0}— lazy, finite-difference, and exact. Do Exercises 9.12, 9.13, 9.18, 9.32, 10.2, 10.3; you may start from these Matlab codes (posted: May 3; due: May 14)

- Problem Set 1: Do Exercises 9.6, 9.8, 9.10, 9.30 (read 8.3 in extra exercises first), 9.31 (read 8.4 in extra exercises first); you may start from these Matlab codes (posted: Apr 23; due: May 2)

- Problem Set 0: Install CVX and run code segment. Do Exercises 3.2, 3.3, 3.9(c), 3.24, 4.1, 5.4 in extra exercises (posted: Apr 10; due: Apr 23)

**Bug report** on the problem sets or the solutions:
`lekheng(at)galton.uchicago.edu`

- C. G. Broyden, "On the discovery of
the 'good Broyden' method,"
*Math. Program.*,**87**(2000), no. 2, pp. 209–213.

- Yu. Nesterov, "How to advance in
structural convex optimization,"
*Optima*,**78**(2008), pp. 2–5.

- Course homepage from Winter 2013, Winter 2012, Winter 2011.

**Grade composition:** No in-class examination. Grade based entirely
on six take-home problem sets.

Boyd & Vandenberghe is freely available online. Nocedal & Wright is freely accessible with your CNetID.

- S. Boyd, L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004.

- G. Golub, C. Van Loan, Matrix Computations, 4th Ed., John Hopkins, 2013.

- J. Nocedal, S. Wright, Numerical Optimization, 2nd Ed., Springer, 2006.