Convex Optimization (PhD level course)

Acknowledgment: I would like to thank Stephen Boyd (Stanford University) and Lieven Vanderberghe (UCLA) for their course material on convex optimization used in my lectures.

Course description

The course provides the doctoral students with the tools and training for recognizing and solving convex optimization problems arising in their research areas. We start with the basic notions of convex analysis (convex sets, convex functions, and convex optimization problems) and introduce some fundamental concepts (optimality conditions, duality theory, theorems of alternative, and perturbation theory). Then, we study in more detail several standard convex problems (linear and quadratic programs, semidefinite programming, geometric programming, and minimax problems) that are often encountered in practice. Finally, we discus computationally efficient numerical methods (e.g., interior points) used in practice for solving the convex optimization problems. Applications to signal processing, statistics and machine learning, and communication engineering, are discussed.

The course is based on the following textbook:

A brief, insightful overview of convex optimization given by Stephen Boyd can be found here.

Lectures

Monday, 13:00 – 15:00, via Zoom. Our first lecture will be on 19'th of April, 2021.

Credits

The course is divided in 3 modules, as follows:

  • Fundamentals of convex optimization (Chapters 1-6 from textbook and homework exercises HW.1-9): 5 ECTS (equivalent to 5hp)

  • Advanced topics in convex optimization (Chapters 7-10 from textbook and homework exercises form HW.10 onward): 3 ECTS

  • Project: up to 4 ECTS (exact number will be decided later as it may also depend on the complexity of the selected problem)

Requirements

Weekly homework will be assigned each Monday and it is due the following Monday, 23:59. You can miss the deadline twice during the entire course but by no more than one week. Beyond this, no other late homework is accepted. The solutions must be scanned (you can use, e.g., the phone's camera for that), converted into a single pdf file, and returned via e-mail before the deadline. The email's subject must be “Convex Optimization HWxx” (xx is the homework index, e.g., HW01, HW02 …).

  • The returned solutions must be handwritten and each step of the mathematical derivations must be well justified. We will deduct points from the unclearly written solutions even if you reached the correct answer. Whenever possible, you are also expected to draw simple figures to illustrate the geometrical insight (this is NOT meant to replace the analytic derivation, only to complement it). For exercises involving programming and/or plotting curves using Matlab, please attach also the scripts (they must contain enough comments to be easily understood). Each question is graded on a scale 0, 1, 2.

  • You are encouraged to form small groups and discuss about the homework exercises together. However, each student is supposed to write up and return the solutions independently. Everyone is supposed to return only the solutions that (s)he was able to solve independently after the group discussion (returning solutions copied from colleagues or other sources is considered cheating).

  • We are aware that hints, answers, or even complete solutions of some homework exercises can be found online. However, you are strongly encouraged to try first solving the homework exercises without looking to the solutions. Note also that quite often the online solutions are just skeletons meant to help you find the right answer but those need to be substantially complemented with your own explanations. What we expect from you are crystal clear solutions where you use your own explanations to carefully justify each step of your derivations. The solutions highly correlated with those returned by other colleagues or available online will not be graded.

Reading assignments: before each lecture, you must read in advance the currently covered chapter from the textbook; you are also encouraged to follow in advance the corresponding YouTube lecture(s). Please try to keep up with the reading assignments; otherwise it may be difficult to keep pace with the lectures.

Final project: choose an appropriate problem from your research area and solve it via convex optimization. You are encouraged to form small teams of 2-3 students per project (4 students per project may be acceptable for an unusually complex project). The formal project requirements are:

  • Project proposal (1 page + references) describing the team, selected topic/problem and some initial ideas about the approach; Due: May 31, 2021.

  • Midterm progress report (4 pages + references) must contain a clear problem statement and initial results; Due: August 31, 2021.

  • Peer review Another project team reviews your progress report and provides you constructive feedback (1 page); Due: September 30, 2021.

  • Final report (6 pages + references) must be prepared according to these guidelines; Due: October 31, 2021.

Regarding the selected problem please note that since the course is on convex optimization, the selected problem is supposed to be solvable via convex optimization methods. This does not necessarily means that you must restrict only to convex problems but, if you select a non-convex one, you are supposed to propose a relaxation, approximation or simplification that would lead to a convex problem (or a series of convex problems) that you will solve via standard convex optimization techniques. You are encouraged to use CVX to solve the optimization problems but you are allowed to use any other convex solver, too. However, solutions via non-convex solvers (e.g., MILP, etc.) does not fall in the interest of this course.

For passing the course, the students should have returned:

  1. sufficiently correct solution to the homework exercises, and

  2. an adequate (i.e., technically correct and readable) final project.

Prerequisites

The course is designed for students holding a Master of Science that includes basic knowledge in mathematics (especially linear algebra and matrix manipulation – please see the Appendix A of the textbook) and programming (you will use Matlab to write simple scripts). Previous background in numerical computing, probability theory, and statistics is helpful but not required.

Announcements

You are supposed to keep an eye on this page, as every announcement/assignment posted here will be assumed to be communicated to you

  • Any e-mail concerning this course must contain the key words “Convex Optimization” in the e-mail's subject (e.g., Convex Optimization - question about XYZ, Convex Optimization HW01, etc.). This will ensure that your e-mail is answered timely.

  • The extra slides with GP applications can be downloaded from here.

  • Here are few clarifications regarding the project: The midterm progress report must contain a clear mathematical statement of the selected problem (objective function, optimization variables, constraints, etc.) and (if possible) some initial results or at least some initial thoughts on how you will solve the selected problem (e.g., using CVX or your own implementation of the solver). This information are required in order to be able to receive some meaningful feedback from your colleagues or instructor.

  • There will be no lecture on Monday, 11'th of October, 2021.

  • Student feedback is critical for improving the quality of the course. Please take a moment to fill up the course evaluation form (available also in pdf format here) and return it via e-mail. If you prefer to return it anonymously, you can use this gmail account. Your comments are highly appreciated.

Homework

  • HW1: 2.2, 2.5, 2.6, 2.7, 2.8, 2.9 a&b, 2.10 a, 2.12, 2.31, and 2.32 from the text book. Due: May 03, 2021 (23:59).

  • HW2: 3.2, 3.5, 3.6, 3.13, 3.15, 3.18, 3.20 b&c, and 3.21 b from the text book. Due: May 10, 2021 (23:59).

  • HW3: 3.22 a&c, 3.23 b, 3.24, 3.36 a,b,d, 3.39 a&c, and 3.49 a,c,d from the text book. Due: May 17, 2021 (23:59).

  • HW4: 4.1, 4.3, 4.6, 4.8, and 4.11 from the text book. Due: May 24, 2021 (23:59).

  • HW5: 4.12, 4.16, 4.19 b, 4.24, 4.25 (draw a figure and justify all steps), 4.26 a and 4.33 a&b from the text book. Due: May 31, 2021 (23:59).

  • HW6: 4.42, 4.57, 4.59, and 4.62 from the text book. Due: June 7, 2021 (23:59).

  • HW7: 5.1, 5.3, 5.4, 5.5, 5.9, 5.12 and 5.13 from the text book. Due: June 14, 2021 (23:59).

  • HW8: 5.14, 5.19 a&b, 5.20, 5.21, 5.22 (note that you have to draw a sketch of the sets, not only to answer the questions), 5.27, 5.28 (you have to explain how you obtain the certificate), 5.31, 5.32, 5.39, 5.43. Furthermore, to help you keep tuned, you are allowed (actually encouraged) to also solve and return exercises from the past homework. Due: August 23, 2021 (23:59).

  • HW9: no new homework is assigned due to midterm project progress report deadline but you can still solve and return exercises from the past homework until August 30, 2021 (23:59).

  • HW10: 6.2 (just stating the solution is not enough, you have to argue your solution; draw also a graphic in 2 dimensions), 6.5 (show also how the problem can be solved via bisection, i.e., write a Matlab script and provide a plot for a randomly generated instance), 6.8a and A5.10 from the online additional exercises. Due: September 6, 2021 (23:59).

  • HW11: 7.1 (just stating the solution is not enough, you have to provide the derivation), 7.8 and 8.8. Due: September 13, 2021 (23:59).

  • HW12: 8.17 (note: just stating the affine invariance condition is not enough, you must show that the condition actually holds), 8.18 and A7.6 from the online additional exercises. Due: September 20, 2021 (23:59).

  • HW13: A7.6 (if you have not returned it last time), A7.7 and A10.2 from the online additional exercises. Due: September 27, 2021 (23:59).

  • HW14:9.30 from the text book (you must explain in detail your solution: derive the mathematical expressions for the gradient, Hessian, descent direction, etc., describe the stopping criterion, backtracking parameters, etc., and use plenty of comments in your source). Due: October 4, 2021 (23:59). Obs: to clarify the difference between consulting the solution manual (which is allowed) and copying the solution (which is cheating) please note that a solution is fine if: (1) you understood and can explain each and every step; (2) you wrote it by yourself without looking to the solution while writing it (e.g., you wrote it the next day after checking the solution); and, (3) you are not ashamed if your PhD adviser sees side by side your solution and the solution manual.

  • HW15: 10.1 a, 10.6 and 10.15 from the textbook. Due: October 18, 2021 (23:59). The following exercises are optional but highly recommended: 9.1 (it is a good idea to solve this one before 10.1) and 9.27 (you can test your idea by modifying the solution of 9.30 from last homework).

  • HW16: A9.5 from the online additional exercises. Due: November 1, 2021 (23:59).

Reading assignments

  • by April 26, 2021 : chapter 2 (Convex sets) and Appendix A (Mathematical background) from the textbook.

  • by May 03, 2021: from chapter 3 (Convex functions) sections 3.1 and 3.2; if you still feel uncomfortable with basic linear algebra and calculus please review once more Appendix A.

  • by May 10, 2021: from chapter 3 (Convex functions) sections 3.3 – 3.6;

  • by May 17, 2021: from chapter 4 (Convex optimization problems) sections 4.1 – 4.3 and skim through CVX Users’ Guide (ignore the duality related sections).

  • by May 24, 2021: from chapter 4 (Convex optimization problems) sections 4.4 – 4.5 and continue skimming through CVX Users’ Guide.

  • by May 31, 2021: from chapter 4 (Convex optimization problems) sections 4.6 – 4.7 and CVX Users’ Guide (everything, except the duality related sections).

  • by June 7, 2021: from chapter 5 (Duality) sections 5.1 – 5.4.

  • by June 14, 2021: from chapter 5 (Duality) sections 5.5 – 5.9 and duality related sections from CVX Users’ Guide.

  • by August 23, 2021: chapter 6 (Approximation and fitting).

  • by August 30, 2021: chapter 7 (Statistical estimation).

  • by September 6, 2021: chapter 8 (Geometric problems).

  • by September 13, 2021: appendix C (Numerical linear algebra background) section C.1.

  • by September 20, 2021: appendix C (Numerical linear algebra background) section C.2 – C.5.

  • by September 27, 2021: chapter 9 (Unconstrained minimization).

  • by October 4, 2021: chapter 10 (Equality constrained minimization).

  • by October 18, 2021: chapter 11 (Interior-point methods).

Online supporting materials

The textbook, the lecture slides, and a complete set of video lectures are all freely available online.

If you need to refresh your linear algebra knowledge please check:

If you need to refresh your multivariable calculus knowledge please check:

What's next? For more advanced topics in (convex) optimization please check:

  • EE364b - Convex Optimization II course (originally developed by Stephen Boyd) given at Stanford; a complete set of video lectures, slides, lecture notes, homework exercises, etc. can be found here.

A series of 3 talks on optimization, given by Stephen Boyd at the Machine Learning Summer School 2015 can be found here: part 1, part 2, part 3.

Optional references

For numerical linear algebra please check:

If you want to extend your knowledge about mathematical optimization you can also check:

Here are few more references (mostly) focussed on communications and signal processing applications:

  • Y. C. Eldar, T. Davidson, A. B. Gershman, and Z.-Q. Luo, Guest Editors, Special Issue on Convex Optimization Methods for Signal Processing," IEEE Signal Processing Magazine, vol. 1, no. 4, pp. 537-735, Dec. 2007.

  • Y. C. Eldar, Z.-Q. Luo, W. K. Ma, D. P. Palomar, and N. D. Sidiropoulos, Guest Editors, Special Issue on Convex Optimization in Signal Processing," IEEE Signal Processing Magazine, vol. 27, no. 3, pp. 19-127, May 2010.

  • M. Chiang, S. H. Low, Z.-Q. Luo, N. B. Shroff, and W. Yu, Guest Editors, “Special Issue on Nonlinear Optimization of Communication Systems," IEEE Journal on Selected Areas in Communications,vol. 24, no. 8, pp. 1421-1661, Aug. 2006.

  • M. Chiang, S. H. Low, A. R. Calderbank, and J. C. Doyle, Layering as optimization decomposition: A mathematical theory of network architectures, “Proceedings of IEEE”, vol. 95, no. 1, pp. 255-312, Jan. 2007

  • A. Wiesel, Y. C. Eldar, and S. Shamai, “Linear precoding via conic optimization for fixed MIMO receivers," IEEE Transactions on Signal Processing, vol. 54, no. 1, pp. 161-176, Jan. 2006.

  • M. Bengtsson and B. Ottersten, “Optimal and suboptimal transmit beamforming," in Handbook of Antennas in Wireless Communications, L. C. Godara, Ed. CRC Press, Boca Raton, FL, 2001.

  • M. Bengtsson and B. Ottersten, “Optimal downlink beamforming using semidefinite optimization," in Proc. Annual Allerton Conf. Commun., Cont., Computing, Urbana- Champaign, IL, Sept. 22-24 1999, pp. 987-996.

  • D. P. Palomar, A Unified Framework for Communications through MIMO Chan nels, Ph.D. thesis, Department of Signal Theory and Communications, Technical University of Catalonia, Barcelona, Spain, May 2003.

  • S. A. Vorobyov, A. B. Gershman, and Z. Q. Luo, “Robust adaptive beamforming using worst-case performance optimization: A solution to the signal mismatch problem," IEEE Transactions on Signal Processing, vol. 51, no. 2, pp. 313-324, Feb. 2003.

  • Z. Q. Luo, “Applications of convex optimization in signal processing and digital communication,” Math. Programm., ser. B, vol. 97, pp. 177–207, 2003.

  • Z. Q. Luo and W. Yu, “An introduction to convex optimization for communications and signal processing,” IEEE J. Sel. Areas Commun., vol. 24, no. 8, pp. 1426–1438, Aug. 2006

  • C A. Wiesel, Y. C. Eldar, and S. Shamai, “Linear precoding via conic optimization for fixed MIMO receivers,” IEEE Trans. Signal Process., vol. 54, no. 1, pp. 161–176, Jan. 2006.

  • M. S. Lobo, L.Vandenberghe, S. Boyd, and H. Lebret, “Applications of second-order cone programming,” Linear Algebra Applicat., vol. 284, pp. 193–228, Nov. 1998.

  • S. Boyd, S. J. Kim, L. Vandenberghe, and A. Hassibi, “A tutorial on geometric programming,” Optimiz. Eng., vol. 8, no. 1, pp. 67–127, Mar. 2007.

  • M. Chiang, “Geometric programming for communication systems,” Found. Trends in Commun. Inf. Theory, vol. 2, no. 1–2, pp. 1–154, Jul. 2005.

  • W. Yu, W. Rhee, S. Boyd, and J. M. Cioffi, Iterative water-filling for Gaussian vector multiple-access channels," IEEE Transactions on Information Theory, vol. 50, no. 1, pp. 145-152, Jan. 2004.

  • T. M. Cover and M. Chiang, Duality between channel capacity and rate distortion with two-sided state information," IEEE Transactions on Information Theory, vol. 48, no. 6, pp. 1629-1638, June 2002.