Differences

This shows you the differences between two versions of the page.

Link to this comparison view

teaching:ft1920:vl:convex [2020/01/16 17:49]
ipa Add exercise sheet 12
teaching:ft1920:vl:convex [2021/03/02 13:28]
Line 1: Line 1:
-====== Convex and Nonlinear Numerical Optimization (MM25) ====== 
  
-===== Content ===== 
- 
-The lecture is organized into two parts. ​ 
- 
-**Part 1** is devoted to convex analysis and programming which play a key role in computational data analysis and also provide the basis for nonconvex problems (Part 2). 
-Keywords: smooth and nonsmooth convex analysis, conjugation and duality, conic programs, operator splitting, deterministic and stochastic convex optimization. ​ 
- 
-**Part 2** is devoted to nonconvex problems with additional structure that enables to design convergent algorithms. 
-Keywords: elementary Riemannian manifolds, retractions,​ nonpositive curvature, Riemannian means, Kurdyka-Lojasiewicz property and global proximal optimization. 
- 
-Basic problems from machine learning and computational data analysis illustrate the application of these concepts. 
- 
- 
-===== Organization ===== 
- 
-==== Place & Time ==== 
-   * **Lecture**:​ Tuesday and Friday from 11-13 in seminar room 6 in the Mathematikon (INF 205) 
-   * **Exercise class**: Thursday 9-11 in seminar room 7 in the Mathematikon (INF 205), the first exercise class will be on 24th of October. 
- 
-==== Language ==== 
-English or German, as the audience requests. 
- 
-==== Target Audience ==== 
-Students of mathematics and scientific computing that are interested ​ numerical optimization,​ with a focus on applications to data analysis and machine learning. ​ 
- 
-==== Prerequisites ==== 
-Mandatory undergraduate courses on analysis and linear algebra. 
- 
-==== Registration ==== 
-If you wish to attend the lecture and the exercises, please sign up using [[https://​muesli.mathi.uni-heidelberg.de/​|MÜSLI]]. 
- 
-==== Exercises ==== 
-Each week there will be an exercise sheet you can voluntarily work on. The exercises will not be collected and corrected, but the solutions will be presented in the exercise class. 
- 
-Some exercise sheets contain (voluntary) programming exercises, which also will be discussed in the exercise class. 
-We recommend programming the exercises with Python and numpy. 
-A basic understanding of Python and numpy should be sufficient for most exercises. 
- 
-==== Using Mathematica ==== 
-Go to the [[https://​www.wolfram.com/​programming-lab/​|Wolfram Programming Lab]] and click on the orange button. 
-This brings you to a tutorial notebook. With the file menu in the light grey bar you can create an empty notebook. 
-In the notebook you can paste the code from the code files below. 
-Execute the code with Shift+Enter. 
- 
- 
- 
-===== Lecture Notes ===== 
-You need to log in to access the lecture notes. 
- 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​toc.pdf |Table of Contents (Jan 6)}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​introduction.pdf |Introduction (update: Oct 18)}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​literature.pdf |Literature}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​preliminaries-svd.pdf |Preliminaries:​ SVD}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​smoothconvexfunctions.pdf |Smooth Convex Functions}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​nonsmoothconvex.pdf ​ |Nonsmooth Convex Functions, Convex Sets, Optimality (update: Nov 25)}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​nonexpansiveoperators.pdf |Nonexpansive Operators (update: Nov 25)}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​coalgorithms-1.pdf |Convex Optimisation Algorithms 1}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​coalgorithms-2.pdf |Convex Optimisation Algorithms 2}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​conjugationduality.pdf |Conjugation,​ Duality}} \\ 
-{{ :​teaching:​ft1920:​vl:​convex:​files:​nonconvex.pdf |Nonconvex Optimization}} 
- 
- 
-===== Exercise Sheets ===== 
-You need to log in to access the exercise sheets. 
- 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt1.pdf | Exercise Sheet 1}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt2.pdf | Exercise Sheet 2}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt3.pdf | Exercise Sheet 3}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt4.pdf | Exercise Sheet 4}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt5.pdf | Exercise Sheet 5}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt6.pdf | Exercise Sheet 6}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt7.pdf | Exercise Sheet 7}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt8.pdf | Exercise Sheet 8}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​8_moreauenvelope.pdf | Mathematica code for visualizing the Moreau Envelope}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​8_sigmoid.pdf | Mathematica code for visualizing the Sigmoid function}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​8_logisticloss.pdf | Mathematica code for visualizing the logistic loss classifier}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt9.pdf | Exercise Sheet 9}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​data_exercise_sheet_9.zip | Data}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​9_StochasticGradientDescent.pdf | Mathematica code for visualizing the stochastic gradient descent for a logistic loss classifier}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt10.pdf | Exercise Sheet 10}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt11.pdf | Exercise Sheet 11}} 
-  - {{teaching:​ft1920:​vl:​convex:​files:​uebungsblatt12.pdf | Exercise Sheet 12}} 
-      - {{teaching:​ft1920:​vl:​convex:​files:​data_exercise_sheet_12.zip | Data}}