Variational and Optimization Methods of Machine Learning and Data Science
Content
The lecture focuses on the following aspects of the overall theme:
(i) Use of probability measures throughout most parts of the lecture for representing data independent from specific formats; (ii) Introduction and discussion of variational formulations of increasing complexity, for basic tasks of inference and learning in terms of suitable functionals of probability distributions; (iii) Derivation by variational approximations of processes that realize data-driven inference and learning through neural networks and optimization.
Previous knowledge expected
Good knowledge of analysis and linear algebra;
basic concepts of convex analysis and optimization.
Objectives
Mathematical principles are emphasized that help the student to classify the field and to recognize common and distinguishing properties. The lecture complements the other core lectures of the master programme and prepares the student for advanced follow-up lectures devoted to contemporary research topics of machine learning and data science.
Detailed Course Type
Core lecture of the master programme: Mathematical Methods of Machine Learning and Data Science
Further information
Keywords:
entropy, mutual information, Bregman divergences, information geometry, latent variable models, variational bounds, statistical influence functions, optimization in spaces of measures, generative models, online and active learning
- Lectures: Prof. Christoph Schnörr
- Exercises: Daniel Gonzalez, Jonas Cassel
- Language: English
- SWS: 4
- ECTS: 8
- Registration:
HeiCo – Lecture
HeiCo – Exercise
- Link Material, Exercises: Lecture Notes, Exercises