This is an old revision of the document!
===== (Master) Seminar: Score-based Generative Models for Machine Learning ===== ==== Descripion of Seminar. ==== Score-based generative models has established state-of-the art performance in many applications for the past recent years. The key idea of these models is to inject successively noise to the training data followed by learning the reverse process for the generation of new samples. Training and sampling can be done independently. Learning can be done with noise-conditional score networks and sampling can be achieved with several methods, such as Langevin Monte Carlo approaches, stochastic differential equations, ordinary differential equations and combinations. ==== Organization ==== * **Prerequisites:** Basic knowledge in probability theory and statistics * **Registration:** Via Müsli. [[https://muesli.mathi.uni-heidelberg.de/lecture/view/1757|Link]] * **First (organizational) meeting:** Kalenderwoche 42. Specific day and time will be announced soon. * **Time and Location:** Time and location will be announced soon. Further information on the seminar will be announced in the first organizational meeting. For any specific question you can contact [[:people | Daniel Gonzalez]]. ==== Literature ==== * **An introduction to deep generative modeling**,// Ruthotto, Lars and Haber, Eldad//, GAMM-Mitteilungen, Wiley Online Library (2021) * **A conceptual introduction to Markov chain Monte Carlo methods**, //Speagle, Joshua S//,arXiv preprint (2019) * **Neural ordinary differential equations**, // Chen, Ricky TQ and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David K// NeurIPS (2018) * **Ffjord: Free-form continuous dynamics for scalable reversible generative models**, // Grathwohl, Will and Chen, Ricky TQ and Bettencourt, Jesse and Sutskever, Ilya and Duvenaud, David// arXiv preprint (2018) * **Diffusion models: A comprehensive survey of methods and applications**,// Yang, Ling and Zhang, Zhilong and Song, Yang and Hong, Shenda and Xu, Runsheng and Zhao, Yue and Shao, Yingxia and Zhang, Wentao and Cui, Bin and Yang, Ming-Hsuan//, arXiv preprint (2022) * **Applied stochastic differential equations**,// Särkkä, Simo and Solin, Arno//, Cambridge University Press (2019) * **Generative modeling by estimating gradients of the data distribution**,// Song, Yang and Ermon, Stefano//, NeurIPS (2019) * **Improved techniques for training score-based generative models**,// Song, Yang and Ermon, Stefano//, NeurIPS (2020) * **Score-based generative modeling through stochastic differential equations**,// Song, Yang and Sohl-Dickstein, Jascha and Kingma, Diederik P and Kumar, Abhishek and Ermon, Stefano and Poole, Ben//, ICLR (2021) * **Gotta go fast when generating data with score-based models**,// olicoeur-Martineau, Alexia and Li, Ke and Piché-Taillefer, Rémi and Kachman, Tal and Mitliagkas, Ioannis//, arXiv preprint (2021)