This is an old revision of the document!
===== (Master) Seminar: Score-based Generative Models for Machine Learning ===== ==== Descripion of Seminar. ==== Score-based generative models have demonstrated state-of-the-art performance in numerous applications over recent years. The central concept behind these models involves gradually introducing noise into the training data, followed by learning the reverse process to generate new samples. The training and sampling procedures can be conducted independently. The learning phase is facilitated by noise-conditional score networks, while sampling can be accomplished through various methods, including Langevin Monte Carlo approaches, stochastic differential equations, ordinary differential equations, and combinations. We begin this seminar by reviewing generative models and the most architectures used in the current research. the most ==== Organization ==== * **Prerequisites:** Basic knowledge in probability theory and statistics * **Registration:** Via Müsli. [[https://muesli.mathi.uni-heidelberg.de/lecture/view/1757|Link]] * **First (organizational) meeting:** Kalenderwoche 42. Specific day and time will be announced soon. * **Time and Location:** Time and location will be announced soon. Further information on the seminar will be announced in the first organizational meeting. For any specific question you can contact [[:people | Daniel Gonzalez]]. ==== Literature ==== * **An introduction to deep generative modeling**,// Ruthotto, Lars and Haber, Eldad//, GAMM-Mitteilungen, Wiley Online Library (2021) * **A conceptual introduction to Markov chain Monte Carlo methods**, //Speagle, Joshua S//,arXiv preprint (2019) * **Neural ordinary differential equations**, // Chen, Ricky TQ and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David K// NeurIPS (2018) * **Ffjord: Free-form continuous dynamics for scalable reversible generative models**, // Grathwohl, Will and Chen, Ricky TQ and Bettencourt, Jesse and Sutskever, Ilya and Duvenaud, David// arXiv preprint (2018) * **Diffusion models: A comprehensive survey of methods and applications**,// Yang, Ling and Zhang, Zhilong and Song, Yang and Hong, Shenda and Xu, Runsheng and Zhao, Yue and Shao, Yingxia and Zhang, Wentao and Cui, Bin and Yang, Ming-Hsuan//, arXiv preprint (2022) * **Applied stochastic differential equations**,// Särkkä, Simo and Solin, Arno//, Cambridge University Press (2019) * **Generative modeling by estimating gradients of the data distribution**,// Song, Yang and Ermon, Stefano//, NeurIPS (2019) * **Improved techniques for training score-based generative models**,// Song, Yang and Ermon, Stefano//, NeurIPS (2020) * **Score-based generative modeling through stochastic differential equations**,// Song, Yang and Sohl-Dickstein, Jascha and Kingma, Diederik P and Kumar, Abhishek and Ermon, Stefano and Poole, Ben//, ICLR (2021) * **Gotta go fast when generating data with score-based models**,// olicoeur-Martineau, Alexia and Li, Ke and Piché-Taillefer, Rémi and Kachman, Tal and Mitliagkas, Ioannis//, arXiv preprint (2021)