Differences

This shows you the differences between two versions of the page.

Link to this comparison view

research [2017/11/02 11:49]
research [2019/10/24 11:37]
Line 1: Line 1:
 ====== Research ====== ====== Research ======
  
-===== Geometric Low-Level ​Variational Image Analysis on Metric Measure Spaces =====+===== Variational Image Analysis on Manifolds and Metric Measure Spaces =====
  
-We exploit basic statistical manifolds to devise variational models of low-level image analysis that exhibit favourable properties in comparison to established convex and non-convex models: smoothness, probabilistic interpretation,​ efficiently converging parallel and sparse Riemannian numerical updates that scale up to large problem sizes. The current focus is on the assignment manifold and image labeling, and on learning from image assignments in large-scale unsupervised scenarios, within the mathematical frameworks of information geometry and regularised optimal transport.+**Scope.** ​We exploit basic statistical manifolds to devise variational models of low-level image analysis that exhibit favourable properties in comparison to established convex and non-convex models: smoothness, probabilistic interpretation,​ efficiently converging parallel and sparse Riemannian numerical updates that scale up to large problem sizes. ​
  
-We applied our approach to solve in a novel way the MAP labeling ​problem based on a given graphical model by smoothly combining a geometric reformulation of the local polytope relaxation with rounding to an integral solution. A key ingredient are local `Wasserstein messages'​ that couple local assignment ​measures along edges.+The current focus is on the **//​assignment manifold//​** and image labeling, and on learning from image assignments in large-scale unsupervised scenarios, within ​the mathematical frameworks of information geometry and regularised optimal transport. A novel smooth dynamical system evolving on a statistical manifold, called **//assignment ​flow//**, forms the basis of our work.
  
-  ​* [[https://arxiv.org/abs/1710.01493|preprintarXiv:​1710.01493,​ 2017]]+**Mathematical aspects.** The assignment flow evolves non-locally for any data given on a graph. Variational aspects, extensions to continuous domains and scale separation are investigated. A preliminary step concerns a more classical //​additive//​ variational formulation that provides a smooth geometric version of the continuous cut approach. 
 +  ​* [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/​Savarino2019aa.pdf|A Variational Perspective on the Assignment FlowSSVM 2019]].
  
-Kick-off paper that introduces ​the basic approach:+**Parameter learning.** We study how weights for geometric diffusion ​that parametrize ​the adaptivity of the assignment flow can be learned from data. Symplectic integration ensures the commutativity of discretisation and optimisation operations. We currently investigate this approach ​in connection with more general objective functions. 
 +  * [[https://​arxiv.org/​abs/​1910.09976|Learning Adaptive Regularization for Image Labeling Using Geometric Assignment, preprint: arXiv:​1910.09976]] 
 +  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Huhnerbein2019aa.pdf|Conference version, SSVM 2019]].
  
-  ​* [[https://link.springer.com/article/10.1007/s10851-016-0702-4|JMathImagVision2017]] +**Unsupervised label learning.** Our recent work concerns the emergence of labels in a completely unsupervised way by data //​self//​-assignment. The resulting unsupervised assignment flow has connections to low-rank matrix factorisation and discrete optimal mass transport that are explored in our current work. 
-  * [[https://www.readcube.com/articles/10.1007/s10851-016-0702-4?​author_access_token=qTJknl5fUiTP-FjpTKUBO_e4RwlQNchNByi7wbcMAY6Xsf53Ss0CTbPqiHjWrFr9KxurTkJxDnblRwd66rV9vVhzVeITjqSsDSC8NWZFxg9y-pWgHhjix00mggjora7T-qHFcXzGInobFGxuIfcnEA%3D%3D|Link to online PDF]]+  ​* [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/​Zisler2019aa.pdf|Unsupervised Labeling by Geometric and Spatially Regularized Self-Assignment,​ SSVM 2019]]. 
 + 
 +We extended the assignment flow to //​unsupervised//​ scenarios, where label evolution on a feature manifold is simultaneously performed together with label assignment to given data. The following papers introduce the corresponding //​unsupervised assignment flow//. 
 +  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Zern2019aa.pdf|Unsupervised Assignment Flow: Label Learning on Feature Manifolds by Spatially Regularized Geometric Assignment, preprint: arXiv:1904.10863]] 
 +  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​gcpr2018.pdf|Unsupervised Label Learning on Manifolds by Spatially Regularized Geometric AssignmentGCPR 2018]]
 + 
 +**Geometric numerical integration.** We conducted a comprehensive study of //geometric integration//​ techniques, including automatic step size adaption, for numerically computing the assignment flow in a stable, efficient and parameter-free way.  
 +  * [[https://iopscience.iop.org/article/10.1088/1361-6420/​ab2772|Geometric Numerical Integration of the Assignment Flow, Inverse Problems, 2019]] 
 +  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Zeilmann2018aa.pdf|preprint:​ arXiv:​1810.06970]] 
 + 
 +**Evaluation of discrete graphical models.** We applied our approach to solve in a novel way the //MAP labeling problem// based on a given graphical model by smoothly combining a geometric reformulation of the local polytope relaxation with rounding to an integral solution. A key ingredient are local `//​Wasserstein messages//'​ that couple local assignment measures along edges. 
 + 
 +  * [[https://​epubs.siam.org/​doi/​abs/​10.1137/​17M1150669|Image Labeling Based on Graphical Models Using Wasserstein Messages and Geometric Assignment, SIAM J. on Imaging Science 11/2 (2018) 1317--1362]] 
 + 
 +**Kick-off paper** that introduces the basic approach: 
 + 
 +  * [[https://​ipa.math.uni-heidelberg.de/​dokuwiki/​Papers/​Astroem2017.pdf|Image Labeling by Assignment.,​ J. Math. Imag. Vision 58/2 (2017) 211--238]]
   * [[http://​www-rech.telecom-lille.fr/​diff-cv2016/​|Proceedings DIFF-CVML'​16;​ Grenander best paper award]]   * [[http://​www-rech.telecom-lille.fr/​diff-cv2016/​|Proceedings DIFF-CVML'​16;​ Grenander best paper award]]
   * [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Astroem2016d.pdf|Proceedings ECCV'​16]]   * [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Astroem2016d.pdf|Proceedings ECCV'​16]]
Line 33: Line 52:
 Johannes Berger, Andreas Neufeld, Florian Becker, Frank Lenzen, Christoph Schnörr Johannes Berger, Andreas Neufeld, Florian Becker, Frank Lenzen, Christoph Schnörr
  
-Details: [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Swoboda2016.pdf|Paper]]+Details: [[https://​ipa.iwr.uni-heidelberg.de/​dokuwiki/​Papers/​Berger2015a.pdf|Paper]]
 ===== Partial Optimality in MAP-MRF ===== ===== Partial Optimality in MAP-MRF =====