# Differences

This shows you the differences between two versions of the page.

Both sides previous revision Previous revision Next revision | Previous revision | ||

research [2018/09/23 13:31] ipa [Variational Image Analysis on Manifolds and Metric Measure Spaces] |
research [2020/01/29 21:57] (current) ipa typos |
||
---|---|---|---|

Line 3: | Line 3: | ||

===== Variational Image Analysis on Manifolds and Metric Measure Spaces ===== | ===== Variational Image Analysis on Manifolds and Metric Measure Spaces ===== | ||

- | **Scope.** We exploit basic statistical manifolds to devise variational models of low-level image analysis that exhibit favourable properties in comparison to established convex and non-convex models: smoothness, probabilistic interpretation, efficiently converging parallel and sparse Riemannian numerical updates that scale up to large problem sizes. | + | **Scope.** We exploit basic statistical manifolds to devise variational models of low-level image analysis that exhibit favorable properties in comparison to established convex and non-convex models: smoothness, probabilistic interpretation, efficiently converging parallel and sparse Riemannian numerical updates that scale up to large problem sizes. |

The current focus is on the **//assignment manifold//** and image labeling, and on learning from image assignments in large-scale unsupervised scenarios, within the mathematical frameworks of information geometry and regularised optimal transport. A novel smooth dynamical system evolving on a statistical manifold, called **//assignment flow//**, forms the basis of our work. | The current focus is on the **//assignment manifold//** and image labeling, and on learning from image assignments in large-scale unsupervised scenarios, within the mathematical frameworks of information geometry and regularised optimal transport. A novel smooth dynamical system evolving on a statistical manifold, called **//assignment flow//**, forms the basis of our work. | ||

- | **Current work.** We conduct a comprehensive study of //geometric integration// techniques, including automatic step size adaption, for numerically computing the assignment flow in a stable, efficient and parameter-free way. Based on this, we study how weights for geometric diffusion can be learned from data, by applying optimal control to the assignment flow. This enables to attach a semantic meaning to such weights, a property that is missing in current models of artificial neural networks. | + | **Mathematical aspects.** The assignment flow evolves non-locally for any data given on a graph. Geometric and variational aspects, extensions to continuous domains, scale separation and models of knowledge representation across the scales are investigated. |

- | **Recent work.** We extended the assignment flow to //unsupervised// scenarios, where label evolution on a feature manifold is simultaneously performed together with label assignment to given data - see the | + | A preliminary extension from graphs to the continuous domain in the `zero-scale limit' (local interaction only) reveals the interplay between the underlying geometry and variational aspects. |

- | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/gcpr2018.pdf|preliminary announcement at the GCPR 2018]]. | + | * [[https://arxiv.org/abs/1910.07287|Continuous-Domain Assignment Flow, preprint arXiv:1910.07287]]. |

- | This paper sketches a special instance of a more general framework, the //unsupervised assignment flow//, to be introduced in a forthcoming report. | + | A more classical //additive// variational reformulation provides a smooth geometric version of the continuous cut approach. |

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Savarino2019aa.pdf|A Variational Perspective on the Assignment Flow, SSVM 2019]]. | ||

- | We applied our approach to solve in a novel way the //MAP labeling problem// based on a given graphical model by smoothly combining a geometric reformulation of the local polytope relaxation with rounding to an integral solution. A key ingredient are local `//Wasserstein messages//' that couple local assignment measures along edges. | + | **Parameter learning.** We study how weights for geometric diffusion that parametrize the adaptivity of the assignment flow can be learned from data. Symplectic integration ensures the commutativity of discretization and optimization operations. Results reveal the steerability of the assignment flow and its potential for pattern //formation//. |

+ | * [[https://arxiv.org/abs/1910.09976|Learning Adaptive Regularization for Image Labeling Using Geometric Assignment, preprint arXiv:1910.09976]] | ||

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Huhnerbein2019aa.pdf|Conference version, SSVM 2019]]. | ||

- | * [[https://epubs.siam.org/doi/abs/10.1137/17M1150669|SIAM J. on Imaging Science, 11/2 (2018) 1317--1362]] | + | **Unsupervised label learning.** Our recent work concerns the emergence of labels in a completely unsupervised way by data //self//-assignment. The resulting //self-assignment flow// has connections to low-rank matrix factorization and discrete optimal mass transport that are explored in our current work. |

+ | * [[https://arxiv.org/abs/1911.03472|Self-Assignment Flows for Unsupervised Data Labeling on Graphs; preprint: arXiv:1911.03472]] | ||

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Zisler2019aa.pdf|Unsupervised Labeling by Geometric and Spatially Regularized Self-Assignment, SSVM 2019]]. | ||

- | Kick-off paper that introduces the basic approach: | + | We extended the assignment flow to //unsupervised// scenarios, where label evolution on a feature manifold is simultaneously performed together with label assignment to given data. The following papers introduce the corresponding //unsupervised assignment flow//. |

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Zern2019aa.pdf|Unsupervised Assignment Flow: Label Learning on Feature Manifolds by Spatially Regularized Geometric Assignment, preprint arXiv:1904.10863]] | ||

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/gcpr2018.pdf|Unsupervised Label Learning on Manifolds by Spatially Regularized Geometric Assignment, GCPR 2018]]. | ||

- | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Astroem2017.pdf|J. Math. Imag. Vision 58/2 (2017) 211--238]] | + | **Geometric numerical integration.** We conducted a comprehensive study of //geometric integration// techniques, including automatic step size adaption, for numerically computing the assignment flow in a stable, efficient and parameter-free way. |

+ | * [[https://iopscience.iop.org/article/10.1088/1361-6420/ab2772|Geometric Numerical Integration of the Assignment Flow, Inverse Problems, 2019]] | ||

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Zeilmann2018aa.pdf|preprint: arXiv:1810.06970]] | ||

+ | | ||

+ | **Evaluation of discrete graphical models.** We applied our approach to solve in a novel way the //MAP labeling problem// based on a given graphical model by smoothly combining a geometric reformulation of the local polytope relaxation with rounding to an integral solution. A key ingredient are local `//Wasserstein messages//' that couple local assignment measures along edges. | ||

+ | | ||

+ | * [[https://epubs.siam.org/doi/abs/10.1137/17M1150669|Image Labeling Based on Graphical Models Using Wasserstein Messages and Geometric Assignment, SIAM J. on Imaging Science 11/2 (2018) 1317--1362]] | ||

+ | | ||

+ | **Kick-off paper** that introduces the basic approach: | ||

+ | | ||

+ | * [[https://ipa.math.uni-heidelberg.de/dokuwiki/Papers/Astroem2017.pdf|Image Labeling by Assignment., J. Math. Imag. Vision 58/2 (2017) 211--238]] | ||

* [[http://www-rech.telecom-lille.fr/diff-cv2016/|Proceedings DIFF-CVML'16; Grenander best paper award]] | * [[http://www-rech.telecom-lille.fr/diff-cv2016/|Proceedings DIFF-CVML'16; Grenander best paper award]] | ||

* [[https://ipa.iwr.uni-heidelberg.de/dokuwiki/Papers/Astroem2016d.pdf|Proceedings ECCV'16]] | * [[https://ipa.iwr.uni-heidelberg.de/dokuwiki/Papers/Astroem2016d.pdf|Proceedings ECCV'16]] | ||

===== Estimating Vehicle Ego-Motion and Piecewise Planar Scene Structure from Optical Flow in a Continuous Framework ===== | ===== Estimating Vehicle Ego-Motion and Piecewise Planar Scene Structure from Optical Flow in a Continuous Framework ===== | ||

- | We propose a variational approach for estimating egomotion and structure of a static scene from a pair of images recorded by a single moving camera. In our approach the scene structure is described by a set of 3D planar surfaces, which are linked to a SLIC superpixel decomposition of the image domain. The continuously parametrized planes are determined along with the extrinsic camera parameters by jointly minimizing a non-convex smooth objective function, that comprises a data term based on the pre-calculated optical flow between the input images and suitable priors on the scene variables. | + | We propose a variational approach for estimating egomotion and structure of a static scene from a pair of images recorded by a single moving camera. In our approach, the scene structure is described by a set of 3D planar surfaces, which are linked to a SLIC superpixel decomposition of the image domain. The continuously parametrized planes are determined along with the extrinsic camera parameters by jointly minimizing a non-convex smooth objective function, that comprises a data term based on the pre-calculated optical flow between the input images and suitable priors on the scene variables. |

Line 43: | Line 60: | ||

===== Partial Optimality in MAP-MRF ===== | ===== Partial Optimality in MAP-MRF ===== | ||

- | We consider the energy minimization problem for undirected graphical models, also known as MAP-inference problem for Markov random fields which is NP-hard in general. We propose a novel polynomial time algorithm to obtain a part of its optimal non-relaxed integral solution. For this task we devise a novel pruning strategy that utilizes standard MAP-solvers as subroutine. We show that our pruning strategy is in a certain sense theoretically optimal. Also empirically our method outperforms previous approaches in terms of the number of persistently labelled variables. The method is very general, as it is applicable to models with arbitrary factors of an arbitrary order and can employ any solver for the considered relaxed problem. Our method’s runtime is determined by the runtime of the convex relaxation solver for the MAP-inference problem. | + | We consider the energy minimization problem for undirected graphical models, also known as MAP-inference problem for Markov random fields which is NP-hard in general. We propose a novel polynomial time algorithm to obtain a part of its optimal non-relaxed integral solution. For this task, we devise a novel pruning strategy that utilizes standard MAP-solvers as a subroutine. We show that our pruning strategy is in a certain sense theoretically optimal. Also empirically our method outperforms previous approaches in terms of the number of persistently labeled variables. The method is very general, as it is applicable to models with arbitrary factors of arbitrary order and can employ any solver for the considered relaxed problem. Our method’s runtime is determined by the runtime of the convex relaxation solver for the MAP-inference problem. |

**Researchers**: Paul Swoboda, Bogdan Savchynskyy, Alexander Shekhovtsov, Jörg Hendrik Kappes, Christoph Schnörr\\ | **Researchers**: Paul Swoboda, Bogdan Savchynskyy, Alexander Shekhovtsov, Jörg Hendrik Kappes, Christoph Schnörr\\ | ||

Details: [[https://ipa.iwr.uni-heidelberg.de/dokuwiki/Papers/Swoboda2016.pdf|Paper]] | Details: [[https://ipa.iwr.uni-heidelberg.de/dokuwiki/Papers/Swoboda2016.pdf|Paper]] | ||