Hierarchical Uncertainty Exploration via Feedforward Posterior Trees

NeurIPS 2024

1Technion - Israel Institute of Technology   2CISPA Helmholtz Center for Information Security

Abstract

When solving ill-posed inverse problems, one often desires to explore the space of potential solutions rather than be presented with a single plausible reconstruction. Valuable insights into these feasible solutions and their associated probabilities are embedded in the posterior distribution. However, when confronted with data of high dimensionality (such as images), visualizing this distribution becomes a formidable challenge, necessitating the application of effective summarization techniques before user examination. In this work, we introduce a new approach for visualizing posteriors across multiple levels of granularity using tree-valued predictions. Our method predicts a tree-valued hierarchical summarization of the posterior distribution for any input measurement, in a single forward pass of a neural network. We showcase the efficacy of our approach across diverse datasets and image restoration challenges, highlighting its prowess in uncertainty quantification and visualization. Our findings reveal that our method performs comparably to a baseline that hierarchically clusters samples from a diffusion-based posterior sampler, yet achieves this with orders of magnitude greater speed.


Paper

Hierarchical Uncertainty Exploration via Feedforward Posterior Trees
Elias Nehme, Rotem Mulayoff, Tomer Michaeli.

Bibtex

@article{ title={Hierarchical Uncertainty Exploration via Feedforward Posterior Trees}, author={Nehme, Elias and Mulayoff, Rotem and Michaeli, Tomer}, journal={arXiv preprint arXiv:2405.15719}, year={2024} }

More results and further discussion about our method can be found in the supplementary material (included in the paper).



Acknowledgements

This webpage was originally made by Matan Kleiner with the help of Hila Manor. The code for the original template can be found here.
Icons are taken from font awesome and from Academicons.