Learning Likelihoods with Conditional Normalizing Flows

Published in Preprint, 2019

Abstract

Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimensional correlations and high multimodality by transforming a simple base density p(z) through an invertible neural network under the change of variables formula. Such behavior is desirable in multivariate structured prediction tasks, where handcrafted per-pixel loss-based methods inadequately capture strong correlations between output dimensions. We present a study of conditional normalizing flows (CNFs), a class of NFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x). CNFs are efficient in sampling and inference, they can be trained with a likelihood-based objective, and CNFs, being generative flows, do not suffer from mode collapse or training instabilities. We provide an effective method to train continuous CNFs for binary problems and in particular, we apply these CNFs to super-resolution and vessel segmentation tasks demonstrating competitive performance on standard benchmark datasets in terms of likelihood and conventional metrics.

  1. @inproceedings{WinklerWHW2019,
      title = {Learning Likelihoods with Conditional Normalizing Flows},
      author = {Winkler, Christina and Worrall, Daniel E. and Hoogeboom, Emiel and Welling, Max},
      year = {2019},
      eprint = {1912.00042},
      archiveprefix = {arXiv},
      primaryclass = {cs.LG}
    }