Creating and sharing knowledge for telecommunications

Variational Mixtures of Normalizing Flows

Pires, G. ; Figueiredo, M. A. T.

Variational Mixtures of Normalizing Flows, Proc European Symp. on Artificial Neural Networks - ESANN , Bruges, Belgium, Vol. , pp. - , October, 2020.

Digital Object Identifier:

Abstract
In the past few years, deep generative models, such as generative adversarial networks, variational autoencoders, and their variants, have seen wide adoption for the task of modelling complex data distributions. In spite of the outstanding sample quality achieved by those methods, they model the target distributions implicitly, in the sense that the probability density functions approximated by them are not explicitly accessible. This fact renders those methods un t for tasks that require, for example, scoring new instances of data with the learned distributions. Normalizing flows overcome this limitation by leveraging the change-of-variables
formula for probability density functions, and by using transformations designed to have tractable and cheaply computable Jacobians. Although flexible, this framework lacked (until the publication of recent work) a way to introduce discrete structure (such as the one found in mixtures) in the models it allows to construct, in an unsupervised scenario. The present work overcomes this by using normalizing flows as components in a mixture model, and devising a training procedure for such a model. This procedure is based on variational inference, and uses a variational posterior parameterized by a neural network. As will become clear, this model naturally lends itself to (multimodal) density estimation, semisupervised learning, and clustering. The proposed model is evaluated on two synthetic datasets, as well as on a real-world dataset.