Modelling heterogeneous distributions with an uncountable mixture of asymmetric laplacians

Axel Brando, Jose Antonio Rodriguez-Serrano, Jordi Vitrià, Alberto Rubio

Producció científica: Article en revista indexadaArticle de conferènciaAvaluat per experts

15 Cites (Scopus)

Resum

In regression tasks, aleatoric uncertainty is commonly addressed by considering a parametric distribution of the output variable, which is based on strong assumptions such as symmetry, unimodality or by supposing a restricted shape. These assumptions are too limited in scenarios where complex shapes, strong skews or multiple modes are present. In this paper, we propose a generic deep learning framework that learns an Uncountable Mixture of Asymmetric Laplacians (UMAL), which will allow us to estimate heterogeneous distributions of the output variable and we show its connections to quantile regression. Despite having a fixed number of parameters, the model can be interpreted as an infinite mixture of components, which yields a flexible approximation for heterogeneous distributions. Apart from synthetic cases, we apply this model to room price forecasting and to predict financial operations in personal bank accounts. We demonstrate that UMAL produces proper distributions, which allows us to extract richer insights and to sharpen decision-making.

Idioma originalAnglès
RevistaAdvances in Neural Information Processing Systems
Volum32
Estat de la publicacióPublicada - 2019
Publicat externament
Esdeveniment33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Durada: 8 de des. 201914 de des. 2019

Fingerprint

Navegar pels temes de recerca de 'Modelling heterogeneous distributions with an uncountable mixture of asymmetric laplacians'. Junts formen un fingerprint únic.

Com citar-ho