Template-Type: ReDIF-Paper 1.0 Author-Name: Hoogerheide, L.F. Author-Name-Last: Hoogerheide Author-Name-First: Lennart Author-Name: Kaashoek, J.F. Author-Name-Last: Kaashoek Author-Name-First: Johan Author-Name: van Dijk, H.K. Author-Name-Last: van Dijk Author-Name-First: Herman Author-Person: pva325 Title: Functional approximations to posterior densities: a neural network approach to efficient sampling Abstract: The performance of Monte Carlo integration methods like importance sampling or Markov Chain Monte Carlo procedures greatly depends on the choice of the importance or candidate density. Usually, such a density has to be "close" to the target density in order to yield numerically accurate results with efficient sampling. Neural networks seem to be natural importance or candidate densities, as they have a universal approximation property and are easy to sample from. That is, conditional upon the specification of the neural network, sampling can be done either directly or using a Gibbs sampling technique, possibly using auxiliary variables. A key step in the proposed class of methods is the construction of a neural network that approximates the target density accurately. The methods are tested on a set of illustrative models which include a mixture of normal distributions, a Bayesian instrumental variable regression problem with weak instruments and near-identification, and two-regime growth model for US recessions and expansions. These examples involve experiments with non-standard, non-elliptical posterior distributions. The results indicate the feasibility of the neural network approach. Creation-Date: 2002-12-31 File-URL: https://repub.eur.nl/pub/1727/EI-2002-48.pdf File-Format: application/pdf Series: RePEc:ems:eureir Number: EI 2002-48 Classification-JEL: C11, C15, C45 Keywords: Bayesian inference, Markov chain Monte Carlo, importance sampling, neural networks Handle: RePEc:ems:eureir:1727