Dijk, H.K. van (Herman)
http://repub.eur.nl/ppl/263/
List of Publicationsenhttp://repub.eur.nl/logo.png
http://repub.eur.nl/
RePub, Erasmus University RepositoryInteractions between Eurozone and US
Booms and Busts:
A Bayesian Panel Markov-switching VAR
Model
http://repub.eur.nl/pub/50275/
Wed, 11 Sep 2013 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
__Abstract__
Interactions between the eurozone and US booms and busts and among major eurozone economies are analyzed by introducing a panel Markov-switching VAR model well suitable for a multi-country cyclical analysis. The model accommodates changes in low and high data frequencies and endogenous time-varying transition matrices of the country-specific Markov chains. The transition matrix of each Markov chain depends on its own past history and on the history of the other chains, thus allowing for modeling of the interactions between cycles. An endogenous common eurozone cycle is derived by aggregating country-specific cycles. The model is estimated using a simulation based Bayesian approach in which an efficient multi-move strategy algorithm is defined to draw common time-varying Markov-switching chains. Our results show that the US and eurozone cycles are not fully synchronized over the 1991-2013 sample period, with evidence of more recessions in the Eurozone. Shocks affect the US 1-quarter in advance of the eurozone, but these spread very rapidly among economies. An increase in the number of eurozone countries in recession increases the probability of the US to stay within recession, while the US recession indicator has a negative impact on the probability to stay in recession for eurozone countries. Turning point analysis shows that the cycles of Germany, France and Italy are closer to the US cycle than other countries. Belgium, Spain, and Germany, provide more timely information on the aggregate recession than Netherlands and France.Posterior-Predictive Evidence on US
Inflation using Extended Phillips Curve
Models with Non-filtered Data
http://repub.eur.nl/pub/40586/
Tue, 16 Jul 2013 00:00:01 GMT<div>Basturk, N.</div><div>Cakmakli, C.</div><div>Ceyhan, P.</div><div>Dijk, H.K. van</div>
Changing time series properties of US inflation and economic activity, measured as marginal costs, are modeled within a set of extended Phillips Curve (PC) models. It is shown that mechanical removal or modeling of simple low
frequency movements in the data may yield poor predictive results which depend on the model specification used. Basic PC models are extended to include structural time series models that describe typical time varying patterns in levels
and volatilities. Forward as well as backward looking expectation mechanisms for inflation are incorporated and their relative importance evaluated. Survey data on expected inflation are introduced to strengthen the information in the likelihood. Use is made of simulation based Bayesian techniques for the empirical analysis. No credible evidence is found on endogeneity and long run stability between inflation and marginal costs. Backward-looking inflation appears stronger than forward-looking one. Levels and volatilities of inflation are estimated more
precisely using rich PC models. Estimated inflation expectations track nicely the observed long run inflation from the survey data. The extended PC structures compare favorably with existing basic Bayesian Vector Autoregressive and Stochastic Volatility models in terms of fit and prediction. Tails of the complete predictive distributions indicate an increase in the probability of disinflation in recent years.
Dynamic econometric modeling and forecasting in the presence of instability
http://repub.eur.nl/pub/40160/
Tue, 30 Apr 2013 00:00:01 GMT<div>Timmermann, A.</div><div>Dijk, H.K. van</div>
Parallel Sequential Monte Carlo for
Efficient Density Combination:
The Deco Matlab Toolbox
http://repub.eur.nl/pub/39840/
Mon, 08 Apr 2013 00:00:01 GMT<div>Casarin, R.</div><div>Grassi, S.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
This paper presents the Matlab package DeCo (Density Combination) which is based on the paper by Billio et al. (2013) where a constructive Bayesian approach is presented for combining predictive densities originating from different models or other sources of information. The combination weights are time-varying and may depend on past predictive forecasting performances and other learning mechanisms. The core algorithm is the function DeCo which applies banks of parallel Sequential Monte Carlo algorithms to filter the time-varying combination weights. The DeCo procedure has been implemented both for standard CPU computing and for Graphical Process Unit (GPU) parallel computing. For the GPU implementation we use the Matlab parallel computing toolbox and show how to use General Purposes GPU computing almost effortless. This GPU implementation comes with a speed up of the execution time up to seventy times compared to a standard CPU Matlab implementation on a multicore CPU. We show the use of the package and the computational gain of the GPU version, through some simulation experiments and empirical applications.
Genome-wide analysis of macrosatellite repeat copy number variation in worldwide populations: Evidence for differences and commonalities in size distributions and size restrictions
http://repub.eur.nl/pub/40840/
Mon, 04 Mar 2013 00:00:01 GMT<div>Schaap, M.</div><div>Lemmers, R.J.L.F.</div><div>Maassen, R.</div><div>Vliet, P.J. van der</div><div>Hoogerheide, L.F.</div><div>Dijk, H.K. van</div><div>Baştürk, N.</div><div>Knijff, P. de</div><div>Maarel, S.M. van der</div>
Background: Macrosatellite repeats (MSRs), usually spanning hundreds of kilobases of genomic DNA, comprise a significant proportion of the human genome. Because of their highly polymorphic nature, MSRs represent an extreme example of copy number variation, but their structure and function is largely understudied. Here, we describe a detailed study of six autosomal and two X chromosomal MSRs among 270 HapMap individuals from Central Europe, Asia and Africa. Copy number variation, stability and genetic heterogeneity of the autosomal macrosatellite repeats RS447 (chromosome 4p), MSR5p (5p), FLJ40296 (13q), RNU2 (17q) and D4Z4 (4q and 10q) and X chromosomal DXZ4 and CT47 were investigated. Results: Repeat array size distribution analysis shows that all of these MSRs are highly polymorphic with the most genetic variation among Africans and the least among Asians. A mitotic mutation rate of 0.4-2.2% was observed, exceeding meiotic mutation rates and possibly explaining the large size variability found for these MSRs. By means of a novel Bayesian approach, statistical support for a distinct multimodal rather than a uniform allele size distribution was detected in seven out of eight MSRs, with evidence for equidistant intervals between the modes. Conclusions: The multimodal distributions with evidence for equidistant intervals, in combination with the observation of MSR-specific constraints on minimum array size, suggest that MSRs are limited in their configurations and that deviations thereof may cause disease, as is the case for facioscapulohumeral muscular dystrophy. However, at present we cannot exclude that there are mechanistic constraints for MSRs that are not directly disease-related. This study represents the first comprehensive study of MSRs in different human populations by applying novel statistical methods and identifies commonalities and differences in their organization and function in the human genome. Evidence on features of a dsge business cycle model from bayesian model averaging
http://repub.eur.nl/pub/38911/
Fri, 01 Feb 2013 00:00:01 GMT<div>Strachan, R.W.</div><div>Dijk, H.K. van</div>
The empirical support for features of a Dynamic Stochastic General Equilibrium model with two technology shocks is evaluated using Bayesian model averaging over vector autoregressions. The model features include equilibria, restrictions on long-run responses, a structural break of unknown date, and a range of lags and deterministic processes. We find support for a number of features implied by the economic model, and the evidence suggests a break in the entire model structure around 1984, after which technology shocks appear to account for all stochastic trends. Business cycle volatility seems more due to investment-specific technology shocks than neutral technology shocks. A class of adaptive importance sampling weighted EM algorithms for efficient and robust posterior and predictive simulation
http://repub.eur.nl/pub/37738/
Sat, 01 Dec 2012 00:00:01 GMT<div>Hoogerheide, L.F.</div><div>Opschoor, A.</div><div>Dijk, H.K. van</div>
A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation. The proposed methods are robust in the sense that they can handle target distributions that exhibit non-elliptical shapes such as multimodality and skewness. The basic method makes use of sequences of importance weighted Expectation Maximization steps in order to efficiently construct a mixture of Student-t densities that approximates accurately the target distribution-typically a posterior distribution, of which we only require a kernel-in the sense that the Kullback-Leibler divergence between target and mixture is minimized. We label this approach Mixture of t by Importance Sampling weighted Expectation Maximization (MitISEM). The constructed mixture is used as a candidate density for quick and reliable application of either Importance Sampling (IS) or the Metropolis-Hastings (MH) method. We also introduce three extensions of the basic MitISEM approach. First, we propose a method for applying MitISEM in a sequential manner, so that the candidate distribution for posterior simulation is cleverly updated when new data become available. Our results show that the computational effort reduces enormously, while the quality of the approximation remains almost unchanged. This sequential approach can be combined with a tempering approach, which facilitates the simulation from densities with multiple modes that are far apart. Second, we introduce a permutation-augmented MitISEM approach. This is useful for importance or Metropolis-Hastings sampling from posterior distributions in mixture models without the requirement of imposing identification restrictions on the model's mixture regimes' parameters. Third, we propose a partial MitISEM approach, which aims at approximating the joint distribution by estimating a product of marginal and conditional distributions. This division can substantially reduce the dimension of the approximation problem, which facilitates the application of adaptive importance sampling for posterior simulation in more complex models with larger numbers of parameters. Our results indicate that the proposed methods can substantially reduce the computational burden in econometric models like DCC or mixture GARCH models and a mixture instrumental variables model. Posterior-Predictive Evidence on US Inflation using Phillips Curve Models with Non-Filtered Time Series
http://repub.eur.nl/pub/38747/
Sat, 01 Dec 2012 00:00:01 GMT<div>Basturk, N.</div><div>Cakmakli, C.</div><div>Ceyhan, P.</div><div>Dijk, H.K. van</div>
Changing time series properties of US inflation and economic activity are analyzed within a class of extended Phillips Curve (PC) models. First, the misspecification effects of mechanical removal of low frequency movements of these series on posterior inference of a basic PC model are analyzed using a Bayesian simulation based approach. Next, structural time series models that describe changing patterns in low and high frequencies and backward as well as forward inflation expectation mechanisms are incorporated in the class of extended PC models. Empirical results indicate that the proposed models compare favorably with existing Bayesian Vector Autoregressive and Stochastic Volatility models in terms of fit and predictive performance. Weak identification and dynamic persistence appear less important when time varying dynamics of high and low frequencies are carefully modeled. Modeling inflation expectations using survey data and adding level shifts and stochastic volatility improves substantially in sample fit and out of sample predictions. No evidence is found of a long run stable cointegration relation between US inflation and marginal costs. Tails of the complete predictive distributions indicate an increase in the probability of disinflation in recent years.
Time-varying Combinations of Predictive Densities using Nonlinear Filtering
http://repub.eur.nl/pub/38198/
Mon, 29 Oct 2012 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose a Bayesian combination approach for multivariate predictive densities which relies upon a distributional state space representation of the combination weights. Several specifications of multivariate time-varying weights are introduced with a particular focus on weight dynamics driven by the past performance of the predictive densities and the use of learning mechanisms. In the proposed approach the model set can be incomplete, meaning that all models can be individually misspecified. A Sequential Monte Carlo method is proposed to approximate the filtering and predictive densities. The combination approach is assessed using statistical and utility-based performance measures for evaluating density forecasts. Simulation results indicate that, for a set of linear autoregressive models, the combination strategy is successful in selecting, with probability close to one, the true model when the model set is complete and it is able to detect parameter instability when the model set includes the true model that has generated subsamples of data. For the macro series we find that incompleteness of the models is relatively large in the 70's, the beginning of the 80's and during the recent financial crisis, and lower during the Great Moderation. With respect to returns of the S&P 500 series, we find that an investment strategy using a combination of predictions from professional forecasters and from a white noise model puts more weight on the white noise model in the beginning of the 90's and switches to giving more weight to the professional forecasts over time.
Bayesian Analysis of Instrumental Variable Models: Acceptance-Rejection within Direct Monte Carlo
http://repub.eur.nl/pub/37314/
Fri, 21 Sep 2012 00:00:01 GMT<div>Zellner, A.</div><div>Ando, T.</div><div>Basturk, N.</div><div>Dijk, H.K. van</div>
We discuss Bayesian inferential procedures within the family of instrumental variables regression models and focus on two issues: existence conditions for posterior moments of the parameters of interest under a flat prior and the potential of Direct Monte Carlo (DMC) approaches for efficient evaluation of such possibly highly onelliptical posteriors. We show that, for the general case of m endogenous variables under a flat prior, posterior moments of order r exist for the coefficients reflecting the endogenous regressors’ effect on the dependent variable, if the number of instruments is greater than m+r, even though there is an issue of local non-identification that causes non-elliptical shapes of the posterior. This stresses the need for efficient Monte Carlo integration methods. We introduce an extension of DMC that incorporates an acceptance-rejection sampling step within DMC. This Acceptance-Rejection within Direct Monte Carlo (ARDMC) method has the attractive property that the generated random drawings are independent, which greatly helps the fast convergence of simulation results, and which facilitates the evaluation of the numerical accuracy. The speed of ARDMC can be easily further improved by making use of parallelized computation using multiple core machines or computer clusters. We note that ARDMC is an analogue to the well-known 'Metropolis-Hastings within Gibbs' sampling in the sense that one 'more difficult' step is used within an 'easier' simulation method. We compare the ARDMC approach with the Gibbs sampler using simulated data and two empirical data sets, involving the settler mortality instrument of Acemoglu et al. (2001) and father's education's instrument used by Hoogerheide et al. (2012a). Even without making use of parallelized computation, an efficiency gain is observed both under strong and weak instruments, where the gain can be enormous in the latter case.The R Package MitISEM: Mixture of Student-t Distributions using Importance Sampling Weighted Expectation Maximization for Efficient and Robust Simulation
http://repub.eur.nl/pub/37313/
Thu, 20 Sep 2012 00:00:01 GMT<div>Basturk, N.</div><div>Hoogerheide, L.F.</div><div>Opschoor, A.</div><div>Dijk, H.K. van</div>
This paper presents the R package MitISEM, which provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate density in Importance Sampling or Metropolis Hastings methods for Bayesian inference on model parameters and probabilities. The package provides also an extended MitISEM algorithm, â€˜sequential MitISEMâ€™, which substantially decreases the computational time when the target density has to be approximated for increasing data samples. This occurs when the posterior distribution is updated with new observations and/or when one computes model probabilities using predictive likelihoods. We illustrate the MitISEM algorithm using three canonical statistical and econometric models that are characterized by several types of non-elliptical posterior shapes and that describe well-known data patterns in econometrics and finance. We show that the candidate distribution obtained by MitISEM outperforms those obtained by â€˜naiveâ€™ approximations in terms of numerical efficiency. Further, the MitISEM approach can be used for Bayesian model comparison, using the predictive likelihoods.Combination schemes for turning point predictions
http://repub.eur.nl/pub/37707/
Thu, 13 Sep 2012 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose new forecast combination schemes for predicting turning points of business cycles. The proposed combination schemes are based on the forecasting performances of a given set of models with the aim to provide better turning point predictions. In particular, we consider predictions generated by autoregressive (AR) and Markov-switching AR models, which are commonly used for business cycle analysis. In order to account for parameter uncertainty we consider a Bayesian approach for both estimation and prediction and compare, in terms of statistical accuracy, the individual models and the combined turning point predictions for the United States and the Euro area business cycles. Combination schemes for turning point predictions
http://repub.eur.nl/pub/37708/
Thu, 13 Sep 2012 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose new forecast combination schemes for predicting turning points of business cycles. The proposed combination schemes are based on the forecasting performances of a given set of models with the aim to provide better turning point predictions. In particular, we consider predictions generated by autoregressive (AR) and Markov-switching AR models, which are commonly used for business cycle analysis. In order to account for parameter uncertainty we consider a Bayesian approach for both estimation and prediction and compare, in terms of statistical accuracy, the individual models and the combined turning point predictions for the United States and the Euro area business cycles. Combination schemes for turning point predictions
http://repub.eur.nl/pub/37709/
Thu, 13 Sep 2012 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose new forecast combination schemes for predicting turning points of business cycles. The proposed combination schemes are based on the forecasting performances of a given set of models with the aim to provide better turning point predictions. In particular, we consider predictions generated by autoregressive (AR) and Markov-switching AR models, which are commonly used for business cycle analysis. In order to account for parameter uncertainty we consider a Bayesian approach for both estimation and prediction and compare, in terms of statistical accuracy, the individual models and the combined turning point predictions for the United States and the Euro area business cycles. Combination schemes for turning point predictions
http://repub.eur.nl/pub/37710/
Thu, 13 Sep 2012 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose new forecast combination schemes for predicting turning points of business cycles. The proposed combination schemes are based on the forecasting performances of a given set of models with the aim to provide better turning point predictions. In particular, we consider predictions generated by autoregressive (AR) and Markov-switching AR models, which are commonly used for business cycle analysis. In order to account for parameter uncertainty we consider a Bayesian approach for both estimation and prediction and compare, in terms of statistical accuracy, the individual models and the combined turning point predictions for the United States and the Euro area business cycles. Evidence on Features of a DSGE Business Cycle Model from Bayesian Model Averaging
http://repub.eur.nl/pub/32101/
Tue, 20 Mar 2012 00:00:01 GMT<div>Strachan, R.W.</div><div>Dijk, H.K. van</div>
The empirical support for features of a Dynamic Stochastic General Equilibrium model with two technology shocks is valuated using Bayesian model averaging over vector autoregressions. The model features include equilibria, restrictions on long-run responses, a structural break of unknown date and a range of lags and deterministic processes. We find support for a number of features implied by the economic model and the evidence suggests a break in the entire model structure around 1984 after which technology shocks appear to account for all stochastic trends. Business cycle volatility seems more due to investment specific technology shocks than neutral technology shocks.
Forecast rationality tests based on multi-horizon bounds: Comment
http://repub.eur.nl/pub/32010/
Sun, 01 Jan 2012 00:00:01 GMT<div>Hoogerheide, L.F.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
Combining Predictive Densities using Nonlinear Filtering with Applications to US Economics Data
http://repub.eur.nl/pub/30684/
Wed, 30 Nov 2011 00:00:01 GMT<div>Billio, M.</div><div>Casarin, R.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
We propose a multivariate combination approach to prediction based on a distributional state space representation of the weights belonging to a set of Bayesian predictive densities which have been obtained from alternative models. Several specifications of multivariate time-varying weights are introduced with a particular focus on weight dynamics driven by the past performance of the predictive densities and the use of learning mechanisms. In the proposed approach the model set can be incomplete, meaning that all models are individually misspecified. The approach is assessed using statistical and utility-based performance measures for evaluating density forecasts of US macroeconomic time series and surveys of stock market prices. For the macro series we find that incompleteness of the models is relatively large in the 70's, the beginning of the 80's and during the recent financial crisis; structural changes like the Great Moderation are empirically identified by our model combination and the predicted probabilities of recession accurately compare with the NBER business cycle dating. Model weights have substantial uncertainty attached and neglecting this may seriously affect results. With respect to returns of the S&P 500 series, we find that an investment strategy using a combination of predictions from professional forecasters and from a white noise model puts more weight on the white noise model in the beginning of the 90's and switches to giving more weight to the left tail of the professional forecasts during the start of the financial crisis around 2008.Instrumental Variables, Errors in Variables, and Simultaneous Equations Models: Applicability and Limitations of Direct Monte Carlo
http://repub.eur.nl/pub/26507/
Tue, 27 Sep 2011 00:00:01 GMT<div>Zellner, A.</div><div>Ando, T.</div><div>Basturk, N.</div><div>Hoogerheide, L.F.</div><div>Dijk, H.K. van</div>
A Direct Monte Carlo (DMC) approach is introduced for posterior simulation in the Instrumental Variables (IV) model with one possibly endogenous regressor, multiple instruments and Gaussian errors under a flat prior. This DMC method can also be applied in an IV model (with one or multiple instruments) under an informative prior for the endogenous regressor's effect. This DMC approach can not be applied to more complex IV models or Simultaneous Equations Models with multiple endogenous regressors. An Approximate DMC (ADMC) approach is introduced that makes use of the proposed Hybrid Mixture Sampling (HMS) method, which facilitates Metropolis-Hastings (MH) or Importance Sampling from a proper marginal posterior density with highly non-elliptical shapes that tend to infinity for a point of singularity. After one has simulated from the irregularly shaped marginal distri- bution using the HMS method, one easily samples the other parameters from their conditional Student-t and Inverse-Wishart posteriors. An example illustrates the close approximation and high MH acceptance rate. While using a simple candidate distribution such as the Student-t may lead to an infinite variance of Importance Sampling weights. The choice between the IV model and a simple linear model un- der the restriction of exogeneity may be based on predictive likelihoods, for which the efficient simulation of all model parameters may be quite useful. In future work the ADMC approach may be extended to more extensive IV models such as IV with non-Gaussian errors, panel IV, or probit/logit IV.Backtesting Value-at-Risk using Forecasts for Multiple Horizons, a Comment on the Forecast Rationality Tests of A.J. Patton and A. Timmermann
http://repub.eur.nl/pub/26505/
Thu, 01 Sep 2011 00:00:01 GMT<div>Hoogerheide, L.F.</div><div>Ravazzolo, F.</div><div>Dijk, H.K. van</div>
Patton and Timmermann (2011, 'Forecast Rationality Tests Based on Multi-Horizon Bounds', Journal of Business & Economic Statistics, forthcoming) propose a set of useful tests for forecast rationality or optimality under squared error loss, including an easily implemented test based on a regression that only involves (long-horizon and short-horizon) forecasts and no observations on the target variable. We propose an extension, a simulation-based procedure that takes into account the presence of errors in parameter estimates. This procedure can also be applied in the field of 'backtesting' models for Value-at-Risk. Applications to simple AR and ARCH time series models show that its power in detecting certain misspecifications is larger than the power of well-known tests for correct Unconditional Coverage and Conditional Coverage.