Next Issue
Volume 6, March
Previous Issue
Volume 5, September
 
 

Econometrics, Volume 5, Issue 4 (December 2017) – 11 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
730 KiB  
Article
Time-Varying Window Length for Correlation Forecasts
by Yoontae Jeon and Thomas H. McCurdy
Econometrics 2017, 5(4), 54; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040054 - 11 Dec 2017
Cited by 2 | Viewed by 7899
Abstract
Forecasting correlations between stocks and commodities is important for diversification across asset classes and other risk management decisions. Correlation forecasts are affected by model uncertainty, the sources of which can include uncertainty about changing fundamentals and associated parameters (model instability), structural breaks and [...] Read more.
Forecasting correlations between stocks and commodities is important for diversification across asset classes and other risk management decisions. Correlation forecasts are affected by model uncertainty, the sources of which can include uncertainty about changing fundamentals and associated parameters (model instability), structural breaks and nonlinearities due, for example, to regime switching. We use approaches that weight historical data according to their predictive content. Specifically, we estimate two alternative models, ‘time-varying weights’ and ‘time-varying window’, in order to maximize the value of past data for forecasting. Our empirical analyses reveal that these approaches provide superior forecasts to several benchmark models for forecasting correlations. Full article
(This article belongs to the Special Issue Volatility Modeling)
Show Figures

Figure 1

1494 KiB  
Article
Reducing Approximation Error in the Fourier Flexible Functional Form
by Tristan D. Skolrud
Econometrics 2017, 5(4), 53; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040053 - 04 Dec 2017
Cited by 1 | Viewed by 6882
Abstract
The Fourier Flexible form provides a global approximation to an unknown data generating process. In terms of limiting function specification error, this form is preferable to functional forms based on second-order Taylor series expansions. The Fourier Flexible form is a truncated Fourier series [...] Read more.
The Fourier Flexible form provides a global approximation to an unknown data generating process. In terms of limiting function specification error, this form is preferable to functional forms based on second-order Taylor series expansions. The Fourier Flexible form is a truncated Fourier series expansion appended to a second-order expansion in logarithms. By replacing the logarithmic expansion with a Box-Cox transformation, we show that the Fourier Flexible form can reduce approximation error by 25% on average in the tails of the data distribution. The new functional form allows for nested testing of a larger set of commonly implemented functional forms. Full article
Show Figures

Figure 1

267 KiB  
Article
Synthetic Control and Inference
by Jinyong Hahn and Ruoyao Shi
Econometrics 2017, 5(4), 52; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040052 - 28 Nov 2017
Cited by 45 | Viewed by 15201
Abstract
We examine properties of permutation tests in the context of synthetic control. Permutation tests are frequently used methods of inference for synthetic control when the number of potential control units is small. We analyze the permutation tests from a repeated sampling perspective and [...] Read more.
We examine properties of permutation tests in the context of synthetic control. Permutation tests are frequently used methods of inference for synthetic control when the number of potential control units is small. We analyze the permutation tests from a repeated sampling perspective and show that the size of permutation tests may be distorted. Several alternative methods are discussed. Full article
(This article belongs to the Special Issue Recent Developments in Panel Data Methods)
638 KiB  
Article
Formula I(1) and I(2): Race Tracks for Likelihood Maximization Algorithms of I(1) and I(2) Cointegrated VAR Models
by Jurgen A. Doornik, Rocco Mosconi and Paolo Paruolo
Econometrics 2017, 5(4), 49; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040049 - 20 Nov 2017
Viewed by 8933
Abstract
This paper provides some test cases, called circuits, for the evaluation of Gaussian likelihood maximization algorithms of the cointegrated vector autoregressive model. Both I(1) and I(2) models are considered. The performance of algorithms is compared first in terms of effectiveness, defined as [...] Read more.
This paper provides some test cases, called circuits, for the evaluation of Gaussian likelihood maximization algorithms of the cointegrated vector autoregressive model. Both I(1) and I(2) models are considered. The performance of algorithms is compared first in terms of effectiveness, defined as the ability to find the overall maximum. The next step is to compare their efficiency and reliability across experiments. The aim of the paper is to commence a collective learning project by the profession on the actual properties of algorithms for cointegrated vector autoregressive model estimation, in order to improve their quality and, as a consequence, also the reliability of empirical research. Full article
(This article belongs to the Special Issue Recent Developments in Cointegration)
Show Figures

Figure 1

574 KiB  
Article
Business Time Sampling Scheme with Applications to Testing Semi-Martingale Hypothesis and Estimating Integrated Volatility
by Yingjie Dong and Yiu-Kuen Tse
Econometrics 2017, 5(4), 51; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040051 - 13 Nov 2017
Cited by 2 | Viewed by 6870
Abstract
We propose a new method to implement the Business Time Sampling (BTS) scheme for high-frequency financial data. We compute a time-transformation (TT) function using the intraday integrated volatility estimated by a jump-robust method. The BTS transactions are obtained using the inverse of the [...] Read more.
We propose a new method to implement the Business Time Sampling (BTS) scheme for high-frequency financial data. We compute a time-transformation (TT) function using the intraday integrated volatility estimated by a jump-robust method. The BTS transactions are obtained using the inverse of the TT function. Using our sampled BTS transactions, we test the semi-martingale hypothesis of the stock log-price process and estimate the daily realized volatility. Our method improves the normality approximation of the standardized business-time return distribution. Our Monte Carlo results show that the integrated volatility estimates using our proposed sampling strategy provide smaller root mean-squared error. Full article
(This article belongs to the Special Issue Volatility Modeling)
Show Figures

Figure 1

2100 KiB  
Article
Inequality and Poverty When Effort Matters
by Martin Ravallion
Econometrics 2017, 5(4), 50; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040050 - 06 Nov 2017
Cited by 5 | Viewed by 9426
Abstract
On the presumption that poorer people tend to work less, it is often claimed that standard measures of inequality and poverty are overestimates. The paper points to a number of reasons to question this claim. It is shown that, while the labor supplies [...] Read more.
On the presumption that poorer people tend to work less, it is often claimed that standard measures of inequality and poverty are overestimates. The paper points to a number of reasons to question this claim. It is shown that, while the labor supplies of American adults have a positive income gradient, the heterogeneity in labor supplies generates considerable horizontal inequality. Using equivalent incomes to adjust for effort can reveal either higher or lower inequality depending on the measurement assumptions. With only a modest allowance for leisure as a basic need, the effort-adjusted poverty rate in terms of equivalent incomes rises. Full article
(This article belongs to the Special Issue Econometrics and Income Inequality)
Show Figures

Figure 1

884 KiB  
Article
Do Seasonal Adjustments Induce Noncausal Dynamics in Inflation Rates?
by Alain Hecq, Sean Telg and Lenard Lieb
Econometrics 2017, 5(4), 48; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040048 - 31 Oct 2017
Cited by 9 | Viewed by 7676
Abstract
This paper investigates the effect of seasonal adjustment filters on the identification of mixed causal-noncausal autoregressive models. By means of Monte Carlo simulations, we find that standard seasonal filters induce spurious autoregressive dynamics on white noise series, a phenomenon already documented in the [...] Read more.
This paper investigates the effect of seasonal adjustment filters on the identification of mixed causal-noncausal autoregressive models. By means of Monte Carlo simulations, we find that standard seasonal filters induce spurious autoregressive dynamics on white noise series, a phenomenon already documented in the literature. Using a symmetric argument, we show that those filters also generate a spurious noncausal component in the seasonally adjusted series, but preserve (although amplify) the existence of causal and noncausal relationships. This result has has important implications for modelling economic time series driven by expectation relationships. We consider inflation data on the G7 countries to illustrate these results. Full article
Show Figures

Figure 1

394 KiB  
Article
Bayesian Analysis of Bubbles in Asset Prices
by Andras Fulop and Jun Yu
Econometrics 2017, 5(4), 47; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040047 - 23 Oct 2017
Cited by 5 | Viewed by 7975
Abstract
We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow [...] Read more.
We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow a mean-reverting process around a stochastic long run mean. The second regime reflects the bubble period with explosive behavior. Stochastic switches between two regimes and non-constant probabilities of exit from the bubble regime are both allowed. A Bayesian learning approach is employed to jointly estimate the latent states and the model parameters in real time. An important feature of our Bayesian method is that we are able to deal with parameter uncertainty and at the same time, to learn about the states and the parameters sequentially, allowing for real time model analysis. This feature is particularly useful for market surveillance. Analysis using simulated data reveals that our method has good power properties for detecting bubbles. Empirical analysis using price-dividend ratios of S&P500 highlights the advantages of our method. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

5050 KiB  
Editorial
An Interview with William A. Barnett
by Apostolos Serletis
Econometrics 2017, 5(4), 45; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040045 - 17 Oct 2017
Cited by 1 | Viewed by 7922
Abstract
William(Bill) Barnett is an eminent econometrician andmacroeconomist.[...] Full article
205 KiB  
Article
Non-Causality Due to Included Variables
by Umberto Triacca
Econometrics 2017, 5(4), 46; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040046 - 15 Oct 2017
Cited by 2 | Viewed by 5805
Abstract
The contribution of this paper is to investigate a particular form of lack of invariance of causality statements to changes in the conditioning information sets. Consider a discrete-time three-dimensional stochastic process [...] Read more.
The contribution of this paper is to investigate a particular form of lack of invariance of causality statements to changes in the conditioning information sets. Consider a discrete-time three-dimensional stochastic process z = ( x , y 1 , y 2 ) . We want to study causality relationships between the variables in y = ( y 1 , y 2 ) and x. Suppose that in a bivariate framework, we find that y 1 Granger causes x and y 2 Granger causes x, but these relationships vanish when the analysis is conducted in a trivariate framework. Thus, the causal links, established in a bivariate setting, seem to be spurious. Is this conclusion always correct? In this note, we show that the causal links, in the bivariate framework, might well not be ‘genuinely’ spurious: they could be reflecting causality from the vector y to x. Paradoxically, in this case, it is the non-causality in trivariate system that is misleading. Full article
2254 KiB  
Article
Twenty-Two Years of Inflation Assessment and Forecasting Experience at the Bulletin of EU & US Inflation and Macroeconomic Analysis
by Antoni Espasa and Eva Senra
Econometrics 2017, 5(4), 44; https://0-doi-org.brum.beds.ac.uk/10.3390/econometrics5040044 - 06 Oct 2017
Cited by 1 | Viewed by 7705
Abstract
The Bulletin of EU & US Inflation and Macroeconomic Analysis (BIAM) is a monthly publication that has been reporting real time analysis and forecasts for inflation and other macroeconomic aggregates for the Euro Area, the US and Spain since 1994. The BIAM inflation [...] Read more.
The Bulletin of EU & US Inflation and Macroeconomic Analysis (BIAM) is a monthly publication that has been reporting real time analysis and forecasts for inflation and other macroeconomic aggregates for the Euro Area, the US and Spain since 1994. The BIAM inflation forecasting methodology stands on working with useful disaggregation schemes, using leading indicators when possible and applying outlier correction. The paper relates this methodology to corresponding topics in the literature and discusses the design of disaggregation schemes. It concludes that those schemes would be useful if they were formulated according to economic, institutional and statistical criteria aiming to end up with a set of components with very different statistical properties for which valid single-equation models could be built. The BIAM assessment, which derives from a new observation, is based on (a) an evaluation of the forecasting errors (innovations) at the components’ level. It provides information on which sectors they come from and allows, when required, for the appropriate correction in the specific models. (b) In updating the path forecast with its corresponding fan chart. Finally, we show that BIAM real time Euro Area inflation forecasts compare successfully with the consensus from the ECB Survey of Professional Forecasters, one and two years ahead. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop