Scholarly Works, Economics
Permanent URI for this collection
Research articles, presentations, and other scholarship
Browse
Browsing Scholarly Works, Economics by Content Type "Working paper"
Now showing 1 - 8 of 8
Results Per Page
Sort Options
- Beyond Optimal ForecastingAshley, Richard A. (Virginia Tech, 2006-11-04)While the conditional mean is known to provide the minimum mean square error (MSE) forecast – and hence is optimal under a squared-error loss function – it must often in practice be replaced by a noisy estimate when model parameters are estimated over a small sample. Here two results are obtained, both of which motivate the use of forecasts biased toward zero (shrinkage forecasts) in such settings. First, the noisy forecast with minimum MSE is shown to be a shrinkage forecast. Second, a condition is derived under which a shrinkage forecast stochastically dominates the unbiased forecast over the class of loss functions monotonic in the forecast error magnitude. The appropriate amount of shrinkage from either perspective depends on a noisiness parameter which must be estimated, however, so the actual reduction in expected losses from shrinkage forecasting is an empirical issue. Simulation results over forecasts from a large variety of multiple regression models indicate that feasible shrinkage forecasts typically do provide modest improvements in forecast MSE when the noise in the estimate of the conditional mean is substantial.
- An Elementary Method for Detecting and Modeling Regression Parameter Variation Across Frequences With an Application to Testing the Permanent Income HypothesisBoon, Tan Hui; Ashley, Richard A. (Virginia Tech, 1997-03)A simple technique for directly testing the parameters of a time series regression model for instability across frequencies is presented. The method can be easily implemented in the time domain, so parameter instability across frequency bands can be conveniently detected and modeled in conjunction with other econometric features of the problem at hand, such as simultaneity, cointegration, missing observations, cross-equation restrictions, etc. The usefulness of the new technique is illustrated with an application to a cointegrated consumption-income regression model, yielding a straightforward test of the permanent income hypothesis.
- Identification of Coefficients in a Quadratic Moving Average Process Using the Generalized Method of MomentsAshley, Richard A.; Patterson, Douglas M. (Virginia Tech, 2002-06-21)The output of a causal, stable, time-invariant nonlinear filter can be approximately represented by the linear and quadratic terms of a finite parameter Volterra series expansion. We call this representation the “quadratic nonlinear MA model” since it is the logical extension of the usual linear MA process. Where the actual generating mechanism for the data is fairly smooth, this quadratic MA model should provide a better approximation to the true dynamics than the twostate threshold autoregression and Markov switching models usually considered. As with linear MA processes, the nonlinear MA model coefficients can be estimated via least squares fitting, but it is essential to begin with a reasonably parsimonious model identification and non-arbitrary preliminary estimates for the parameters. In linear ARMA modeling these are derived from the sample correlogram and the sample partial correlogram, but these tools are confounded by nonlinearity in the generating mechanism. Here we obtain analytic expressions for the second and third order moments – the autocovariances and third order cumulants – of a quadratic MA process driven by i.i.d. symmetric innovations. These expressions allow us to identify the significant coefficients in the process by using GMM to obtain preliminary coefficient estimates and their concomitant estimated standard errors. The utility of the method for specifying nonlinear time series models is illustrated using artificially generated data.
- International Evidence On The Oil Price-Real Output Relationship: Does Persistence Matter?Ashley, Richard A.; Tsang, Kwok Ping (Virginia Tech, 2013-08-28)The literature on the relationship between real output growth and the growth rate in the price of oil, including an allowance for asymmetry in the impact of oil prices on output, continues to evolve. Here we show that a new technique, which allows us to control for both this asymmetry and also for the persistence of oil price changes, yields results implying that such control is necessary for a statistically adequate specification of the relationship. The new technique also yields an estimated model for the relationship which is more economically interpretable. In particular, using quarterly data from 1976 – 2007 on each of six countries which are essentially net oil importers, we find that changes in the growth rate of oil prices which persist for more than four years have a large and statistically significant impact on future output growth, whereas less persistent changes (lasting more than one year but less than four years) have no significant impact on output growth. In contrast, ‘temporary’ fluctuations in the oil price growth rate – persisting for only a year or less – again have a large and statistically significant impact on output growth for most of these countries. The results for the single major net oil producer in our sample (Norway) are distinct in an interesting way.
- Non-nested Model Selection/Validation: Making Credible Postsample Inference FeasibleAshley, Richard A. (Virginia Tech, 1995-04)Effective, credible inference with respect to the postsample forecasting performance of time series models is widely held to be infeasible. Consequently, the model selection and Granger-causality literatures have focussed almost exclusively on in-sample tests, which can easily be biased by typical specification-search activity. Indeed, the postsample error series generated by competing models are typically cross-correlated, serially correlated, and not even clearly gaussian; thus, postsample inference procedures are necessarily only asymptotically valid. As a result, a postsample period large enough to yield credible inferences is perceived to be too costly in terms of sample observations foregone. This paper describes a new, re-sampling based, approach to postsample inference which, by explicitly quantifying the inferential uncertainty caused by the limited length of the postsample period, makes it feasible to obtain credible postsample inferences using postsample periods of reasonable length. For a given target level of inferential precision – e.g., significance at the 5% level – this new approach also provides explicit estimates of both how strong the postsample forecasting efficiency evidence in favor of one of two models must be (for a given length postsample period) and how long a postsample period is necessary, if the evidence is of given strength. These results indicate that postsample model validation periods substantially longer than the 5 to 20 periods typically reserved in past studies are necessary in order to credibly detect 20% - 30% MSE reductions. This approach also quantifies the inferential impact of different forecasting efficiency criterion choices – e.g., MSE vs. MAE vs. asymmetric criteria and the use of expected loss differentials (as in Diebold and Mariano(1994)) vs. ratios of expected losses. The value of this new approach to postsample inference is illustrated using postsample forecasting error data from Ashley, Granger, and Schmalensee(1980), in which evidence was presented for unidirectional Granger-causation from fluctuations in aggregate U.S. consumption expenditures to fluctuations in U.S. aggregate expenditures on advertising.
- A Reconsideration of Consistent Estimation of a Dynamic Panel Data Model in the Random Effects (Error Components) FrameworkAshley, Richard A. (Virginia Tech, 2010-04-19)It is widely believed that the inclusion of lagged dependent variables in a panel data model necessarily renders the Random Effects (RE) estimators, based on OLS applied to the quasi-differenced variables, inconsistent. It is shown here that this belief is incorrect under the usual assumption made in this context — i.e., that the other regressors are strictly exogenous. This result follows from the fact that lagged values of the deviation of the quasi-differenced dependent variable from its mean can be written as a weighted sum of the past values of the quasi-differenced model error term, whereas these quasi-differenced errors are serially uncorrelated by construction. The RE estimators are therefore consistent. Thus, since instrumental variables methods { e.g., Arellano and Bond (1991) — clearly provide less precise estimates, the RE estimates are preferable if a Hausman test is unable to reject the null hypothesis that the parameter estimates of interest from both methods are equal.
- Sensitivity Analysis of OLS Multiple Regression Inference with Respect to Possible Linear Endogeneity in the Explanatory VariablesAshley, Richard A.; Parmeter, Christopher F. (Virginia Tech, 2019-06-17)This work describes a versatile sensitivity analysis of OLS hypothesis test rejection p-values with respect to possible endogeneity in the explanatory variables of the usual k-variate linear multiple regression model which practitioners can readily deploy in their research. This sensitivity analysis is based on a derivation of the asymptotic distribution of the OLS parameter estimator, but extended in a particularly straightforward way to the case where some or all of the explanatory variables are endogenous to a specified degree — that is, where the population covariances of the explanatory variables with the model errors are given. In exchange for restricting attention to possible endogeneity which is solely linear in nature, no additional model assumptions must be made, beyond the usual ones for a model with stochastic regressors. In addition, we also use simulation methods to quantify the uncertainty in the sensitivity analysis results introduced by replacing the population variance-covariance matrix by its sample estimate. The usefulness of the analysis — as a `screen' for potential endogeneity issues — is illustrated with an example from the empirical growth literature.
- Strategyproof Choice of Acts: Beyond DictatorshipBahel, Eric A.; Sprumont, Yves (Virginia Tech, 2017-05-20)We model uncertain social prospects as acts mapping states of nature to (public) outcomes. A social choice function (or SCF) assigns an act to every profile of subjective expected utility preferences over acts. A SCF is strategyproof if no agent ever has an incentive to misrepresent her beliefs about the states of nature or her valuation of the outcomes; it is ex-post efficient if the act selected at any given preference profile picks a Pareto-efficient outcome in every state of nature. We offer a complete characterization of all strategyproof and ex-post efficient SCFs. The chosen act must pick the most preferred outcome of some (possibly different) agent in every state of nature. The set of states in which an agent's top outcome is selected may vary with the reported belief profile; it is the union of all the states assigned to her by a collection of constant, bilaterally dictatorial, or bilaterally consensual assignment rules.