- Philosophy of EconometricsSpanos, Aris (Routledge, 2021-10-12)The preceding quotation from Einstein’s reply to Robert Thornton, a young philosopher of science who began teaching physics at the university level in 1944, encapsulates succinctly the importance of examining the methodology, history, and philosophical foundations of different scientific fields to avoid missing the forest for the trees. The field of interest in the discussion that follows is modern econometrics, whose roots can be traced back to the early 20th century. The problem of induction, in the sense of justifying an inference from particular instances to realizations yet to be observed, has been bedeviling the philosophy of science since Hume’s discourse on the problem. Modern statistical inference, as a form of induction, is based on data that exhibit inherent chance regularity patterns. Model-based statistical induction differs from other forms of induction, such as induction by enumeration, in three crucial respects.
- Fortune Tellers: The Story of America's First Economic Forecasters [Book review]Spanos, Aris (Cambridge University Press, 2015-11-12)
- Frequentist ProbabilitySpanos, Aris (2017)The primary objective of this article is to discuss a model-based frequentist interpretation that identifies the probability of an event with the limit of its relative frequency of occurrence. What differentiates the proposed interpretation from the traditional ones are several key features: (i) events and probabilities are defined in the context of a statistical model , (ii) it is anchored on the strong law of large numbers, (iii) it is justified on empirical grounds by validating the model assumptions vis-à-vis data , (iv) the “long-run” metaphor can be rendered operational by simple simulation based on , and (v) the link between probability and real-world phenomena is provided by viewing data as a “truly typical” realization of the stochastic mechanism defined by . This link constitutes a feature shared with the Kolmogorov complexity algorithmic perspective on probability, which provides a further justification for the proposed frequentist interpretation.
- The Pre-Eminence of Theory Versus the European CVAR Perspective in Macroeconometric ModelingSpanos, Aris (Elsevier, 2008-01-01)The primary aim of the paper is to place current methodological discussions on empirical modeling contrasting the 'theory first' versus the 'data first' perspectives in the context of a broader methodological framework with a view to constructively appraise them. In particular, the paper focuses on Colander?s argument in his paper ?Economists, Incentives, Judgement and Empirical Work? relating to the two different perspectives in Europe and the US that are currently dominating empirical macro-econometric modeling and delves deeper into their methodological/philosophical foundations. It is argued that the key to establishing a constructive dialogue between them is provided by a better understanding of the role of data in modern statistical inference, and how that relates to the centuries old issue of the realisticness of economic theories.
- Testing for Structural Breaks and other forms of Non-stationarity: a Misspecification PerspectiveHeracleous, Maria S.; Koutris, Andreas; Spanos, Aris (2008)In the 1980s and 1990s the issue of non-stationarity in economic time series has been discussed in the context of unit roots vs. mean trends in AR(p) models. More recently this perspective has been extended to include structural breaks. In this paper we take a much broader perspective by viewing the problem of changing parameters as one of misspecification testing due to the nonstationarity of the underlying process. The proposed misspecification testing procedure relies on resampling techniques to enhance the informational content of the observed data in an attempt to capture heterogeneity ‘locally’ using rolling window estimators of the primary moments of the stochastic process. The effectiveness of the testing procedure is assessed using extensive Monte Carlo simulations.
- Propagation of shocks in an input-output economy: Evidence from disaggregated pricesLuo, Shaowen; Villar, David (Elsevier, 2023-07-01)Using disaggregated industry-level data, this paper empirically evaluates predictions for the cross-sectional price change distribution made by input-output models with sticky prices. The response of prices to shocks is found to be consistent with the price sensitivities predicted by the input-output model. Moreover, moments of the sectoral price change distribution vary over time in response to the evolution of the network structure. Finally, through a quantitative analysis, demand and supply shocks are disentangled during the pandemic period. Counterfactual analyses show that sectoral supply shocks, aggregate demand shocks and the production network structure contributed significantly to the inflation surge in 2021–2022.
- The price adjustment hazard function: Evidence from high inflation periodsLuo, Shaowen; Villar, Daniel (Elsevier, 2021-09-01)The price adjustment hazard function - the probability of a good's price changing as a function of its price misalignment - enables the examination of the relationship between price stickiness and monetary non-neutrality without specifying a micro-founded model, as discussed by Caballero and Engel (1993a, 2007). Using the micro data underlying the U.S. Consumer Price Index going back to the 1970s, we estimate the hazard function relying on empirical patterns from high and low inflation periods. We find that the relation between inflation and higher moments of the price change distribution is particularly informative for the shape of the hazard function. Our estimated hazard function is relatively flat with positive values at zero. It implies weak price selection and a high degree of monetary non-neutrality: about 60% of the degree implied by the Calvo model, and much higher than what menu cost models imply. In addition, our estimated function is asymmetric: price increases are considerably more likely to occur than price decreases of the same magnitude.
- The Skewness of the Price Change Distribution: A New Touchstone for Sticky Price ModelsLuo, Shaowen; Villar, Daniel (Wiley, 2021-02-01)We present a new way of empirically evaluating various sticky price models that are used to assess the degree of monetary nonneutrality. While menu cost models uniformly predict that price change skewness and dispersion fall with inflation, in the Calvo model, both rise. However, the U.S. Consumer Price Index (CPI) data from the late 1970s onward show that skewness does not fall with inflation, while dispersion does. We present a random menu cost model that, with a menu cost distribution that has a strong Calvo flavor, can match the empirical patterns. The model exhibits much more monetary nonneutrality than existing menu cost models.
- Testing for Structural Breaks and other forms of Non-stationarity: a Misspecification PerspectiveIn the 1980s and 1990s the issue of non-stationarity in economic time series has been in the context of unit roots vs. mean trends in AR(p) models. More recently this perspective has been extended to include structural breaks. In this paper we take a much broader perspective by viewing the problem as one of misspecification testing: assessing the stationarity of the underlying process. The proposed misspecification testing procedure relies on resampling techniques to enhance the informational content of the observed data in an attempt to capture heterogeneity `locally' using rolling window estimators of the primary moments of the stochastic process. The effectiveness of the testing procedure is assessed using extensive Monte Carlo simulations
- A New Perspective on Impartial and Unbiased ApportionmentHyman, Ross; Tideman, Nicolaus (Taylor & Francis, 2023-08-17)How to fairly apportion congressional seats to states has been debated for centuries. We present an alternative perspective on apportionment, centered not on states but “families” of states, sets of states with “divisor-method” quotas with the same integer part. We develop “impartial” and “unbiased” apportionment methods. Impartial methods apportion the same number of seats to families of states containing the same total population, whether a family consists of many small-population states or a few large-population states. Unbiased methods apportion seats so that if states are drawn repeatedly from the same distribution, the expected number of seats apportioned to each family equals the expected divisor-method quota for that family.
- Choosing Among the Variety of Proposed Voting ReformsTideman, Nicolaus (Springer, 2023-12-05)A wide variety of voting reforms are offered for consideration in this special issue. This paper draws connections among them and identifies the beliefs that make particular proposals more attractive than others.
- Revisiting the Large n (Sample Size) Problem: How to Avert Spurious Significance ResultsSpanos, Aris (MDPI, 2023-12-05)Although large data sets are generally viewed as advantageous for their ability to provide more precise and reliable evidence, it is often overlooked that these benefits are contingent upon certain conditions being met. The primary condition is the approximate validity (statistical adequacy) of the probabilistic assumptions comprising the statistical model Mθ(x) applied to the data. In the case of a statistically adequate Mθ(x) and a given significance level α, as n increases, the power of a test increases, and the p-value decreases due to the inherent trade-off between type I and type II error probabilities in frequentist testing. This trade-off raises concerns about the reliability of declaring ‘statistical significance’ based on conventional significance levels when n is exceptionally large. To address this issue, the author proposes that a principled approach, in the form of post-data severity (SEV) evaluation, be employed. The SEV evaluation represents a post-data error probability that converts unduly data-specific ‘accept/reject H0 results’ into evidence either supporting or contradicting inferential claims regarding the parameters of interest. This approach offers a more nuanced and robust perspective in navigating the challenges posed by the large n problem.
- How the Post-Data Severity Converts Testing Results into Evidence for or against Pertinent Inferential ClaimsSpanos, Aris (MDPI, 2024-01-22)The paper makes a case that the current discussions on replicability and the abuse of significance testing have overlooked a more general contributor to the untrustworthiness of published empirical evidence, which is the uninformed and recipe-like implementation of statistical modeling and inference. It is argued that this contributes to the untrustworthiness problem in several different ways, including [a] statistical misspecification, [b] unwarranted evidential interpretations of frequentist inference results, and [c] questionable modeling strategies that rely on curve-fitting. What is more, the alternative proposals to replace or modify frequentist testing, including [i] replacing p-values with observed confidence intervals and effects sizes, and [ii] redefining statistical significance, will not address the untrustworthiness of evidence problem since they are equally vulnerable to [a]–[c]. The paper calls for distinguishing between unduly data-dependant ‘statistical results’, such as a point estimate, a p-value, and accept/reject H0, from ‘evidence for or against inferential claims’. The post-data severity (SEV) evaluation of the accept/reject H0 results, converts them into evidence for or against germane inferential claims. These claims can be used to address/elucidate several foundational issues, including (i) statistical vs. substantive significance, (ii) the large n problem, and (iii) the replicability of evidence. Also, the SEV perspective sheds light on the impertinence of the proposed alternatives [i]–[iii], and oppugns [iii] the alleged arbitrariness of framing H0 and H1 which is often exploited to undermine the credibility of frequentist testing.
- Women’s labour market participation and intimate partner violence in Ghana: A multilevel analysisOwusu-Brown, Bernice (Virginia Tech, 2023-09-14)In recent decades, the capabilities approach has emerged as the most pertinent theoretical framework for elucidating development, well-being, and justice. By emphasizing the multifaceted nature of human well-being, the capability approach advocates a broader perspective of development beyond mere economic growth. It underscores the necessity of considering various dimensions that contribute to the enhancement of human lives by assigning importance to freedom. One prevalent form of freedom violation is intimate partner violence, which stems from historically unequal power dynamics between men and women, resulting in the subjugation and discrimination of women by men and hindering the full realization of their potential. This profound restriction of freedom does not only violate their fundamental human rights but also jeopardizes their health, and, consequently, obstructs their active engagement in national economic and social development. The capability approach prescribes women’s empowerment as a remedy for curbing violence, as reflected in both conventional economic and non-economic models. These models forecast that women's engagement in the labor market enhances their bargaining power, leading to a decrease in intimate partner violence. However, in conflict are rather pessimistic models suggesting that women who earn more than their partners via their labor market participation are at risk of expiring increased partnered violence. Conscious of this bi-causal relationship and accounting for the potential endogeneity, I set out to empirically investigate the direction of association of this relationship within the Ghanaian context. Our key finding indicates that woman's work status significantly increases her likelihood of becoming a victim of partnered violence. I conclude that while there is a growing focus on creating job opportunities for women to foster gender equality and development, it is essential to consider and address the implications this may have on their safety and well-being.
- Matching markets with middlemen under transferable utilityAtay, Ata; Bahel, Eric; Solymosi, Tamas (Springer, 2023-03)This paper studies matching markets in the presence of middlemen. In our framework, a buyer-seller pair may either trade directly or use the services of a middleman; and a middleman may serve multiple buyer-seller pairs. For each such market, we examine the associated TU game. We first show that, in our context, an optimal matching can be obtained by considering the two-sided assignment market where each buyer-seller pair is allowed to use the mediation services of any middleman free of charge. Second, we prove that matching markets with middlemen are totally balanced: in particular, we show the existence of a buyer-optimal (seller-optimal) core allocation where each buyer (seller) receives her marginal contribution to the grand coalition. In general, the core does not exhibit a middleman-optimal allocation, not even when there are only two buyers and two sellers. However, we prove that in these small markets the maximum core payoff to each middleman is her marginal contribution. Finally, we establish the coincidence between the core and the set of competitive equilibrium payoff vectors.
- Rationally Inattentive Statistical Discrimination: Arrow Meets PhelpsEchenique, Federico; Li, Anqi (2022-12)When information acquisition is costly but flexible, a principal may rationally acquire information that favors a “majority” group over “minorities” unless the latter are strictly more productive than the former (the relative size of the groups plays no actual role). Majorities therefore face incentives to in- vest in being productive to the principal, whereas minorities are discouraged from such investments. The principal, in turn, focuses scarce attentional resources on majorities precisely because they are likely to invest. Our results have welfare and policy implications, as they add to the discussion of affirmative action, as well as the empirical literature on implicit bias and discrimination in performance evaluation.
- The Politics of Personalized News AggregationHu, Lin; Li, Anqi; Segal, Ilya (2023)We study how personalized news aggregation for rationally inattentive voters (NARI) affects policy polarization. In a two-candidate electoral competition model, an attention-maximizing infomediary aggregates source data about candidates’ valence into easy-to-digest news. Voters decide whether to consume news, trading off the expected gain from improved expressive voting against the attention cost. NARI generates policy polarization even if candidates are officemotivated. Personalized news aggregation makes extreme voters the disciplining entity of policy polarization. The skewness of their signals helps sustain a high degree of policy polarization in equilibrium. Analysis of disciplining voters informs the equilibrium and welfare consequences of regulating infomediaries.