Browsing by Author "Spanos, Aris"
Now showing 1 - 20 of 53
Results Per Page
Sort Options
- Accounting for Risk and Level of Service in the Design of Passing Sight DistancesEl Khoury, John (Virginia Tech, 2005-11-28)Current design methods in transportation engineering do not simultaneously address the levels of risk and service associated with the design and use of various highway geometric elements. Passing sight distance (PSD) is an example of a geometric element designed with no risk measures. PSD is provided to ensure the safety of passing maneuvers on two-lane roads. Many variables decide the minimum length required for a safe passing maneuver. These are random variables and represent a wide range of human and vehicle characteristics. Also, current PSD design practices replace these random variables by single-value means in the calculation process, disregarding their inherent variations. The research focuses on three main objectives. The first goal is to derive a PSD distribution that accounts for the variations in the contributing parameters. Two models are devised for this purpose, a Monte-Carlo simulation model and a closed form analytical estimation model. The results of both models verify each other and differ by less than 5 percent. Using the PSD distribution, the reliability index of the current PSD criteria are assessed. The second goal is to attach risk indices to the various PSD lengths of the obtained distribution. A unique microscopic simulation is devised to replicate passing maneuvers on two-lane roads. Using the simulation results, the author is able to assess the risk of various PSD lengths for a specific design speed. The risk index of the AASHTO Green Book and the MUTCD PSD standards are also obtained using simulation. With risk measures attached to the PSD lengths, a trade-off analysis between the level of service and risk is feasible to accomplish. The last task is concerned with applying the Highway Capacity Manual concepts to assessing the service measures of the different PSD lengths. The results of the final trade-off analysis show that for a design speed of 50 mph, the AASHTO Green Book and the MUTCD standards overestimate the PSD requirements. The criteria can be reduced to 725 ft and still be within an acceptable risk level.
- Advances in Applied Econometrics: Binary Discrete Choice Models, Artificial Neural Networks, and Asymmetries in the FAST Multistage Demand SystemBergtold, Jason Scott (Virginia Tech, 2004-04-14)The dissertation examines advancements in the methods and techniques used in the field of econometrics. These advancements include: (i) a re-examination of the underlying statistical foundations of statistical models with binary dependent variables. (ii) using feed-forward backpropagation artificial neural networks for modeling dichotomous choice processes, and (iii) the estimation of unconditional demand elasticities using the flexible multistage demand system with asymmetric partitions and fixed effects across time. The first paper re-examines the underlying statistical foundations of statistical models with binary dependent variables using the probabilistic reduction approach. This re-examination leads to the development of the Bernoulli Regression Model, a family of statistical models arising from conditional Bernoulli distributions. The paper provides guidelines for specifying and estimating a Bernoulli Regression Model, as well as, methods for generating and simulating conditional binary choice processes. Finally, the Multinomial Regression Model is presented as a direct extension. The second paper empirically compares the out-of-sample predictive capabilities of artificial neural networks to binary logit and probit models. To facilitate this comparison, the statistical foundations of dichotomous choice models and feed-forward backpropagation artificial neural networks (FFBANNs) are re-evaluated. Using contingent valuation survey data, the paper shows that FFBANNs provide an alternative to the binary logit and probit models with linear index functions. Direct comparisons between the models showed that the FFBANNs performed marginally better than the logit and probit models for a number of within-sample and out-of-sample performance measures, but in the majority of cases these differences were not statistically significant. In addition, guidelines for modeling contingent valuation survey data and techniques for estimating median WTP measures using FFBANNs are examined. The third paper estimates a set of unconditional price and expenditure elasticities for 49 different processed food categories using scanner data and the flexible and symmetric translog (FAST) multistage demand system. Due to the use of panel data and the presence of heterogeneity across time, temporal fixed effects were incorporated into the model. Overall, estimated price elasticities are larger, in absolute terms, than previous estimates. The use of disaggregated product groupings, scanner data, and the estimation of unconditional elasticities likely accounts for these differences.
- Animating the EPR-Experiment: Reasoning from Error in the Search for Bell ViolationsVasudevan, Anubav (Virginia Tech, 2004-08-29)When faced with Duhemian problems of underdetermination, scientific method suggests neither a circumvention of such difficulties via the uncritical acceptance of background assumptions, nor the employment of epistemically unsatisfying subjectivist models of rational retainment. Instead, scientists are challenged to attack problems of underdetermination 'head-on', through a careful analysis of the severity of the testing procedures responsible for the production and modeling of their anomalous data. Researchers faced with the task of explaining empirical anomalies, employ a number of diverse and clever experimental techniques designed to cut through the Duhemian mists, and account for potential sources of error that might weaken an otherwise warranted inference. In lieu of such progressive experimental procedures, scientists try to identify the actual inferential work that an existing experiment is capable of providing so as to avoid ascribing to its output more discriminative power than it is rightfully due. We argue that the various strategies adopted by researchers involved in the testing of Bell's inequality, are well represented by Mayo's error-statistical notion of scientific evidence. In particular, an acceptance of her stringent demand for the output of severe tests to stand at the basis of rational inference, helps to explain the methodological reactions expressed by scientists in response to the loopholes that plagued the early Bell experiments performed by Alain Aspect et al.. At the same time, we argue as a counterpoint, that these very reactions present a challenge for 'top-down' approaches to Duhem's problem.
- Bernoulli Regression Models: Revisiting the Specification of Statistical Models with Binary Dependent VariablesBergtold, Jason S.; Spanos, Aris; Onukwugha, Eberechukwu (Elsevier, 2010)The latent variable and generalized linear modelling approaches do not provide a systematic approach for modelling discrete choice observational data. Another alternative, the probabilistic reduction (PR) approach, provides a systematic way to specify such models that can yield reliable statistical and substantive inferences. The purpose of this paper is to re-examine the underlying probabilistic foundations of conditional statistical models with binary dependent variables using the PR approach. This leads to the development of the Bernoulli Regression Model, a family of statistical models, which includes the binary logistic regression model. The paper provides an explicit presentation of probabilistic model assumptions, guidance on model specification and estimation, and empirical application.
- Confronting Theory with Data: the Case of DSGE ModelingPoudyal, Niraj (Virginia Tech, 2012-12-07)The primary objective of this is to confront the DSGE model (Ireland, 2011) with data in an attempt to evaluate its empirical adequacy. The perspective used for this evaluation is based on unveiling the statistical model (structural VAR) behind the DSGE model, with a view to test its probabilistic assumptions vis-a-vis the data. It is shown that the implicit statistical model is seriously misspecified and the information from mis-specification (M-S) testing is then used to respecify the original structural VAR in an attempt to achieve statistical adequacy. The latter provides a precondition for the reliability of any inference based on the statistical model. Once the statistical adequacy of the respecified model is secured through thorough M-S testing, inferences like the likelihood-ratio test for the overidentifying restrictions, forecasting, impulse response analysis are applied to the original DSGE model to evaluate its empirical adequacy. At the end, the same inferential procedure is applied to the CAPM model.
- The Econometrics of Piecewise Linear Budget Constraints With Skewed Error Distributons: An Application To Housing Demand In The Presence Of Capital Gains TaxationYan, Zheng (Virginia Tech, 1999-07-23)This paper examines the extent to which thin markets in conjunction with tax induced kinks in the budget constraint cause consumer demand to be skewed. To illustrate the principles I focus on the demand for owner-occupied housing. Housing units are indivisible and heterogeneous while tastes for housing are at least partly idiosyncratic, causing housing markets to be thin. In addition, prior to 1998, capital gains tax provisions introduced a sharp kink in the budget constraint of existing owner-occupiers in search of a new home: previous homeowners under age 55 paid no capital gains tax if they bought up, but were subject to capital gains tax if they bought down. I first characterize the economic conditions under which households err on the up or down side when choosing a home in the presence of a thin market and a kinked budget constraint. I then specify an empirical model that takes such effects into account. Results based on Monte Carlo experiments indicate that failing to allow for skewness in the demand for housing leads to biased estimates of the elasticities of demand when such skewness is actually present. In addition, estimates based on American Housing Survey data suggest that such bias is substantial: controlling for skewness reduces the price elasticity of demand among previous owner-occupiers from 1.6 to 0.3. Moreover, 58% of previous homeowners err on the up while only 42% err on the down side. Thus, housing demand is skewed.
- Essays in Educational Economics and Industry StructureMcLeod, Mark Alexander (Virginia Tech, 2003-07-14)My dissertation contains two separate components. One part is a theoretical examination of the effect of ownership structure on format choice in the radio industry. I use a Hotelling type location model to study the effects of mergers in the radio industry. I find that common ownership of two radio stations results in format choices that are more similar than under competitive ownership, and also that the stations will advertise more if they are operated under common ownership. Welfare results are ambiguous, but there is evidence that total welfare might decrease as the result of a merger, with obvious policy implications for the Federal Trade Commission and the Antitrust Division of the Department of Justice who evaluate and regulate mergers in all industries. The second component is an empirical study designed to assess the effectiveness of a mathematical tutorial that I authored in conjunction with colleagues in the Math department here at Virginia Tech. I taught four large sections of Principles of Macroeconomics in the spring and fall of 2001. Each class met on MWF; two sections at 8 AM, one at 10:10 AM, and one at 1:25 PM. I required one of the sections (8 AM Spring) to review the module and take a proficiency quiz to demonstrate their skill level in basic math that is used in the Economics Principles course. Final average in the course is the dependant variable in a regression designed to discover which variables have explanatory power in determining performance in introductory economics. Besides exposure to the math module, I include other independent variables describing class time, semester, demographics and effort. In addition, I collected qualitative information about the students' perceptions of the module's effectiveness and administration. I find that exposure to the Math module does not have a significant effect on performance in the course. However, within the treatment group, there is a positive significant effect of time spent using the module on performance. Also, being registered for an 8 AM section has a significant negative effect. Overall, student comments indicate a dislike for the module. Students report that they prefer learning math skills through lectures by the professor and use of textbooks.
- Essays on DSGE Models and Bayesian EstimationKim, Jae-yoon (Virginia Tech, 2018-06-11)This thesis explores the theory and practice of sovereignty. I begin with a conceptual analysis of sovereignty, examining its theological roots in contrast with its later influence in contestations over political authority. Theological debates surrounding God’s sovereignty dealt not with the question of legitimacy, which would become important for political sovereignty, but instead with the limits of his ability. Read as an ontological capacity, sovereignty is coterminous with an existent’s activity in the world. As lived, this capacity is regularly limited by the ways in which space is produced via its representations, its symbols, and its practices. All collective appropriations of space have a nomos that characterizes their practice. Foucault’s account of “biopolitics” provides an account of how contemporary materiality is distributed, an account that can be supplemented by sociological typologies of how city space is typically produced. The collective biopolitical distribution of space expands the range of practices that representationally legibilize activity in the world, thereby expanding the conceptual limits of existents and what it means for them to act up to the borders of their capacity, i.e., to practice sovereignty. The desire for total authorial capacity expresses itself in relations of domination and subordination that never erase the fundamental precarity of subjects, even as these expressions seek to disguise it. I conclude with a close reading of narratives recounting the lives of residents in Chicago’s Englewood, reading their activity as practices of sovereignty which manifest variously as they master and produce space.
- Essays on Fertility and the Economy in VenezuelaMaza Duerto, Octavio (Virginia Tech, 2003-01-31)The purpose of this dissertation is to explore the relationship between fertility rates and the economy in Venezuela. In particular, it looks at the extreme fluctuations in oil revenues, Venezuela's main source of income, in the 1970s and their impact on fertility. It uses the 1998 National Survey of Population and Family collected in Venezuela by The Central Office of Statistics and Information to estimate a Poisson model of total fertility by union cohort and to empirically investigate changes in childbearing patterns. The results show that compared to the 1967-1968 union cohort, all subsequent union cohorts decline in total fertility in the first 14 years after entering the first union. This finding raises the possibility that the period fertility rate stall and reversal of the 1970s are not associated with rises in total fertility. Further, the simple two period model of fertility timing developed for this study illustrates how females may change their fertility timing by responding to temporary changes in income, either through changes in wages or changes in the amount of transfers. Also, the duration analysis presents differences in childbearing patterns where the boom cohort seems to be at a higher risk of an additional child for lower parities, but at a lower risk for higher parities when compared to the bust cohort. These differences seem to remain even after controlling for individual characteristics or secular changes between the two cohorts. This study is important, because it highlights how sharp and short changes in economic conditions faced by a Venezuelan household induce a change in the timing of birth, thereby creating unexpected moves in the period fertility rates. Understanding the source of these moves can help to plan for them in the future.
- An examination of specification error in modern United States growth processesRosenberry, Lisa A. (Virginia Tech, 1995)This dissertation involves an empirical reexamination of US growth with the purpose of explaining growth usually attributed to advances in productivity. First, retaining the assumption of exogenous technological progress, I attempt to improve upon existing empirical models through new functional form assumptions. Next, I employ recent models of endogenous growth. Later chapters explore the issues of nonstationarity and international dependence. A significant generalization of the Gumbel Exponential distribution is developed and applied to the statistical modeling of economic growth. My chief objective is to characterize more accurately recent growth experience so that we may determine the most effective policy actions. Current empirical studies of growth behavior have concentrated on a cross sectional approach. I believe, in addition, much can be learned about individual growth processes through a time series approach. This approach avoids many complicated issues in cross sectional analysis including changes in institutions within and between countries. Better understanding the nature of growth in a particular country and relating this process to other nations should yield valuable insight into the nature of growth, convergence and divergence and provide implications for public policy. Many empirical studies have downplayed the crucial issue of examining the data in order to find the most appropriate econometric model specification. Through misspecification testing, we can identify and avoid faulty assumptions. Instead of viewing our data set as uncooperative, we should value the rich information our data contain. If our usual specification assumptions are invalid, more information can be extracted from our series through the inclusion of additional variables or through a Maximum Likelihood approach based upon an alternative distribution. This is the approach I follow in reexamining commonly utilized US input and output series. Utilizing the statistical and graphical abilities of the computer packages GAUSS and MATLAB, I am able to examine both graphically and analytically the validity of various assumptions about the underlying distributions of the data. With this approach, I can show that the Solow Residual contains a great deal of additional information about the dynamic pattern of growth of macroeconomic aggregates.
- Experimental Knowledge in Cognitive Neuroscience: Evidence, Errors, and InferenceAktunc, Mahir Emrah (Virginia Tech, 2011-07-02)This is a work in the epistemology of functional neuroimaging (fNI) and it applies the error-statistical (ES) philosophy to inferential problems in fNI to formulate and address these problems. This gives us a clear, accurate, and more complete understanding of what we can learn from fNI and how we can learn it. I review the works in the epistemology of fNI which I group into two categories; the first category consists of discussions of the theoretical significance of fNI findings and the second category discusses methodological difficulties of fNI. Both types of works have shortcomings; the first category has been too theory-centered in its approach and the second category has implicitly or explicitly adopted the assumption that methodological difficulties of fNI cannot be satisfactorily addressed. In this dissertation, I address these shortcomings and show how and what kind of experimental knowledge fNI can reliably produce which would be theoretically significant. I take fMRI as a representative fNI procedure and discuss the history of its development. Two independent trajectories of research in physics and physiology eventually converge to give rise to fMRI. Thus, fMRI findings are laden in the theories of physics and physiology and I propose how this creates a kind of useful theory-ladenness which allows for the representation of and intervention in the constructs of cognitive neuroscience. Duhemian challenges and problems of underdetermination are often raised to argue that fNI is of little, if any, epistemic value for psychology. I show how the ES notions of severe tests and error probabilities can be applied in epistemological analyses of fMRI. The result is that hemodynamic hypotheses can be severely tested in fMRI experiments and I demonstrate how these hypotheses are theoretically significant and fuel the growth of experimental knowledge in cognitive neuroscience. Throughout this dissertation, I put the emphasis on the experimental knowledge we obtain from fNI and argue that this is the fruitful approach that enables us to see how fNI can contribute to psychology. In doing so, I offer an error-statistical epistemology of fNI, which hopefully will be a significant contribution to the philosophy of psychology.
- Financial Liberalization, Competition and Sound Banking: Theoretical and Empirical EssaysChen, Xiaofen (Virginia Tech, 2001-07-31)Previous studies seem to agree that increased competition would cause riskier banking behavior. This dissertation shows that when competition intensifies, banks have greater incentives for screening loan applicants, and thus loan quality may improve. In addition, competition fosters banks to rely less on collateral requirements. Hence, banks may be less vulnerable to asset price shocks. The empirical chapter finds evidence of loan quality improvement after removing cross-border entry restrictions in the EU. There is also evidence that banks' behavior across EU countries has converged.
- Fortune Tellers: The Story of America's First Economic Forecasters [Book review]Spanos, Aris (Cambridge University Press, 2015-11-12)
- Frequentist ProbabilitySpanos, Aris (2017)The primary objective of this article is to discuss a model-based frequentist interpretation that identifies the probability of an event with the limit of its relative frequency of occurrence. What differentiates the proposed interpretation from the traditional ones are several key features: (i) events and probabilities are defined in the context of a statistical model , (ii) it is anchored on the strong law of large numbers, (iii) it is justified on empirical grounds by validating the model assumptions vis-à-vis data , (iv) the “long-run” metaphor can be rendered operational by simple simulation based on , and (v) the link between probability and real-world phenomena is provided by viewing data as a “truly typical” realization of the stochastic mechanism defined by . This link constitutes a feature shared with the Kolmogorov complexity algorithmic perspective on probability, which provides a further justification for the proposed frequentist interpretation.
- Game Theoretic Models of Connectivity Among Internet Access ProvidersBadasyan, Narine (Virginia Tech, 2004-06-09)The Internet has a loosely hierarchical structure. At the top of the hierarchy are the backbones, also called Internet Access Providers (hereafter IAPs). The second layer of the hierarchy is comprised of Internet Service Providers (hereafter ISPs). At the bottom of the hierarchy are the end users, consumers, who browse the web, and websites. To provide access to the whole Internet, the providers must interconnect with each other and share their network infrastructure. Two main forms of interconnection have emerged — peering under which the providers carry each other's traffic without any payments and transit under which the downstream provider pays the upstream provider a certain settlement payment for carrying its traffic. This dissertation develops three game theoretical models to describe the interconnection agreements among the providers, and analysis of those models from two alternative modeling perspectives: a purely non-cooperative game and a network perspective. There are two original contributions of the dissertation. First, we model the formation of peering/transit contracts explicitly as a decision variable in a non-cooperative game, while the current literature does not employ such modeling techniques. Second, we apply network analysis to examine interconnection decisions of the providers, which yields much realistic results. Chapter 1 provides a brief description of the Internet history, architecture and infrastructure as well as the economic literature. In Chapter 2 we develop a model, in which IAPs decide on private peering agreements, comparing the benefits of private peering relative to being connected only through National Access Points (hereafter NAPs). The model is formulated as a multistage game. Private peering agreements reduce congestion in the Internet, and so improve the quality of IAPs. The results show that even though the profits are lower with private peerings, due to large investments, the network where all the providers privately peer is the stable network. Chapter 3 discusses the interconnection arrangements among ISPs. Intra-backbone peering refers to peering between ISPs connected to the same backbone, whereas inter-backbone peering refers to peering between ISPs connected to different backbones. We formulate the model as a two-stage game. Peering affects profits through two channels - reduction of backbone congestion and ability to send traffic circumventing congested backbones. The relative magnitude of these factors helps or hinders peering. In Chapter 4 we develop a game theoretic model to examine how providers decide who they want to peer with and who has to pay transit. There is no regulation with regard to interconnection policies of providers, though there is a general convention that the providers peer if they perceive equal benefits from peering, and have transit arrangements otherwise. The model discusses a set of conditions, which determine the formation of peering and transit agreements. We argue that market forces determine the terms of interconnection, and there is no need for regulation to encourage peering. Moreover, Pareto optimum is achieved under the transit arrangements.
- Generalized Principal Component AnalysisSolat, Karo (Virginia Tech, 2018-06-05)The primary objective of this dissertation is to extend the classical Principal Components Analysis (PCA), aiming to reduce the dimensionality of a large number of Normal interrelated variables, in two directions. The first is to go beyond the static (contemporaneous or synchronous) covariance matrix among these interrelated variables to include certain forms of temporal (over time) dependence. The second direction takes the form of extending the PCA model beyond the Normal multivariate distribution to the Elliptically Symmetric family of distributions, which includes the Normal, the Student's t, the Laplace and the Pearson type II distributions as special cases. The result of these extensions is called the Generalized principal component analysis (GPCA). The GPCA is illustrated using both Monte Carlo simulations as well as an empirical study, in an attempt to demonstrate the enhanced reliability of these more general factor models in the context of out-of-sample forecasting. The empirical study examines the predictive capacity of the GPCA method in the context of Exchange Rate Forecasting, showing how the GPCA method dominates forecasts based on existing standard methods, including the random walk models, with or without including macroeconomic fundamentals.
- The Give and Take on Restaurant TippingParrett, Matthew Barton (Virginia Tech, 2003-09-17)This dissertation examines aspects of both the consumer (the "give") and the server (the "take") sides of restaurant tipping. On the consumer side, I address both why, and how much, people tip in restaurants. I also examine a policy issue related to the recent Supreme Court decision in United States v. Fior d'Italia. These issues are addressed via a combination of theoretical, empirical, and experimental analysis. On the server side, I use survey data collected from several restaurants to address the issue of labor market discrimination based on beauty. Specifically, do more attractive servers earn higher tips than less attractive servers? I argue that a tipping data set offers several advantages over data sets used in previous studies of the beauty wage gap. This dissertation was funded by a National Science Foundation Dissertation Enhancement Grant (NSF #427347).
- How the Post-Data Severity Converts Testing Results into Evidence for or against Pertinent Inferential ClaimsSpanos, Aris (MDPI, 2024-01-22)The paper makes a case that the current discussions on replicability and the abuse of significance testing have overlooked a more general contributor to the untrustworthiness of published empirical evidence, which is the uninformed and recipe-like implementation of statistical modeling and inference. It is argued that this contributes to the untrustworthiness problem in several different ways, including [a] statistical misspecification, [b] unwarranted evidential interpretations of frequentist inference results, and [c] questionable modeling strategies that rely on curve-fitting. What is more, the alternative proposals to replace or modify frequentist testing, including [i] replacing p-values with observed confidence intervals and effects sizes, and [ii] redefining statistical significance, will not address the untrustworthiness of evidence problem since they are equally vulnerable to [a]–[c]. The paper calls for distinguishing between unduly data-dependant ‘statistical results’, such as a point estimate, a p-value, and accept/reject H0, from ‘evidence for or against inferential claims’. The post-data severity (SEV) evaluation of the accept/reject H0 results, converts them into evidence for or against germane inferential claims. These claims can be used to address/elucidate several foundational issues, including (i) statistical vs. substantive significance, (ii) the large n problem, and (iii) the replicability of evidence. Also, the SEV perspective sheds light on the impertinence of the proposed alternatives [i]–[iii], and oppugns [iii] the alleged arbitrariness of framing H0 and H1 which is often exploited to undermine the credibility of frequentist testing.
- An Investigation into the Demand for Service ContractsMoore, Evan (Virginia Tech, 2002-08-13)This dissertation is an investigation into the determinants of demand for service contracts on new vehicles. In the first chapter, I characterize the consumer decision to buy a service contract with a discrete choice model. Hypotheses and conjectures are tested empirically using survey data from new vehicle buyers. The second chapter consists of the development and testing of an instrument for measuring attitudes toward uncertainty. This tool is useful in gauging aversion toward weak ambiguity. Finally, in the third chapter, I use additional survey and experimental data from new vehicle buyers to further differentiate between the factors that significantly affect the service contract purchase decision. A variety of uncertainty measures and their predictive powers are discussed. I would like to thank the John D. and Catherine T. MacArthur Foundation, Network on Preferences and Norms, for their generous financial support, which was indispensable to the completion of this research.
- Liquidity as a Latent Variable - an Application of the Mimic ModelSpanos, Aris (Blackwell, 1984-01-01)
- «
- 1 (current)
- 2
- 3
- »