Scholarly Works, Economics
Permanent URI for this collection
Research articles, presentations, and other scholarship
Browse
Browsing Scholarly Works, Economics by Title
Now showing 1 - 20 of 89
Results Per Page
Sort Options
- Amenities, affordability, and housing vouchersBieri, David S.; Dawkins, Casey J. (Blackwell Publishing Inc., 2018-06-19)Against the background of an emerging rental affordability crisis, we examine how the standard rule that households should not spend more than 30% of their income on housing expenditures leads to inefficiencies in the context of federal low-income housing policy. We quantify how the current practice of locally indexing individual rent subsidies in the Housing Choice Voucher (HCV) program regardless of quality-of-life conditions implicitly incentivizes recipients to live in high-amenity areas. We also assess a novel scenario for housing policy reform that adjusts subsidies by the amenity expenditures of low-income households, permitting national HCV program coverage to increase. © 2018 The Authors. Journal of Regional Science published by Wiley Periodicals, Inc.
- Application of stochastic choice modeling to policy analysis of public-goods - a case-study of air-quality improvementsLoehman, E.; De, V. H. (MIT Press, 1982)
- Assessing proxies of knowledge and difficulty with rubric‐based instrumentsSmith, Ben O.; Wooten, Jadrian (Wiley, 2023-09-28)The fields of psychometrics, economic education, and education have developed statistically‐valid methods of assessing knowledge and learning. These methods include item response theory, value‐added learning models, and disaggregated learning. These methods, however, focus on multiple‐choice or single response assessments. Faculty and administrators routinely assess knowledge through papers, thesis presentations, or other demonstrations of knowledge assessed with rubric rows. This paper presents a statistical approach to estimating a proxy for student ability and rubric row difficulty. Moreover, we have developed software so that practitioners can more easily apply this method to their instruments. This approach can be used in researching education treatment effects, practitioners measuring learning outcomes in their own classrooms, or estimating knowledge for administrative assessment. As an example, we have applied these new methods to projects in a large Labor Economics course at a public university.
- Asset Prices under Random Risk PreferencesTsang, Kwok Ping; Tserenjigmid, Gerelt (2016-12-05)We consider an overlapping-generations model with two types of investors: the stable investors have constant risk aversion, but the unstable investors have random levels of risk aversion across different generations. Investors are not sure about how risk averse future investors are. We show that i) a small amount of randomness in the risk aversion or ii) a small population of the unstable investors generates a large deviation from fundamental price and a high price volatility.
- Beamforming when the sound velocity is not precisely knownHinich, Melvin J. (Acoustical Society of America, 1980-08)Beamforming is an integral part of most signal processingsystems in active or passive sonars. The delays used to generate a beam are functions of the sound velocity, which depends on temperature, salinity, and pressure. There is a loss in array gain if the delays are incorrectly set. This will occur when the sound velocity in the water surrounding the hydrophones is different from the velocity that was used to set the delays. This paper makes two points: (1) fixed delay line sonars suffer a loss in gain when the true sound speed in the water is different from the velocity that is used to set the delays, and (2) there are signal processing techniques for two- or three-dimensional arrays that yield source bearings that are independent of the true sound velocity. These techniques require variable time delays, which can be realized using digital processing.
- Bearing estimation using a large towed arrayHinich, Melvin J.; Rule, William (Acoustical Society of America, 1975)When a towed array of hydrophones is significantly nonlinear due to bending, ordinary linear array beamforming gives a biased estimate of the true source bearing. By processing the array as a sequence of smaller aperture subarrays and then computing the mean of the subarray bearings, the variation due to bending is reduced and a reasonably precise estimate is obtained if the average bending angle with respect to the nominal axis is small. The median and mean subarray bearings are analyzed for a theoreticalstatistical model and are tested using artificial data for various sinusoidal array geometries.
- Bearing estimation using a perturbed linear arrayHinich, Melvin J. (Acoustical Society of America, 1977-06)A linear hydrophone array which is towed in the ocean is subject to snakelike bending. If the array is processed as if it was truly linear, the author has shown that the bending causes a deflection of the measured bearing of a fixed source from its true bearing relative to the array. This deflection results from patterned perturbations in the true sensor positions along the nominal array axis. As the perturbation pattern changes with the flexing of the array, the source appears to move around. A probability model of the perturbations is used in order to develop a theoretical solution to the question of how the space-time information gathered by the array is best used to measure source bearing. The method which is used to reduce the bending perturbation deflection of the bearing is to group the sensors into adjacent subarrays, process these arrays over short time slices, average the subarray bearings for each time period, and then to average the average over time. This averaging method significantly improves the bearing accuracy of the source when the array is bent according to the model.
- Bernoulli Regression Models: Revisiting the Specification of Statistical Models with Binary Dependent VariablesBergtold, Jason S.; Spanos, Aris; Onukwugha, Eberechukwu (Elsevier, 2010)The latent variable and generalized linear modelling approaches do not provide a systematic approach for modelling discrete choice observational data. Another alternative, the probabilistic reduction (PR) approach, provides a systematic way to specify such models that can yield reliable statistical and substantive inferences. The purpose of this paper is to re-examine the underlying probabilistic foundations of conditional statistical models with binary dependent variables using the PR approach. This leads to the development of the Bernoulli Regression Model, a family of statistical models, which includes the binary logistic regression model. The paper provides an explicit presentation of probabilistic model assumptions, guidance on model specification and estimation, and empirical application.
- Beyond Optimal ForecastingAshley, Richard A. (Virginia Tech, 2006-11-04)While the conditional mean is known to provide the minimum mean square error (MSE) forecast – and hence is optimal under a squared-error loss function – it must often in practice be replaced by a noisy estimate when model parameters are estimated over a small sample. Here two results are obtained, both of which motivate the use of forecasts biased toward zero (shrinkage forecasts) in such settings. First, the noisy forecast with minimum MSE is shown to be a shrinkage forecast. Second, a condition is derived under which a shrinkage forecast stochastically dominates the unbiased forecast over the class of loss functions monotonic in the forecast error magnitude. The appropriate amount of shrinkage from either perspective depends on a noisiness parameter which must be estimated, however, so the actual reduction in expected losses from shrinkage forecasting is an empirical issue. Simulation results over forecasts from a large variety of multiple regression models indicate that feasible shrinkage forecasts typically do provide modest improvements in forecast MSE when the noise in the estimate of the conditional mean is substantial.
- ChatGPT has Aced the Test of Understanding in College Economics: Now What?Geerling, Wayne; Mateer, G. Dirk; Damodaran, Nikhil; Wooten, Jadrian (SAGE, 2023-04-08)The Test of Understanding in College Economics (TUCE) is a standardized test of economics knowledge performed in the United States which primarily targets principles-level understanding. We asked ChatGPT to complete the TUCE. ChatGPT ranked in the 91st percentile for Microeconomics and the 99th percentile for Macroeconomics when compared to students who take the TUCE exam at the end of their principles course. The results show that ChatGPT is capable of providing answers that exceed the mean responses of students across all institutions. The emergence of artificial intelligence presents a significant challenge to traditional assessment methods in higher education. An important implication of this finding is that educators will likely need to redesign their curriculum in at least one of the following three ways: reintroduce proctored, in-person assessments; augment learning with chatbots; and/or increase the prevalence of experiential learning projects that artificial intelligence struggles to replicate well.
- Choosing Among the Variety of Proposed Voting ReformsTideman, Nicolaus (Springer, 2023-12-05)A wide variety of voting reforms are offered for consideration in this special issue. This paper draws connections among them and identifies the beliefs that make particular proposals more attractive than others.
- Cloud-Sourcing: Using an Online Labor Force to Detect Clouds and Cloud Shadows in Landsat ImagesYu, Ling; Ball, Sheryl B.; Blinn, Christine E.; Moeltner, Klaus; Peery, Seth; Thomas, Valerie A.; Wynne, Randolph H. (MDPI, 2015-02-26)We recruit an online labor force through Amazon.com’s Mechanical Turk platform to identify clouds and cloud shadows in Landsat satellite images. We find that a large group of workers can be mobilized quickly and relatively inexpensively. Our results indicate that workers’ accuracy is insensitive to wage, but deteriorates with the complexity of images and with time-on-task. In most instances, human interpretation of cloud impacted area using a majority rule was more accurate than an automated algorithm (Fmask) commonly used to identify clouds and cloud shadows. However, cirrus-impacted pixels were better identified by Fmask than by human interpreters. Crowd-sourced interpretation of cloud impacted pixels appears to be a promising means by which to augment or potentially validate fully automated algorithms.
- Credible Granger-Causality Inference with Modest Sample Lengths: A Cross-Sample Validation ApproachAshley, Richard A.; Tsang, Kwok Ping (MDPI, 2014-03-25)Credible Granger-causality analysis appears to require post-sample inference, as it is well-known that in-sample fit can be a poor guide to actual forecasting effectiveness. However, post-sample model testing requires an often-consequential a priori partitioning of the data into an “in-sample” period – purportedly utilized only for model specification/estimation – and a “post-sample” period, purportedly utilized (only at the end of the analysis) for model validation/testing purposes. This partitioning is usually infeasible, however, with samples of modest length – e.g., T ≤ 150 – as is common in both quarterly data sets and/or in monthly data sets where institutional arrangements vary over time, simply because there is in such cases insufficient data available to credibly accomplish both purposes separately. A cross-sample validation (CSV) testing procedure is proposed below which both eliminates the aforementioned a priori partitioning and which also substantially ameliorates this power versus credibility predicament – preserving most of the power of in-sample testing (by utilizing all of the sample data in the test), while also retaining most of the credibility of post-sample testing (by always basing model forecasts on data not utilized in estimating that particular model’s coefficients). Simulations show that the price paid, in terms of power relative to the in-sample Granger-causality F test, is manageable. An illustrative application is given, to a re-analysis of the Engel andWest [1] study of the causal relationship between macroeconomic fundamentals and the exchange rate; several of their conclusions are changed by our analysis.
- Demand for electricity in VirginiaMurray, M. P.; Spann, R.; Pulley, L.; Beauvais, E. (MIT Press, 1978)
- Deposit Competition, Interbank Market, and Bank ProfitJiang, Bo; Tzavellas, Hector; Yang, Xiaoying (MDPI, 2022-04-20)In this paper, we study how the interbank market could impact deposit competition and bank profits. We first document two stylized facts: the net interbank funding ratio is negatively correlated with net interest margin (NIM), as well as with the cost-to-income ratio (CIR). To rationalize these two facts, we embed the interbank market into a BLP model framework. The model is calibrated using Chinese listed banks’ data. A counterfactual experiment reveals that shutting down the interbank market will lead to a decline in NIM and bank profits. Our results indicate that the interbank market can facilitate specialization and reduce the intensity of deposit competition.
- Economic models for TMDL assessment and implementationBosch, Darrell J.; Ogg, Clayton; Osei, Edward; Stoecker, Arthur L. (American Society of Agricultural and Biological Engineers, 2006)The TMDL assessment and implementation process is designed to achieve designated uses for water bodies, which are set by states based on criteria including perceived costs and benefits. Setting water quality goals based on designated uses and plans to achieve these goals have important implications for public welfare. Both treatment and damage costs should be considered in simultaneously determining the desired water quality goal and allocating pollution reductions among sources to achieve that goal. Assessing and implementing TMDL plans are complicated by uncertainties about pollution damages and stakeholder responses. Economic optimization or simulation models linked to water quality models allow water quality impacts and costs of TMDL standards to be assessed Higher water quality thresholds may be reserved for watersheds with higher estimated benefits. Costs of achieving standards can be reduced by targeting reductions at pollution sources with the lowest costs of achieving reductions. Trading programs can help achieve efficient targeting of pollution reductions while distributing costs equitably. The effectiveness of economic models to assist in setting water quality goals and in TMDL program planning and implementation can be improved by using economic models to analyze costs and benefits of water quality improvements and to assist with pollution targeting and trading programs to minimize costs of reducing pollution. Multi-media impacts of pollution should be included within economic and environmental water quality models. Given uncertainties about benefits and costs of achieving TMDL standards, policymakers and program managers need to collect more data on stakeholder responses to TMDL programs as well as better monitoring data on pollutant levels and functioning of aquatic systems.
- Economics in a Crisis: A Cautious Approach to Being RelevantWooten, Jadrian; Al-Bahrani, Abdullah (Journal of Economics Teaching, 2021-01-01)While this may appear to be a good opportunity to bring real-life examples into the classroom and show how economics applies during a global pandemic, we advocate instead for a more cautious approach. One of the joys of teaching economics is that it can be applied to “everyday life,” but there are some moments in life where caution may be warranted.
- Economics of Squid GameWooten, Jadrian; Geerling, Wayne (Taylor & Francis, 2023)
- An Elementary Method for Detecting and Modeling Regression Parameter Variation Across Frequences With an Application to Testing the Permanent Income HypothesisBoon, Tan Hui; Ashley, Richard A. (Virginia Tech, 1997-03)A simple technique for directly testing the parameters of a time series regression model for instability across frequencies is presented. The method can be easily implemented in the time domain, so parameter instability across frequency bands can be conveniently detected and modeled in conjunction with other econometric features of the problem at hand, such as simultaneity, cointegration, missing observations, cross-equation restrictions, etc. The usefulness of the new technique is illustrated with an application to a cointegrated consumption-income regression model, yielding a straightforward test of the permanent income hypothesis.
- Estimating bearing when the source is endfire to an arrayHinich, Melvin J. (Acoustical Society of America, 1979-03)Consider the problem of estimating the direction of arrival of a plane wave using a linear array of length L with M sensors. Let 0 denote the direction with respect to the array axis. Assume that we know that 0 >0. The sign of 0 is ambiguous, given data from a linear array. The approximations to the rms errors of the maximum likelihood 1-a and the least-squares 4 estimators of 0 are proportional to I sin0 1 'l. The Cramer-Rao bound also has this property. 5 This seems to imply that these estimators have infinite variances as 0-0. This is not the case, as I will now show.