Browsing by Author "Zobel, Christopher W."
Now showing 1 - 20 of 45
Results Per Page
Sort Options
- Advancement of Using Portable Free Fall Penetrometers for Geotechnical Site Characterization of Energetic Sandy Nearshore AreasAlbatal, Ali Hefdhallah Ali (Virginia Tech, 2018-04-24)Portable Free Fall Penetrometers (PFFPs) are lightweight tools used for rapid and economic characterization of surficial subaqueous sediments. PFFPs vary in weight, shape and size with options for using add-on units. The different configurations enable deployments in various environments and water depths, including the nearshore zone where conventional methods are challenged by energetic hydrodynamics and limited navigable depth. Moreover, PFFPs offer an opportunity to reduce the high site investigation costs associated with conventional offshore geotechnical site investigation methods. These costs are often a major obstacle for small projects serving remote communities or testing novel renewable energy harvesting machines. However, PFFPs still face issues regarding data analysis and interpretation, particularly in energetic sandy nearshore areas. This includes a lack of data and accepted analysis methods for such environments. Therefore, the goal of this research was to advance data interpretation and sediments characterization methods using PFFPs with emphasis on deployments in energetic nearshore environments. PFFP tests were conducted in the nearshore areas of: Yakutat Bay, AK; Cannon Beach, AK; and the U.S. Army Corps of Engineers' Field Research Facility's beach, Duck, NC. From the measurements, the research goal was addressed by: (1) introducing a methodology to create a regional sediment classification scheme utilizing the PFFP deceleration and pore pressure measurements, sediment traces on the probe upon retrieval, and previous literature; (2) investigating the effect of wave forcing on the sediments' behavior through correlating variations in sediment strength to wave climate, sandbar migration, and depth of closure, as well as identifying areas of significant sediment mobilization processes; and (3) estimating the relative density and friction angle of sand in energetic nearshore areas from PFFP measurements. For the latter, the field data was supported by vacuum triaxial tests and PFFP deployments under controlled laboratory conditions on sand samples prepared at different relative densities. The research outcomes address gaps in knowledge with regard to the limited studies available that investigate the sand geotechnical properties in energetic nearshore areas. More specifically, the research contributes to the understanding of surficial sediment geotechnical properties in energetic nearshore areas and the enhancement of sediment characterization and interpretation methods.
- Advances in Applied Econometrics: Binary Discrete Choice Models, Artificial Neural Networks, and Asymmetries in the FAST Multistage Demand SystemBergtold, Jason Scott (Virginia Tech, 2004-04-14)The dissertation examines advancements in the methods and techniques used in the field of econometrics. These advancements include: (i) a re-examination of the underlying statistical foundations of statistical models with binary dependent variables. (ii) using feed-forward backpropagation artificial neural networks for modeling dichotomous choice processes, and (iii) the estimation of unconditional demand elasticities using the flexible multistage demand system with asymmetric partitions and fixed effects across time. The first paper re-examines the underlying statistical foundations of statistical models with binary dependent variables using the probabilistic reduction approach. This re-examination leads to the development of the Bernoulli Regression Model, a family of statistical models arising from conditional Bernoulli distributions. The paper provides guidelines for specifying and estimating a Bernoulli Regression Model, as well as, methods for generating and simulating conditional binary choice processes. Finally, the Multinomial Regression Model is presented as a direct extension. The second paper empirically compares the out-of-sample predictive capabilities of artificial neural networks to binary logit and probit models. To facilitate this comparison, the statistical foundations of dichotomous choice models and feed-forward backpropagation artificial neural networks (FFBANNs) are re-evaluated. Using contingent valuation survey data, the paper shows that FFBANNs provide an alternative to the binary logit and probit models with linear index functions. Direct comparisons between the models showed that the FFBANNs performed marginally better than the logit and probit models for a number of within-sample and out-of-sample performance measures, but in the majority of cases these differences were not statistically significant. In addition, guidelines for modeling contingent valuation survey data and techniques for estimating median WTP measures using FFBANNs are examined. The third paper estimates a set of unconditional price and expenditure elasticities for 49 different processed food categories using scanner data and the flexible and symmetric translog (FAST) multistage demand system. Due to the use of panel data and the presence of heterogeneity across time, temporal fixed effects were incorporated into the model. Overall, estimated price elasticities are larger, in absolute terms, than previous estimates. The use of disaggregated product groupings, scanner data, and the estimation of unconditional elasticities likely accounts for these differences.
- Advancing Emergency Department Efficiency, Infectious Disease Management at Mass Gatherings, and Self-Efficacy Through Data Science and Dynamic ModelingBa-Aoum, Mohammed Hassan (Virginia Tech, 2024-04-09)This dissertation employs management systems engineering principles, data science, and industrial systems engineering techniques to address pressing challenges in emergency department (ED) efficiency, infectious disease management at mass gatherings, and student self-efficacy. It is structured into three essays, each contributing to a distinct domain of research, and utilizes industrial and systems engineering approaches to provide data-driven insights and recommend solutions. The first essay used data analytics and regression analysis to understand how patient length of stay (LOS) in EDs could be influenced by multi-level variables integrating patient, service, and organizational factors. The findings suggested that specific demographic variables, the complexity of service provided, and staff-related variables significantly impacted LOS, offering guidance for operational improvements and better resource allocation. The second essay utilized system dynamics simulations to develop a modified SEIR model for modeling infectious diseases during mass gatherings and assessing the effectiveness of commonly implemented policies. The results demonstrated the significant collective impact of interventions such as visitor limits, vaccination mandates, and mask wearing, emphasizing their role in preventing health crises. The third essay applied machine learning methods to predict student self-efficacy in Muslim societies, revealing the importance of socio-emotional traits, cognitive abilities, and regulatory competencies. It provided a basis for identifying students with varying levels of self-efficacy and developing tailored strategies to enhance their academic and personal success. Collectively, these essays underscore the value of data-driven and evidence-based decision- making. The dissertation's broader impact lies in its contribution to optimizing healthcare operations, informing public health policy, and shaping educational strategies to be more culturally sensitive and psychologically informed. It provides a roadmap for future research and practical applications across the healthcare, public health, and education sectors, fostering advancements that could significantly benefit society.
- An Agent-Based Distributed Decision Support System Framework for Mediated NegotiationLoPinto, Frank Anthony (Virginia Tech, 2004-04-26)Implementing an e-market for limited supply perishable asset (LiSPA) products is a problem at the intersection of online purchasing and distributed decision support systems (DistDSS). In this dissertation, we introduce and define LiSPA products, provide real-world examples, develop a framework for a distributed system to implement an e-market for LiSPA products, and provide proof-of-concept for the two major components of the framework. The DistDSS framework requires customers to instantiate agents that learn their preferences and evaluate products on their behalf. Accurately eliciting and modeling customer preferences in a quick and easy manner is a major hurdle for implementing this agent-based system. A methodology is developed for this problem using conjoint analysis and neural networks. The framework also contains a model component that is addressed in this work. The model component is presented as a mediator of customer negotiation that uses the agent-based preference models mentioned above and employs a linear programming model to maximize overall satisfaction of the total market.
- Analysis of Post-Sandy Single-Family Housing Market in Staten Island, New YorkBorate, Aishwarya (Virginia Tech, 2018-11-13)Recent hurricanes have made it clear that housing is the single greatest component of all losses in terms of economic value and buildings damaged. Housing damage resulting from floods has increased in the United States, despite local, state and federal encouragement to mitigate flood hazards and regulate development in flood-prone areas (Atreya, 2013). The two primary causes of these increased costs are: (1) a rise in the occurrence and strength of the extreme weather events, and (2) increased development and value of property in physically vulnerable areas. The overlap of the above two factors resulted in tremendous losses of property in Staten Island and other coastal communities along the Atlantic Coast. Hurricane Sandy was a reminder of how vulnerable such areas could be. After hurricane Sandy, damaged properties experienced higher than usual housing sales and changed property values. This research, seeks to improve the current state of knowledge about housing market following a major disaster through examining single-family housing sales and prices in Staten Island, New York. The housing price recovery rate was much slower for the properties that sustained damage, and the impacts lasted for at least four years after the storm. Researchers studying housing recovery have utilized a variety of indicators like financial characteristics, government policies, social parameters, damage, housing characteristics, etc. to capture the dimensions of recovery. In Sandy's case damage was the major influencing parameter, and it completely changed the housing dynamics of the affected coastal regions. Housing market, in terms of damage, restoration, and recovery, is a fundamental indicator of disaster resilience. Every community is different and so are the effects of disasters on residential markets. This study clearly highlights this point and underscores the importance of using contextual methods and data sets in conducting the research.
- Assessing Nonprofit Websites: Developing an Evaluation ModelKirk, Kristin Cherish (Virginia Tech, 2018-04-23)Nonprofit organizations are pivotal actors in society, and their websites can play important roles in aiding organizations in their socially-beneficial missions by serving as a platform to present information, to interact with stakeholders and to perform online transactions. This dissertation analyzed nonprofit websites in the United States (U.S.) and in Thailand in a series of three articles. The first developed a website evaluative instrument, based on an e-commerce model, and applied it to nonprofit websites through a manual decoding process. That article's findings suggested that Thai websites are not considerably different than U.S. nonprofit websites, except more American websites offer online transactions. The second article analyzed two different types of nonprofits in Thailand using the same model to assess website development in an emerging market. That analysis suggested local Thai nonprofits' websites lagged significantly behind those of internationally connected nonprofit organizations in the country in the features they offered. The third article compared the adapted model employed in the second analysis, which used manual decoding for website examination, to a commercially available, automated evaluation service. That analysis highlighted the differences between the two assessment tools and found them to be complementary, but independently insufficient to ensure robust nonprofit website evaluation.
- Assessment of SWAT to Enable Development of Watershed Management Plans for Agricultural Dominated Systems under Data-Poor ConditionsOsorio Leyton, Javier Mauricio (Virginia Tech, 2012-05-02)Modeling is an important tool in watershed management. In much of the world, data needed for modeling, both for model inputs and for model evaluation, are very limited or non-existent. The overall objective of this research was to enable development of watershed management plans for agricultural dominated systems under situations where data are scarce. First, uncertainty of the SWAT model's outputs due to input parameters, specifically soils and high resolution digital elevation models, which are likely to be lacking in data-poor environments, was quantified using Monte Carlo simulation. Two sources of soil parameter values (SSURGO and STATSGO) were investigated, as well as three levels of DEM resolution (10, 30, and 90 m). Uncertainty increased as the input data became coarser for individual soil parameters. The combination of SSURGO and the 30 m DEM proved to adequately balance the level of uncertainty and the quality of input datasets. Second, methods were developed to generate appropriate soils information and DEM resolution for data-poor environments. The soils map was generated based on lithology and slope class, while the soil attributes were generated by linking surface soil texture to soils characterized in the SWAT soils database. A 30 m resolution DEM was generated by resampling a 90 m DEM, the resolution that is readily available around the world, by direct projection using a cubic convolution method. The effect of the generated DEM and soils data on model predictions was evaluated in a data-rich environment. When all soil parameters were varied at the same time, predictions based on the derived soil map were comparable to the predictions based on the SSURGO map. Finally, the methodology was tested in a data-poor watershed in Bolivia. The proposed methodologies for generating input data showed how available knowledge can be employed to generate data for modeling purposes and give the opportunity to incorporate uncertainty in the decision making process in data-poor environments.
- Automatic Modulation Classication and Blind Equalization for Cognitive RadiosRamkumar, Barathram (Virginia Tech, 2011-07-28)Cognitive Radio (CR) is an emerging wireless communications technology that addresses the inefficiency of current radio spectrum usage. CR also supports the evolution of existing wireless applications and the development of new civilian and military applications. In military and public safety applications, there is no information available about the signal present in a frequency band and hence there is a need for a CR receiver to identify the modulation format employed in the signal. The automatic modulation classifier (AMC) is an important signal processing component that helps the CR in identifying the modulation format employed in the detected signal. AMC algorithms developed so far can classify only signals from a single user present in a frequency band. In a typical CR scenario, there is a possibility that more than one user is present in a frequency band and hence it is necessary to develop an AMC that can classify signals from multiple users simultaneously. One of the main objectives of this dissertation is to develop robust multiuser AMC's for CR. It will be shown later that multiple antennas are required at the receiver for classifying multiple signals. The use of multiple antennas at the transmitter and receiver is known as a Multi Input Multi Output (MIMO) communication system. By using multiple antennas at the receiver, apart from classifying signals from multiple users, the CR can harness the advantages offered by classical MIMO communication techniques like higher data rate, reliability, and an extended coverage area. While MIMO CR will provide numerous benefits, there are some significant challenges in applying conventional MIMO theory to CR. In this dissertation, open problems in applying classical MIMO techniques to a CR scenario are addressed. A blind equalizer is another important signal processing component that a CR must possess since there are no training or pilot signals available in many applications. In a typical wireless communication environment the transmitted signals are subjected to noise and multipath fading. Multipath fading not only affects the performance of symbol detection by causing inter symbol interference (ISI) but also affects the performance of the AMC. The equalizer is a signal processing component that removes ISI from the received signal, thus improving the symbol detection performance. In a conventional wireless communication system, training or pilot sequences are usually available for designing the equalizer. When a training sequence is available, equalizer parameters are adapted by minimizing the well known cost function called mean square error (MSE). When a training sequence is not available, blind equalization algorithms adapt the parameters of the blind equalizer by minimizing cost functions that exploit the higher order statistics of the received signal. These cost functions are non convex and hence the blind equalizer has the potential to converge to a local minimum. Convergence to a local minimum not only affects symbol detection performance but also affects the performance of the AMC. Robust blind equalizers can be designed if the performance of the AMC is also considered while adapting equalizer parameters. In this dissertation we also develop Single Input Single Output (SISO) and MIMO blind equalizers where the performance of the AMC is also considered while adapting the equalizer parameters.
- Consumer-Centric Innovation for Mobile Apps Empowered by Social Media AnalyticsQiao, Zhilei (Virginia Tech, 2018-06-20)Due to the rapid development of Internet communication technologies (ICTs), an increasing number of social media platforms exist where consumers can exchange comments online about products and services that businesses offer. The existing literature has demonstrated that online user-generated content can significantly influence consumer behavior and increase sales. However, its impact on organizational operations has been primarily focused on marketing, with other areas understudied. Hence, there is a pressing need to design a research framework that explores the impact of online user-generated content on important organizational operations such as product innovation, customer relationship management, and operations management. Research efforts in this dissertation center on exploring the co-creation value of online consumer reviews, where consumers' demands influence firms' decision-making. The dissertation is composed of three studies. The first study finds empirical evidence that quality signals in online product reviews are predictors of the timing of firms' incremental innovation. Guided by the product differentiation theory, the second study examines how companies' innovation and marketing differentiation strategies influence app performance. The last study proposes a novel text analytics framework to discover different information types from user reviews. The research contributes theoretical and practical insights to consumer-centric innovation and social media analytics literature.
- A data-driven framework to support resilient and sustainable early designZaker Esteghamati, Mohsen (Virginia Tech, 2021-08-05)Early design is the most critical stage to improve the resiliency and sustainability of buildings. An unaided early design follows the designer's accustomed domain of knowledge and cognitive biases. Given the inherent limitations of human decision-making, such a design process will only explore a small set of alternatives using limited criteria, and most likely, miss high-performing alternatives. Performance-based engineering (PBE) is a probabilistic approach to quantify buildings performance against natural hazards in terms of decision metrics such as repair cost and functionality loss. Therefore, PBE can remarkably improve early design by informing the designer regarding the possible consequences of different decisions. Incorporating PBE in early design is obstructed by several challenges such as time- and effort-intensiveness of performing rigorous PBE assessments, a specific skillset that might not be available, and accrual of aleatoric (associated with innate randomness of physical systems properties and surrounding environment conditions) and epistemic (associated with the incomplete state of knowledge) uncertainties. In addition, a successful early design requires exploring a large number of alternatives, which, when compounded by PBE assessments, will significantly exhaust computational resources and pressure the project timeline. This dissertation proposes a framework to integrate prior knowledge and PBE assessments in early design. The primary workflow in the proposed framework develops a performance inventory to train statistical surrogate models using supervised learning algorithms. This performance inventory comprises PBE assessments consistent with building taxonomy and site, and is supported by a knowledge-based module. The knowledge-based module organizes prior published PBE assessments as a relational database to supplement the performance inventory and aid early design exploration through knowledge-based surrogate models. Lastly, the developed knowledge-based and data-driven surrogate models are implemented in a sequential design exploration scheme to estimate the performance range for a given topology and building system. The proposed framework is then applied for mid-rise concrete office buildings in Charleston, South Carolina, where seismic vulnerability and environmental performance are linked to topology and design parameters.
- Decision Support for Casualty Triage in Emergency ResponseKamali, Behrooz (Virginia Tech, 2016-05-04)Mass-casualty incidents (MCI) cause a sudden increase in demand of medical resources in a region. The most important and challenging task in addressing an MCI is managing overwhelmed resources with the goal of increasing total number of survivors. Currently, most of the decisions following an MCI are made in an ad-hoc manner or by following static guidelines that do not account for amount of available resources and number of the casualties. The purpose of this dissertation is to introduce and analyze sophisticated service prioritization and resource allocation tools. These tools can be used to produce service order strategies that increase the overall number of survivors. There are several models proposed that account for number and mix of the casualties, and amount and type of the resources available. Large number of the elements involved in this problem makes the model very complex, and thus, in order to gain some insights into the structure of the optimal solutions, some of the proposed models are developed under simplifying assumptions. These assumptions include limitations on the number of casualty types, handling of deaths, servers, and types of resources. Under these assumptions several characteristics of the optimal policies are identified, and optimal algorithms for various scenarios are developed. We also develop an integrated model that addresses service order, transportation, and hospital selection. A comprehensive set of computational results and comparison with the related works in the literature are provided in order to demonstrate the efficacy of the proposed methodologies.
- Decision support for long-range, community-based planning to mitigate against and recover from potential multiple disastersChacko, Josey; Rees, Loren P.; Zobel, Christopher W.; Rakes, Terry R.; Russell, Roberta S.; Ragsdale, Cliff T. (Elsevier, 2016-07-01)This paper discusses a new mathematical model for community-driven disaster planning that is intended to help decision makers exploit the synergies resulting from simultaneously considering actions focusing on mitigation and efforts geared toward long-term recovery. The model is keyed on enabling long-term community resilience in the face of potential disasters of varying types, frequencies, and severities, and the approach’s highly iterative nature is facilitated by the model’s implementation in the context of a Decision Support System. Three examples from Mombasa, Kenya, East Africa, are discussed and compared in order to demonstrate the advantages of the new mathematical model over the current ad hoc mitigation and long-term recovery planning approaches that are typically used.
- Designing a framework to guide renewal engineering decision-making for water and wastewater pipelinesManiar, Saumil Hiren (Virginia Tech, 2010-08-11)Federal, state and private organizations have an urgent need for renewal of water and wastewater pipelines. A pertinent gap remains in understanding the relationship between deteriorated host-pipe conditions and renewal products cost and performance. This work provides a framework Decision-Support System that supports water and wastewater pipeline renewal-products. Various renewal products fit utility needs, and the optimization of this process streamlines the decision-making for renewal product selection. The Thesis has classified various factors for use in the renewal product decision-making process, and it provides the justification for use of the renewal decision-making factors in recommending a product. Pipeline problem definition, system causes, system requirements and renewal product characteristics are the key decision-making areas controlling the recommendation of a renewal product. The Decision-Support System framework is developed in a user-friendly Visual Basic forms, using Microsoft tools and evaluated for vendor information. The given framework allows the user to edit product information needs, factors affecting decision-making and the classification of each factor. This allows for ease in modification, utilization and collaborative understanding. The prototype framework An online hosting of the proposed framework will improve accessibility and validity of the renewal decision-making process.
- A Deterministic Approach to Partitioning Neural Network Training Data for the Classification ProblemSmith, Gregory Edward (Virginia Tech, 2006-08-07)The classification problem in discriminant analysis involves identifying a function that accurately classifies observations as originating from one of two or more mutually exclusive groups. Because no single classification technique works best for all problems, many different techniques have been developed. For business applications, neural networks have become the most commonly used classification technique and though they often outperform traditional statistical classification methods, their performance may be hindered because of failings in the use of training data. This problem can be exacerbated because of small data set size. In this dissertation, we identify and discuss a number of potential problems with typical random partitioning of neural network training data for the classification problem and introduce deterministic methods to partitioning that overcome these obstacles and improve classification accuracy on new validation data. A traditional statistical distance measure enables this deterministic partitioning. Heuristics for both the two-group classification problem and k-group classification problem are presented. We show that these heuristics result in generalizable neural network models that produce more accurate classification results, on average, than several commonly used classification techniques. In addition, we compare several two-group simulated and real-world data sets with respect to the interior and boundary positions of observations within their groups' convex polyhedrons. We show by example that projecting the interior points of simulated data to the boundary of their group polyhedrons generates convex shapes similar to real-world data group convex polyhedrons. Our two-group deterministic partitioning heuristic is then applied to the repositioned simulated data, producing results superior to several commonly used classification techniques.
- Development of Sustainable Traffic Control Principles for Self-Driving Vehicles: A Paradigm Shift Within the Framework of Social JusticeMladenovic, Milos (Virginia Tech, 2014-08-22)Developments of commercial self-driving vehicle (SDV) technology has a potential for a paradigm shift in traffic control technology. Contrary to some previous research approaches, this research argues that, as any other technology, traffic control technology for SDVs should be developed having in mind improved quality of life through a sustainable developmental approach. Consequently, this research emphasizes upon the social perspective of sustainability, considering its neglect in the conventional control principles, and the importance of behavioral considerations for accurately predicting impacts upon economic or environmental factors. The premise is that traffic control technology can affect the distribution of advantages and disadvantages in a society, and thus it requires a framework of social justice. The framework of social justice is inspired by John Rawls' Theory of Justice as fairness, and tries to protect the inviolability of each user in a system. Consequently, the control objective is the distribution of delay per individual, considering for example that the effect of delay is not the same if a person is traveling to a grocery store as opposed to traveling to a hospital. The notion of social justice is developed as a priority system, with end-user responsibility, where user is able to assign a specific Priority Level for each individual trip with SDV. Selected Priority Level is used to determine the right-of-way for each self-driving vehicle at an intersection. As a supporting mechanism to the priority system, there is a structure of non-monetary Priority Credits. Rules for using Priority Credits are determined using knowledge from social science research and through empirical evaluation using surveys, interviews, and web-based experiment. In the physical space, the intersection control principle is developed as hierarchical self-organization, utilizing communication, sensing, and in-vehicle technological capabilities. This distributed control approach should enable robustness against failure, and scalability for future expansion. The control mechanism has been modeled as an agent-based system, allowing evaluation of effects upon safety and user delay. In conclusion, by reaching across multiple disciplines, this development provides the promise and the challenge for evolving SDV control technology. Future efforts for SDV technology development should continue to rely upon transparent public involvement and understanding of human decision-making.
- Disruption Information, Network Topology and Supply Chain ResilienceLi, Yuhong (Virginia Tech, 2017-07-17)This dissertation consists of three essays studying three closely related aspects of supply chain resilience. The first essay is "Value of Supply Disruption Information and Information Accuracy", in which we examine the factors that influence the value of supply disruption information, investigate how information accuracy influences this value, and provide managerial suggestions to practitioners. The study is motivated by the fact that fully accurate disruption information may be difficult and costly to obtain and inaccurate disruption information can decrease the financial benefit of prior knowledge and even lead to negative performance. We perform the analysis by adopting a newsvendor model. The results show that information accuracy, specifically information bias and information variance, plays an important role in determining the value of disruption information. However, this influence varies at different levels of disruption severity and resilience capacity. The second essay is "Quantifying Supply Chain Resilience: A Dynamic Approach", in which we provide a new type of quantitative framework for assessing network resilience. This framework includes three basic elements: robustness, recoverability and resilience, which can be assessed with respect to different performance measures. Then we present a comprehensive analysis on how network structure and other parameters influence these different elements. The results of this analysis clearly show that both researchers and practitioners should be aware of the possible tradeoffs among different aspects of supply chain resilience. The ability of the framework to support better decision making is then illustrated through a systemic analysis based on a real supply chain network. The third essay is "Network Characteristics and Supply Chain Disruption Resilience", in which we investigate the relationships between network characteristics and supply chain resilience. In this work, we first prove that investigating network characteristics can lead to a better understanding of supply chain resilience behaviors. Later we select key characteristics that play a critical role in determining network resilience. We then construct the regression and decision tree models of different supply chain resilience measures, which can be used to estimate supply chain network resilience given the key influential characteristics. Finally, we conduct a case study to examine the estimation accuracy.
- Ensemble Learning Techniques for Structured and Unstructured DataKing, Michael Allen (Virginia Tech, 2015-04-01)This research provides an integrated approach of applying innovative ensemble learning techniques that has the potential to increase the overall accuracy of classification models. Actual structured and unstructured data sets from industry are utilized during the research process, analysis and subsequent model evaluations. The first research section addresses the consumer demand forecasting and daily capacity management requirements of a nationally recognized alpine ski resort in the state of Utah, in the United States of America. A basic econometric model is developed and three classic predictive models evaluated the effectiveness. These predictive models were subsequently used as input for four ensemble modeling techniques. Ensemble learning techniques are shown to be effective. The second research section discusses the opportunities and challenges faced by a leading firm providing sponsored search marketing services. The goal for sponsored search marketing campaigns is to create advertising campaigns that better attract and motivate a target market to purchase. This research develops a method for classifying profitable campaigns and maximizing overall campaign portfolio profits. Four traditional classifiers are utilized, along with four ensemble learning techniques, to build classifier models to identify profitable pay-per-click campaigns. A MetaCost ensemble configuration, having the ability to integrate unequal classification cost, produced the highest campaign portfolio profit. The third research section addresses the management challenges of online consumer reviews encountered by service industries and addresses how these textual reviews can be used for service improvements. A service improvement framework is introduced that integrates traditional text mining techniques and second order feature derivation with ensemble learning techniques. The concept of GLOW and SMOKE words is introduced and is shown to be an objective text analytic source of service defects or service accolades.
- Evolving Technologies Shaping Public TransitEpanty, Efon Mandong (Virginia Tech, 2024-02-01)The transit industry is changing rapidly due to technology, which in turn changes business models, ridership, travel patterns, and the transit workforce. As transit agencies introduce new technology systems, research is needed on how these systems impact demand for paratransit and on-demand mobility services. This research addresses this topic by studying the impact of technology on demand-responsive transportation and urban mobility. Over the past two decades, this sector has been transformed by cloud computing, machine learning, artificial intelligence, ridesharing, and mobility-on-demand. This dissertation explores the adoption of new technology by transit agencies and service providers, focusing on implementing app-based dynamic technologies for dispatching and scheduling demand-responsive transportation modes such as microtransit services, on-demand transit, and paratransit. Although studies on technological changes in other sectors have been conducted, public transit agencies need a more systematic approach to adopting new technology. Current literature on technology adoption in public transit focuses on the benefits and outcomes of technology adoption, with limited discussions of the challenges faced in adopting and implementing technologies. Comprehensive research on the emerging and evolving transit technological landscape is essential to bridge this gap. This research examines how transit agencies react to internal and external technological changes as their operational, tactical, and strategic operating conditions evolve. The aim is to enhance the current comprehension of the topic by providing a comprehensive overview of the technology adoption methodology and to offer practical planning and policy recommendations where possible. A mixed-methods approach was applied to explore the research questions. Transit practitioners and managers in the Washington DC region were surveyed, and the analysis techniques employed included cross-tabulation and descriptive statistics. This dissertation focuses on gaining insight into adopting real-time dynamic dispatching and scheduling, on-demand transit, and microtransit technologies, including the opinions of transit practitioners and policymakers involved in facilitating technology adoption. Specifically, the study aims to: 1) understand the impact of adopting emerging paratransit technologies; 2) investigate on-demand transit system performance outcomes under ridership, on-time performance, and operating costs, using a survey and expert interviews; and 3) investigate the use of a multicriteria decision-making approach to evaluate accessibility considerations in microtransit adoption planning and design strategies. The results suggest that current technology adoption approaches in transit can significantly enhance decision-making and transit outcomes while addressing the equity and accessibility needs of the community and maintaining coverage and route frequency. The Socio-Technical-Systems (STS) approach was applied to help understand the adoption of new technology in demand response transit. This approach provides insights into technology, accessibility, decision-making, functionality, and interchangeability, enhancing our understanding of social complexity. Additionally, this research introduces a multi-level decision-making framework to measure service performance and provides insights into the impact of transportation technology on planning, policy, and decision-making processes.
- Examining Electronic Markets in Which Intelligent Agents Are Used for Comparison Shopping and Dynamic PricingHertweck, Bryan M. (Virginia Tech, 2005-09-08)Electronic commerce markets are becoming increasingly popular forums for commerce. As those markets mature, buyers and sellers will both vigorously seek techniques to improve their performance. The Internet lends itself to the use of agents to work on behalf of buyers and sellers. Through simulation, this research examines different implementations of buyers' agents (shopbots) and sellers' agents (pricebots) so that buyers, sellers, and agent builders can capitalize on the evolution of e-commerce technologies. Internet markets bring price visibility to a level beyond what is observed in traditional brick-and-mortar markets. Additionally, an online seller is able to update prices quickly and cheaply. Due to these facts, there are many pricing strategies that sellers can implement via pricebot to react to their environments. The best strategy for a particular seller is dependent on characteristics of its marketplace. This research shows that the extent to which buyers are using shopbots is a critical driver of the success of pricing strategies. When measuring profitability, the interaction between shopbot usage and seller strategy is very strong - what works well at low shopbot usage levels may perform poorly at high levels. If a seller is evaluating strategies based on sales volume, the choice may change. Additionally, as markets evolve and competitors change strategies, the choice of most effective counterstrategies may evolve as well. Sellers need to clearly define their goals and thoroughly understand their marketplace before choosing a pricing strategy. Just as sellers have choices to make in implementing pricebots, buyers have decisions to make with shopbots. In addition to the factors described above, the types of shopbots in use can actually affect the relative performance of pricing strategies. This research also shows that varying shopbot implementations (specifically involving the use of a price memory component) can affect the prices that buyers ultimately pay - an especially important consideration for high-volume buyers. Modern technology permits software agents to employ artificial intelligence. This work demonstrates the potential of neural networks as a tool for pricebots. As discussed above, a seller's best strategy option can change as the behavior of the competition changes. Simulation can be used to evaluate a multitude of scenarios and determine what strategies work best under what conditions. This research shows that a neural network can be effectively implemented to classify the behavior of competitors and point to the best counterstrategy.
- Firms' Resilience to Supply Chain DisruptionsBaghersad, Milad (Virginia Tech, 2018-07-16)This dissertation consists of three papers related to firms' resiliency to supply chain disruptions. The first paper seeks to evaluate the effects of supply chain disruptions on firms' performance by using a recent dataset of supply chain disruptions. To this end, we analyzed operating and stock market performances of over 300 firms that experienced a supply chain disruption during 2005 to the end of 2014. The results show that supply chain disruptions are still associated with a significant decrease in operating income, return on sales, return on assets, sales, and a negative performance in total assets. Supply chain disruptions are also associated with a significant negative abnormal stock return on the day of the supply chain disruption announcements. These results are in line with previous findings in the literature. In the second paper, in order to provide a more detailed characterization of negative impacts of disruptions on firms' performance, we develop three complementary measures of system loss: the initial loss due to the disruption, the maximum loss, and the total loss over time. Then, we utilize the contingent resource-based view to evaluate the moderating effects of operational slack and operational scope on the relationship between the severity of supply chain disruptions and the three complementary measures of system loss. We find that maintaining certain aspects of operational slack and broadening business scope can affect these different measures of loss in different ways, although these effects are contingent on the disruptions' severity. The third paper examines relationships between the origin of supply chain disruptions, firms' past experience, and the negative impacts of supply chain disruptions on firms' performance. This third study shows that the impact of external and internal supply chain disruptions on firms' performance can be different when firms do and do not have past experience with similar events. For example, the results show that past experience significantly decreases initial loss, recovery time, and total loss over time experienced by firms after internal disruptions, although past experience may not decrease initial loss, recovery time, and total loss over time in the case of external disruptions.
- «
- 1 (current)
- 2
- 3
- »