Browsing by Author "Ragsdale, Cliff T."
Now showing 1 - 20 of 26
Results Per Page
Sort Options
- An Agent-Based Distributed Decision Support System Framework for Mediated NegotiationLoPinto, Frank Anthony (Virginia Tech, 2004-04-26)Implementing an e-market for limited supply perishable asset (LiSPA) products is a problem at the intersection of online purchasing and distributed decision support systems (DistDSS). In this dissertation, we introduce and define LiSPA products, provide real-world examples, develop a framework for a distributed system to implement an e-market for LiSPA products, and provide proof-of-concept for the two major components of the framework. The DistDSS framework requires customers to instantiate agents that learn their preferences and evaluate products on their behalf. Accurately eliciting and modeling customer preferences in a quick and easy manner is a major hurdle for implementing this agent-based system. A methodology is developed for this problem using conjoint analysis and neural networks. The framework also contains a model component that is addressed in this work. The model component is presented as a mediator of customer negotiation that uses the agent-based preference models mentioned above and employs a linear programming model to maximize overall satisfaction of the total market.
- Automated extraction of product feedback from online reviews: Improving efficiency, value, and total yieldGoldberg, David Michael (Virginia Tech, 2019-04-25)In recent years, the expansion of online media has presented firms with rich and voluminous new datasets with profound business applications. Among these, online reviews provide nuanced details on consumers' interactions with products. Analysis of these reviews has enormous potential, but the enormity of the data and the nature of unstructured text make mining these insights challenging and time-consuming. This paper presents three studies examining this problem and suggesting techniques for automated extraction of vital insights. The first study examines the problem of identifying mentions of safety hazards in online reviews. Discussions of hazards may have profound importance for firms and regulators as they seek to protect consumers. However, as most online reviews do not pertain to safety hazards, identifying this small portion of reviews is a challenging problem. Much of the literature in this domain focuses on selecting "smoke terms," or specific words and phrases closely associated with the mentions of safety hazards. We first examine and evaluate prior techniques to identify these reviews, which incorporate substantial human opinion in curating smoke terms and thus vary in their effectiveness. We propose a new automated method that utilizes a heuristic to curate smoke terms, and we find that this method is far more efficient than the human-driven techniques. Finally, we incorporate consumers' star ratings in our analysis, further improving prediction of safety hazard-related discussions. The second study examines the identification of consumer-sourced innovation ideas and opportunities from online reviews. We build upon a widely-accepted attribute mapping framework from the entrepreneurship literature for evaluating and comparing product attributes. We first adapt this framework for use in the analysis of online reviews. Then, we develop analytical techniques based on smoke terms for automated identification of innovation opportunities mentioned in online reviews. These techniques can be used to profile products as to attributes that affect or have the potential to affect their competitive standing. In collaboration with a large countertop appliances manufacturer, we assess and validate the usefulness of these suggestions, tying together the theoretical value of the attribute mapping framework and the practical value of identifying innovation-related discussions in online reviews. The third study addresses safety hazard monitoring for use cases in which a higher yield of safety hazards detected is desirable. We note a trade-off between the efficiency of hazard techniques described in the first study and the depth of such techniques, as a high proportion of identified records refer to true hazards, but several important hazards may be undetected. We suggest several techniques for handling this trade-off, including alternate objective functions for heuristics and fuzzy term matching, which improve the total yield. We examine the efficacy of each of these techniques and contrast their merits with past techniques. Finally, we test the capability of these methods to generalize to online reviews across different product categories.
- Behavioral Logistics and Fatigue Management in Vehicle Routing and Scheduling ProblemsBowden, Zachary E. (Virginia Tech, 2016-05-03)The vehicle routing problem (VRP), is a classic optimization problem that aims to determine the optimal set of routes for a fleet of vehicles to meet the demands of a set of customers. The VRP has been studied for many decades and as such, there are many variants and extensions to the original problem. The research presented here focuses on two different types of vehicle routing and scheduling planning problems: car shipping and fatigue-aware scheduling. In addition to modeling and solving the car shipping problem, this research presents a novel way for ways in which drivers can describe their route preferences in a decision support system. This work also introduces the first fatigue-aware vehicle scheduling problem called the Truck Driver Scheduling Problem with Fatigue Management (TDSPFM). The TDSPFM is utilized to produce schedules that keep the drivers more alert than existing commercial vehicle regulations. Finally, this work analyzes the effect of the starting alertness level on driver alertness for the remainder of the work week and examines a critical shortcoming in existing regulations.
- Building a knowledge based simulation optimization system with discovery learningSiochi, Fernando C. (Virginia Tech, 1995)Simulation optimization is a developing research area whereby a set of input conditions is sought that produce a desirable output (or outputs) to a simulation model. Although many approaches to simulation optimization have been developed, the research area is by no means mature. This research makes three contributions in the area of simulation optimization. The first is fundamental in that it examines simulation outputs, called "response surfaces," and notes their behavior. In particular both point and region estimates are studied for different response surfaces: Conclusions are developed that indicate when and where simulation-optimization techniques such as Response Surface Methodology should be applied. The second contribution provides assistance in selecting a region to begin a simulation-optimization search. The new method is based upon the artificial intelligence based approach best-first search. Two examples of the method are given. The final contribution of this research expands upon the ideas by Crouch for building a "Learner" to improve heuristics in simulation over time. The particular case of parameter-modification learning is developed and illustrated by example. The dissertation concludes with limitations and suggestions for future work.
- Computational Studies in Multi-Criteria Scheduling and OptimizationMartin, Megan Wydick (Virginia Tech, 2017-08-11)Multi-criteria scheduling provides the opportunity to create mathematical optimization models that are applicable to a diverse set of problem domains in the business world. This research addresses two different employee scheduling applications using multi-criteria objectives that present decision makers with trade-offs between global optimality and the level of disruption to current operating resources. Additionally, it investigates a scheduling problem from the product testing domain and proposes a heuristic solution technique for the problem that is shown to produce very high-quality solutions in short amounts of time. Chapter 2 addresses a grant administration workload-to-staff assignment problem that occurs in the Office of Research and Sponsored Programs at land-grant universities. We identify the optimal workload assignment plan which differs considerably due to multiple reassignments from the current state. To achieve the optimal workload reassignment plan we demonstrate a technique to identify the n best reassignments from the current state that provides the greatest progress toward the utopian solution. Solving this problem over several values of n and plotting the results allows the decision maker to visualize the reassignments and the progress achieved toward the utopian balanced workload solution. Chapter 3 identifies a weekly schedule that seeks the most cost-effective set of coach-to-program assignments in a gymnastics facility. We identify the optimal assignment plan using an integer linear programming model. The optimal assignment plan differs greatly from the status quo; therefore, we utilize a similar approach from Chapter 2 and use a multiple objective optimization technique to identify the n best staff reassignments. Again, the decision maker can visualize the trade-off between the number of reassignments and the resulting progress toward the utopian staffing cost solution and make an informed decision about the best number of reassignments. Chapter 4 focuses on product test scheduling in the presence of in-process and at-completion inspection constraints. Such testing arises in the context of the manufacture of products that must perform reliably in extreme environmental conditions. Each product receives a certification at the successful completion of a predetermined series of tests. Operational efficiency is enhanced by determining the optimal order and start times of tests so as to minimize the make span while ensuring that technicians are available when needed to complete in-process and at-completion inspections We first formulate a mixed-integer programming model (MILP) to identify the optimal solution to this problem using IBM ILOG CPLEX Interactive Optimizer 12.7. We also present a genetic algorithm (GA) solution that is implemented and solved in Microsoft Excel. Computational results are presented demonstrating the relative merits of the MILP and GA solution approaches across a number of scenarios.
- Decision support for long-range, community-based planning to mitigate against and recover from potential multiple disastersChacko, Josey; Rees, Loren P.; Zobel, Christopher W.; Rakes, Terry R.; Russell, Roberta S.; Ragsdale, Cliff T. (Elsevier, 2016-07-01)This paper discusses a new mathematical model for community-driven disaster planning that is intended to help decision makers exploit the synergies resulting from simultaneously considering actions focusing on mitigation and efforts geared toward long-term recovery. The model is keyed on enabling long-term community resilience in the face of potential disasters of varying types, frequencies, and severities, and the approach’s highly iterative nature is facilitated by the model’s implementation in the context of a Decision Support System. Three examples from Mombasa, Kenya, East Africa, are discussed and compared in order to demonstrate the advantages of the new mathematical model over the current ad hoc mitigation and long-term recovery planning approaches that are typically used.
- A Decision Support System for the Electrical Power Districting ProblemBergey, Paul K. (Virginia Tech, 2000-04-21)Due to a variety of political, economic, and technological factors, many national electricity industries around the globe are transforming from non-competitive monopolies with centralized systems to decentralized operations with competitive business units. This process, commonly referred to as deregulation (or liberalization) is driven by the belief that a monopolistic industry fails to achieve economic efficiency for consumers over the long run. Deregulation has occurred in a number of industries such as: aviation, natural gas, transportation, and telecommunications. The most recent movement involving the deregulation of the electricity marketplace is expected to yield consumer benefit as well. To facilitate deregulation of the electricity marketplace, competitive business units must be established to manage various functions and services independently. In addition, these business units must be given physical property rights for certain parts of the transmission and distribution network in order to provide reliable service and make effective business decisions. However, partitioning a physical power grid into economically viable districts involves many considerations. We refer to this complex problem as the electrical power districting problem. This research is intended to identify the necessary and fundamental characteristics to appropriately model and solve an electrical power districting problem. Specifically, the objectives of this research are five-fold. First, to identify the issues relevant to electrical power districting problems. Second, to investigate the similarities and differences of electrical power districting problems with other districting problems published in the research literature. Third, to develop and recommend an appropriate solution methodology for electrical power districting problems. Fourth, to demonstrate the effectiveness of the proposed solution method for a specific case of electric power districting in the Republic of Ghana, with data provided by the World Bank. Finally, to develop a decision support system for the decision makers at the World Bank for solving Ghana's electrical power districting problem.
- Design and Application of Genetic Algorithms for the Multiple Traveling Salesperson Assignment ProblemCarter, Arthur E. (Virginia Tech, 2003-04-21)The multiple traveling salesmen problem (MTSP) is an extension of the traveling salesman problem with many production and scheduling applications. The TSP has been well studied including methods of solving the problem with genetic algorithms. The MTSP has also been studied and solved with GAs in the form of the vehicle-scheduling problem. This work presents a new modeling methodology for setting up the MTSP to be solved using a GA. The advantages of the new model are compared to existing models both mathematically and experimentally. The model is also used to model and solve a multi line production problem in a spreadsheet environment. The new model proves itself to be an effective method to model the MTSP for solving with GAs. The concept of the MTSP is then used to model and solve with a GA the use of one salesman make many tours to visit all the cities instead of using one continuous trip to visit all the cities. While this problem uses only one salesman, it can be modeled as a MTSP and has many applications for people who must visit many cities on a number of short trips. The method used effectively creates a schedule while considering all required constraints.
- A Deterministic Approach to Partitioning Neural Network Training Data for the Classification ProblemSmith, Gregory Edward (Virginia Tech, 2006-08-07)The classification problem in discriminant analysis involves identifying a function that accurately classifies observations as originating from one of two or more mutually exclusive groups. Because no single classification technique works best for all problems, many different techniques have been developed. For business applications, neural networks have become the most commonly used classification technique and though they often outperform traditional statistical classification methods, their performance may be hindered because of failings in the use of training data. This problem can be exacerbated because of small data set size. In this dissertation, we identify and discuss a number of potential problems with typical random partitioning of neural network training data for the classification problem and introduce deterministic methods to partitioning that overcome these obstacles and improve classification accuracy on new validation data. A traditional statistical distance measure enables this deterministic partitioning. Heuristics for both the two-group classification problem and k-group classification problem are presented. We show that these heuristics result in generalizable neural network models that produce more accurate classification results, on average, than several commonly used classification techniques. In addition, we compare several two-group simulated and real-world data sets with respect to the interior and boundary positions of observations within their groups' convex polyhedrons. We show by example that projecting the interior points of simulated data to the boundary of their group polyhedrons generates convex shapes similar to real-world data group convex polyhedrons. Our two-group deterministic partitioning heuristic is then applied to the repositioned simulated data, producing results superior to several commonly used classification techniques.
- Ensemble Learning Techniques for Structured and Unstructured DataKing, Michael Allen (Virginia Tech, 2015-04-01)This research provides an integrated approach of applying innovative ensemble learning techniques that has the potential to increase the overall accuracy of classification models. Actual structured and unstructured data sets from industry are utilized during the research process, analysis and subsequent model evaluations. The first research section addresses the consumer demand forecasting and daily capacity management requirements of a nationally recognized alpine ski resort in the state of Utah, in the United States of America. A basic econometric model is developed and three classic predictive models evaluated the effectiveness. These predictive models were subsequently used as input for four ensemble modeling techniques. Ensemble learning techniques are shown to be effective. The second research section discusses the opportunities and challenges faced by a leading firm providing sponsored search marketing services. The goal for sponsored search marketing campaigns is to create advertising campaigns that better attract and motivate a target market to purchase. This research develops a method for classifying profitable campaigns and maximizing overall campaign portfolio profits. Four traditional classifiers are utilized, along with four ensemble learning techniques, to build classifier models to identify profitable pay-per-click campaigns. A MetaCost ensemble configuration, having the ability to integrate unequal classification cost, produced the highest campaign portfolio profit. The third research section addresses the management challenges of online consumer reviews encountered by service industries and addresses how these textual reviews can be used for service improvements. A service improvement framework is introduced that integrates traditional text mining techniques and second order feature derivation with ensemble learning techniques. The concept of GLOW and SMOKE words is introduced and is shown to be an objective text analytic source of service defects or service accolades.
- Exploratory and Empirical Analysis of E-Marketplaces for Truck Transportation Services ProcurementCollignon, Stephane Eric (Virginia Tech, 2016-08-11)In the late 1990s, early 2000s, academic literature considered electronic marketplaces as a game changer in truck transportation services procurement. Early enthusiasm was followed by skepticism regarding e-marketplaces' usefulness and the popularity of e-marketplaces appeared to wane both in industry and in academic literature. However, recent sources argue that almost half of the freight currently transported by truck in the USA is subject to transactions conducted in e-marketplaces. This dissertation intends to fill a gap in the academic literature by showing that truck transportation e-marketplaces necessitate renewed dedicated research efforts, by exploring the strategies implemented by e-marketplaces in this specific industry and by linking these strategies to marketplaces' performance. First, transportation and non-transportation e-marketplaces are compared in chapter 2 with regard to their usage of mechanisms designed to generate trust among users. Results show that truck transportation e-marketplaces use these trust mechanisms differently than non-transportation e-marketplaces, which supports a call for research on e-marketplaces in the specific context of truck transportation services procurement. In chapter 3, a database inventorying the usage of 141 features by 208 e-marketplaces is then created to initiate the empirical exploration of these specific e-marketplaces. Thanks to that database, a new typology (a way of classifying objects based on several simultaneous classification criteria) is developed in chapter 4 that identifies three main truck transportation e-marketplace strategies (two with sub-divided into two sub-strategies). The typology provides a state of industry and puts in perspective the specificity of truck transportation e-marketplaces with regard to their structure along 11 dimensions known to the general e-marketplace literature. Finally, the link between e-marketplace strategies and performance is investigated in chapter 5. Performance is measured with three traffic metrics: number of unique visitors per day, number of page views per day, and website ranking. Results show that third-party-owned e-marketplaces that provide auction mechanisms with a fairly high level of user decision and transaction support are more successful than other e-marketplaces. This dissertation provides a picture of existing e-marketplaces for the procurement of truck transportation services, challenges components of existing theories and provides ground for further research.
- Improving Post-Disaster Recovery: Decision Support for Debris Disposal OperationsFetter, Gary (Virginia Tech, 2010-03-31)Disaster debris cleanup operations are commonly organized into two phases. During the first phase, the objective is to clear debris from evacuation and other important pathways to ensure access to the disaster-affected area. Practically, Phase 1 activities largely consist of pushing fallen trees, vehicles, and other debris blocking streets and highways to the curb. These activities begin immediately once the disaster has passed, with the goal of completion usually within 24 to 72 hours. In Phase 2 of debris removal, which is the focus of this study, completion can take months or years. Activities in this phase include organizing and managing curbside debris collection, reduction, recycling, and disposal operations (FEMA 2007). This dissertation research investigates methods for improving post-disaster debris cleanup operations—one of the most important and costly aspects of the least researched area of disaster operations management (Altay and Green 2006). The first objective is to identify the unique nature of the disaster debris cleanup problem and the important decisions faced by disaster debris coordinators. The second goal is to present three research projects that develop methods for assisting disaster management coordinators with debris cleanup operations. In the first project, which is the topic of Chapter 3, a facility location model is developed for addressing the problem of opening temporary disposal and storage reduction facilities, which are needed to ensure efficient and effective cleanup operations. In the second project, which is the topic of Chapter 4, a multiple objective mixed-integer linear programming model is developed to address the problem of assigning debris cleanup resources across the disaster-affected area at the onset of debris cleanup operations. The third project and the focus of Chapter 5 addresses the problem of equitably controlling ongoing cleanup operations in real-time. A self-balancing CUSUM statistical process control chart is developed to assist disaster management coordinators with equitably allocating cleanup resources as information becomes available in real-time. All of the models in this dissertation are evaluated using data from debris cleanup operations in Chesapeake, Virginia, completed after Hurricane Isabel in 2003.
- Nonparametric metamodeling for simulation optimizationKeys, Anthony C. (Virginia Tech, 1995-04-19)Optimization of simulation model performance requires finding the values of the model's controllable inputs that optimize a chosen model response. Responses are usually stochastic in nature, and the cost of simulation model runs is high. The literature suggests the use of metamodels to synthesize the response surface using sample data. In particular, nonparametric regression is proposed as a useful tool in the global optimization of a response surface. As the general simulation optimization problem is very difficult and requires expertise from a number of fields, there is a growing consensus in the literature that a knowledge-based approach to solving simulation optimization problems is required. This dissertation examines the relative performance of the principal nonparametric techniques, spline and kernel smoothing, and subsequently addresses the issues involved in implementing the techniques in a knowledge-based simulation optimization system. The dissertation consists of two parts. In the first part, a full factorial experiment is carried out to compare the performance of kernel and spline smoothing on a number of measures when modeling a varied set of surfaces using a range of small sample sizes. In the second part, nonparametric metamodeling techniques are placed in a taxonomy of stochastic search procedures for simulation optimization and a method for their implementation in a knowledge-based system is presented. A sequential design procedure is developed that allows spline smoothing to be used as a search technique. Throughout the dissertation, a two-input, single-response model is considered. Results from the experiment show that spline smoothing is superior to constant-bandwidth kernel smoothing in fitting the response. Kernel smoothing is shown to be more accurate in placing optima in X-space for sample sizes up to 36. Inventory model examples are used to illustrate the results. The taxonomy implies that search procedures can be chosen initially using the parameters of the problem. A process that allows for selection of a search technique and its subsequent evaluation for further use or for substitution of another search technique is given. The success of a sequential design method for spline smooths in finding a global optimum is demonstrated using a bimodal response surface.
- Online Review Analytics: New Methods for discovering Key Product Quality and Service ConcernsZaman, Nohel (Virginia Tech, 2019-07-09)The purpose of this dissertation intends to discover as well as categorize safety concern reports in online reviews by using key terms prevalent in sub-categories of safety concerns. This dissertation extends the literature of semi-automatic text classification methodology in monitoring and classifying product quality and service concerns. We develop various text classification methods for finding key concerns across a diverse set of product and service categories. Additionally, we generalize our results by testing the performance of our methodologies on online reviews collected from two different data sources (Amazon product reviews and Facebook hospital service reviews). Stakeholders such as product designers and safety regulators can use the semi-automatic classification procedure to subcategorize safety concerns by injury type and narrative type (Chapter 1). We enhance the text classification approach by proposing a Risk Assessment Model for quality management (QM) professionals, safety regulators, and product designers to allow them to estimate overall risk level of specific products by analyzing consumer-generated content in online reviews (Chapter 2). Monitoring and prioritizing the hazard risk levels of products will help the stakeholders to make appropriate actions on mitigating the risk of product safety. Lastly, the text classification approach discovers and ranks aspects of services that predict overall user satisfaction (Chapter 3). The key service terms are beneficial for healthcare providers to rapidly trace specific service concerns for improving the hospital services.
- Planning and Scheduling of Complex, High Value-Added Service OperationsWhite, Sheneeta Williams (Virginia Tech, 2009-07-27)This research takes the initial steps of evaluating resource planning for service operations in which the client is a direct resource in the service system. First, this research examines the effects of client involvement on resource planning decisions when a service firm is faced with efficiency and quality considerations. We develop a non-linear, deterministic, single-stage planning model that allows for examination of trade-offs among client involvement, efficiency and quality. Policy recommendations give service firms better insights into setting workforce, client intensity, and service generation levels. Second, we examine the sensitivity of estimates of technology functions to data analysis and make policy recommendations to service providers on how to allocate resources when there are technology function uncertainties and uncontrollable inputs. Results show that resources are allocated to compensate for technology function uncertainties. Third, we gain insights as to how resource decisions are made for multiple stages and for multiple clients. We extrapolate theoretical findings from the single-stage planning study to determine resource allocations across multiple services and stages. Results show that when the dynamic program in the single-stage study is extended there is trade-off between the cost of capacity changes and profits across multiple stages.
- The Product Test Scheduling ProblemRagsdale, Cliff T.; Martin, Megan; Fico, John; Cajica-Sierra, Carlos; Fetcenko, Richard (2022)
- Quantitative Decision Models for Humanitarian LogisticsFalasca, Mauro (Virginia Tech, 2009-08-20)Humanitarian relief and aid organizations all over the world implement efforts aimed at recovering from disasters, reducing poverty and promoting human rights. The purpose of this dissertation is to develop a series of quantitative decision models to help address some of the challenges faced by humanitarian logistics. The first study discusses the development of a spreadsheet-based multicriteria scheduling model for a small development aid organization in a South American developing country. Development aid organizations plan and execute efforts that are primarily directed towards promoting human welfare. Because these organizations rely heavily on the use of volunteers to carry out their social mission, it is important that they manage their volunteer workforce efficiently. In this study, we demonstrate not only how the proposed model helps to reduce the number of unfilled shifts and to decrease total scheduling costs, but also how it helps to better satisfy the volunteers’ scheduling preferences, thus supporting long-term retention and effectiveness of the workforce. The purpose of the second study is to develop a decision model to assist in the management of humanitarian relief volunteers. One of the challenges faced by humanitarian organizations is that there exist limited decision technologies that fit their needs while it has also been pointed out that those organizations experience coordination difficulties with volunteers willing to help. Even though employee workforce management models have been the topic of extensive research over the past decades, no work has focused on the problem of managing humanitarian relief volunteers. In this study, we discuss a series of principles from the field of volunteer management and develop a multicriteria optimization model to assist in the assignment of both individual volunteers and volunteer groups to tasks. We present illustrative examples and analyze two complementary solution methodologies that incorporate the decision maker's preferences and knowledge and allow him/her to trade-off conflicting objectives. The third study discusses the development of a decision model for the procurement of goods in humanitarian efforts. Despite the prevalence of procurement expenditures in humanitarian efforts, procurement in humanitarian contexts is a topic that has only been discussed in a qualitative manner in the literature. In our paper, we introduce a two stage decision model with recourse to improve the procurement of goods in humanitarian relief supply chains and present an illustrative example. Conclusions, limitations, and directions for future research are also discussed.
- The relationship between geographic proximity and strategic posture: a longitudinal study of the U.S. fiberoptics industryLamb, William G. (Virginia Tech, 1997-11-17)The purpose of this study is to investigate implications of geographic location for firm strategy and for the competitive climate in emerging higher technology industries. Hypotheses are generated based on concepts from institutional theory, transaction costs economics, economic geography, and strategic management. Specifically, tests are conducted to determine whether there is an association between establishments' geographic locations and the incidence of two collective strategies: strategic isomorphism and strategic complementarity. These tests are performed with respect to the U. S. flberoptics industry at three-year intervals during the period 1976-1994. Tests are also performed (using 1994 data) to assess the influence that research institutes and economically dominant firms have on collective strategy formation. The study's summary finding is that, to date, there is little, if any, empirical support for an association between geographic location and strategic posture in the fiberoptics industry. While it is possible that the proposed phenomena do not occur in this industry, for all of the hypotheses there are several alternative explanations for the results. First, several of the findings suggest that too little time has elapsed for the proposed phenomena to be fully manifested in the fiberoptics industry. Second, some of the phenomena might be observable by changing sampling or measurement procedures. Third, certain characteristics of emerging higher technology industries might affect the strength of some hypothesized relationships. Based on the findings of this study, a number of suggestions are offered for further studies of the subject.
- The Role of Computer and Internet Access in Business Students' Acceptance of E-Learning TechnologyHenderson, Ronda Baskerville (Virginia Tech, 2005-06-20)This study was based on previous research that investigated the disparity or gap between those who have access to computers and the Internet and those who do not (Hoffman and Novak, 1998; NTIA, 1999b; Carey, Chisholm and Irwin, 2002; Vail, 2003 Zeliff, 2004; Glenn, 2005). The Technology Acceptance Model developed by Davis, Bagozzi, and Warshaw (1989) was used to investigate whether computer and Internet access influenced the acceptance of e-learning technology tools such as Blackboard and the Internet. Of the studies conducted concerning adoption of these technologies, a limited number have addressed the extent to which college students accept these tools. The majority of these studies failed to consider computer access as a factor regarding computer technology acceptance. The E-Learning Technology Acceptance (ETA) survey instrument was administered to business students at two universities in North Carolina. Hierarchical regression was performed to test whether or not computer and Internet access explained variance above and beyond race and socioeconomic status. Regression analysis revealed that computer and Internet access affected the degree to which students expect Blackboard and the Internet to be easy to use. As a result, creating a technology assessment to be utilized by e-learning educators and students to measure the level of computer and Internet access was recommended. The analyses also revealed that computer and Internet access significantly impacted students' attitude toward using Blackboard and the Internet. Improving the level of technology access should be addressed to promote positive attitudes regarding e-learning tools. Additional findings revealed that socioeconomic status and race did influence computer ownership. A suggestion for educators is to explore initiatives that assist low income and minority students with obtaining home computers. Finally, the findings suggested that closing the digital divide is not enough to ensure technology acceptance of students. The researcher proposed that digital inclusion should be the goal of our society. Recommendations for further research suggested by the researcher included investigating other variables that may influence technology acceptance and computer and Internet access.
- A Spatial Decision Support System for Planning Broadband, Fixed Wireless Telecommunication NetworksScheibe, Kevin Paul (Virginia Tech, 2004-08-05)Over the last two decades, wireless technology has become ubiquitous in the United States and other developed countries. Consumer devices such as AM/FM radios, cordless and cellular telephones, pagers, satellite televisions, garage door openers, and television channel changers are just some of the applications of wireless technology. More recently, wireless computer networking has seen increasing employment. A few reasons for this move toward wireless networking are improved electronics transmitters and receivers, reduced costs, simplified installation, and enhanced network expandability. The objective of the study is to generate understanding of the planning inherent in a broadband, fixed wireless telecommunication network and to implement that knowledge into an SDSS. Intermediate steps toward this goal include solutions to both fixed wireless point-to-multipoint (PMP) and fixed wireless mesh networks, which are developed and incorporated into the SDSS. This study explores the use of a Spatial Decision Support System (SDSS) for broadband fixed wireless connectivity to solve the wireless network planning problem. The spatial component of the DSS is a Geographic Information System (GIS), which displays visibility for specific tower locations. The SDSS proposed here incorporates cost, revenue, and performance capabilities of a wireless technology applied to a given area. It encompasses cost and range capabilities of wireless equipment, the customers' propensity to pay, the market penetration of a given service offering, the topology of the area in which the wireless service is proffered, and signal obstructions due to local geography. This research is both quantitative and qualitative in nature. Quantitatively, the wireless network planning problem may be formulated as integer programming problems (IP). The line-of-sight restriction imposed by several extant wireless technologies necessitates the incorporation of a GIS and the development of an SDSS to facilitate the symbiosis of the mathematics and geography. The qualitative aspect of this research involves the consideration of planning guidelines for the general wireless planning problem. Methodologically, this requires a synthesis of the literature and insights gathered from using the SDSS above in a what-if mode.