Browsing by Author "Rakes, Terry R."
Now showing 1 - 20 of 36
Results Per Page
Sort Options
- Analysis of timber harvest scheduling under alternative levels of land aggregation: an application to a hypothetical Mexican forest ownershipHernandez-Vazquez, Edgardo (Virginia Polytechnic Institute and State University, 1989)The problem of optimal land organization was approached via a general methodology to aggregate finely distinguished planning unit areas of an even-aged ponderosa pine forest in Northwestern Mexico. Factor analysis was applied to eighteen timber inventory variables to produce four independent and meaningful constructs that explained 87% of the total variable set’s variation. Next, each planning unit area was characterized by its factor scores and an Euclidean-metric based analysis was applied. The resultant Dendrograrn’s structure helped to define four levels of land aggregation that were evaluated with the same forest management policy. This policy simulated current Mexican forestry guidelines such as replacement stand’s regimes based on maximum mean annual increment, and area volume constraints for timber harvest scheduling. Then, the present value-maximizing timber harvest schedules for each level of land organization was found by using LP Model 1 formulations. Results showed that timber harvesting net benefits varied between 1.3% and 7.0% across levels of land aggregation. This fact was a consequence of the biophysical homogeneity of the forest and the Mexican assumptions of prices and flat costs for overhead and planning. Theoretical considerations indicated that if overhead and planning costs are properly considered for every level of land aggregation, the study’s methodology could show a greater present value difference between alternative levels of land organization.
- Application of expert systems in landscape architectureKulkarni, Nitin Y. (Virginia Tech, 1989-07-20)Application of artificial intelligence (Al) has been a topic of interest among researchers for the past decade or more. Years of research in the commercial application of Al, availability of hardware support for Al application and affordability of software and hardware has generated a lot of interest in this field and brought this technology within the reach of micro-computer based users. The commercial impact of AI is due to expert systems (ESs). ES technology is a collection of methods and techniques for constructing human-machine systems with specialized problem solving expertise. This project explores the application of ESs in landscape architecture by developing a prototype ES and testing implications of its use with designers while working on a hypothetical problem in a studio environment. The development process helps identify the typical difficulties of such an application, to uncover technical problems, and to identify areas needing further research. The project aims at building an ES that provides very limited preliminary data and design guidelines to initialize the design process and keeps track of the most fundamental issues necessary for planning, thus acting as an expert and assistant simultaneously. The idea is to explore the possibility of applying ESs to facilitate the design process so that designers may concentrate on other important aspects of design which include intuitive judgement about qualitative aspects.
- Automated extraction of product feedback from online reviews: Improving efficiency, value, and total yieldGoldberg, David Michael (Virginia Tech, 2019-04-25)In recent years, the expansion of online media has presented firms with rich and voluminous new datasets with profound business applications. Among these, online reviews provide nuanced details on consumers' interactions with products. Analysis of these reviews has enormous potential, but the enormity of the data and the nature of unstructured text make mining these insights challenging and time-consuming. This paper presents three studies examining this problem and suggesting techniques for automated extraction of vital insights. The first study examines the problem of identifying mentions of safety hazards in online reviews. Discussions of hazards may have profound importance for firms and regulators as they seek to protect consumers. However, as most online reviews do not pertain to safety hazards, identifying this small portion of reviews is a challenging problem. Much of the literature in this domain focuses on selecting "smoke terms," or specific words and phrases closely associated with the mentions of safety hazards. We first examine and evaluate prior techniques to identify these reviews, which incorporate substantial human opinion in curating smoke terms and thus vary in their effectiveness. We propose a new automated method that utilizes a heuristic to curate smoke terms, and we find that this method is far more efficient than the human-driven techniques. Finally, we incorporate consumers' star ratings in our analysis, further improving prediction of safety hazard-related discussions. The second study examines the identification of consumer-sourced innovation ideas and opportunities from online reviews. We build upon a widely-accepted attribute mapping framework from the entrepreneurship literature for evaluating and comparing product attributes. We first adapt this framework for use in the analysis of online reviews. Then, we develop analytical techniques based on smoke terms for automated identification of innovation opportunities mentioned in online reviews. These techniques can be used to profile products as to attributes that affect or have the potential to affect their competitive standing. In collaboration with a large countertop appliances manufacturer, we assess and validate the usefulness of these suggestions, tying together the theoretical value of the attribute mapping framework and the practical value of identifying innovation-related discussions in online reviews. The third study addresses safety hazard monitoring for use cases in which a higher yield of safety hazards detected is desirable. We note a trade-off between the efficiency of hazard techniques described in the first study and the depth of such techniques, as a high proportion of identified records refer to true hazards, but several important hazards may be undetected. We suggest several techniques for handling this trade-off, including alternate objective functions for heuristics and fuzzy term matching, which improve the total yield. We examine the efficacy of each of these techniques and contrast their merits with past techniques. Finally, we test the capability of these methods to generalize to online reviews across different product categories.
- Building a knowledge based simulation optimization system with discovery learningSiochi, Fernando C. (Virginia Tech, 1995)Simulation optimization is a developing research area whereby a set of input conditions is sought that produce a desirable output (or outputs) to a simulation model. Although many approaches to simulation optimization have been developed, the research area is by no means mature. This research makes three contributions in the area of simulation optimization. The first is fundamental in that it examines simulation outputs, called "response surfaces," and notes their behavior. In particular both point and region estimates are studied for different response surfaces: Conclusions are developed that indicate when and where simulation-optimization techniques such as Response Surface Methodology should be applied. The second contribution provides assistance in selecting a region to begin a simulation-optimization search. The new method is based upon the artificial intelligence based approach best-first search. Two examples of the method are given. The final contribution of this research expands upon the ideas by Crouch for building a "Learner" to improve heuristics in simulation over time. The particular case of parameter-modification learning is developed and illustrated by example. The dissertation concludes with limitations and suggestions for future work.
- Computational Studies in Multi-Criteria Scheduling and OptimizationMartin, Megan Wydick (Virginia Tech, 2017-08-11)Multi-criteria scheduling provides the opportunity to create mathematical optimization models that are applicable to a diverse set of problem domains in the business world. This research addresses two different employee scheduling applications using multi-criteria objectives that present decision makers with trade-offs between global optimality and the level of disruption to current operating resources. Additionally, it investigates a scheduling problem from the product testing domain and proposes a heuristic solution technique for the problem that is shown to produce very high-quality solutions in short amounts of time. Chapter 2 addresses a grant administration workload-to-staff assignment problem that occurs in the Office of Research and Sponsored Programs at land-grant universities. We identify the optimal workload assignment plan which differs considerably due to multiple reassignments from the current state. To achieve the optimal workload reassignment plan we demonstrate a technique to identify the n best reassignments from the current state that provides the greatest progress toward the utopian solution. Solving this problem over several values of n and plotting the results allows the decision maker to visualize the reassignments and the progress achieved toward the utopian balanced workload solution. Chapter 3 identifies a weekly schedule that seeks the most cost-effective set of coach-to-program assignments in a gymnastics facility. We identify the optimal assignment plan using an integer linear programming model. The optimal assignment plan differs greatly from the status quo; therefore, we utilize a similar approach from Chapter 2 and use a multiple objective optimization technique to identify the n best staff reassignments. Again, the decision maker can visualize the trade-off between the number of reassignments and the resulting progress toward the utopian staffing cost solution and make an informed decision about the best number of reassignments. Chapter 4 focuses on product test scheduling in the presence of in-process and at-completion inspection constraints. Such testing arises in the context of the manufacture of products that must perform reliably in extreme environmental conditions. Each product receives a certification at the successful completion of a predetermined series of tests. Operational efficiency is enhanced by determining the optimal order and start times of tests so as to minimize the make span while ensuring that technicians are available when needed to complete in-process and at-completion inspections We first formulate a mixed-integer programming model (MILP) to identify the optimal solution to this problem using IBM ILOG CPLEX Interactive Optimizer 12.7. We also present a genetic algorithm (GA) solution that is implemented and solved in Microsoft Excel. Computational results are presented demonstrating the relative merits of the MILP and GA solution approaches across a number of scenarios.
- Computer Network Routing with a Fuzzy Neural NetworkBrande, Julia K. Jr. (Virginia Tech, 1997-11-07)The growing usage of computer networks is requiring improvements in network technologies and management techniques so users will receive high quality service. As more individuals transmit data through a computer network, the quality of service received by the users begins to degrade. A major aspect of computer networks that is vital to quality of service is data routing. A more effective method for routing data through a computer network can assist with the new problems being encountered with today's growing networks. Effective routing algorithms use various techniques to determine the most appropriate route for transmitting data. Determining the best route through a wide area network (WAN), requires the routing algorithm to obtain information concerning all of the nodes, links, and devices present on the network. The most relevant routing information involves various measures that are often obtained in an imprecise or inaccurate manner, thus suggesting that fuzzy reasoning is a natural method to employ in an improved routing scheme. The neural network is deemed as a suitable accompaniment because it maintains the ability to learn in dynamic situations. Once the neural network is initially designed, any alterations in the computer routing environment can easily be learned by this adaptive artificial intelligence method. The capability to learn and adapt is essential in today's rapidly growing and changing computer networks. These techniques, fuzzy reasoning and neural networks, when combined together provide a very effective routing algorithm for computer networks. Computer simulation is employed to prove the new fuzzy routing algorithm outperforms the Shortest Path First (SPF) algorithm in most computer network situations. The benefits increase as the computer network migrates from a stable network to a more variable one. The advantages of applying this fuzzy routing algorithm are apparent when considering the dynamic nature of modern computer networks.
- A computer-based DSS for funds management in a large state university environmentTyagi, Rajesh (Virginia Polytechnic Institute and State University, 1986)The comprehensive computerized decision support system developed in this research employs two techniques, computer modeling and goal programming, to assist top university financial officers in assessing the current status of funds sources and uses. The purpose of the DSS is to aid in reaching decisions concerning proposed projects, and to allocate funds from sources to uses on an aggregate basis according to a rational set of prescribed procedures. The computer model provides fast and easy access to the database and it permits the administrator to update the database as new information is received. Goal programming is used for modeling the allocation process since it provides a framework for the inclusion of multiple goals that may be conflicting and incommensurable. The goal programming model allocates funds from sources to uses based on a priority structure associated with the goals. The DSS, which runs interactively, performs a number of tasks that include: selection of model parameters, formulating goals and priority structure, and solving the GP model. It also provides on-line access to the database so that it may be updated as necessary. In addition, the DSS generates reports regarding funds allocation and goal achievements to allow analysis of the model results. The decision support system also provides a framework for experimentation with various goal and priority structures, thus facilitating what-if analyses. The user can also perform a sensitivity analysis by observing the effect of assigning different relative importance to a goal or set of goals.
- Cost-based shop control using artificial neural networksWiegmann, Lars (Virginia Tech, 1992)The production control system of a shop consists of three stages: due-date prediction, order release, and job dispatching. The literature has dealt thoroughly with the third stage, but there is a paucity of study on either of the first two stages or on interaction between the stages. This dissertation focuses on the first stage of production control, due-date prediction, by examining methodologies for improved prediction that go beyond either practitioner or published approaches. In particular, artificial neural networks and regression nonlinear in its variables are considered. In addition, interactive effects with the third stage, shop-floor dispatching, are taken into consideration. The dissertation conducts three basic studies. The first examines neural networks and regression nonlinear in its variables as alternatives to conventional due-date prediction. The second proposes a new cost-based criterion and prediction methodology that explicitly includes costs of earliness and tardiness directly in the forecast; these costs may differ in form and/or degree from each other. And third, the benefit of tying together the first and third stages of production control is explored. The studies are conducted by statistically analyzing data generated from simulated shops. Results of the first study conclude that both neural networks and regression nonlinear in its variables are preferred significantly to approaches advanced to date in the literature and in practice. Moreover, in the second study, it is found that the consequences of not using the cost-based criterion can be profound, particularly if a firm's cost function is asymmetric about the due date. Finally, it is discovered that the integrative, interactive methodology developed in the third study is significantly superior to the current non-integrative and non-interactive approaches. In particular, interactive neural network prediction is found to excel in the presence of asymmetric cost functions, whereas regression nonlinear in its variables is preferable under symmetric costs.
- Decision support for long-range, community-based planning to mitigate against and recover from potential multiple disastersChacko, Josey; Rees, Loren P.; Zobel, Christopher W.; Rakes, Terry R.; Russell, Roberta S.; Ragsdale, Cliff T. (Elsevier, 2016-07-01)This paper discusses a new mathematical model for community-driven disaster planning that is intended to help decision makers exploit the synergies resulting from simultaneously considering actions focusing on mitigation and efforts geared toward long-term recovery. The model is keyed on enabling long-term community resilience in the face of potential disasters of varying types, frequencies, and severities, and the approach’s highly iterative nature is facilitated by the model’s implementation in the context of a Decision Support System. Three examples from Mombasa, Kenya, East Africa, are discussed and compared in order to demonstrate the advantages of the new mathematical model over the current ad hoc mitigation and long-term recovery planning approaches that are typically used.
- Decision support systems design: a nursing scheduling applicationCeccucci, Wendy A. (Virginia Tech, 1994-01-28)The systems development life cycle (SDLC) has been the traditional method of decision support systems design. However, in the last decade several methodologies have been introduced to address the limitations arising in the use of the traditional method. These approaches include Courban's iterative design, Keen's adaptive design, prototyping and a number of mixed methodologies incorporating prototyping into the SDLC. Each of the previously established design methodologies has a number of differing characteristics that make each of them a more suitable strategy for certain environments. However, in some environments the current methodologies present certain limitations or unnecessary expenditures. These limitations suggest the need for an alternative methodology. This dissertation develops a new methodology, priority design, to meet this need. To determine what methodology would be most effective in a given situation, an analysis of the operating environment must be performed. Such issues as project complexity, project uncertainty, and limited user involvement must be addressed. This dissertation develops a set of guidelines to assist in this analysis. For clarity, the guidelines are applied to three, well-documented case studies. As an application of the priority design methodology, a decision support system for nurse scheduling is developed. The development of a useful DSS for nurse scheduling requires that projected staff requirements and issues of both coverage and differential assignment of personnel be addressed.
- The effect of outsourcing and situational characteristics on physical distribution transportation efficiencyBienstock, Carol C. (Virginia Tech, 1994-10-06)This research examined the outsourcing decision for the logistics function of motor carrier transportation. A full factorial design was executed on a simulated transportation network to investigate how the efficiency of motor carrier transportation was affected by how it was structured (private/leased fleet versus contract carrier transportation) and the characteristics of the transportation activities. Transaction Cost Analysis (TCA) offered a useful theoretical framework for consideration of this make or buy decision by suggesting the independent variables of asset specificity, uncertainty, and frequency/volume. Seven two-part research hypotheses examined the relationships among the independent variables to gain a greater understanding of the factors which drive the make versus buy decision for motor carrier transportation. The major conclusions of this research are: 1) For the system modelled here, structure (private/leased versus contract carriers) and volume had the largest effects on transportation efficiency (mean shipment cost). 2) The results of this study indicated that there may be important factors within the nature of the “supplying” industry that impact the make or buy decision. This research provided strong support for TCA predictions and clearly demonstrated that TCA is a useful framework for understanding firms’ make or buy decisions. Because of the nature of the transportation industry (the high level of competition and the lack of a small numbers bargaining situation), the hypotheses in this research clearly indicated that a "buy" rather than a "make" decision was the most efficient alternative; this result is exactly consistent with TCA predictions. 3) For the system modelled here, higher fixed and per mile equipment leasing expenses (incurred in the operation of refrigerated trailers) caused refrigerated shipments to be more expensive than standard dry trailer shipments. That is, asset specificity (in this case, requirements for refrigerated trailer equipment) had a significant effect on shipment efficiency.
- Examining Electronic Markets in Which Intelligent Agents Are Used for Comparison Shopping and Dynamic PricingHertweck, Bryan M. (Virginia Tech, 2005-09-08)Electronic commerce markets are becoming increasingly popular forums for commerce. As those markets mature, buyers and sellers will both vigorously seek techniques to improve their performance. The Internet lends itself to the use of agents to work on behalf of buyers and sellers. Through simulation, this research examines different implementations of buyers' agents (shopbots) and sellers' agents (pricebots) so that buyers, sellers, and agent builders can capitalize on the evolution of e-commerce technologies. Internet markets bring price visibility to a level beyond what is observed in traditional brick-and-mortar markets. Additionally, an online seller is able to update prices quickly and cheaply. Due to these facts, there are many pricing strategies that sellers can implement via pricebot to react to their environments. The best strategy for a particular seller is dependent on characteristics of its marketplace. This research shows that the extent to which buyers are using shopbots is a critical driver of the success of pricing strategies. When measuring profitability, the interaction between shopbot usage and seller strategy is very strong - what works well at low shopbot usage levels may perform poorly at high levels. If a seller is evaluating strategies based on sales volume, the choice may change. Additionally, as markets evolve and competitors change strategies, the choice of most effective counterstrategies may evolve as well. Sellers need to clearly define their goals and thoroughly understand their marketplace before choosing a pricing strategy. Just as sellers have choices to make in implementing pricebots, buyers have decisions to make with shopbots. In addition to the factors described above, the types of shopbots in use can actually affect the relative performance of pricing strategies. This research also shows that varying shopbot implementations (specifically involving the use of a price memory component) can affect the prices that buyers ultimately pay - an especially important consideration for high-volume buyers. Modern technology permits software agents to employ artificial intelligence. This work demonstrates the potential of neural networks as a tool for pricebots. As discussed above, a seller's best strategy option can change as the behavior of the competition changes. Simulation can be used to evaluate a multitude of scenarios and determine what strategies work best under what conditions. This research shows that a neural network can be effectively implemented to classify the behavior of competitors and point to the best counterstrategy.
- Expert system applications in architectureKarandikar, Swanandesh S. (Virginia Tech, 1989-07-24)This study proposes an Architectural Expert System (AES) to act as a design partner for architectural designers. Architectural designers are faced with a very complex task of searching a solution space, which is a labyrinth of several domains ranging from social to cultural, and from aesthetic to scientific. With the number of domains come a number of experts of that domain. After progressing through tedious analytical procedures involving the physical principles in architecture, and applying the knowledge of experience, the experts are able to convert the raw data into useful design guidelines. Research in the field of artificial intelligence has developed techniques which can capture such expertise in a computer program, which then emulates the expert. This technology is know as Expert System (ES). This study has used this technology to develop a system to aid architectural design. An AES model is derived from literature review. As the nature of a system based on this model is complex and would require custom built software, an alternative is developed based on the derived model. Based on this alternative, a prototype is developed for energy audit and energy conservation by capturing the expertise of an energy conscious design expert. This prototype module is one component of the sub-system of AES and provides an example for further modules. Various areas such as design, architecture, artificial intelligence and expert systems technology, and energy conscious design and energy conservation converge, and become parts of this study.
- Expert systems for financial analysis of university auxiliary enterprisesMcCart, Christina D. (Virginia Tech, 1991)An essential task of university administration is to monitor the financial position of its auxiliary enterprises. This is an ill-defined and complex task which often requires more administrative time and information than is available. In order to perform this task in an adequate manner a large amount of expertise is required to: (1) determine what constitutes reasonable performance, (2) define unacceptable levels of performance, and (3) suggest courses of action which will alleviate an unacceptable situation. Thorough analysis requires a substantial amount of an expert’s time. The purpose of this research is to explore the opportunities for the enhancement of the financial analysis of auxiliary enterprises through the use of expert systems. The research has included: (1) a comprehensive review of analytical techniques that can be used in financial position analysis, (2) a determination of the the applicability of such techniques to auxiliary enterprises, and (3) an assessment of their amenability to expert system development. As a part of the above described research, an expert system prototype was developed which addresses several of the above issues for one auxiliary enterprise at Virginia Polytechnic Institute and State University. It integrates the knowledge of an expert with both accounting data from the VPI & SU accounting system and other types of data from the auxiliary enterprise operation. The system provides a comprehensive, systematic analysis of the financial position of the Tailor Shop at VPI & SU. This analysis is performed in much less time than would be required by an expert. As a result of the research conducted, it has been concluded that building such a system is possible and it can provide significant benefits to a user. However, financial position analysis requires a substantial amount of data and numerical calculations, both of which require large amounts of computer memory and computations. Therefore, designing an expert system to efficiently perform this task requires the use of a package or a language that efficiently utilizes computer memory and CPU.
- An exploration of the robustness of traditional regression analysis versus analysis using backpropagation networksMarkham, Ina Samanta (Virginia Tech, 1992)Research linking neural networks and statistics has been at two ends of a spectrum: either highly theoretical or application specific. This research attempts to bridge the gap on the spectrum by exploring the robustness of regression analysis and backpropagation networks in conducting data analysis. Robustness is viewed as the degree to which a technique is insensitive to abnormalities in data sets, such as violations of assumptions. The central focus of regression analysis is the establishment of an equation that describes the relationship between the variables in a data set. This relationship 1s used primarily for the prediction of one variable based on the known values of the other variables. Certain assumptions have to be made regarding the data in order to obtain a tractable solution and the failure of one or more of these assumptions results in poor prediction. The assumptions underlying linear regression that are used to characterize data sets in this research are characterized by: (a) sample size and error variance, (b) outliers, skewness, and kurtosis, (c) multicollinearity, and (d) nonlinearity and underspecification. By using this characterization, the robustness of each technique is studied under what is, in effect, the relaxation of assumptions one at a time. The comparison between regression and backpropagation is made using the root mean square difference between the predicted output from each technique and the actual output.
- Geometric-based reasoning system for project planning utilizing AI and CAD technologiesMorad, Ayman Ahmed (Virginia Tech, 1990)Traditional planning and scheduling techniques have played an important role is system analysis over the last three decades. They provide construction planners with mathematical models to simulate the construction process as an aid in planning and control of complex projects. Although these techniques have been widely used by the construction industry, they possess many limitations. Researchers and practitioners in the construction industry have followed two directions to overcome most of the limitations of current planning techniques. The first direction has been concentrated on the utilization of state-of-the-art Computer Aided Design (CAD) and 3D computer modeling technology. The objective of their work is to interactively generate and visually simulate the construction process on graphics display. The second direction has been influenced by the potential capabilities of Artificial Intelligence (AI) technology to accomplish “Automated Planning”. This group has utilized knowledge-based and expert systems to automatically generate construction plans. The research proposed here presents a geometric-based reasoning system called KNOW-PLAN. The system integrates CAD and 3D computer modeling technology with AI technology to automatically generate and simulate construction plans. The system, therefore, can be classified as a third alternative in approaching the planning problem. The research seeks to utilize geometric data to provide a dynamic sequencing for project planning. The research utilizes object location and object interaction with other objects as the primary source of reasoning for the project plan. The interaction of objects is based on a classification of objects with relation to connection types among them, the zones in which the objects are located, and relationships between the classes with which the objects are associated. To accomplish the objectives of the research, an overall model called the KNOW-PLAN model has been formulated. This model is formulated to demonstrate theoretically the feasibility of implementing such a model in real-life. The implementation effort has been concentrated on the development of the crucial components of the KNOW-PLAN model using advanced computer applications. The implementation at this level is referred to as the KNOW-PLAN prototype system.
- Improving Post-Disaster Recovery: Decision Support for Debris Disposal OperationsFetter, Gary (Virginia Tech, 2010-03-31)Disaster debris cleanup operations are commonly organized into two phases. During the first phase, the objective is to clear debris from evacuation and other important pathways to ensure access to the disaster-affected area. Practically, Phase 1 activities largely consist of pushing fallen trees, vehicles, and other debris blocking streets and highways to the curb. These activities begin immediately once the disaster has passed, with the goal of completion usually within 24 to 72 hours. In Phase 2 of debris removal, which is the focus of this study, completion can take months or years. Activities in this phase include organizing and managing curbside debris collection, reduction, recycling, and disposal operations (FEMA 2007). This dissertation research investigates methods for improving post-disaster debris cleanup operations—one of the most important and costly aspects of the least researched area of disaster operations management (Altay and Green 2006). The first objective is to identify the unique nature of the disaster debris cleanup problem and the important decisions faced by disaster debris coordinators. The second goal is to present three research projects that develop methods for assisting disaster management coordinators with debris cleanup operations. In the first project, which is the topic of Chapter 3, a facility location model is developed for addressing the problem of opening temporary disposal and storage reduction facilities, which are needed to ensure efficient and effective cleanup operations. In the second project, which is the topic of Chapter 4, a multiple objective mixed-integer linear programming model is developed to address the problem of assigning debris cleanup resources across the disaster-affected area at the onset of debris cleanup operations. The third project and the focus of Chapter 5 addresses the problem of equitably controlling ongoing cleanup operations in real-time. A self-balancing CUSUM statistical process control chart is developed to assist disaster management coordinators with equitably allocating cleanup resources as information becomes available in real-time. All of the models in this dissertation are evaluated using data from debris cleanup operations in Chesapeake, Virginia, completed after Hurricane Isabel in 2003.
- A knowledge-based simulation optimization system with machine learningCrouch, Ingrid W. M. (Virginia Tech, 1992-05-06)A knowledge-based system is formulated to guide the search strategy selection process in simulation optimization. This system includes a framework for machine learning which enhances the knowledge base and thereby improves the ability of the system to guide optimizations. Response surfaces (i.e., the response of a simulation model to all possible input combinations) are first classified based on estimates of various surface characteristics. Then heuristics are applied to choose the most appropriate search strategy. As the search is carried out and more information about the surface becomes available, the knowledge-based system reclassifies the response surface and, if appropriate, selects a different search strategy. Periodically the system’s Learner is invoked to upgrade the knowledge base. Specifically, judgments are made to improve the heuristic knowledge (rules) in the knowledge base (i.e., rules are added, modified, or combined). The Learner makes these judgments using information from two sources. The first source is past experience -- all the information generated during previous simulation optimizations. The second source is results of experiments that the Learner performs to test hypotheses regarding rules in the knowledge base. The great benefits of simulation optimization (coupled with the high cost) have highlighted the need for efficient algorithms to guide the selection of search strategies. Earlier work in simulation optimization has led to the development of different search strategies for finding optimal-response-producing input levels. These strategies include response surface methodology, simulated annealing, random search, genetic algorithms, and single-factor search. Depending on the characteristics of the response surface (e.g., presence or absence of local optima, number of inputs, variance), some strategies can be more efficient and effective than others at finding an optimal solution. If the response surface were perfectly characterized, the most appropriate search strategy could, ideally, be immediately selected. However, characterization of the surface itself requires simulation runs. The knowledge-based system formulated here provides an effective approach to guiding search strategy selection in simulation optimization.
- A Methodology for Characterization and Performance Analysis of Connection-Based Network Access TechnologiesNovak, David C. (Virginia Tech, 2001-04-18)Network administration has become more difficult as the number of Internet users has grown and customer usage patterns have changed over time. Rapidly increasing subscriber bases, data intensive applications (such as streaming audio and video), heavy Web browsing, and large file downloads require significant resources and may tax existing network bandwidth. Reliability and quality of service are becoming serious issues for service providers across the country. Due to the dynamic nature of the information technology (IT) sector in general, it is difficult to predict future network usage patterns or what types of applications may be available, and how these applications may be used over time. This research presents a methodology to facilitate capacity planning and to improve the evaluation of network performance for connection-based networks using the Virginia Tech modem pool as a test bed. The abstract research question is can innovative business strategies be employed in lieu of, or in addition to, traditional management practices such as adding capacity in order to improve the performance of a dialup network? Examples of such strategies or business rules may include limiting the duration of an online session or limiting the number of times a given customer can dial into the pool in a specified time period. A complete network traffic characterization is conducted based on service time and interarrival time variables. A longitudinal analysis is performed to examine how traffic patterns have changed over time. Finally, a simulation model is utilized to examine how imposing different business rules during peak-periods of operation can reduce the blocking probability and improve the overall level-of-service. The potential contribution of this research appears to be significant based on the lack of existing literature.
- A Methodology for the Development of a Production Experience Database for Earthmoving Operations Using Automated Data CollectionKannan, Govindan (Virginia Tech, 1999-05-26)Automated data acquisition has revolutionized the reliability of product design in recent years. A noteworthy example is the improvement in the design of aircrafts through field data. This research proposes a similar improvement in the reliability of process design of earthmoving operations through automated field data acquisition. The segment of earthmoving operations addressed in this research constitutes the truck-loader operation. Therefore, the applicability of this research extends to other industries involving truck-operation such as mining, agriculture and forest logging and is closely related to wheel-based earthmoving operations such as scrapers. The context of this research is defined by data collection needed to increase the validity of the results obtained by analysis tools such as simulation, performance measures and graphical representation of variance in an activity's performance, and the relation between operating conditions and the variance in an activity's performance. The automated cycle time data collection is facilitated by instrumented trucks and the collection of information on operating conditions is facilitated by image database and paper forms. The cycle time data and the information on operating conditions are linked together to form the experience database. This research developed methods to extract, quantify and understand the variation in each component of the earthmoving cycle namely, load, haul and return, and dump activities. For the load activity, the simultaneous variation in payload and load time is illustrated through the development of a PLT (PayLoad Time) Map. Among the operating conditions, material type, load area floor, space constraints and shift are investigated. A dynamic normalization process of determining the ratio of actual travel time to expected travel time is developed for the haul and return activities. The length of the haul road, sequence of gear downshifts and shift are investigated for their effect on the travel time. The discussion on the dump activity is presented in a qualitative form due to the lack of data. Each component is integrated within the framework of the experience database. The implementation aspects with respect to developing and using the experience database are also described in detail. The practical relevance of this study is highlighted using an example.