Browsing by Author "Rees, Loren P."
Now showing 1 - 20 of 54
Results Per Page
Sort Options
- An Agent-Based Distributed Decision Support System Framework for Mediated NegotiationLoPinto, Frank Anthony (Virginia Tech, 2004-04-26)Implementing an e-market for limited supply perishable asset (LiSPA) products is a problem at the intersection of online purchasing and distributed decision support systems (DistDSS). In this dissertation, we introduce and define LiSPA products, provide real-world examples, develop a framework for a distributed system to implement an e-market for LiSPA products, and provide proof-of-concept for the two major components of the framework. The DistDSS framework requires customers to instantiate agents that learn their preferences and evaluate products on their behalf. Accurately eliciting and modeling customer preferences in a quick and easy manner is a major hurdle for implementing this agent-based system. A methodology is developed for this problem using conjoint analysis and neural networks. The framework also contains a model component that is addressed in this work. The model component is presented as a mediator of customer negotiation that uses the agent-based preference models mentioned above and employs a linear programming model to maximize overall satisfaction of the total market.
- Automated extraction of product feedback from online reviews: Improving efficiency, value, and total yieldGoldberg, David Michael (Virginia Tech, 2019-04-25)In recent years, the expansion of online media has presented firms with rich and voluminous new datasets with profound business applications. Among these, online reviews provide nuanced details on consumers' interactions with products. Analysis of these reviews has enormous potential, but the enormity of the data and the nature of unstructured text make mining these insights challenging and time-consuming. This paper presents three studies examining this problem and suggesting techniques for automated extraction of vital insights. The first study examines the problem of identifying mentions of safety hazards in online reviews. Discussions of hazards may have profound importance for firms and regulators as they seek to protect consumers. However, as most online reviews do not pertain to safety hazards, identifying this small portion of reviews is a challenging problem. Much of the literature in this domain focuses on selecting "smoke terms," or specific words and phrases closely associated with the mentions of safety hazards. We first examine and evaluate prior techniques to identify these reviews, which incorporate substantial human opinion in curating smoke terms and thus vary in their effectiveness. We propose a new automated method that utilizes a heuristic to curate smoke terms, and we find that this method is far more efficient than the human-driven techniques. Finally, we incorporate consumers' star ratings in our analysis, further improving prediction of safety hazard-related discussions. The second study examines the identification of consumer-sourced innovation ideas and opportunities from online reviews. We build upon a widely-accepted attribute mapping framework from the entrepreneurship literature for evaluating and comparing product attributes. We first adapt this framework for use in the analysis of online reviews. Then, we develop analytical techniques based on smoke terms for automated identification of innovation opportunities mentioned in online reviews. These techniques can be used to profile products as to attributes that affect or have the potential to affect their competitive standing. In collaboration with a large countertop appliances manufacturer, we assess and validate the usefulness of these suggestions, tying together the theoretical value of the attribute mapping framework and the practical value of identifying innovation-related discussions in online reviews. The third study addresses safety hazard monitoring for use cases in which a higher yield of safety hazards detected is desirable. We note a trade-off between the efficiency of hazard techniques described in the first study and the depth of such techniques, as a high proportion of identified records refer to true hazards, but several important hazards may be undetected. We suggest several techniques for handling this trade-off, including alternate objective functions for heuristics and fuzzy term matching, which improve the total yield. We examine the efficacy of each of these techniques and contrast their merits with past techniques. Finally, we test the capability of these methods to generalize to online reviews across different product categories.
- Building a knowledge based simulation optimization system with discovery learningSiochi, Fernando C. (Virginia Tech, 1995)Simulation optimization is a developing research area whereby a set of input conditions is sought that produce a desirable output (or outputs) to a simulation model. Although many approaches to simulation optimization have been developed, the research area is by no means mature. This research makes three contributions in the area of simulation optimization. The first is fundamental in that it examines simulation outputs, called "response surfaces," and notes their behavior. In particular both point and region estimates are studied for different response surfaces: Conclusions are developed that indicate when and where simulation-optimization techniques such as Response Surface Methodology should be applied. The second contribution provides assistance in selecting a region to begin a simulation-optimization search. The new method is based upon the artificial intelligence based approach best-first search. Two examples of the method are given. The final contribution of this research expands upon the ideas by Crouch for building a "Learner" to improve heuristics in simulation over time. The particular case of parameter-modification learning is developed and illustrated by example. The dissertation concludes with limitations and suggestions for future work.
- Cellular manufacturing: applicability and system designLeu, Yow-yuh (Virginia Tech, 1991-08-07)As competition has intensified, many American manufacturers have sought alternatives to rejuvenate their production systems. Cellular manufacturing systems have received considerable interest from both academics and practitioners. This research examines three major issues in cellular manufacturing that have not been adequately addressed: applicability, structural design, and operational design. Applicability, in this study, is concerned with discerning the circumstances in which cellular manufacturing is the system of choice. The methodology employed is simulation and two experimental studies are conducted. The objective of Experiment I, a 2 x 3 x 3 factorial design, is to investigate the role of setup time and move time on system performance and to gain insight into why and how one layout could outperform another. The results of Experiment I suggest that move time is a significant factor for job shops and that workload variation needs to be reduced if the performance of cellular manufacturing is to be improved. Experiment II evaluates the impact of setup time reduction and operational standardization on the performance of cellular manufacturing. The results of Experiment II suggest that cellular manufacturing is preferred if the following conditions exist: (1) well balanced workload, (2) standardized products, (3) standardized operations, and (4) setup times independent from processing times.
- Computer Network Routing with a Fuzzy Neural NetworkBrande, Julia K. Jr. (Virginia Tech, 1997-11-07)The growing usage of computer networks is requiring improvements in network technologies and management techniques so users will receive high quality service. As more individuals transmit data through a computer network, the quality of service received by the users begins to degrade. A major aspect of computer networks that is vital to quality of service is data routing. A more effective method for routing data through a computer network can assist with the new problems being encountered with today's growing networks. Effective routing algorithms use various techniques to determine the most appropriate route for transmitting data. Determining the best route through a wide area network (WAN), requires the routing algorithm to obtain information concerning all of the nodes, links, and devices present on the network. The most relevant routing information involves various measures that are often obtained in an imprecise or inaccurate manner, thus suggesting that fuzzy reasoning is a natural method to employ in an improved routing scheme. The neural network is deemed as a suitable accompaniment because it maintains the ability to learn in dynamic situations. Once the neural network is initially designed, any alterations in the computer routing environment can easily be learned by this adaptive artificial intelligence method. The capability to learn and adapt is essential in today's rapidly growing and changing computer networks. These techniques, fuzzy reasoning and neural networks, when combined together provide a very effective routing algorithm for computer networks. Computer simulation is employed to prove the new fuzzy routing algorithm outperforms the Shortest Path First (SPF) algorithm in most computer network situations. The benefits increase as the computer network migrates from a stable network to a more variable one. The advantages of applying this fuzzy routing algorithm are apparent when considering the dynamic nature of modern computer networks.
- Consumer-Centric Innovation for Mobile Apps Empowered by Social Media AnalyticsQiao, Zhilei (Virginia Tech, 2018-06-20)Due to the rapid development of Internet communication technologies (ICTs), an increasing number of social media platforms exist where consumers can exchange comments online about products and services that businesses offer. The existing literature has demonstrated that online user-generated content can significantly influence consumer behavior and increase sales. However, its impact on organizational operations has been primarily focused on marketing, with other areas understudied. Hence, there is a pressing need to design a research framework that explores the impact of online user-generated content on important organizational operations such as product innovation, customer relationship management, and operations management. Research efforts in this dissertation center on exploring the co-creation value of online consumer reviews, where consumers' demands influence firms' decision-making. The dissertation is composed of three studies. The first study finds empirical evidence that quality signals in online product reviews are predictors of the timing of firms' incremental innovation. Guided by the product differentiation theory, the second study examines how companies' innovation and marketing differentiation strategies influence app performance. The last study proposes a novel text analytics framework to discover different information types from user reviews. The research contributes theoretical and practical insights to consumer-centric innovation and social media analytics literature.
- Cost-based shop control using artificial neural networksWiegmann, Lars (Virginia Tech, 1992)The production control system of a shop consists of three stages: due-date prediction, order release, and job dispatching. The literature has dealt thoroughly with the third stage, but there is a paucity of study on either of the first two stages or on interaction between the stages. This dissertation focuses on the first stage of production control, due-date prediction, by examining methodologies for improved prediction that go beyond either practitioner or published approaches. In particular, artificial neural networks and regression nonlinear in its variables are considered. In addition, interactive effects with the third stage, shop-floor dispatching, are taken into consideration. The dissertation conducts three basic studies. The first examines neural networks and regression nonlinear in its variables as alternatives to conventional due-date prediction. The second proposes a new cost-based criterion and prediction methodology that explicitly includes costs of earliness and tardiness directly in the forecast; these costs may differ in form and/or degree from each other. And third, the benefit of tying together the first and third stages of production control is explored. The studies are conducted by statistically analyzing data generated from simulated shops. Results of the first study conclude that both neural networks and regression nonlinear in its variables are preferred significantly to approaches advanced to date in the literature and in practice. Moreover, in the second study, it is found that the consequences of not using the cost-based criterion can be profound, particularly if a firm's cost function is asymmetric about the due date. Finally, it is discovered that the integrative, interactive methodology developed in the third study is significantly superior to the current non-integrative and non-interactive approaches. In particular, interactive neural network prediction is found to excel in the presence of asymmetric cost functions, whereas regression nonlinear in its variables is preferable under symmetric costs.
- Data Standardization and Machine Learning Models for HistopathologyAwaysheh, Abdullah Mamdouh (Virginia Tech, 2017-03-27)Machine learning can provide insight and support for a variety of decisions. In some areas of medicine, decision-support models are capable of assisting healthcare practitioners in making accurate diagnoses. In this work we explored the application of these techniques to distinguish between two diseases in veterinary medicine; inflammatory bowel disease (IBD) and alimentary lymphoma (ALA). Both disorders are common gastrointestinal (GI) diseases in humans and animals that share very similar clinical and pathological outcomes. Because of these similarities, distinguishing between these two diseases can sometimes be challenging. In order to identify patterns that may help with this differentiation, we retrospectively mined medical records from dogs and cats with histopathologically diagnosed GI diseases. Since the pathology report is the key conveyer of this information in the medical records, our first study focused on its information structure. Other groups have had a similar interest. In 2008, to help insure consistent reporting, the World Small Animal Veterinary Association (WSAVA) GI International Standardization Group proposed standards for recording histopathological findings (HF) from GI biopsy samples. In our work, we extend WSAVA efforts and propose an information model (composed of information structure and terminology mapped to the Systematized Nomenclature of Medicine - Clinical Terms) to be used when recording histopathological diagnoses (HDX, one or more HF from one or more tissues). Next, our aim was to identify free-text HF not currently expressed in the WSAVA format that may provide evidence for distinguishing between IBD and ALA in cats. As part of this work, we hypothesized that WSAVA-based structured reports would have higher classification accuracy of GI disorders in comparison to use of unstructured free-text format. We trained machine learning models in 60 structured, and independently, 60 unstructured reports. Results show that unstructured information-based models using two machine learning algorithms achieved higher accuracy in predicting the diagnosis when compared to the structured information-based models, and some novel free-text features were identified for possible inclusion in the WSAVA-reports. In our third study, we tested the use of machine learning algorithms to differentiate between IBD and ALA using complete blood count and serum chemistry data. Three models (using naïve Bayes, neural networks, and C4.5 decision trees) were trained and tested on laboratory results for 40 Normal, 40 IBD, and 40 ALA cats. Diagnostic models achieved classification sensitivity ranging between 63% and 71% with naïve Bayes and neural networks being superior. These models can provide another non-invasive diagnostic tool to assist with differentiating between IBD and ALA, and between diseased and non-diseased cats. We believe that relying on our information model for histopathological reporting can lead to a more complete, consistent, and computable knowledgebase in which machine learning algorithms can more efficiently identify these and other disease patterns.
- Decision support for long-range, community-based planning to mitigate against and recover from potential multiple disastersChacko, Josey; Rees, Loren P.; Zobel, Christopher W.; Rakes, Terry R.; Russell, Roberta S.; Ragsdale, Cliff T. (Elsevier, 2016-07-01)This paper discusses a new mathematical model for community-driven disaster planning that is intended to help decision makers exploit the synergies resulting from simultaneously considering actions focusing on mitigation and efforts geared toward long-term recovery. The model is keyed on enabling long-term community resilience in the face of potential disasters of varying types, frequencies, and severities, and the approach’s highly iterative nature is facilitated by the model’s implementation in the context of a Decision Support System. Three examples from Mombasa, Kenya, East Africa, are discussed and compared in order to demonstrate the advantages of the new mathematical model over the current ad hoc mitigation and long-term recovery planning approaches that are typically used.
- A decision support system for integrated design analysis of a repairable item and it's logistic support systemReasor, Roderick J. (Virginia Tech, 1990)Design of a repairable item and its logistic support system requires consideration of several interrelated decision problems. These decision problems concern the variables, controllable by the design engineer and/or system manager, which affect system performance. This research develops a framework for integration of these decision problems and evaluation of system design tradeoffs. These design decision problems are represented in the model base of a decision support system (DSS). Interrelationships between decision problems are defined using data flow diagrams. Data flows within and between these decision problems are integrated in the DSS database. A simulation capability, imbedded into the DSS permits short-term, accelerated time excursions into possible futures for decision-making purposes. Alternative system designs are evaluated using a multicriteria decision model which considers reliability, maintainability, availability, and life cycle costs. The logistic support system is modeled as a multilevel inventory system. These inventories include spare repairable items, spare parts, labor, maintenance equipment, and other support resources. Repairable item and logistic support system design decision problems affect the quantity and location of these inventories. Five decision problems identified by Moore [1986] were selected to demonstrate the utility of this framework. The selected decision problems are: 1) the equipment design problem; 2) the maintenance configuration problem; 3) the spare equipment problem; 4) the level of repair problem; and 5) the replacement policy problem. The framework developed supports integration of these decision problems throughout the item’s life cycle. A repairable item can be systematically divided into subelements until individual repairable components are identified. This systematic subdivision of the item produces an inverted, tree-like structure. This structure is used as the representational view of the DSS database. As the life cycle progresses and the item design becomes more detailed, the structure expands. The DSS database is designed to accommodate this expansion so that the framework can be used throughout the item’s life cycle. The initial fielding and the retirement of the repairable item population produces nonstationary demands on the logistics support system. A multistream model captures the nonstationary aspects of demand, eliminating the need for item-by-item tracking within the model. The framework developed is illustrated using a comprehensive case study. The case study addresses the design of a Side Loadable Warping Tug (SLWT) and its logistics support system. A population of SLWT’s must be deployed to meet demands in two different operating environments. The SLWT is a component of the U.S. Navy’s Container Offloading and Transfer System (COTS).
- A Decision Support System for the Electrical Power Districting ProblemBergey, Paul K. (Virginia Tech, 2000-04-21)Due to a variety of political, economic, and technological factors, many national electricity industries around the globe are transforming from non-competitive monopolies with centralized systems to decentralized operations with competitive business units. This process, commonly referred to as deregulation (or liberalization) is driven by the belief that a monopolistic industry fails to achieve economic efficiency for consumers over the long run. Deregulation has occurred in a number of industries such as: aviation, natural gas, transportation, and telecommunications. The most recent movement involving the deregulation of the electricity marketplace is expected to yield consumer benefit as well. To facilitate deregulation of the electricity marketplace, competitive business units must be established to manage various functions and services independently. In addition, these business units must be given physical property rights for certain parts of the transmission and distribution network in order to provide reliable service and make effective business decisions. However, partitioning a physical power grid into economically viable districts involves many considerations. We refer to this complex problem as the electrical power districting problem. This research is intended to identify the necessary and fundamental characteristics to appropriately model and solve an electrical power districting problem. Specifically, the objectives of this research are five-fold. First, to identify the issues relevant to electrical power districting problems. Second, to investigate the similarities and differences of electrical power districting problems with other districting problems published in the research literature. Third, to develop and recommend an appropriate solution methodology for electrical power districting problems. Fourth, to demonstrate the effectiveness of the proposed solution method for a specific case of electric power districting in the Republic of Ghana, with data provided by the World Bank. Finally, to develop a decision support system for the decision makers at the World Bank for solving Ghana's electrical power districting problem.
- A decision support system for tuition and fee policy analysisGreenwood, Allen G. (Virginia Polytechnic Institute and State University, 1984)Tuition and fees are a major source of income for colleges and universities and a major portion of the cost of a student's education. The university administration's task of making sound and effective tuition and fee policy decisions is becoming both more critical and more complex. This is a result of the increased reliance on student-generated tuition-and-fee income, the declining college-age student population, reductions in state and Federal funds, and escalating costs of operation. The comprehensive computerized decision support system (DSS) developed in this research enhances the administration's planning, decision-making, and policy-setting processes. It integrates data and reports with modeling and analysis in order to provide a systematic means for analyzing tuition and fee problems, at a detailed and sophisticated level, without the user having to be an expert in management science techniques or computers. The DSS with its imbedded multi-year goal programming (GP) model allocates the university's revenue requirements to charges for individual student categories based on a set of user-defined objectives, constraints, and priorities. The system translates the mathematical programming model into a valuable decision-making aid by making it directly and readily accessible to the administration. The arduous tasks of model formulation and solution, the calculation of the model's parameter values, and the generation of a series of reports to document the results are performed by the system; whereas, the user is responsible for defining the problem framework, selecting the goals, setting the targets, establishing the priority structure, and assessing the solution. The DSS architecture is defined in terms of three highly integrated subsystems - dialog, data, and models - that provide the following functions: user/system interface, program integration, process control, data storage and handling, mathematical, statistical, and financial computations, as well as display, memory aid, and report generation. The software was developed using four programming languages/systems: EXEC 2, FORTRAN, IFPS, and LINDO. While the system was developed, tested, and implemented at Virginia Polytechnic Institute and State University, the concepts developed in this research are general enough to be applied to any public institution of higher education.
- Decision support systems design: a nursing scheduling applicationCeccucci, Wendy A. (Virginia Tech, 1994-01-28)The systems development life cycle (SDLC) has been the traditional method of decision support systems design. However, in the last decade several methodologies have been introduced to address the limitations arising in the use of the traditional method. These approaches include Courban's iterative design, Keen's adaptive design, prototyping and a number of mixed methodologies incorporating prototyping into the SDLC. Each of the previously established design methodologies has a number of differing characteristics that make each of them a more suitable strategy for certain environments. However, in some environments the current methodologies present certain limitations or unnecessary expenditures. These limitations suggest the need for an alternative methodology. This dissertation develops a new methodology, priority design, to meet this need. To determine what methodology would be most effective in a given situation, an analysis of the operating environment must be performed. Such issues as project complexity, project uncertainty, and limited user involvement must be addressed. This dissertation develops a set of guidelines to assist in this analysis. For clarity, the guidelines are applied to three, well-documented case studies. As an application of the priority design methodology, a decision support system for nurse scheduling is developed. The development of a useful DSS for nurse scheduling requires that projected staff requirements and issues of both coverage and differential assignment of personnel be addressed.
- Design and Application of Genetic Algorithms for the Multiple Traveling Salesperson Assignment ProblemCarter, Arthur E. (Virginia Tech, 2003-04-21)The multiple traveling salesmen problem (MTSP) is an extension of the traveling salesman problem with many production and scheduling applications. The TSP has been well studied including methods of solving the problem with genetic algorithms. The MTSP has also been studied and solved with GAs in the form of the vehicle-scheduling problem. This work presents a new modeling methodology for setting up the MTSP to be solved using a GA. The advantages of the new model are compared to existing models both mathematically and experimentally. The model is also used to model and solve a multi line production problem in a spreadsheet environment. The new model proves itself to be an effective method to model the MTSP for solving with GAs. The concept of the MTSP is then used to model and solve with a GA the use of one salesman make many tours to visit all the cities instead of using one continuous trip to visit all the cities. While this problem uses only one salesman, it can be modeled as a MTSP and has many applications for people who must visit many cities on a number of short trips. The method used effectively creates a schedule while considering all required constraints.
- Designing Massive 3-Dimensional Neural Networks with Chromosomal-Based Simulated DevelopmentSchinazi, Robert Glen (Virginia Tech, 1995-12-04)A technique for designing and optimizing the next generation of smart process controllers has been developed in this dissertation. The literature review indicated that neural networks held the most promise for this application, yet fundamental limitations have prevented their introduction to commercial settings thus far. This fundamental limitation has been overcome through the enhancement of neural network theory. The approach taken in this research was to produce highly intelligent process control systems by accurately modeling the nervous structures of higher biological organisms. The mammalian cerebral cortex was selected as the primary model since it is the only computational element capable of interpreting and complex patterns that develop over time. However the choice of the mammalian cerebral cortex as the model introduced two new levels of network complexity. First, the cerebral cortex is a three dimensional structure with extremely complicated patterns of interconnectivity. Second, the structure of the cerebral cortex can only be realized when thousands or millions of neurons are integrated into a massive scale neural network. The neural networks developed in this research were designed around the Hebbian adaptation, the only training technique proven by the literature review to be applicable to massive scale networks. These design difficulties were resolved by not only modeling the cerebral cortex, but the process by which it develops and evolves in biological systems. To complete this model, an advanced genetic algorithm was produced, and a technique was developed to encode all functional and structural parameters that define the cerebral cortex into the artificial chromosome. The neural networks were designed by a cell growth simulation program that decoded the structural and functional information on the chromosome. The cell growth simulation program is capable of producing patterns of differentiation unique for any slight variations in the genetic parameters. These growth patterns are similar to patterns of cellular differentiation seen in biological systems. While the computational resources needed to implement a massive scale neural network are beyond that available in existing computer systems, the technique has produced output lists which fully define the interconnections and functional characteristic of the neurons, thereby laying the foundation for their future use in process control.
- A Deterministic Approach to Partitioning Neural Network Training Data for the Classification ProblemSmith, Gregory Edward (Virginia Tech, 2006-08-07)The classification problem in discriminant analysis involves identifying a function that accurately classifies observations as originating from one of two or more mutually exclusive groups. Because no single classification technique works best for all problems, many different techniques have been developed. For business applications, neural networks have become the most commonly used classification technique and though they often outperform traditional statistical classification methods, their performance may be hindered because of failings in the use of training data. This problem can be exacerbated because of small data set size. In this dissertation, we identify and discuss a number of potential problems with typical random partitioning of neural network training data for the classification problem and introduce deterministic methods to partitioning that overcome these obstacles and improve classification accuracy on new validation data. A traditional statistical distance measure enables this deterministic partitioning. Heuristics for both the two-group classification problem and k-group classification problem are presented. We show that these heuristics result in generalizable neural network models that produce more accurate classification results, on average, than several commonly used classification techniques. In addition, we compare several two-group simulated and real-world data sets with respect to the interior and boundary positions of observations within their groups' convex polyhedrons. We show by example that projecting the interior points of simulated data to the boundary of their group polyhedrons generates convex shapes similar to real-world data group convex polyhedrons. Our two-group deterministic partitioning heuristic is then applied to the repositioned simulated data, producing results superior to several commonly used classification techniques.
- Development and evaluation of methods for structured recording of heart murmur findings using SNOMED CT® post-coordinationGreen, Julie Meadows (Virginia Tech, 2004-12-10)Objective: Structured recording of examination findings, such as heart murmurs, is important for effective retrieval and analysis of data. Our study proposes two models for post-coordinating murmur findings and evaluates their ability to record murmurs found in clinical records. Methods: Two models were proposed for post-coordinating murmur findings: the Concept-dependent Attributes model and the Interprets/Has interpretation model. A micro-nomenclature was created based on each model by using the subset and extension mechanisms provided for by the SNOMED-CT® framework. Within each micro-nomenclature a partonomy of cardiac cycle timing values was generated. In order for each model to be capable of representing clinical data, a mechanism for handling range values was developed. One hundred murmurs taken from clinical records were entered into two systems that were built based on each model to enter and display murmur data. Results: Both models were able to record all 100 murmur findings; both required the addition of the same number of concepts into their respective micro-nomenclatures. However, the Interprets/Has interpretation model required twice the storage space for recording murmurs. Conclusion: We found little difference in the requirements for implementation of either model. In fact, data stored using these models could be easily inter-converted. This will allow system developers to choose a model based on their own preferences. If at a later date a method is chosen for modeling within SNOMED-CT, the data can be converted to conform if necessary.
- Development of an Assortment Planning Model for Fashion Sensitive ProductsKang, Keang-Young (Virginia Tech, 1999-04-12)The purpose of this research is to develop an established assortment-planning model identifying procedures and activities for women's wear retail buyers. This research built three assortment-planning models: (a) a conceptual moddel based on a secondary data analysis, (b) a practical-use model based on interviews using questionnaire and a set of activity cards, (c) the suggested model based on the conncetion analysis of the previous two models. Integrated DEFinition (IDEF) Functional modeling method was used to describe procedures and variables of functional activities of assortment planning and to increase the consistency of a model developing process. The variables of functional activities were determined as input, mechanism, constraint, connection, and output based on IDEF0 diagram format. Other research and pilot interviews confirmed the reliability of methodology. Experts and interviewees validated the three models. The abstract level of the suggested assortment-planning model included following concepts: (a) problem recognition, (b) information search, (c) qualitative evaluation, (d) quantitative evaluation, (e) product selection plan, and (f) plan sales.
- Development of an ontology of animals in context within the OBO Foundry framework from a SNOMED-CT extension and subsetSantamaria, Suzanne Lamar (Virginia Tech, 2012-04-25)Animal classification needs vary by use and application. The Linnaean taxonomy is an important animal classification scheme but does not portray key animal identifying information like sex, age group, physiologic stage, living environment and role in production systems such as farms. Ontologies are created and used for defining, organizing and classifying information in a domain to enable learning and sharing of information. This work develops an ontology of animal classes that form the basis for communication of animal identifying information among animal managers, medical professionals caring for animals and biomedical researchers involved in disciplines as diverse as wildlife ecology and dairy science. The Animals in Context Ontology (ACO) was created from an extension and subset of the Systematized Nomenclature of Medicine — Clinical Terms (SNOMED-CT). The principles of the Open Biological and Biomedical Ontologies (OBO) Foundry were followed and freely available tools were used. ACO includes normal development and physiologic animal classes as well animal classes where humans have assigned the animal's role. ACO is interoperable with and includes classes from other OBO Foundry ontologies such as the Gene Ontology (GO). Meeting many of the OBO Foundry principles was straightforward but difficulties were encountered with missing and problematic content in some of the OBO ontologies. Additions and corrections were submitted to four ontologies. Some information in ACO could not be represented formally because of inconsistency in husbandry practices. ACO classes are of interest to science, medicine and agriculture, and can connect information between animal and human systems to enable knowledge discovery.
- Disruption Information, Network Topology and Supply Chain ResilienceLi, Yuhong (Virginia Tech, 2017-07-17)This dissertation consists of three essays studying three closely related aspects of supply chain resilience. The first essay is "Value of Supply Disruption Information and Information Accuracy", in which we examine the factors that influence the value of supply disruption information, investigate how information accuracy influences this value, and provide managerial suggestions to practitioners. The study is motivated by the fact that fully accurate disruption information may be difficult and costly to obtain and inaccurate disruption information can decrease the financial benefit of prior knowledge and even lead to negative performance. We perform the analysis by adopting a newsvendor model. The results show that information accuracy, specifically information bias and information variance, plays an important role in determining the value of disruption information. However, this influence varies at different levels of disruption severity and resilience capacity. The second essay is "Quantifying Supply Chain Resilience: A Dynamic Approach", in which we provide a new type of quantitative framework for assessing network resilience. This framework includes three basic elements: robustness, recoverability and resilience, which can be assessed with respect to different performance measures. Then we present a comprehensive analysis on how network structure and other parameters influence these different elements. The results of this analysis clearly show that both researchers and practitioners should be aware of the possible tradeoffs among different aspects of supply chain resilience. The ability of the framework to support better decision making is then illustrated through a systemic analysis based on a real supply chain network. The third essay is "Network Characteristics and Supply Chain Disruption Resilience", in which we investigate the relationships between network characteristics and supply chain resilience. In this work, we first prove that investigating network characteristics can lead to a better understanding of supply chain resilience behaviors. Later we select key characteristics that play a critical role in determining network resilience. We then construct the regression and decision tree models of different supply chain resilience measures, which can be used to estimate supply chain network resilience given the key influential characteristics. Finally, we conduct a case study to examine the estimation accuracy.
- «
- 1 (current)
- 2
- 3
- »