Browsing by Author "Taylor, G. Don"
Now showing 1 - 17 of 17
Results Per Page
Sort Options
- Collection-and-Delivery-Points: A Variation on a Location-Routing ProblemSavage, Laura Elizabeth (Virginia Tech, 2019-09-20)Missed deliveries are a major issue for package carriers and a source of great hassle for the customers. Either the carrier attempts to redeliver the package, incurring the additional expense of visiting the same house up to three times, or they leave the package on the doorstep, vulnerable to package thieves. In this dissertation, a system of collection-and-delivery-points (CDPs) has been proposed to improve customer service and reduce carrier costs. A CDP is a place, either in an existing business or a new location, where the carrier drops any missed deliveries and the customers can pick the packages at their convenience. To examine the viability of a CDP system in North America, a variation on a location-routing problem (LRP) was created, a mixed-integer programming model called the CDP-LRP. Unlike standard LRPs, the CDP-LRP takes into account both the delivery truck route distance and the direct customer travel to the CDPs. Also, the facilities being placed are not located at the beginning and ending of the truck routes, but are stops along the routes. After testing, it became clear that, because of the size and complexity of the problem, the CDP-LRP is unable to be solved exactly in a reasonable amount of time. Heuristics developed for the standard LRP cannot be applied to the CDP-LRP because of the differences between the models. Therefore, three heuristics were created to approximate the solution to the CDP-LRP, each with two different embedded modified vehicle routing problem (VRP) algorithms, the Clark-Wright and the Sweep, modified to handle the additional restrictions caused by the CDPs. The first is an improvement heuristic, in which each closed CDP is tested as a replacement for each open CDP, and the move that creates the most savings is implemented. The second begins with every CDP open, and closes them one at a time, while the third does the reverse and begins will only one open CDP, then opens the others one by one. In each case, a penalty is applied if the customer travel distance is too long. Each heuristic was tested for each possible number of open CDPs, and the least expensive was chosen as the best solution. Each heuristic and VRP algorithm combination was tested using three delivery failure rates and different data sets: three small data sets pulled from VRP literature, and randomly generated clustered and uniformly distributed data sets with three different numbers of customers. OpenAll and OpenOne produced better solutions than Replacement in most instances, and the Sweep Algorithm outperformed the Clark-Wright in both solution quality and time in almost every test. To judge the quality of the heuristic solutions, the results were compared to the results of a simple locate-first, route-second sequential algorithm that represents the way the decision would commonly be made in industry today. The CDPs were located using a simple facility location model, then the delivery routes were created with the Sweep algorithm. These results were mixed: for the uniformly distributed data sets, if the customer travel penalty threshold and customer density are low enough, the heuristics outperform the sequential algorithm. For the clustered data sets, the sequential algorithm produces solutions as good as or slightly better than the sequential algorithm, because the location of the potential CDP inside the clusters means that the penalty has less impact, and the addition of more open CDPs has less effect on the delivery route distances. The heuristic solutions were also compared to a second value – the route costs incurred by the carrier in the current system of redeliveries, calculated by placing additional customers in the routes and running the Sweep algorithm – to judge the potential savings that could be realized by implementing a CDP system in North America. Though in some circumstances the current system is less expensive, depending on the geographic distribution of the customers and the delivery failure rate, in other circumstances the cost savings to the carrier could be as high as 27.1%. Though the decision of whether or not to set up a CDP system in an area would need to be made on a case-by-case basis, the results of this study suggest that such a system could be successful in North America.
- A Complex Adaptive Systems Analysis of Productive EfficiencyDougherty, Francis Laverne (Virginia Tech, 2014-10-17)Linkages between Complex Adaptive Systems (CAS) thinking and efficiency analysis remain in their infancy. This research associates the basic building blocks of the CAS 'flocking' metaphor with the essential building block concepts of Data Envelopment Analysis (DEA). Within a proposed framework DEA "decision-making units" (DMUs) are represented as agents in the agent-based modeling (ABM) paradigm. Guided by simple rules, agent DMUs representing business units of a larger management system, 'align' with one another to achieve mutual protection/risk reduction and 'cohere' with the most efficient DMUs among them to achieve the greatest possible efficiency in the least possible time. Analysis of the resulting patterns of behavior can provide policy insights that are both evidence-based and intuitive. This research introduces a consistent methodology that will be called here the Complex Adaptive Productive Efficiency Method (CAPEM) and employs it to bridge these domains. This research formalizes CAPEM mathematically and graphically. It then conducts experimentation employing using the resulting CAPEM simulation using data of a sample of electric power plants obtained from Rungsuriyawiboon and Stefanou (2003). Guided by rules, individual agent DMUs (power plants) representing business units of a larger management system,'align' with one another to achieve mutual protection/risk reduction and 'cohere' with the most efficient DMUs among them to achieve the greatest possible efficiency in the least possible time. Using a CAS ABM simulation, it is found that the flocking rules (alignment, cohesion and separation), taken individually and in selected combinations, increased the mean technical efficiency of the power plant population and conversely decreased the time to reach the frontier. It is found however that these effects were limited to a smaller than expected sub-set of these combinations of the flocking factors. Having been successful in finding even a limited sub-set of flocking rules that increased efficiency was sufficient to support the hypotheses and conclude that employing the flocking metaphor offers useful options to decision-makers for increasing the efficiency of management systems.
- Creating a Positive Departmental Climate at Virginia Tech: A Compendium of Successful StrategiesFinney, Jack W.; Finkielstein, Carla V.; Merola, Joseph S.; Puri, Ishwar; Taylor, G. Don; Van Aken, Eileen M.; Hyer, Patricia B.; Savelyeva, Tamara (Virginia Tech, 2008-05-05)“Creating a Positive Departmental Climate at Virginia Tech: A Compendium of Successful Strategies” was created as part of the AdvanceVT Departmental Climate Initiative (DCI). The Department Climate Committee collected policies and practices from a variety of sources to provide department chairs and heads with opportunities to learn about departmental issues at Virginia Tech, to understand more fully the ways in which these issues manifest themselves within departments, and to share both successful and unsuccessful strategies illustrative of the different approaches departments have taken towards promoting effective, efficient, and pleasant work environments.
- Decision Support for Casualty Triage in Emergency ResponseKamali, Behrooz (Virginia Tech, 2016-05-04)Mass-casualty incidents (MCI) cause a sudden increase in demand of medical resources in a region. The most important and challenging task in addressing an MCI is managing overwhelmed resources with the goal of increasing total number of survivors. Currently, most of the decisions following an MCI are made in an ad-hoc manner or by following static guidelines that do not account for amount of available resources and number of the casualties. The purpose of this dissertation is to introduce and analyze sophisticated service prioritization and resource allocation tools. These tools can be used to produce service order strategies that increase the overall number of survivors. There are several models proposed that account for number and mix of the casualties, and amount and type of the resources available. Large number of the elements involved in this problem makes the model very complex, and thus, in order to gain some insights into the structure of the optimal solutions, some of the proposed models are developed under simplifying assumptions. These assumptions include limitations on the number of casualty types, handling of deaths, servers, and types of resources. Under these assumptions several characteristics of the optimal policies are identified, and optimal algorithms for various scenarios are developed. We also develop an integrated model that addresses service order, transportation, and hospital selection. A comprehensive set of computational results and comparison with the related works in the literature are provided in order to demonstrate the efficacy of the proposed methodologies.
- Design in the Modern Age: Investigating the Role of Complexity in the Performance of Collaborative Engineering Design TeamsAmbler, Nathaniel Palenaka (Virginia Tech, 2015-06-12)The world of engineering design finds itself at a crossroads. The technical and scientifically rooted tools that propelled humankind into the modern age are now insufficient as evidenced by a growing number of failures to meet design expectations and to deliver value for users and society in general. In the empirical world, a growing consensus among many design practitioners has emerged that engineering design efforts are becoming too unmanageable and too complex for existing design management systems and tools. One of the key difficulties of engineering design is the coordination and management of the underlying collaboration processes. Development efforts that focus on the design of complex artefacts, such as a satellite or information system, commonly require the interaction of hundreds to thousands of different disciplines. What makes these efforts and the related collaboration processes complex from the perspective of many practitioners is the strong degree of interdependency between design decision-making occurring, often concurrently, across multiple designers who commonly reside in different organizational settings. Not only must a design account for and satisfice these dependencies, but it must remain also acceptable to all design participants. Design in effect represents a coevolution between the problem definition and solution, with a finalized design approach arising not from a repeatable series of mathematical optimizations but rather through the collective socio-technical design activities of a large collaboration of designers. Despite the importance of understanding design as a socio-technical decision-making entity, many of the existing design approaches ignore socio-technical issues and often view them as either too imprecise or too difficult to consider. This research provides a performance measurement framework to explore these factors by investigating design as a socio-technical complex adaptive collaborative process between the designer, artefact, and user (DAU). The research implements this framework through an agent-based model, the Complex Adaptive Performance Evaluation Method for Collaboration Design (C2D). This approach allows a design management analyst to generate insights about potential design strategies and mechanisms as they relate to design complexity by examining the simulated performance of a design collaboration as it explores theoretical design fitness landscapes with various degrees of ruggedness.
- Designing Order Picking Systems for Distribution CentersParikh, Pratik J. (Virginia Tech, 2006-09-01)This research addresses decisions involved in the design of an order picking system in a distribution center. A distribution center (DC) in a logistics system is responsible for obtaining materials from different suppliers and assembling (or sorting) them to fulfill a number of different customer orders. Order picking, which is a key activity in a DC, refers to the operation through which items are retrieved from storage locations to fulfill customer orders. Several decisions are involved when designing an order picking system (OPS). Some of these decisions include the identification of the picking-area layout, configuration of the storage system, and determination of the storage policy, picking method, picking strategy, material handling system, pick-assist technology, etc. For a given set of these parameters, the best design depends on the objective function (e.g., maximizing throughout, minimizing cost, etc.) being optimized. The overall goal of this research is to develop a set of analytical models for OPS design. The idea is to help an OPS designer to identify the best performing alternatives out of a large number of possible alternatives. Such models will complement experienced-based or simulation-based approaches, with the goal of improving the efficiency and efficacy of the design process. In this dissertation we focus on the following two key OPS design issues: configuration of the storage system and selection between batch and zone order picking strategies. Several factors that affect these decisions are identified in this dissertation; a common factor amongst these being picker blocking. We first develop models to estimate picker blocking (Contribution 1) and use the picker blocking estimates in addressing the two OPS design issues, presented as Contributions 2 and 3. In Contribution 1 we develop analytical models using discrete-time Markov chains to estimate pick-face blocking in wide-aisle OPSs. Pick-face blocking refers to the blocking experienced by a picker at a pick-face when another picker is already picking at that pick-face. We observe that for the case when pickers may pick only one item at a pick-face, similar to in-the-aisle blocking, pick-face blocking first increases with an increase in pick-density and then decreases. Moreover, pick-face blocking increases with an increase in the number of pickers and pick to walk time ratio, while it decreases with an increase in the number of pick-faces. For the case when pickers may pick multiple items at a pick-face, pick-face blocking increases monotonically with an increase in the pick-density. These blocking estimates are used in addressing the two OPS design issues, which are presented as Contributions 2 and 3. In Contribution 2 we address the issue of configuring the storage system for order picking. A storage system, typically comprised of racks, is used to store pallet-loads of various stock keeping units (SKU) --- a SKU is a unique identifier of products or items that are stored in a DC. The design question we address is related to identifying the optimal height (i.e., number of storage levels), and thus length, of a one-pallet-deep storage system. We develop a cost-based optimization model in which the number of storage levels is the decision variable and satisfying system throughput is the constraint. The objective of the model is to minimize the system cost, which is comprised of the cost of labor and space. To estimate the cost of labor we first develop a travel-time model for a person-aboard storage/retrieval (S/R) machine performing Tchebyshev travel as it travels in the aisle. Then, using this travel-time model we estimate the throughput of each picker, which helps us estimate the number of pickers required to satisfy the system throughput for a given number of storage levels. An estimation of the cost of space is also modeled to complete the total cost model. Results from an experimental study suggest that a low (in height) and long (in length) storage system tends to be optimal for situations where there is a relatively low number of storage locations and a relatively high throughput requirement; this is in contrast with common industry perception of the higher the better. The primary reason for this contrast is because the industry does not consider picker blocking and vertical travel of the S/R machine. On the other hand, results from the same optimization model suggest that a manual OPS should, in almost all situations, employ a high (in height) and short (in length) storage system; a result that is consistent with industry practice. This consistency is expected as picker blocking and vertical travel, ignored in industry, are not a factor in a manual OPS. In Contribution 3 we address the issue of selecting between batch and zone picking strategies. A picking strategy defines the manner in which the pickers navigate the picking aisles of a storage area to pick the required items. Our aim is to help the designer in identifying the least expensive picking strategy to be employed that meets the system throughput requirements. Consequently, we develop a cost model to estimate the system cost of a picking system that employs either a batch or a zone picking strategy. System cost includes the cost of pickers, equipment, imbalance, sorting system, and packers. Although all elements are modeled, we highlight the development of models to estimate the cost of imbalance and sorting system. Imbalance cost refers to the cost of fulfilling the left-over items (in customer orders) due to workload-imbalance amongst pickers. To estimate the imbalance cost we develop order batching models, the solving of which helps in identifying the number of items unfulfilled. We also develop a comprehensive cost model to estimate the cost of an automated sorting system. To demonstrate the use of our models we present an illustrative example that compares a sort-while-pick batch picking system with a simultaneous zone picking system. To summarize, the overall goal of our research is to develop a set of analytical models to help the designer in designing order picking systems in a distribution center. In this research we focused on two key design issues and addressed them through analytical approaches. Our future research will focus on addressing other design issues and incorporating them in a decision support system.
- Extending the System Dynamics Toolbox to Address Policy Problems in Transportation and HealthSeyed Zadeh Sabounchi, Nasim (Virginia Tech, 2012-03-16)System dynamics can be a very useful tool to expand the boundaries of one's mental models to better understand the underlying behavior of systems. But despite its utility, there remains challenges associated with system dynamics modeling that the current research addresses by expanding the system dynamics modeling toolbox. The first challenge relates to imprecision or vagueness, for example, with respect to human perception and linguistic variables. The most common approach is to use table or graph functions to capture the inherent vagueness in these linguistic (qualitative) variables. Yet, combining two or more table functions may lead to further complexity and, moreover, increased difficulty when analyzing the resulting behavior. As part of this research, we extend the system dynamics toolbox by applying fuzzy logic. Then, we select a problem of congestion pricing in mitigating traffic congestion to verify the effectiveness of our integration of fuzzy logic into system dynamics modeling. Another challenge, in system dynamics modeling, is defining proper equations to predict variables based on numerous studies. In particular, we focus on published equations in models for energy balance and weight change of individuals. For these models there is a need to define a single robust prediction equation for Basal Metabolic Rate (BMR), which is an element of the energy expenditure of the body. In our approach, we perform an extensive literature review to explore the relationship between BMR and different factors including age, body composition, gender, and ethnicity. We find that there are many equations used to estimate BMR, especially for different demographic groups. Further, we find that these equations use different independent variables and, in a few cases, generate inconsistent conclusions. It follows then that selecting a single equation for BMI can be quite difficult for purposes of modeling in a systems dynamics context. Our approach involves conducting a meta-regression to summarize the available prediction equations and identifying the most appropriate model for predicting BMR for different sub-populations. The results of this research potentially could lead to more precise predictions of body weight and enhanced policy interventions to help mitigate serious health issues such as obesity.
- Infrastructure Performance and Risk Assessment under Extreme Weather and Climate Change ConditionsBhatkoti, Roma (Virginia Tech, 2016-07-19)This dissertation explores the impact of climate change and extreme weather events on critical infrastructures as defined by US Department of Homeland Security. The focus is on two important critical infrastructure systems – Water and Transportation. Critical infrastructures are always under the risk of threats such as terrorist attacks, natural disasters, faulty management practices, regulatory policies, and defective technologies and system designs. Measuring the performance and risks of critical infrastructures is complex due to its network, geographic and dynamic characteristics and multiplicity of stakeholders associated with them. Critical infrastructure systems in crowded urban and suburban areas like the Washington Metropolitan Area (WMA) are subject to increased risk from geographic proximity. Moreover, climate is challenging the assumption of stationary (the idea that natural systems fluctuate within an unchanging envelope of variability) that is the foundation of water resource engineering and planning. Within this context, this research uses concepts of systems engineering such as 'systems thinking' and 'system dynamics' to understand, analyze, model, simulate, and critically assess a critical infrastructure system's vulnerability to extreme natural events and climate change. In most cases, transportation infrastructure is designed to withstand either the most extreme or close to the most extreme event that will add abnormal stresses on a physical structure. The system may fail to perform as intended if the physical structure faces an event larger than what it is designed for. The results of the transportation study demonstrate that all categories of roadways are vulnerable to climate change and that the magnitude of bridge vulnerability to future climate change is variable depending on which climate model projection is used. Results also show that urbanization and land use patterns affects the susceptibility of the bridge to failures. Similarly, results of the water study indicate that the WMA water supply system may suffer from water shortages accruing due to future droughts but climate change is expected to improve water supply reliability due to an upward trend in precipitation and streamflow.
- Integrated Process Planning and Scheduling for a Complex Job Shop Using a Proxy Based Local SearchHenry, Andrew Joseph (Virginia Tech, 2015-12-10)Within manufacturing systems, process planning and scheduling are two interrelated problems that are often treated independently. Process planning involves deciding which operations are required to produce a finished product and which resources will perform each operation. Scheduling involves deciding the sequence that operations should be processed by each resource, where process planning decisions are known a priori. Integrating process planning and scheduling offers significant opportunities to reduce bottlenecks and improve plant performance, particularly for complex job shops. This research is motivated by the coating and laminating (CandL) system of a film manufacturing facility, where more than 1,000 product types are regularly produced monthly. The CandL system can be described as a complex job shop with sequence dependent setups, operation re-entry, minimum and maximum wait time constraints, and a due date performance measure. In addition to the complex scheduling environment, products produced in the CandL system have multiple feasible process plans. The CandL system experiences significant issues with schedule generation and due date performance. Thus, an integrated process planning and scheduling approach is needed to address large scale industry problems. In this research, a novel proxy measure based local search (PBLS) approach is proposed to address the integrated process planning and scheduling for a complex job shop. PBLS uses a proxy measure in conjunction with local search procedures to adjust process planning decisions with the goal of reducing total tardiness. A new dispatching heuristic, OU-MW, is developed to generate feasible schedules for complex job shop scheduling problems with maximum wait time constraints. A regression based proxy approach, PBLS-R, and a neural network based proxy approach, PBLS-NN, are investigated. In each case, descriptive statistics about the active process plan set are used as independent variables in the model. The resulting proxy measure is used to evaluate the effect of process planning local search moves on the objective function sum of total tardiness. Using the proxy measure to guide a local search reduces the number of times a detailed schedule is generated reducing overall runtime. In summary, the proxy measure based local search approach involves the following stages: • Generate a set of feasible schedules for a set of jobs in a complex job shop. • Evaluate the parameters and results of the schedules to establish a proxy measure that will estimate the effect of process planning decisions on objective function performance. • Apply local search methods to improve upon feasible schedules. Both PBLS-R and PBLS-NN are integrated process planning and scheduling heuristics capable of addressing the challenges of the CandL problem. Both approaches show significant improvement in objective function performance when compared to local search guided by random walk. Finally, an optimal solution approach is applied to small data sets and the results are compared to those of PBLS-R and PBLS-NN. Although the proxy based local search approaches investigated do not guarantee optimality, they provide a significant improvement in computational time when compared to an optimal solution approach. The results suggest proxy based local search is an appealing approach for integrated process planning and scheduling in complex job shop environment where optimal solution approaches are not viable due to processing time.
- A Phenomenological Approach to User-Centered Design: Conceptualizing the Technology Design Space to Assist Military Veterans with Community ReintegrationHaskins Lisle, Alice Catherine (Virginia Tech, 2017-10-17)The current best practices of user-centered design (UCD) may not be optimal with respect to eliciting information from representative users from special populations. This research extended elicitation approaches traditional focus on user needs and context to include criteria describing obstacles users encounter. Military veterans were selected for this research effort as representative users for a use case in technology design that addresses the difficulties associated with community reintegration. This work provides several contributions to the UCD field. First, different elicitation methods were compared by the depth and breadth of design space criteria elicited. Guidelines were generated for designer use of phenomenology in practice. Obstacles were added as an important facet of design, with corresponding grammar rules for construction. Finally, an algorithm was applied as a method for generating personas. Additionally, this dissertation contributes to the field of veteran research. Some example contributions include a set of design space criteria for designers to consider when designing for veterans, and two veteran personas grounded in data procured from the analysis. This research effort was conducted in three phases: elicitation, first-cycle analysis, and second-cycle analysis. The elicitation process engaged 40 military veterans to complete an interview session and a design session. These sessions explored the lived experience of veterans as they reintegrate into communities, and gathered their ideas for technology to assist with veteran reintegration. The researchers who conducted first-cycle coding focused on categorizing the most important participant statements (meaning units) using a codebook. This analysis resulted in over 3,000 meaning units. Additionally, the meaning unit corpus was subjected to systematic second-cycle analyses, using standardized linguistic structures to generate design space criteria. In total, over 6,000 design space criteria were discovered, and these criteria were synthesized to create personas using a situated data mining (SDM) algorithm. Results suggest that the interview session was crucial to elicit higher quantity and broader coverage of design space criteria. It is recommended that designers conduct and analyze interviews that focus on understanding the lived experience of users (not on their technology ideas) as part of a UCD approach.
- Reorganize Your Blogs: Supporting Blog Re-visitation with Natural Language Processing and VisualizationNiu, Shuo; McCrickard, D. Scott; Stelter, Timothy L.; Dix, Alan; Taylor, G. Don (MDPI, 2019-10-07)Temporally-connected personal blogs contain voluminous textual content, presenting challenges in re-visiting and reflecting on experiences. Other data repositories have benefited from natural language processing (NLP) and interactive visualizations (VIS) to support exploration, but little is known about how these techniques could be used with blogs to present experiences and support multimodal interaction with blogs, particularly for authors. This paper presents the effect of reorganization—reorganizing the large blog set with NLP and presenting abstract topics with VIS—to support novel re-visitation experiences to blogs. The BlogCloud tool, a blog re-visitation tool that reorganizes blog paragraphs around user-searched keywords, implements reorganization and similarity-based content grouping. Through a public use session with bloggers who wrote about extended hikes, we observed the effect of NLP-based reorganization in delivering novel re-visitation experiences. Findings suggest that the re-presented topics provide new reflection materials and re-visitation paths, enabling interaction with symbolic items in memory.
- Resource Allocation and Process Improvement of Genetic Manufacturing SystemsPurdy, Gregory T. (Virginia Tech, 2016-11-21)Breakthroughs in molecular and synthetic biology through de novo gene synthesis are stimulating new vaccines, pharmaceutical applications, and functionalized biomaterials, and advancing the knowledge of the function of cells. This evolution in biological processing motivates the study of a class of manufacturing systems, defined here as genetic manufacturing systems, which produce a final product with a genetic construct. Genetic manufacturing systems rely on rare molecular events for success, resulting in waste and repeated work during the deoxyribonucleic acid (DNA) fabrication process. Inspection and real time monitoring strategies are possible as mitigation tools, but it is unclear if these techniques are cost efficient and value added for the successful creation of custom genetic constructs. This work investigates resource allocation strategies for DNA fabrication environments, with an emphasis on inspection allocation. The primary similarities and differences between traditional manufacturing systems and genetic manufacturing systems are described. A serial, multi-stage inspection allocation mathematical model is formulated for a genetic manufacturing system utilizing gene synthesis. Additionally, discrete event simulation is used to evaluate inspection strategies for a fragment synthesis process and multiple fragment assembly operation. Results from the mathematical model and discrete event simulation provide two approaches to determine the appropriate inspection strategies with respect to total cost or total flow time of the genetic manufacturing system.
- Revenue Management in High-Density Urban Parking Districts: Modeling and EvaluationRoper, Martha Annette (Virginia Tech, 2010-01-22)This thesis explores how revenue management (RM) principles would integrate into a parking system, and how advanced reservation-making, coupled with dynamic pricing (based on booking limits) could be used to maximize parking revenue. Detailed here is a comprehensive RM strategy for the parking industry, and an integer programming formulation that maximizes parking revenue over a system of garages is presented. Furthermore, an intelligent parking reservation model is developed that uses an artificial neural network procedure for online reservation decision-making. Next, the work evaluates whether the implementation of a parking RM system in a dense urban parking district (and thus avoiding "trial-and-error" behaviors exhibited by drivers) mitigates urban congestion levels. In order to test this hypothesis, a parallel modeling structure was developed that uses a real-time decision-making model that either accepts or rejects requests for parking via a back-propagation neural network. Coupled with the real-time decision-making model is a micro-simulation model structure used to evaluate the policy's effects on network performance. It is clear from the results that the rate at which parkers renege is a primary determinant of the value of the implementation of RM. All other things being equal, the RM model in which the majority of parkers is directed to their precise parking spot via the most direct route is much more robust to the random elements within the network that can instigate extreme congestion. The thesis then moves from micro-evaluation to macro-evaluation by measuring the performance of the urban parking system from the perspective of the set of relevant stakeholders using the hyperbolic DEA model within the context of the matrix DEA construct. The stakeholder models, including that of the provider, the user, and the community, have defined inputs/outputs to the hyperbolic DEA model, which allows for the inclusion of undesirable outputs such as network delay and incidence of extreme congestion. Another key contribution of this work is that of identifying design issues for current and future dense urban parking districts. Clearly, reneging rate and the tenacity of perspective parkers is a key consideration in cases where RM policy is not implemented.
- Sampling Laws for Stochastically Constrained Simulation Optimization on Finite SetsHunter, Susan R. (Virginia Tech, 2011-09-23)Consider the context of selecting an optimal system from among a finite set of competing systems, based on a "stochastic" objective function and subject to multiple "stochastic" constraints. In this context, we characterize the asymptotically optimal sample allocation that maximizes the rate at which the probability of false selection tends to zero in two scenarios: first in the context of general light-tailed distributions, and second in the specific context in which the objective function and constraints may be observed together as multivariate normal random variates. In the context of general light-tailed distributions, we present the optimal allocation as the result of a concave maximization problem for which the optimal solution is the result of solving one of two nonlinear systems of equations. The first result of its kind, the optimal allocation is particularly easy to obtain in contexts where the underlying distributions are known or can be assumed, e.g., normal, Bernoulli. A consistent estimator for the optimal allocation and a corresponding sequential algorithm for implementation are provided. Various numerical examples demonstrate where and to what extent the proposed allocation differs from competing algorithms. In the context of multivariate normal distributions, we present an exact, asymptotically optimal allocation. This allocation is the result of a concave maximization problem in which there are at least as many constraints as there are suboptimal systems. Each constraint corresponding to a suboptimal system is a convex optimization problem. Thus the optimal allocation may easily be obtained in the context of a "small" number of systems, where the quantifier "small" depends on the available computing resources. A consistent estimator for the optimal allocation and a fully sequential algorithm, fit for implementation, are provided. The sequential algorithm performs significantly better than equal allocation in finite time across a variety of randomly generated problems. The results presented in the general and multivariate normal context provide the first foundation of exact asymptotically optimal sampling methods in the context of "stochastically" constrained simulation optimization on finite sets. Particularly, the general optimal allocation model is likely to be most useful when correlation between the objective and constraint estimators is low, but the data are non-normal. The multivariate normal optimal allocation model is likely to be useful when the multivariate normal assumption is reasonable or the correlation is high.
- Strategic Planning Models and Approaches to Improve Distribution Planning in the Industrial Gas IndustryFarrokhvar, Leily (Virginia Tech, 2016-05-04)The industrial gas industry represents a multi-billion dollar global market and provides essential product to manufacturing and service organizations that drive the global economy. In this dissertation, we focus on improving distribution efficiency in the industrial gas industry by addressing the strategic level problem of bulk tank allocation (BTA) while considering the effects of important operational issues. The BTA problem determines the preferred size of bulk tanks to assign to customer sites to minimize recurring gas distribution costs and initial tank installation costs. The BTA problem has a unique structure which includes a resource allocation problem and an underlying vehicle routing problem with split deliveries. In this dissertation, we provide an exact solution approach that solves the BTA problem to optimality and recommends tank allocations, provides a set of delivery routes, and determines delivery amounts to customers on each delivery route within reasonable computational time. The exact solution approach is based on a branch-and-price algorithm that solves problem instances with up to 40 customers in reasonable computational time. Due to the complexity of the problem and the size of industry representative problems, the solution approaches published in the literature rely on heuristics that require a set of potential routes as input. In this research, we investigate and compare three alternative route generation algorithms using data sets from an industry partner. When comparing the routes generation algorithms, a sweep-based heuristic was the preferred heuristic for the data sets evaluated. The existing BTA solution approaches in the literature also assume a single bulk tank can be allocated at each customer site. While this assumption is valid for some customers due to space limitations, other customer sites may have the capability to accommodate multiple tanks. We propose two alternative mathematical models to explore the possibility and potential benefits of allocating multiple tanks at designated customer site that have the capacity to accommodate more than one tank. In a case study with 20 customers, allowing multiple tank allocation yield 13% reduction in total costs. In practice, industrial gas customer demands frequently vary by time period. Thus, it is important to allocate tanks to effectively accommodate time varying demand. Therefore, we develop a bulk tank allocation model for time varying demand (BTATVD) which captures changing demands by period for each customer. Adding this time dimension increases complexity. Therefore, we present three decomposition-based solution approaches. In the first two approaches, the problem is decomposed and a restricted master problem is solved. For the third approach, a two phase periodically restricting heuristic approach is developed. We evaluate the solution approaches using data sets provided by an industrial partner and solve the problem instances with up to 200 customers. The results yield approximately 10% in total savings and 20% in distribution cost savings over a 7 year time horizon. The results of this research provide effective approaches to address a variety of distribution issues faced by the industrial gas industry. The case study results demonstrate the potential improvements for distribution efficiency.
- Transportation Service Provider Collaboration Problem: Potential Benefits and Solution ApproachesRoesch, Robert Steven (Virginia Tech, 2017-02-28)Truck-based freight transportation continues to play a vital role in the delivery of goods in the United States. Despite its size and importance, the truck transportation industry continues to struggle with fulfilling transportation requests in an efficient and sustainable manner. One potential solution to alleviate many of the current truck industry problems is for transportation service providers (TSPs) to collaborate by sharing volume, resources, and facilities. This research introduces the Transportation Service Provider Collaboration Problem (TSP-CP) to demonstrate the benefits of using optimal freight routing and consolidation decisions for collaborating TSPs. A mathematical model for the TSP-CP is introduced to describe the problem in detail. Additionally, two separate adaptive large neighborhood search (ALNS) heuristics are developed to provide solutions to industry representative problem instances. Finally, the benefits and insights achieved by enabling collaboration between TSPs using the TSP-CP are identified using industry representative data sets. The representative data sets were derived from actual freight data provided by a freight pooling company that manages collaboration among TSPs. Carriers were chosen from the industry data to evaluate collaborative partnerships and to gain insights on the effects of partnership characteristics on overall benefit as well as the benefits obtained by individual carriers. The computational results suggested collaboration among TSPs offers the potential for substantial reductions in the total distance required to deliver all loads, in the number miles that were traveled completely empty, and the number of containers required for delivery compared to individual performance. Additionally, collaboration increased delivery resource capacity utilization as measured by the percentage of weighted full miles. Detailed analysis of the results from the TSP-CP revealed new insights into the collaboration between full truckload and less-than truckload carriers that have not been quantified or highlighted in previous research. These insights included the effect that an individual carrier's type and size had on the amount of benefit received to each carrier. Finally, the results highlighted the importance of building collaborative partnerships that consider a carrier's geographic location.
- Vehicle Routing for Emergency EvacuationsPereira, Victor Caon (Virginia Tech, 2013-11-22)This dissertation introduces and analyzes the Bus Evacuation Problem (BEP), a unique Vehicle Routing Problem motivated both by its humanitarian significance and by the routing and scheduling challenges of planning transit-based, regional evacuations. First, a variant where evacuees arrive at constant, location-specific rates is introduced. In this problem, a fleet of capacitated buses must transport all evacuees to a depot/shelter such that the last scheduled pick-up and the end of the evacuee arrival process occurs at a location-specific time. The problem seeks to minimize their accumulated waiting time, restricts the number of pick-ups on each location, and exploits efficiencies from service choice and from allowing buses to unload evacuees at the depot multiple times. It is shown that, depending on the problem instance, increasing the maximum number of pick-ups allowed may reduce both the fleet size requirement and the evacuee waiting time, and that, past a certain threshold, there exist a range of values that guarantees an efficient usage of the available fleet and equitable reductions in waiting time across pick-up locations. Second, an extension of the Ritter (1967) Relaxation Algorithm, which explores the inherent structure of problems with complicating variables and constraints, such as the aforementioned BEP variant, is presented. The modified algorithm allows problems with linear, integer, or mixed-integer subproblems and with linear or quadratic objective functions to be solved to optimality. Empirical studies demonstrate the algorithm viability to solve large optimization problems. Finally, a two-stage stochastic formulation for the BEP is presented. Such variant assumes that all evacuees are at the pick-up locations at the onset of the evacuation, that the set of possible demands is provided, and, more importantly, that the actual demands become known once buses visit the pick-up locations for the first time. The effect of exploratory visits (sampling) and symmetry is explored, and the resulting insights used to develop an improved formulation for the problem. An iterative (dynamic) solution algorithm is proposed.