Browsing by Author "Triantis, Konstantinos P."
Now showing 1 - 20 of 98
Results Per Page
Sort Options
- Addressing the Reliability and Life Cycle Cost Analysis Problem for Technology and System Developers Early in the DoD System Development ProcessPflanz, Mark (Virginia Tech, 2005-12-06)Early in the process of developing or upgrading new weapon systems, Department of Defense (DoD) system and technology developers are faced with decisions regarding which technologies are appropriate for inclusion into the conceptual design. To reduce risk and improve decision making, system and technology developers need a capability to assess the impact of technology reliability on the attributable Operating and Support (O&S) cost of the system. Early understanding of the reliability implications of potential technologies on system O&S cost will help make better informed decisions early in the system development timeline, prior to points of design lock-in. Using a Marine Corps case study and a system dynamics simulation model, this thesis examines the nature of the relationship between component reliability and attributable changes in O&S cost. This thesis also develops a potential analysis methodology repeatable for future use. The modeling results indicate that this relationship is best described as exponential decay, meaning that the savings in O&S cost per system mile is proportional for any fixed incremental change in component reliability. We find these results to be insensitive to changes in preventative maintenance policies, maintenance deferment ratios, and component replacement cost. We completed verification and validation using the case study and existing Marine Corps systems, finding good association between the modeling results and the actual system. This analysis is valuable to the system and technology developer by helping to answer the question: "how reliable is reliable enough in terms of O&S cost" when considering technologies with uncertainties in long-term performance.
- An advanced system for quantifying the effects of radiological releases following a major nuclear accidentBurnfield, Daniel L. (Virginia Tech, 1994-04-15)Although the use of nuclear power has several advantages over the burning of fossil fuels, it has several disadvantages also. The inherent danger of a nuclear accident at a power plant is one of these disadvantages. Although the probability of an accident is very low in comparison to other risks we normally encounter, the consequences are significant. Thousands of local citizens could be exposed to radiation levels more than the normal background levels. It is the responsibility of the State to make the necessary decisions regarding the evacuation of its citizens. To make the best decision possible, it is necessary to obtain a large amount of information regarding the concentration of radionuclides being released and to quickly make projections of the exposure to ionizing radiation of the neighbors of the plant.
- Advancing Emergency Department Efficiency, Infectious Disease Management at Mass Gatherings, and Self-Efficacy Through Data Science and Dynamic ModelingBa-Aoum, Mohammed Hassan (Virginia Tech, 2024-04-09)This dissertation employs management systems engineering principles, data science, and industrial systems engineering techniques to address pressing challenges in emergency department (ED) efficiency, infectious disease management at mass gatherings, and student self-efficacy. It is structured into three essays, each contributing to a distinct domain of research, and utilizes industrial and systems engineering approaches to provide data-driven insights and recommend solutions. The first essay used data analytics and regression analysis to understand how patient length of stay (LOS) in EDs could be influenced by multi-level variables integrating patient, service, and organizational factors. The findings suggested that specific demographic variables, the complexity of service provided, and staff-related variables significantly impacted LOS, offering guidance for operational improvements and better resource allocation. The second essay utilized system dynamics simulations to develop a modified SEIR model for modeling infectious diseases during mass gatherings and assessing the effectiveness of commonly implemented policies. The results demonstrated the significant collective impact of interventions such as visitor limits, vaccination mandates, and mask wearing, emphasizing their role in preventing health crises. The third essay applied machine learning methods to predict student self-efficacy in Muslim societies, revealing the importance of socio-emotional traits, cognitive abilities, and regulatory competencies. It provided a basis for identifying students with varying levels of self-efficacy and developing tailored strategies to enhance their academic and personal success. Collectively, these essays underscore the value of data-driven and evidence-based decision- making. The dissertation's broader impact lies in its contribution to optimizing healthcare operations, informing public health policy, and shaping educational strategies to be more culturally sensitive and psychologically informed. It provides a roadmap for future research and practical applications across the healthcare, public health, and education sectors, fostering advancements that could significantly benefit society.
- An Agent-based Model for Airline Evolution, Competition, and Airport CongestionKim, Junhyuk (Virginia Tech, 2005-05-25)The air transportation system has grown significantly during the past few decades. The demand for air travel has increased tremendously as compared to the increase in the supply. The air transportation system can be divided into four subsystems: airports, airlines, air traffic control, and passengers, each of them having different interests. These subsystems interact in a very complex way resulting in various phenomena. On the airport side, there is excessive flight demand during the peak hours that frequently exceeds the airport capacity resulting in serious flight delays. These delays incur costs to the airport, passengers, and airlines. The air traffic pattern is also affected by the characteristics of the air transportation network. The current network structure of most major airlines in United States is a hub-and-spoke network. The airports are interested in reducing congestion, especially during the peak time. The airlines act as direct demand to the airport and as the supplier to the passengers. They sometimes compete with other airlines on certain routes and sometimes they collaborate to maximize revenue. The flight schedule of airlines directly affects the travel demand. The flight schedule that minimizes the schedule delay of passengers in directed and connected flights will attract more passengers. The important factors affecting the airline revenue include ticket price, departure times, frequency, and aircraft type operated on each route. The revenue generated from airline depends also on the behavior of competing airlines, and their flight schedules. The passengers choose their flight based on preferred departure times, offered ticket prices, and willingness of airlines to minimize delay and cost. Hence, all subsystems of air transportation system are inter-connected to each other, meaning, strategy of each subsystem directly affects the performance of other subsystems. This interaction between the subsystems makes it more difficult to analyze the air transportation system. Traditionally, analytical top-down approach has been used to analyze the air transportation problem. In top-down approach, a set of objectives is defined and each subsystem is fixed in the overall scheme. On the other hand, in a bottom-up approach, many issues are addressed simultaneously and each individual system has greater autonomy to make decisions, communicate and to interact with one another to achieve their goals when considering complex air transportation system. Therefore, it seems more appropriate to approach the complex air traffic congestion and airline competition problems using a bottom-up approach. In this research, an agent-based model for the air transportation system has been developed. The developed model considers each subsystem as an independent type of agent that acts based on its local knowledge and its interaction with other agents. The focus of this research is to analyze air traffic congestion and airline competition in a hub-and-spoke network. The simulation model developed is based on evolutionary computation. It seems that the only way for analyzing emergent phenomenon (such as air traffic congestion) is through the development of simulation models that can simulate the behavior of each agent. In the agent-based model developed in this research, agents that represent airports can increase capacity or significantly change landing fee policy, while the agents that represent airlines learn all the time, change their markets, fare structure, flight frequencies, and flight schedules. Such a bottom-up approach facilitates a better understanding of the complex nature of congestion and gains more insights into the competition in air transportation, hence making it easier to understand, predict and control the overall performance of the complex air transportation system.
- Analysis of decision maker preferencesBurkard, Anita M. (Virginia Tech, 1991-05-06)Decision making is required daily in our lives, whether it be selecting produce at the grocery store, deciding where to live or work, or designing a weapon system for military applications. Most decisions require the Decision Maker (OM) to examine multiple alternatives which most typically are defined by multiple, conflicting criteria. The objective is to select the alternative which minimizes the tradeoffs between attribute levels in order to determine which alternative is "best". This selection of "best" is based on the subjective viewpoint of the DM, that is, the DM's values and preferences directly influence his or her final alternative selection. A comprehensive analysis of the preferences of the OM in order to systematically structure a decision problem should invariably assist the OM in making the "best" choice from the list of available alternatives. This comprehensive analysis of decision maker preferences is the subject of this project/report.
- An Analysis of Fare Collection Costs on Heavy Rail and Bus Systems in the U.S.Plotnikov, Valeri (Virginia Tech, 2001-09-05)In this research, an effort is made to analyze the costs of fare collection on heavy rail and motorbus systems in the U.S. Since existing ticketing and fare collection (TFC) systems are major elements of transit infrastructure and there are several new alternative TFC technologies available on the market, the need to evaluate the performance of existing TFC systems arises. However, very little research has been done, so far, to assess impacts of TFC technologies on capital and operating expenses in public transit. The two objectives of this research are: (1) to formulate a conceptual evaluation framework and a plan to assess the operating costs of existing TFC systems in transit and (2) to analyze the operating expenses associated with existing TFC systems on heavy rail and motorbus transit in the U.S. with the aid of the evaluation framework and plan. This research begins with a review of the current state of knowledge in the areas of transit TFC evaluation, the economics of public transit operations, and fare collection practices and technologies. It helps to determine the scope of work related to assessment of TFC operating costs on public transit and provides the basis for the development of a conceptual evaluation framework and an evaluation plan. Next, this research presents a systematic approach to define and describe alternative TFC systems and suggests that the major TFC system determinants are payment media, fare media, TFC equipment, and transit technology (mode). Following this is the development of measures of effectiveness to evaluate alternative TFC systems. These measures assess cost-effectiveness and labor-intensiveness of TFC operations. The development of TFC System Technology Index follows. This Index recognizes the fact that TFC systems may consist of different sets of TFC technologies both traditional and innovative. Finally, this research presents statistical results that support the hypothesis that TFC operating costs are related to transit demand, transit technology (mode) and TFC technologies. These results further suggest that: (1) TFC operating costs per unlinked passenger trip on heavy rail systems are higher than on motorbus systems and (2) TFC operating costs per unlinked passenger trip tend to increase as the use of non-electronic fare media increases. Actions for further research are also recommended.
- Analysis of potential system improvements concepts for Sunday newspaper insert packagingHarris, Earl D. (Virginia Tech, 1994-04-04)The newspaper industry is undergoing changes to its' revenue base that appear to be leading the industry away from its traditional mission. Throughout history newspapers have had a mission to research, print and deliver news. The better a newspaper was at presenting the news, the greater its' circulation, the greater its' circulation the more revenue it could generate from printed advertisements. Indeed, until the 1970's all of a newspaper's revenues were tied to its' printing process. Beginning in the late 1960's, newspapers began to insert pre-printed advertisements into their products. This inserting has continued to grow steadily over the past twenty years. Recently the growth rate of advertising revenue generated by many major newspaper's internal printed advertising has become stagnant. Meanwhile newspaper inserting revenue continues to grow steadily and it appears that the newspapers are facing the possible need to re-define their mission. This new mission appears to be: "to distribute varieties of desirable printed products to subscribers." The word "desirable" is key. Advertisers want to direct their information to market "niches" and correspondingly, the advertiser wants to reach only the reader segments most likely to respond positively to the advertisement, i.e. make a purchase. One important facility that gives newspapers a potential advantage over other distributors of printed matter is the newspaper's delivery system. Unlike products delivered through the mail, a newspaper's circulation organization can deliver printed material within as little as a twenty-four hour notice. This research will examine certain production process changes that could potentially take advantage of a modern newspaper's production and distribution strengths. The research considers trends in magazine production and models a number of production line scenarios that have the potential of enhancing revenue by inserting greater varieties of pre-prints into an individually addressed package. It lists a number of changes that will probably be required for many design up-grade and recommends future research that would lead to a possible major change in business strategy.
- The application of level of repair analysis to military electronics programsGodshall, R. N. (Virginia Tech, 1990-12-15)During the early stages of the acquisition cycle for a military equipment, the question arises as to how the system will be maintained at an operationally capable level. The desired level of readiness must be balanced against the cost of system maintenance. The primary question to be answered is what is the optimum (i.e., least cost) maintenance concept which will allow the system to meet its specified performance goals. In order to answer this question, one must compare the costs associated with the possible choices in maintenance alternatives. Level of Repair Analysis, or LORA for short, is an optimization technique used to determine the optimum level of repair for each component in the system. Premised on accurately addressing selected system life-cycle maintenance costs, the methodology, data requirements, and algorithms used to conduct a LORA are found in Military Standard 1390C, Level of Repair Analysis. LORA examines the feasibility of repair from both technical (non-economic) and economic standpoints. The optimization process will determine whether it is feasible to repair an item, and if so, where and how. While LORA is an excellent and necessary tool, like many analyses of its type, it has limitations. Like any analytical tool, one must understand both the strengths and weaknesses of the process to properly apply its techniques and interpret its results. Data collection and entry into a computer model can be made relatively easily. Paramount however to properly applying the techniques and results is a thorough understanding of the LORA process. Therefore, the primary focus of this project and report is to illustrate the strengths and limitations of the WRA process by its application to a specific example. It is not intended to pass judgement on the LORA process as implemented and utilized by DOD. Rather, it is intended that this report present an overview of the LORA process and detail some of the nuances one could expect to encounter when performing a LORA. This report contains a description of the Level of Repair Analysis process and how this process was applied to a specific United States Department of Defense electronics system. This report documents the analysis (data and calculations used) to reach the repair and discard decisions for the NIXIE Signal Generator Engineering Change Kit for the AN/SLQ-25, a US Navy electronics equipment. This report differs from the one submitted under the contract in that it contains more theoretical detail on the overall LORA process and does not include any classified or proprietary data.
- An Approach to Real Time Adaptive Decision Making in Dynamic Distributed SystemsAdams, Kevin Page (Virginia Tech, 2005-12-12)Efficient operation of a dynamic system requires (near) optimal real-time control decisions. Those decisions depend on a set of control parameters that change over time. Very often, the optimal decision can be made only with the knowledge of future values of control parameters. As a consequence, the decision process is heuristic in nature. The optimal decision can be determined only after the fact, once the uncertainty is removed. For some types of dynamic systems, the heuristic approach can be very effective. The basic premise is that the future values of control parameters can be predicted with sufficient accuracy. We can either predict those value based on a good model of the system or based on historical data. In many cases, the good model is not available. In that case, prediction using historical data is the only option. It is necessary to detect similarities with the current situation and extrapolate future values. In other words, we need to (quickly) identify patterns in historical data that match the current data pattern. The low sensitivity of the optimal solution is critical. Small variations in data patterns should affect minimally the optimal solution. Resource allocation problems and other "discrete decision systems" are good examples of such systems. The main contribution of this work is a novel heuristic methodology that uses neural networks for classifying, learning and detecting changing patterns, as well as making (near) real-time decisions. We improve on existing approaches by providing a real-time adaptive approach that takes into account changes in system behavior with minimal operational delay without the need for an accurate model. The methodology is validated by extensive simulation and practical measurements. Two metrics are proposed to quantify the quality of control decisions as well as a comparison to the optimal solution.
- An Assessment Methodology for Emergency Vehicle Traffic Signal Priority SystemsMcHale, Gene Michael (Virginia Tech, 2002-02-26)Emergency vehicle traffic signal priority systems allow emergency vehicles such as fire and emergency medical vehicles to request and receive a green traffic signal indication when approaching an intersection. Such systems have been around for a number of years, however, there is little understanding of the costs and benefits of such systems once they are deployed. This research develops an improved method to assess the travel time impacts of emergency vehicle traffic signal priority systems for transportation planning analyses. The research investigates the current state of available methodologies used in assessing the costs and benefits of emergency vehicle traffic signal priority systems. The ITS Deployment Analysis System (IDAS) software is identified as a recently developed transportation planning tool with cost and benefit assessment capabilities for emergency vehicle traffic signal priority systems. The IDAS emergency vehicle traffic signal priority methodology is reviewed and recommendations are made to incorporate the estimation of non-emergency vehicle travel time impacts into the current methodology. To develop these improvements, a simulation analysis was performed to model an emergency vehicle traffic signal priority system under a variety of conditions. The simulation analysis was implemented using the CORSIM traffic simulation software as the tool. Results from the simulation analysis were used to make recommendations for enhancements to the IDAS emergency vehicle traffic signal priority methodology. These enhancements include the addition of non-emergency vehicle travel time impacts as a function of traffic volume on the transportation network. These impacts were relatively small and ranged from a 1.1% to 3.3% travel time increase for a one-hour analysis period to a 0.6% to 1.7% travel time increase for a two-hour analysis period. The enhanced methodology and a sample application of the methodology are presented as the results of this research. In addition, future research activities are identified to further improve assessment capabilities for emergency vehicle traffic signal priority systems.
- Assessment of Dynamic Maintenance ManagementKothari, Vishal Pratap (Virginia Tech, 2004-12-10)Today's technological systems are expected to perform at very high standards throughout their operational phase. The cost associated with unavailability of these systems is very high and especially with the defense systems or medical equipment which can directly affect human lives. The maintenance system plays an important role in achieving higher performance targets. In order to manage maintenance activities in more informed and rational manner, it is very important to understand the inherently complex and dynamic structure of the system. Traditionally maintenance policies are derived from reliability characteristics of individual components or sub-systems. This research makes an attempt to understand the system from the forest level and suggest better maintenance policies for achieving higher availability and lower system degradation. The leverage is gained from System Dynamics framework's ability to model complex systems and capture various feedback loops. The simulation results reveal that with the limited preventive maintenance capacity and within the given assumptions of the model, there exists and optimal preventive maintenance interval which is not the minimum. The simulation results also reflect that frequent preventive maintenance is required at higher load factors.
- An assessment of the quality management practices of a systems integration support organization with respect to the operations of a large-scale request for change (RFC) systemJobes, Gregory B. (Virginia Tech, 1992-05-15)Using a prototype Total Quality Management (TQM) assessment methodology, this project accomplished an assessment of the quality management practices of the General Electric Company Management and Data Systems Operations (M&DSO) Division, Systems Integration Program Department (SIPD) with respect to the operation and management of a large-scale Request for Change System (RFC). In addition, guidelines were proposed for the planning, design, and implementation of a TQM system. These guidelines can be used by SIPD management if they chose to integrate a Total Quality Management System into the organization. An agenda of enhancement opportunities for quality management practices was identified as a result of the assessment. A complete description of the RFC system, a systems analysis of the RFC system, the assessment methodology, and the criteria used to evaluate SIPD's quality management practices is included.
- Capturing multi-stage fuzzy uncertainties in hybrid system dynamics and agent-based models for enhancing policy implementation in health systems researchLiu, Shiyong; Triantis, Konstantinos P.; Zhao, Li; Wang, Youfa (PLOS, 2018-04-25)Background In practical research, it was found that most people made health-related decisions not based on numerical data but on perceptions. Examples include the perceptions and their corresponding linguistic values of health risks such as, smoking, syringe sharing, eating energy-dense food, drinking sugar-sweetened beverages etc. For the sake of understanding the mechanisms that affect the implementations of health-related interventions, we employ fuzzy variables to quantify linguistic variable in healthcare modeling where we employ an integrated system dynamics and agent-based model. Methodology In a nonlinear causal-driven simulation environment driven by feedback loops, we mathematically demonstrate how interventions at an aggregate level affect the dynamics of linguistic variables that are captured by fuzzy agents and how interactions among fuzzy agents, at the same time, affect the formation of different clusters(groups) that are targeted by specific interventions. Results In this paper, we provide an innovative framework to capture multi-stage fuzzy uncertainties manifested among interacting heterogeneous agents (individuals) and intervention decisions that affect homogeneous agents (groups of individuals) in a hybrid model that combines an agent-based simulation model (ABM) and a system dynamics models (SDM). Having built the platform to incorporate high-dimension data in a hybrid ABM/SDM model, this paper demonstrates how one can obtain the state variable behaviors in the SDM and the corresponding values of linguistic variables in the ABM. Conclusions This research provides a way to incorporate high-dimension data in a hybrid ABM/SDM model. This research not only enriches the application of fuzzy set theory by capturing the dynamics of variables associated with interacting fuzzy agents that lead to aggregate behaviors but also informs implementation research by enabling the incorporation of linguistic variables at both individual and institutional levels, which makes unstructured linguistic data meaningful and quantifiable in a simulation environment. This research can help practitioners and decision makers to gain better understanding on the dynamics and complexities of precision intervention in healthcare. It can aid the improvement of the optimal allocation of resources for targeted group (s) and the achievement of maximum utility. As this technology becomes more mature, one can design policy flight simulators by which policy/intervention designers can test a variety of assumptions when they evaluate different alternatives interventions.
- Cluster Algebra: A Query Language for Heterogeneous DatabasesBhasker, Bharat; Egyhazy, Csaba J.; Triantis, Konstantinos P. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1992)This report describes a query language based on algebra for heterogeneous databases. The database logic is used as a uniform framework for studying the heterogeneous databases. The data model based on the database logic is referred to as cluster data model in this report. Generalized Structured Query Language (GSQL) is used for expressing ad-hoc queries over the relational, hierarchical and network database uniformly. For the purpose of query optimization, a query language that can express the primitive heterogeneous database operations is required. This report describes such a query language for the clusters (i.e., heterogeneous databases). The cluster algebra consists of (a) generalized relational operations such as selection, union, intersection, difference, semi-join, rename and cross-product; (b) modified relational operations such as normal projection and normal join; and (c) new operations such as normalize, embed, and unembed.
- Commute Travel Changes and their Duration in Hurricane Sandy's AftermathKontou, Eleftheria (Virginia Tech, 2014-01-31)Hurricane Sandy struck the New York City-New Jersey region on October 29, 2012, with severe consequences to the transportation network, including both the road network and the transit system. This study used survey data from nearly 400 commuters in the New York City Metropolitan Area to determine the transportation disruptions and socio-demographic characteristics associated with travel changes and their duration for the home-to-work commute after Hurricane Sandy. Multi-variable binary logit modeling was used to examine mode shifting, cancelling the trip to work, route changing, and modifying departure time. Transit commuters were more likely to change modes, cancel the trip, and depart earlier. Women were less likely to change modes or depart later. Carpool restrictions encouraged mode changing and earlier departures. Delays/crowding increased the probability of route changes, canceled trips, and earlier departures. Durations of commute travel changes were modeled with accelerated failure time approaches (Weibull distribution). New Jersey Transit disruptions prolonged the time to return to the normal working schedule, telecommuting time, and the time of commuting patterns alterations. Gasoline purchase restrictions extended commuting delays and the duration of alteration of normal commute patterns but decreased the duration of the change of working schedule and location. The mode used under normal commute conditions did not have an impact on the duration of the changes, even though it has a significant impact on the selected changes. The results underline the need for policy makers to account for mode-specific populations and lower income commuters during post-disaster recovery periods.
- A Complex Adaptive Systems Analysis of Productive EfficiencyDougherty, Francis Laverne (Virginia Tech, 2014-10-17)Linkages between Complex Adaptive Systems (CAS) thinking and efficiency analysis remain in their infancy. This research associates the basic building blocks of the CAS 'flocking' metaphor with the essential building block concepts of Data Envelopment Analysis (DEA). Within a proposed framework DEA "decision-making units" (DMUs) are represented as agents in the agent-based modeling (ABM) paradigm. Guided by simple rules, agent DMUs representing business units of a larger management system, 'align' with one another to achieve mutual protection/risk reduction and 'cohere' with the most efficient DMUs among them to achieve the greatest possible efficiency in the least possible time. Analysis of the resulting patterns of behavior can provide policy insights that are both evidence-based and intuitive. This research introduces a consistent methodology that will be called here the Complex Adaptive Productive Efficiency Method (CAPEM) and employs it to bridge these domains. This research formalizes CAPEM mathematically and graphically. It then conducts experimentation employing using the resulting CAPEM simulation using data of a sample of electric power plants obtained from Rungsuriyawiboon and Stefanou (2003). Guided by rules, individual agent DMUs (power plants) representing business units of a larger management system,'align' with one another to achieve mutual protection/risk reduction and 'cohere' with the most efficient DMUs among them to achieve the greatest possible efficiency in the least possible time. Using a CAS ABM simulation, it is found that the flocking rules (alignment, cohesion and separation), taken individually and in selected combinations, increased the mean technical efficiency of the power plant population and conversely decreased the time to reach the frontier. It is found however that these effects were limited to a smaller than expected sub-set of these combinations of the flocking factors. Having been successful in finding even a limited sub-set of flocking rules that increased efficiency was sufficient to support the hypotheses and conclude that employing the flocking metaphor offers useful options to decision-makers for increasing the efficiency of management systems.
- Computer integrated machining parameter selection in a job shop using expert systems and algorithmsGopalakrishnan, B. (Virginia Polytechnic Institute and State University, 1988)The research for this dissertation is focused on the selection of machining parameters for a job shop using expert systems and algorithms. The machining processes are analyzed in detail and rule based expert systems are developed for the analysis of process plans based on operation and work-material compatibility, the selection of machines, cutting tools, cutting fluids, and tool angles. Data base design is examined for this problem. Algorithms are developed to evaluate the selection of machines and cutting tools based on cost considerations. An algorithm for optimizing cutting conditions in turning operations has been developed. Data framework and evaluation procedures are developed for other machining operations involving different types of machines and tools.
- Cyrano: a meta model for federated database systemsDzikiewicz, Joseph (Virginia Tech, 1996-05-01)The emergence of new data models requires further research into federated database systems. A federated database system (FDBS) provides uniform access to multiple heterogeneous databases. Most FDBS's provide access to only the older data models such as relational, hierarchical, and network models. A federated system requires a meta data model. The meta model is a uniform data model through which users access data regardless of the data model of the data's native database. This dissertation examines the question of meta models for use in an FDBS that provides access to relational, object oriented, and rule based databases. This dissertation proposes Cyrano, a hybrid of object oriented and rule based data models. The dissertation demonstrates that Cyrano is suitable as a meta model by showing that Cyrano satisfies the following three criteria: 1) Cyrano fully supports relational, object oriented, and rule based member data models. 2) Cyrano provides sufficient capabilities to support integration of heterogeneous databases. 3) Cyrano can be implemented as the meta model of an operational FDBS. This dissertation describes four primary products of this research: 1) The dissertation presents Cyrano, a meta model designed as part of this research that supports both the older and the newer data models. Cyrano is an example of analytic object orientation. Analytic object orientation is a conceptual approach that combines elements of object oriented and rule based data models. 2) The dissertation describes Roxanne, a proof-of-concept FDBS that uses Cyrano as its meta model. 3) The dissertation proposes a set of criteria for the evaluation of meta models. The dissertation uses these criteria to demonstrate Cyrano's Suitability as a meta model. 4) The dissertation presents an object oriented FDBS reference architecture suitable for use in describing and designing an FDBS.
- Design in the Modern Age: Investigating the Role of Complexity in the Performance of Collaborative Engineering Design TeamsAmbler, Nathaniel Palenaka (Virginia Tech, 2015-06-12)The world of engineering design finds itself at a crossroads. The technical and scientifically rooted tools that propelled humankind into the modern age are now insufficient as evidenced by a growing number of failures to meet design expectations and to deliver value for users and society in general. In the empirical world, a growing consensus among many design practitioners has emerged that engineering design efforts are becoming too unmanageable and too complex for existing design management systems and tools. One of the key difficulties of engineering design is the coordination and management of the underlying collaboration processes. Development efforts that focus on the design of complex artefacts, such as a satellite or information system, commonly require the interaction of hundreds to thousands of different disciplines. What makes these efforts and the related collaboration processes complex from the perspective of many practitioners is the strong degree of interdependency between design decision-making occurring, often concurrently, across multiple designers who commonly reside in different organizational settings. Not only must a design account for and satisfice these dependencies, but it must remain also acceptable to all design participants. Design in effect represents a coevolution between the problem definition and solution, with a finalized design approach arising not from a repeatable series of mathematical optimizations but rather through the collective socio-technical design activities of a large collaboration of designers. Despite the importance of understanding design as a socio-technical decision-making entity, many of the existing design approaches ignore socio-technical issues and often view them as either too imprecise or too difficult to consider. This research provides a performance measurement framework to explore these factors by investigating design as a socio-technical complex adaptive collaborative process between the designer, artefact, and user (DAU). The research implements this framework through an agent-based model, the Complex Adaptive Performance Evaluation Method for Collaboration Design (C2D). This approach allows a design management analyst to generate insights about potential design strategies and mechanisms as they relate to design complexity by examining the simulated performance of a design collaboration as it explores theoretical design fitness landscapes with various degrees of ruggedness.
- The Design of an Urban Roadside Automatic Sprinkling System: Mitigation of PM2.5–10 in Ambient Air in MegacitiesLiu, Shiyong; Triantis, Konstantinos P.; Zhang, Lan (Hindawi, 2014-07-23)The objective of this research paper is to describe the system architecture for an urban roadside automatic mist-generating system. Its primary purpose is to mitigate particulate matter especially PM2.5–10. In this paper, four graphs are provided to exhibit the constituent elements of this system. This paper also discusses the functional extensions of this system for alternative uses in civil engineering which include winter road deicing and desnowing with added salt; clean-up of street dust; lowering of temperature of a “hot island” during the summer; the addition of humidity in an arid area; and the suppression of flu virus in the winter season. The structure and function of this system are comprehensively discussed in this paper. This system is compared to existing and other proposed systems in terms of control options, efficiency, and primary functional issues. The unique design of the road automatic sprinkling system renders itself a prominent option. Although there are no data available for this conceptual system, some expected qualitative and quantitative outcomes are provided and justified. The paper concludes with some potential research areas and challenges associated with this system architecture.