Browsing by Author "Mortveit, Henning S."
Now showing 1 - 20 of 20
Results Per Page
Sort Options
- The Algebra of Systems BiologyVeliz-Cuba, Alan A. (Virginia Tech, 2010-07-05)In order to understand biochemical networks we need to know not only how their parts work but also how they interact with each other. The goal of systems biology is to look at biological systems as a whole to understand how interactions of the parts can give rise to complex dynamics. In order to do this efficiently, new techniques have to be developed. This work shows how tools from mathematics are suitable to study problems in systems biology such as modeling, dynamics prediction, reverse engineering and many others. The advantage of using mathematical tools is that there is a large number of theory, algorithms and software available. This work focuses on how algebra can contribute to answer questions arising from systems biology.
- Algebraic Methods for Modeling Gene Regulatory NetworksMurrugarra Tomairo, David M. (Virginia Tech, 2012-07-18)So called discrete models have been successfully used in engineering and computational systems biology. This thesis discusses algebraic methods for modeling and analysis of gene regulatory networks within the discrete modeling context. The first chapter gives a background for discrete models and put in context some of the main research problems that have been pursued in this field for the last fifty years. It also outlines the content of each subsequent chapter. The second chapter focuses on the problem of inferring dynamics from the structure (topology) of the network. It also discusses the characterization of the attractor structure of a network when a particular class of functions control the nodes of the network. Chapters~3 and 4 focus on the study of multi-state nested canalyzing functions as biologically inspired functions and the characterization of their dynamics. Chapter 5 focuses on stochastic methods, specifically on the development of a stochastic modeling framework for discrete models. Stochastic discrete modeling is an alternative approach from the well-known mathematical formalizations such as stochastic differential equations and Gillespie algorithm simulations. Within the discrete setting, a framework that incorporates propensity probabilities for activation and degradation is presented. This approach allows a finer analysis of discrete models and provides a natural setup for cell population simulations. Finally, Chapter 6 discusses future research directions inspired by the work presented here.
- Architectural Enhancements to Increase Trust in Cyber-Physical Systems Containing Untrusted Software and HardwareFarag, Mohammed Morsy Naeem (Virginia Tech, 2012-09-17)Embedded electronics are widely employed in cyber-physical systems (CPSes), which tightly integrate and coordinate computational and physical elements. CPSes are extensively deployed in security-critical applications and nationwide infrastructure. Perimeter security approaches to preventing malware infiltration of CPSes are challenged by the complexity of modern embedded systems incorporating numerous heterogeneous and updatable components. Global supply chains and third-party hardware components, tools, and software limit the reach of design verification techniques and introduce security concerns about deliberate Trojan inclusions. As a consequence, skilled attacks against CPSes have demonstrated that these systems can be surreptitiously compromised. Existing run-time security approaches are not adequate to counter such threats because of either the impact on performance and cost, lack of scalability and generality, trust needed in global third parties, or significant changes required to the design flow. We present a protection scheme called Run-time Enhancement of Trusted Computing (RETC) to enhance trust in CPSes containing untrusted software and hardware. RETC is complementary to design-time verification approaches and serves as a last line of defense against the rising number of inexorable threats against CPSes. We target systems built using reconfigurable hardware to meet the flexibility and high-performance requirements of modern security protections. Security policies are derived from the system physical characteristics and component operational specifications and translated into synthesizable hardware integrated into specific interfaces on a per-module or per-function basis. The policy-based approach addresses many security challenges by decoupling policies from system-specific implementations and optimizations, and minimizes changes required to the design flow. Interface guards enable in-line monitoring and enforcement of critical system computations at run-time. Trust is only required in a small set of simple, self-contained, and verifiable guard components. Hardware trust anchors simultaneously addresses the performance, flexibility, developer productivity, and security requirements of contemporary CPSes. We apply RETC to several CPSes having common security challenges including: secure reconfiguration control in reconfigurable cognitive radio platforms, tolerating hardware Trojan threats in third-party IP cores, and preserving stability in process control systems. High-level architectures demonstrated with prototypes are presented for the selected applications. Implementation results illustrate the RETC efficiency in terms of the performance and overheads of the hardware trust anchors. Testbenches associated with the addressed threat models are generated and experimentally validated on reconfigurable platform to establish the protection scheme efficacy in thwarting the selected threats. This new approach significantly enhances trust in CPSes containing untrusted components without sacrificing cost and performance.
- Assessment of the Jones Act Waiver Process on Freight Transportation Networks Experiencing DisruptionFialkoff, Marc Richard (Virginia Tech, 2017-10-27)In October 2012, Hurricane Sandy caused massive disruption and destruction to the Mid-Atlantic region of the United States. The intensity of the storm forced the Port of New York and New Jersey to close, forcing cargo diversion to the Port of Norfolk in Virginia. Because of the Jones Act restriction on foreign vessels moving between U.S. ports, the restriction on short sea shipping was viewed as a barrier to recovery. Much of the critical infrastructure resilience and security literature focuses on the "hardening" of physical infrastructure, but not the relationship between law, policy, and critical infrastructure. Traditional views of transportation systems do not adequately address questions of governance and behaviors that contribute to resilience. In contrast, recent development of a System of Systems framework provides a conceptual framework to study the relationship of law and policy systems to the transportation systems they govern. Applying a System of Systems framework, this research analyzed the effect of relaxing the Jones Act on freight transportation networks experiencing a disruptive event. Using WebTRAGIS (Transportation Routing Analysis GIS), the results of the research demonstrate that relaxing the Jones Act had a marginal reduction on highway truck traffic and no change in rail traffic volume in the aftermath of a disruption. The research also analyzed the Jones Act waiver process and the barriers posed by the legal process involved in administration and review for Jones Act waivers. Recommendations on improving the waiver process include greater agency coordination and formal rulemaking to ensure certainty with the waiver process. This research is the first in studying the impact of the Jones Act on a multimodal freight transportation network. Likewise, the use of the System of Systems framework to conceptualize the law and a critical infrastructure system such as transportation provides future opportunities for studying different sets of laws and policies on infrastructure. This research externalizes law and policy systems from the transportation systems they govern. This can provide policymakers and planners with an opportunity to understand the impact of law and policy on the infrastructure systems they govern.
- Complex situation analysis system that generates a social contact network, uses edge brokers and service brokers, and dynamically adds brokers(United States Patent and Trademark Office, 2013-04-16)A system for generating a representation of a situation is disclosed. The system comprises one or more computer-readable media including computer-executable instructions that are executable by one or more processors to implement a method of generating a representation of a situation. The method comprises receiving input data regarding a target population. The method further comprises constructing a synthetic data set including a synthetic population based on the input data. The synthetic population includes a plurality of synthetic entities. Each synthetic entity has a one-to-one correspondence with an entity in the target population. Each synthetic entity is assigned one or more attributes based on information included in the input data. The method further comprises receiving activity data for a plurality of entities in the target population.
- Computational Framework for Uncertainty Quantification, Sensitivity Analysis and Experimental Design of Network-based Computer Simulation ModelsWu, Sichao (Virginia Tech, 2017-08-29)When capturing a real-world, networked system using a simulation model, features are usually omitted or represented by probability distributions. Verification and validation (V and V) of such models is an inherent and fundamental challenge. Central to V and V, but also to model analysis and prediction, are uncertainty quantification (UQ), sensitivity analysis (SA) and design of experiments (DOE). In addition, network-based computer simulation models, as compared with models based on ordinary and partial differential equations (ODE and PDE), typically involve a significantly larger volume of more complex data. Efficient use of such models is challenging since it requires a broad set of skills ranging from domain expertise to in-depth knowledge including modeling, programming, algorithmics, high- performance computing, statistical analysis, and optimization. On top of this, the need to support reproducible experiments necessitates complete data tracking and management. Finally, the lack of standardization of simulation model configuration formats presents an extra challenge when developing technology intended to work across models. While there are tools and frameworks that address parts of the challenges above, to the best of our knowledge, none of them accomplishes all this in a model-independent and scientifically reproducible manner. In this dissertation, we present a computational framework called GENEUS that addresses these challenges. Specifically, it incorporates (i) a standardized model configuration format, (ii) a data flow management system with digital library functions helping to ensure scientific reproducibility, and (iii) a model-independent, expandable plugin-type library for efficiently conducting UQ/SA/DOE for network-based simulation models. This framework has been applied to systems ranging from fundamental graph dynamical systems (GDSs) to large-scale socio-technical simulation models with a broad range of analyses such as UQ and parameter studies for various scenarios. Graph dynamical systems provide a theoretical framework for network-based simulation models and have been studied theoretically in this dissertation. This includes a broad range of stability and sensitivity analyses offering insights into how GDSs respond to perturbations of their key components. This stability-focused, structure-to-function theory was a motivator for the design and implementation of GENEUS. GENEUS, rooted in the framework of GDS, provides modelers, experimentalists, and research groups access to a variety of UQ/SA/DOE methods with robust and tested implementations without requiring them to necessarily have the detailed expertise in statistics, data management and computing. Even for research teams having all the skills, GENEUS can significantly increase research productivity.
- Epidemiology Experimentation and Simulation Management through Scientific Digital LibrariesLeidig, Jonathan Paul (Virginia Tech, 2012-07-20)Advances in scientific data management, discovery, dissemination, and sharing are changing the manner in which scientific studies are being conducted and repurposed. Data-intensive scientific practices increasingly require data management related services not available in existing digital libraries. Complicating the issue are the diversity of functional requirements and content in scientific domains as well as scientists' lack of expertise in information and library sciences. Researchers that utilize simulation and experimentation systems need digital libraries to maintain datasets, input configurations, results, analyses, and related documents. A digital library may be integrated with simulation infrastructures to provide automated support for research components, e.g., simulation interfaces to models, data warehouses, simulation applications, computational resources, and storage systems. Managing and provisioning simulation content allows streamlined experimentation, collaboration, discovery, and content reuse within a simulation community. Formal definitions of this class of digital libraries provide a foundation for producing a software toolkit and the semi-automated generation of digital library instances. We present a generic, component-based SIMulation-supporting Digital Library (SimDL) framework. The framework is formally described and provides a deployable set of domain-free services, schema-based domain knowledge representations, and extensible lower and higher level service abstractions. Services in SimDL are specialized for semi-structured simulation content and large-scale data producing infrastructures, as exemplified in data storage, indexing, and retrieval service implementations. Contributions to the scientific community include previously unavailable simulation-specific services, e.g., incentivizing public contributions, semi-automated content curating, and memoizing simulation-generated data products. The practicality of SimDL is demonstrated through several case studies in computational epidemiology and network science as well as performance evaluations.
- Generalizations of Threshold Graph Dynamical SystemsKuhlman, Christopher James (Virginia Tech, 2013-05-02)Dynamics of social processes in populations, such as the spread of emotions, influence, language, mass movements, and warfare (often referred to individually and collectively as contagions), are increasingly studied because of their social, political, and economic impacts. Discrete dynamical systems (discrete in time and discrete in agent states) are often used to quantify contagion propagation in populations that are cast as graphs, where vertices represent agents and edges represent agent interactions. We refer to such formulations as graph dynamical systems. For social applications, threshold models are used extensively for agent state transition rules (i.e., for vertex functions). In its simplest form, each agent can be in one of two states (state 0 (1) means that an agent does not (does) possess a contagion), and an agent contracts a contagion if at least a threshold number of its distance-1 neighbors already possess it. The transition to state 0 is not permitted. In this study, we extend threshold models in three ways. First, we allow transitions to states 0 and 1, and we study the long-term dynamics of these bithreshold systems, wherein there are two distinct thresholds for each vertex; one governing each of the transitions to states 0 and 1. Second, we extend the model from a binary vertex state set to an arbitrary number r of states, and allow transitions between every pair of states. Third, we analyze a recent hierarchical model from the literature where inputs to vertex functions take into account subgraphs induced on the distance-1 neighbors of a vertex. We state, prove, and analyze conditions characterizing long-term dynamics of all of these models.
- High Performance Computational Social Science Modeling of Networked PopulationsKuhlman, Christopher J. (Virginia Tech, 2013-07-17)Dynamics of social processes in populations, such as the spread of emotions, influence, opinions, and mass movements (often referred to individually and collectively as contagions), are increasingly studied because of their economic, social, and political impacts. Moreover, multiple contagions may interact and hence studying their simultaneous evolution is important. Within the context of social media, large datasets involving many tens of millions of people are leading to new insights into human behavior, and these datasets continue to grow in size. Through social media, contagions can readily cross national boundaries, as evidenced by the 2011 Arab Spring. These and other observations guide our work. Our goal is to study contagion processes at scale with an approach that permits intricate descriptions of interactions among members of a population. Our contributions are a modeling environment to perform these computations and a set of approaches to predict contagion spread size and to block the spread of contagions. Since we represent populations as networks, we also provide insights into network structure effects, and present and analyze a new model of contagion dynamics that represents a person\'s behavior in repeatedly joining and withdrawing from collective action. We study variants of problems for different classes of social contagions, including those known as simple and complex contagions.
- Mathematical frameworks for quantitative network analysisBura, Cotiso Andrei (Virginia Tech, 2019-10-22)This thesis is comprised of three parts. The first part describes a novel framework for computing importance measures on graph vertices. The concept of a D-spectrum is introduced, based on vertex ranks within certain chains of nested sub-graphs. We show that the D- spectrum integrates the degree distribution and coreness information of the graph as two particular such chains. We prove that these spectra are realized as fixed points of certain monotone and contractive SDSs we call t-systems. Finally, we give a vertex deletion algorithm that efficiently computes D-spectra, and we illustrate their correlation with stochastic SIR-processes on real world networks. The second part deals with the topology of the intersection nerve for a bi-secondary structure, and its singular homology. A bi-secondary structure R, is a combinatorial object that can be viewed as a collection of cycles (loops) of certain at most tetravalent planar graphs. Bi-secondary structures arise naturally in the study of RNA riboswitches - molecules that have an MFE binary structural degeneracy. We prove that this loop nerve complex has a euclidean 3-space embedding characterized solely by H2(R), its second homology group. We show that this group is the only non-trivial one in the sequence and furthermore it is free abelian. The third part further describes the features of the loop nerve. We identify certain disjoint objects in the structure of R which we call crossing components (CC). These are non-trivial connected components of a graph that captures a particular non-planar embedding of R. We show that each CC contributes a unique generator to H2(R) and thus the total number of these crossing components in fact equals the rank of the second homology group.
- Modeling, Analysis and Comparison of Large Scale Social Contact Networks on Epidemic StudiesXia, Huadong (Virginia Tech, 2015-04-07)Social contact networks represent proximity relationships between individual agents. Such networks are useful in diverse applications, including epidemiology, wireless networking and urban resilience. The vertices of a social contact network represent individual agents (e.g. people). Time varying edges represent time varying proximity relationship. The networks are relational -- node and edge labels represent important demographic, spatial and temporal attributes. Synthesizing social contact networks that span large urban regions is challenging for several reasons including: spatial, temporal and relational variety of data sources, noisy and incomplete data, and privacy and confidentiality requirements. Moreover, the synthesized networks differ due to the data and methods used to synthesize them. This dissertation undertakes a systematic study of synthesizing urban scale social contact networks within the specific application context of computational epidemiology. It is motivated by three important questions: (i) How does one construct a realistic social contact network that is adaptable to different levels of data availability? (ii) How does one compare different versions of the network for a given region, and what are appropriate metrics when comparing the relational networks? (iii) When does a network have adequate structural details for the specific application we have. We study these questions by synthesizing three social contact networks for Delhi, India. Our case study suggests that we can iteratively improve the quality of a network by adapting to the best data sources available within a framework. The networks differ by the data and the models used. We carry out detailed comparative analyses of the networks. The analysis has three components: (i) structure analysis that compares the structural properties of the networks, (ii) dynamics analysis that compares the epidemic dynamics on these networks and (iii) policy analysis that compares the efficacy of various interventions. We have proposed a framework to systematically analyze how details in networks impact epidemic dynamics over these networks. The results suggest that a combination of multi-level metrics instead of any individual one should be used to compare two networks. We further investigate the sensitivity of these models. The study reveals the details necessary for particular class of control policies. Our methods are entirely general and can be applied to other areas of network science.
- A Multi-platform Job Submission System for Epidemiological Simulation ModelsMudgal, Kunal Rajendra (Virginia Tech, 2011-07-22)In the current era of computing, the emergence of middle-ware software has resulted into a high level of software abstraction which has contributed to the diffusion of distributed computing. Distributed frameworks are systems comprised of many computing resources that are coupled together to perform one or more tasks. In a scientific grid-like environment, there are many different applications which require vast computing resources for simulation, execution, or analysis. Some of the applications require faster results, while other rely on more resources for execution. A configurable job submission system helps in distributing the tasks among the resources depending on their preference, priority and usage in a distributed environment. We implemented a job submission system using JavaSpace which can be used for simulating jobs according to their priority. Two different types of brokers are used to monitor the JavaSpace. These brokers can either be used to compute results for tasks that require faster results using binaries installed on a stand-alone system or they can be used to submit the task to a BOINC system for computation. The flexibility to submit tasks as per their priority would help users to get faster results. Upon completion of the tasks, the brokers updates the jobs and transfers the results for further processing thus completing a cycle of retrieving a job, computing the data and transferring the results back to the user. The generic nature of the framework makes it very simple to add new services which can perform a variety of tasks, making the system highly modular and easily extensible. Multiple brokers doing the same or different tasks can run on the same or different systems allowing the users to make efficient use of their resources. The brokers can be configured to detect the existing system and start monitoring jobs of different types. The framework can be used to transfer the results or detect any failures while execution and report it back to the user. The simple modular design and strong high-level JavaSpace API make it very easy to develop new systems for the framework.
- Plane Permutations and their Applications to Graph Embeddings and Genome RearrangementsChen, Xiaofeng (Virginia Tech, 2017-04-27)Maps have been extensively studied and are important in many research fields. A map is a 2-cell embedding of a graph on an orientable surface. Motivated by a new way to read the information provided by the skeleton of a map, we introduce new objects called plane permutations. Plane permutations not only provide new insight into enumeration of maps and related graph embedding problems, but they also provide a powerful framework to study less related genome rearrangement problems. As results, we refine and extend several existing results on enumeration of maps by counting plane permutations filtered by different criteria. In the spirit of the topological, graph theoretical study of graph embeddings, we study the behavior of graph embeddings under local changes. We obtain a local version of the interpolation theorem, local genus distribution as well as an easy-to-check necessary condition for a given embedding to be of minimum genus. Applying the plane permutation paradigm to genome rearrangement problems, we present a unified simple framework to study transposition distances and block-interchange distances of permutations as well as reversal distances of signed permutations. The essential idea is associating a plane permutation to a given permutation or signed permutation to sort, and then applying the developed plane permutation theory.
- Register Transfer Level Simulation Acceleration via Hardware/Software Process MigrationBlumer, Aric David (Virginia Tech, 2007-10-15)The run-time reconfiguration of Field Programmable Gate Arrays (FPGAs) opens new avenues to hardware reuse. Through the use of process migration between hardware and software, an FPGA provides a parallel execution cache. Busy processes can be migrated into hardware-based, parallel processors, and idle processes can be migrated out increasing the utilization of the hardware. The application of hardware/software process migration to the acceleration of Register Transfer Level (RTL) circuit simulation is developed and analyzed. RTL code can exhibit a form of locality of reference such that executing processes tend to be executed again. This property is termed executive temporal locality, and it can be exploited by migration systems to accelerate RTL simulation. In this dissertation, process migration is first formally modeled using Finite State Machines (FSMs). Upon FSMs are built programs, processes, migration realms, and the migration of process state within a realm. From this model, a taxonomy of migration realms is developed. Second, process migration is applied to the RTL simulation of digital circuits. The canonical form of an RTL process is defined, and transformations of HDL code are justified and demonstrated. These transformations allow a simulator to identify basic active units within the simulation and combine them to balance the load across a set of processors. Through the use of input monitors, executive locality of reference is identified and demonstrated on a set of six RTL designs. Finally, the implementation of a migration system is described which utilizes Virtual Machines (VMs) and Real Machines (RMs) in existing FPGAs. Empirical and algorithmic models are developed from the data collected from the implementation to evaluate the effect of optimizations and migration algorithms.
- Simulating the Spread of Malaria: A Cellular Automaton Based Mathematical Model & A Prototype Software ImplementationMerchant, Farid (Virginia Tech, 2007-02-05)Every year three million deaths are attributed to malaria, of which one-third are of children. Malaria is a vector-borne disease, where a mosquito acts as the vector that transmits the disease. In the last few years, computer simulation based models have been used effectively to study the vector population dynamics and control strategies of vector-borne diseases. Typically, these models use ordinary differential equations to simulate the spread of malaria. Although these models provide a powerful mechanism to study the spread of malaria, they have several shortcomings. The research in this thesis focuses on creating a simulation model based on the framework of cellular automata, which addresses many shortcomings of previous models. Cellular automata are dynamical systems, which are discrete in time and space. The implementation of the model proposed can easily be integrated with EpiSims/TRANSIMS. EpiSims is an epidemiological modeling tool for studying the spread of infectious diseases; it uses social contact network from TRANSIMS (A Transport Analysis and Simulation System). Simulation results from the prototype implementation showed qualitatively correct results for vector densities, diffusion and epidemiological curves.
- Stability in Graph Dynamical SystemsMcnitt, Joseph Andrew (Virginia Tech, 2018-06-20)The underlying mathematical model of many simulation models is graph dynamical systems (GDS). This dynamical system, its implementation, and analyses on each will be the focus of this paper. When using a simulation model to answer a research question, it is important to describe this underlying mathematical model in which we are operating for verification and validation. In this paper we discuss analyses commonly used in simulation models. These include sensitivity analyses and uncertainty quantification, which provide motivation for stability and structure-to-function research in GDS. We review various results in these areas, which contribute toward validation and computationally tractable analyses of our simulation model. We then present two new areas of research - stability of transient structure with respect to update order permutations, and an application of GDS in which a time-varying generalized cellular automata is implemented as a simulation model.
- Stochastic Computer Model Calibration and Uncertainty QuantificationFadikar, Arindam (Virginia Tech, 2019-07-24)This dissertation presents novel methodologies in the field of stochastic computer model calibration and uncertainty quantification. Simulation models are widely used in studying physical systems, which are often represented by a set of mathematical equations. Inference on true physical system (unobserved or partially observed) is drawn based on the observations from corresponding computer simulation model. These computer models are calibrated based on limited ground truth observations in order produce realistic predictions and associated uncertainties. Stochastic computer model differs from traditional computer model in the sense that repeated execution results in different outcomes from a stochastic simulation. This additional uncertainty in the simulation model requires to be handled accordingly in any calibration set up. Gaussian process (GP) emulator replaces the actual computer simulation when it is expensive to run and the budget is limited. However, traditional GP interpolator models the mean and/or variance of the simulation output as function of input. For a simulation where marginal gaussianity assumption is not appropriate, it does not suffice to emulate only the mean and/or variance. We present two different approaches addressing the non-gaussianity behavior of an emulator, by (1) incorporating quantile regression in GP for multivariate output, (2) approximating using finite mixture of gaussians. These emulators are also used to calibrate and make forward predictions in the context of an Agent Based disease model which models the Ebola epidemic outbreak in 2014 in West Africa. The third approach employs a sequential scheme which periodically updates the uncertainty inn the computer model input as data becomes available in an online fashion. Unlike other two methods which use an emulator in place of the actual simulation, the sequential approach relies on repeated run of the actual, potentially expensive simulation.
- Systems Uncertainty in Systems Biology & Gene Function PredictionFalin, Lee J. (Virginia Tech, 2011-03-22)The widespread use of high-throughput experimental assays designed to measure the entire complement of a cells genes or gene products has led to vast stores of data which are extremely plentiful in terms of the number of items they can measure in a single sample, yet often sparse in the number of samples per experiment due to their high cost. This often leads to datasets where the number of treatment levels or time points sampled is limited, or where there are very small numbers of technical and/or biological replicates. If the goal is to use this data to infer network models, these sparse datasets can lead to under-determined systems. While model parameter variation and its effects on model robustness has been well studied, most of this work has looked exclusively at accounting for variation only from measurement error. In contrast, little work has been done to isolate and quantify the amount of parameter variation caused by the uncertainty in the unmeasured regions of time course experiments. Here we introduce a novel algorithm to quantify the uncertainty in the unmeasured inter- vals between biological measurements taken across a set of quantitative treatments. The algorithm provides a probabilistic distribution of possible gene expression values within un- measured intervals, based on a plausible biological constraint. We show how quantification of this uncertainty can be used to guide researchers in further data collection by identifying which samples would likely add the most information to the system under study. We also present an application of this method to isolate and quantify two distinct sources of model parameter variation. In the concluding chapter we discuss another source of uncertainty in systems biology, namely gene function prediction, and compare several algorithms designed for that purpose.
- Towards a calculus of biological networksReidys, Christian Michael; Mortveit, Henning S. (2002)In this paper we present a new framework for studying the dynamics of biological networks. A specific class of dynamical systems, Sequential Dynamical Systems (SDS), is introduced. These systems allow one to investigate the interplay between structural properties of the network and its phase space. We will show in detail how to find a reduced system that captures key features of a given system. This reduction is based on a special graph-theoretic relation between the two networks. We will study the reduction of SDS over n-cubes in detail and we will present several examples.
- Towards Support of Visual Analytics for Synthetic InformationAgashe, Aditya Vidyanand (Virginia Tech, 2015-09-15)This thesis describes a scalable system for visualizing and exploring global synthetic populations. The implementation described in this thesis addresses the following existing limitations of the Syn- thetic Information Viewer (SIV): (i) it adds ability to support synthetic populations for the entire globe by resolving data inconsistencies, (ii) introduces opportunities to explore and find patterns in the data, and (iii) allows the addition of new synthetic population centers with minimal effort. We propose the following extensions to the system: (i) Data Registry: an abstraction layer for handling heterogeneity of data across countries, and adding new population centers for visualizations, and (ii) Visual Query Interface: for exploring and analyzing patterns to gain insights. With these additions, our system is capable of visual exploration and querying of heterogeneous, temporal, spatial and social data for 14 countries with a total population of 830 million. Work in this thesis takes a step towards providing visual analytics capability for synthetic information. This system will assist urban planners, public health analysts, and, any individuals interested in socially-coupled systems, by empowering them to make informed decisions through exploration of synthetic information.