Browsing by Author "Nance, Richard E."
Now showing 1 - 20 of 108
Results Per Page
Sort Options
- The Abstraction Refinement Model and the Modification-Cost ProblemKeller, Benjamin J.; Nance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1992)A problem common to systems and software engineering is that of estimating the cost of making changes to a system. For system modifications that include changes to the design history of the system this is the "modification-cost" problem. A solution to this problem is important to planning changes in large systems engineering projects. In this paper, a cost model based on the Abstraction Refinement Model (ARM) is proposed as a framework for deriving solutions to the modification-cost problem. The ARM is a characterization of software evolution that is also applicable to general systems. Modifications to systems and their design histories are described using the components of the ARM. The cost model is defined by functions on the ARM components. The derived solution is given by an abstract expression of the cost functions.
- Acquisition of an Interactive Computing System for Academic Use: A Case StudyHeafner, John F.; Nance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1981)The acquisition of a large-scale computer system is a complex and important task that universities face periodically. The large capital expenditures and the always expanding need for computing services ensure an increasing importance. Virginia Tech recently made such an acquisition. This paper describes the evaluation procedures leading to the acquisition, beginning with needs assessment and concluding with system selection. The acquisition of a computing system, in this instance a system primarily for interactive instructional support of undergraduates, is a decision that is subject to a variety of influences technical, managerial, political, and personal. This paper describes the authors' attempts (as then Associate Director of the Computing Center and then Head of the Computer Science Department, respectively) to deal with these influences through the use of quantitative techniques, behavioral analysis, and common sense.
- Actor systems platform design and implementation of the actor paradigm in a distributed object-oriented environmentJoshi, Nandan (Virginia Tech, 1993-08-05)This project was undertaken as part of an effort to explore the design of object -oriented systems that are distributed, concurrent, real-time and/or embedded in nature. This work seeks to integrate the concurrency features of the actor model in a distributed, object oriented environment, ESP. The integrated system, called the Actor Systems Platform (ASP), provides a platform for designing concurrent, distributed applications. The actor model provides a mechanism for expressing the inherent concurrency in an application. The concurrency in the application can be exploited by the distributed features available in ESP. The actor abstraction in ASP is provided by a application-level class hierarchy in ESP. The message passing semantics of the actor model are implemented by using special operator overloading in C++. Cboxes are implemented to provide a synchronization mechanism and a means of returning replies. In a concurrent system, simultaneous execution of an object's methods can cause its state to be inconsistent. This is prevented by providing a method locking mechanism using behavior sets. While integrating the concurrency features of the actor model in an object-oriented environment, differences were encountered in determining the invocation semantics of the actor model and those of inherited methods. The problem is investigated and a taxonomy of solutions is presented.
- An Adaptive Time Window Algorithm for Large Scale Network EmulationKodukula, Surya Ravikiran (Virginia Tech, 2002-01-25)With the continuing growth of the Internet and network protocols, there is a need for Protocol Development Environments. Simulation environments like ns and OPNET require protocol code to be rewritten in a discrete event model. Direct Code Execution Environments (DCEE) solve the Verification and Validation problems by supporting the execution of unmodified protocol code in a controlled environment. Open Network Emulator (ONE) is a system supporting Direct Code Execution in a parallel environment - allowing unmodified protocol code to run on top of a parallel simulation layer, capable of simulating complex network topologies. Traditional approaches to the problem of Parallel Discrete Event Simulation (PDES) broadly fall into two categories. Conservative approaches allow processing of events only after it has been asserted that the event handling would not result in a causality error. Optimistic approaches allow for causality errors and support means of restoring state — i.e., rollback. All standard approaches to the problem of PDES are either flawed by their assumption of existing event patterns in the system or cannot be applied to ONE due to their restricted analysis on simplified models like queues and Petri-nets. The Adaptive Time Window algorithm is a bounded optimistic parallel simulation algorithm with the capability to change the degree of optimism with changes in the degree of causality in the network. The optimism at any instant is bounded by the amount of virtual time called the time window. The algorithm assumes efficient rollback capabilities supported by the â Weaves' framework. The algorithm is reactive and responds to changes in the degree of causality in the system by adjusting the length of its time window. With sufficient history gathered the algorithm adjusts to the increasing causality in the system with a small time window (conservative approach) and increases to a higher value (optimistic approach) during idle periods. The problem of splitting the entire simulation run into time windows of arbitrary length, whereby the total number of rollbacks in the system is minimal, is NP-complete. The Adaptive Time Window algorithm is compared against offline greedy approaches to the NP-complete problem called Oracle Computations. The total number of rollbacks in the system and the total execution time for the Adaptive Time Window algorithm were comparable to the ones for Oracle Computations.
- An algebraic model of software evolutionKeller, Benjamin J. (Virginia Tech, 1990-09-05)A model of the software evolution process, called the Abstraction Refinement Model, is described which builds on the algebraic influence of the Laws of Programming and the transformational Draco Paradigm. The result is an algebraic structure consisting of the states of the software product (system descriptions) ordered by a relation of relative correctness with transformations defined between the system descriptions. This structure is interpreted as the software evolution space, with the intended semantics given by a model combining axiomatic semantics and the Lindenbaum algebra of a first-order logic. Using this interpretation, software evolution can be represented as a sequence of transformations on system descriptions. The primary contributions of the characterization of software evolution are to the understanding of maintenance and its relationship to development. The influence of development on maintenance is shown as the transfer of a "descriptive context" for the software system. This context is used as an information source during maintenance, and is progressively modified by maintenance activities. These activities are characterized by balanced forward and reverse transformations. The use of reverse transformations explaining the role of reverse engineering in maintenance for information gathering and document reconstruction. Additionally, the form of maintenance affects the performance of the activity, with adaptive maintenance differing from corrective, perfective and preventive maintenance. These factors contribute to the descriptive nature and utility of the Abstraction Refinement Model in defining methodologies for maintenance.
- Analysis and Evaluation of Methods for Activities in the Expanded Requirements Generation Model (x-RGM)Lobo, Lester Oscar (Virginia Tech, 2004-07-26)In recent years, the requirements engineering community has proposed a number of models for the generation of a well-formulated, complete set of requirements. However, these models are often highly abstract or narrowly focused, providing only pieces of structure and parts of guidance to the requirements generation process. Furthermore, many of the models fail to identify methods that can be employed to achieve the activity objectives. As a consequence of these problems, the requirements engineer lacks the necessary guidance to effectively apply the requirements generation process, and thus, resulting in the production of an inadequate set of requirements. To address these concerns, we propose the expanded Requirements Generation Model (x-RGM), which consists of activities at a more appropriate level of abstraction. This decomposition of the model ensures that the requirements engineer has a clear understanding of the activities involved in the requirements generation process. In addition, the objectives of all the activities defined by the x-RGM are identified and explicitly stated so that no assumptions are made about the goals of the activities involved in the generation of requirements. We also identify sets of methods that can be used during each activity to effectively achieve its objectives. The mapping of methods to activities guides the requirements engineer in selecting the appropriate techniques for a particular activity in the requirements engineering process. Furthermore, we prescribe small subsets of methods for each activity based on commonly used selection criteria such that the chosen criterion is optimized. This list of methods is created with the intention of simplifying the task of choosing methods for the activities defined by the x-RGM that best meet the selection criterion goal
- Analysis of networks with dynamic topologiesMoose, Robert Lewis (Virginia Polytechnic Institute and State University, 1987)Dynamic hierarchical networks represent an architectural strategy for employing adaptive behavior in applications sensitive to highly variable external demands or uncertain internal conditions. The characteristics of such architectures are described, and the significance of adaptive capability is discussed. The necessity for assessing cost/benefit tradeoffs leads to the use of queueing network models. The general model, a network of M/M/1 queues in a random environment, is introduced and then is simplified so that the links may be treated as isolated M/M/1 queues in a random environment. This treatment yields a formula for approximate mean network delay by combining matrix-geometric results (mean queue length and mean delay) for the individual links. Conditions under which the analytic model is considered valid are identified through comparison with a discrete event simulation model. Last, performance of the dynamic hierarchy is compared with that of the static hierarchy. This comparison establishes conditions for which the dynamic architecture enables performance equal or nearly equal to performance of the static architecture.
- Animations and Interactive Material for Improving the Effectiveness of Learning the Fundamentals of Computer ScienceGilley, William (Virginia Tech, 2001-05-08)Due to the rapid proliferation of the World Wide Web (WWW) in recent years, many educators are now seeking to improve the effectiveness of their instruction by providing interactive, web-based course material to their students. The purpose of this thesis is to document a set of eight online learning modules created to improve the effectiveness of learning the fundamentals of Computer Science. The modules are as follows: • Algorithms - Definition and specification of algorithms, with a comparison and analysis of several sorting algorithms as examples. • Artificial Intelligence - Overview of current applications in this discipline. • Data Structures - Explanation of basic data structures, including an introduction to computer memory and pointers, and a comparison of logical and physical representations of commonly used data structures. • Machine Architecture - Explanation of data storage, gates and circuits, and the central processing unit. • Number Systems - Discussion of number representation and arithmetic in number systems other than the decimal number system, with a focus on binary numbers and binary arithmetic. • Operating Systems - Explanation of the purpose of operating systems and the major components that make up an operating system. • Programming Languages - Explanation of the fundamental concepts in procedural programming languages. • Software Engineering - Introduction to software life cycle models and an overview of the procedural and object-oriented paradigms. Each module consists of a set of lessons and review questions written in HyperText Markup Language (HTML). Embedded in these pages are various interactive components implemented as Flash animations or Java applets. The modules currently reside on the Computer Science courseware server of Virginia Polytechnic Institute and State University (Virginia Tech) and can be viewed at the following WWW site: http://courses.cs.vt.edu/csonline/
- Application of the Analytic Hierarchy Process to Complex System Design EvaluationTalbert, Michael L.; Balci, Osman; Nance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1994-06-01)This paper examines the use of the Analytic Hierarchy Process (AHP) to weight indicators used in the evaluation of complex systems designs which involve software, hardware, and humanware. Since such a comprehensive easily include hundreds of system quality indicators, evaluators need a technique to ensure the identification and emphasis of salient indicators in the determination of the quality of the design. The AHP is a popular technique for determining relative worth among a set of elements. In the present work, we introduce AHP with a simple example, then illustrate the application of the AHP to design evaluation using a subset of indicators from the human component of a system. We note in some detail issues which require added attention when applying AHP to this domain. The issues include indicator selection, dealing with large numbers of indicators, incorporating group judgements, and conflict resolution. We found AHP to be an effective tool for use in assigning weights criticality in indicator-based design evaluation, and propose elements of an environment in which the use of AHP is easily incorporated.
- Applying software maintenance metrics in the object oriented software development life cylceLi, Wei (Virginia Tech, 1992-09-05)Software complexity metrics have been studied in the procedural paradigm as a quantitative means of assessing the software development process as well as the quality of software products. Several studies have validated that various metrics are useful indicators of maintenance effort in the procedural paradigm. However, software complexity metrics have rarely been studied in the object oriented paradigm. Very few complexity metrics have been proposed to measure object oriented systems, and the proposed ones have not been validated. This research concentrates on several object oriented software complexity metrics and the validation of these metrics with maintenance effort in two commercial systems. The results of an empirical study of the maintenance activities in the two commercial systems are also described. A metric instrumentation in an object oriented software development framework is presented.
- Assessing software quality in Ada based products with the objectives, principles, attributes frameworkBundy, Gary Neal (Virginia Tech, 1990-09-08)This thesis describes the results of a research effort focusing on the validation of a procedure for assessing the quality of an Ada-based product. Starting with the identification of crucial Ada constructs, this thesis outlines a seven step process for defining metrics that support software quality assessment within a framework based on linkages among software engineering objectives, principles, and attributes. The thesis presents the impact of the use of crucial Ada constructs on the software engineering attributes and describes measurement approaches for assessing that impact This thesis also outlines a planned research effon to develop an automated analyzer for the assessment of software quality in Ada-based products and plans for validating the assessment procedure.
- The automated assessment of computer software documentation quality using the objectives/principles/attributes frameworkDorsey, Edward Vernon (Virginia Tech, 1992-10-15)Since humans first put pen to paper, people have critically assessed written work; thus, the assessment of documents per se is not new. Only recently, however, has the issue of formalized document quality assessment become feasible. Enabled by the rapid progress in computing technology, the prospect of an automated, formalized system of quality assessment, based on the presence of certain attributes deemed essential to the quality of a document, is feasible. The existing Objectives/Principles/Attributes Framework, previously applied to code assessment, is modified to allow application to documentation quality assessment. An automated procedure for the assessment of software documentation quality assessment and the development of a prototype documentation analyzer are described. A major shortcoming of the many quality metrics that are proposed in computer science is their lack of empirical validation. In pursuit of such necessary validation for the measures proposed within this thesis, a study is performed to determine the agreement of the measures rendered by Docalyze with those of human evaluators. This thesis demonstrates the applicability of a quality assessment framework to the documentation component of a software product. Further, the validity of a subset of the proposed metrics is demonstrated.
- Capturing Requirements Meeting Customer Intent: A Methodological ApproachGröner, Markus K. (Virginia Tech, 2002-05-10)Product quality is directly related to how well that product meets the customer's needs and intents. It is paramount, therefore, to capture customer requirements correctly and succinctly. Unfortunately, most development models tend to avoid, or only vaguely define the process by which requirements are generated. Other models rely on formalistic characterizations that require specialized training to understand. To address such drawbacks we introduce the Requirements Generation Model (RGM) that (a) decomposes the conventional "requirements analysis" phase into sub-phases which focus and refine requirements generation activities, (b) constrains and structures those activities, and (c) incorporates a monitoring methodology to assist in detecting and resolving deviations from process activities defined by the RGM. We present an empirical study of the RGM in an industrial setting, and results derived from this study that substantiate the effectiveness of the RGM in producing a better set of requirements.
- Communications Resource Allocation: Feasibility Assessment for Tactial Networking ApplicationsBernard, Jon Ashley (Virginia Tech, 2004-12-14)The research reported here offers a solution to the communications resource allocation problem. Unlike earlier approaches to this problem, we employ a time-sliced event model where messages are sent and received in a single time slice called an epoch. In addition, we also consider networks that contain relay nodes capable of only transferring messages. Consequently, network topologies can be considered where a given node is not directly connected to every other node and must use one or more relay nodes in order to get a message to some destination. The resulting architectures broaden the networks to be considered and enable the capability of constructing more realistic communication scenarios. In this paper we modify the standard MCNF model by turning our focus to feasibility instead of optimality in an effort to provide adequate and accurate decision support to communication network planners. Given a network configuration and message requirements, our goal is to determine if the proposed scenario is feasible in terms of the communication resources available. To meet this goal, three algorithms are presented that each solve the extended MCNF problem with varying degrees of accuracy and run-time requirements. Experimental results show that a large number of multi-variable interactions among input parameters play a key role in determining feasibility and predicting expected execution time. Several heuristics are presented that reduce run-time dramatically, in some cases by a factor of 37. Each algorithm is tested on a range of inputs and compared to the others. Preliminary results gathered indicate that the second algorithm of the three (APEA) offers the best balance of accuracy vs. execution time. In summary, the solutions presented here solve the resource allocation problem for message delivery in a way that enables evaluation of real world communication scenarios.
- A Comparison of Discrete Event Simulation Courses Based on a Small Sample SurveyNance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1976)No abstract available.
- A Comparison of Selected Conceptual Frameworks for SimulationModelingDerrick, Emory Joseph; Balci, Osman; Nance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1989)The purpose of this paper is compare 13 Conceptual Frameworks (CFs) selected from among several catagories of applicability to discrete-event simulation modeling. Each CF is briefly reviewed to provide the background information required for the comparison. Based on the insights gained in applying the CFs to the modeling complex traffic intersection system, the CFs are compared to their distinct characteristics and capabilities. Comparative comments are grouped according to the design guidance and implementation guidance features of the CFs. Conclusions highlight the inadequacies of the CFs and the importance of research in CF development.
- Conceptual frameworks for discrete event simulation modelingDerrick, Emory Joseph (Virginia Tech, 1988-08-05)This thesis examines those aspects of simulation with digital computers which concern the use of conceptual frameworks (CFs) for the design and implementation of a model. A literature review of CFs which are in common use is conducted. These CFs are applied to a complex modeling problem, a traffic intersection system. A comparative review of the CFs is given based upon the lessons learned from the above applications, and a taxonomy is developed. The research clarifies the differences that exist among the myriad of CFs in use today. In particular, the comparative review highlights the significant CF features that are necessary for successful model representation of discrete-event systems. The taxonomy provides a useful and meaningful classification of CFs and produces insights in to the conceptual relationships that exist among them. The characteristics of CFs that are desired to enable the development of model specifications that are analyzable, domain independent, and fully translatable are identified. The roles of CFs are better understood and specific potential directions for future research are pinpointed.
- The Conical Methodology: A Framework for Simulation Model DevelopmentNance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1987)The Conical Methodology, intended for large discrete event simulation modeling, is reviewed from two perspectives. The designer perspectivebegins with the question: What is a methodology? From an answer to that question is framed an inquiry based on an objective/principles/attributes linkage that has proved useful in evaluating software development methodologies. The user perspective addresses the role of a methodology vis a vis the software utilities (the tools) that comprise the environment. Principles of a methodology form the needs analysis by which the requirements for tool design can be derived. A comparison with software development methodologies and some applications for the Conical Methodologies comprise the concluding summary.
- Contemplations of a Simulated NavelNance, Richard E. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1988)The Model Development Environment Project has the goal of defining the software utilities and the database support needed for creating, validating, and experimenting with complex simulation models. This project review, emphasizing the needs and explaining some of the guiding concepts and principles, serves to underscore key issues extending beyond discrete event simulation. An introspective summary presents an optimistic reaction to the fear that technically naive modelers might use the more sophisticated capabilities to produce catastrophic results.
- A data analysis software tool for the visual simulation environmentTuglu, Ali (Virginia Tech, 1995)The objective of the research described herein is to develop a prototype data analysis software tool integrated within the Visual Simulation Environment (VSE). The VSE is an integrated set of software tools that provide computer-aided assistance throughout the development life cycle of visual discrete-event simulation models. Simulation input and output data analyses are commonly needed in simulation studies. A software tool performing such data analysis is required within the VSE to provide automated support for input data modeling and output data analysis phases of the model development life cycle. The VSE DataAnalyzer provides general statistics. histograms, confidence intervals, and randomness tests for the data sets. It can also create C modules for generating random variates based on a collected set of data. Furthermore, the VSE DataAnalyzer possesses the basic file management, editing, printing, and formatting functionalities as well as a complete help feature. It has been used in a senior-level Simulation and Modeling course, and the feedback from the students has been positive. New functionalities can easily be added to the VSE DataAnalyzer due to its object-oriented software structure. The VSE DataAnalyzer is yet another software tool created to provide more comprehensive automated support throughout the visual simulation model development.