Browsing by Author "Sargent, Robert G."
Now showing 1 - 5 of 5
Results Per Page
Sort Options
- Analysis of Future Event Set Algorithms for Discrete Event SimulationMcCormack, William M.; Sargent, Robert G. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1980)This work reports on new analytical and empirical results on the performance of algorithms for handling the future event set in discrete event simulation. These results provide a clear insight to the factors affecting algorithm performance; evaluate the "hold" model, often used to study future event set algorithms; and determine the best algorithm(s) to use.
- A Methodology For Validating Multivariate Response Simulation Models By Using Simultaneous Confidence IntervalsBalci, Osman; Sargent, Robert G. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1981)This paper deals with the substantiation that a multivariate response self- or trace-driven simulation model, within its domain of applicability, possesses a satisfactory range of accuracy consistent with the intended application of the model. A methodology is developed by using simultaneous confidence intervals to do this substantiation with respect to the mean behavior of a simulation model that represents an observable system. A trade off analysis can be performed and judgement decisions can be made as to what data collection budget to allocate, what data collection method to use, how many observations to collect on each of the model and system response variables, and what confidence level to choose for producing the range of accuracy with satisfactory lengths. The methodology is illustrated for self-driven steady-state and trace-driven terminating simulations.
- Validation of Multivariate Response Simulation Models by Using Hotelling's Two-sample T^2 TestBalci, Osman; Sargent, Robert G. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1981)A procedure is developed by using Hotelling's two-sample T^2 test to test the validity of a multivariate response simulation model that represents an observable system. The validity of the simulation model is tested with respect to the mean behavior under a given experimental frame. A trade-off analysis can be performed and judgement decisions can be made as to what data collection budget to allocate, what data collection method to use, how many observations to collect on each of the model and system response variables, and what model builder's risk to choose for testing the validity under a satisfactory model user's risk. The procedure for validation is illustrated for a simulation model that represents an M/M/l queueing system with two performance measures of interest.
- Validation of Multivariate Response Trace-driven Simulation ModelsBalci, Osman; Sargent, Robert G. (Department of Computer Science, Virginia Polytechnic Institute & State University, 1982)A procedure is developed by using Rotelling's one-sample T^2 test to test the validity of a multivariate response trace-driven simulation model that represents an observable system. The vali- dity of the simulation model is tested with respect to the mean behavior under a given experimental frame. A procedure for cost-risk analysis for the one-sample T^2 test is developed. By using this procedure, a trade-off analysis can be performed and judgement decisions can be made as to what data collection budget to allocate, what data collection method to use, how many paired observations to collect on the model and system re- sponse variables, and what model builder's risk to choose for test- ing the validity under a satisfactory model user's risk. The procedure for validation and the cost-risk analysis are illustrated for a trace-driven simulation model that represents a time-sharing computer system with two performance measures of interest.
- Verification and Validation: What Impact Should Project Size and Complexity Have on Attendant V&V Activities and Supporting Infrastructure?Arthur, James D.; Sargent, Robert G.; Dabney, James B.; Law, Averill M.; Morrison, John D. (Jack) (Department of Computer Science, Virginia Polytechnic Institute & State University, 1999-11-01)The size and complexity of Modeling and Simulation (M&S) application continue to grow at a significant rate. The focus of this panel is to examine the impact that such growth should have on attendant Verification and Validation (V&V) activities. Two prominent considerations guiding the panel discussion are: (1) Extending the current M&S development objectives to include quality characteristics like maintainability, reliability, and reusability -- the current modus operandi focuses primarily on correctness, and (2) Recognizing the necessity and benefits of tailoring V&V activities commensurate with the size of the project, i.e., one size does not fit all. In this paper we provide six questions and four sets of responses to those questions. These questions and responses are intended to foster additional thought and discussion on topics crucial to the synthesis of quality M&S applications.