Browsing by Author "Perez, Victor M."
Now showing 1 - 4 of 4
Results Per Page
Sort Options
- Homotopy methods for constraint relaxation in unilevel reliability based design optimizationAgarwal, Harish; Gano, Shawn E.; Renaud, John E.; Perez, Victor M.; Watson, Layne T. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2007)Reliability based design optimization is a methodology for finding optimized designs that are characterized with a low probability of failure. The main ob jective in reliability based design optimization is to minimize a merit function while satisfying the reliability constraints. The reliability constraints are constraints on the probability of failure corre- sponding to each of the failure modes of the system or a single constraint on the system probability of failure. The probability of failure is usually estimated by performing a relia- bility analysis. During the last few years, a variety of different techniques have been devel- oped for reliability based design optimization. Traditionally, these have been formulated as a double-loop (nested) optimization problem. The upper level optimization loop gen- erally involves optimizing a merit function sub ject to reliability constraints and the lower level optimization loop(s) compute the probabilities of failure corresponding to the failure mode(s) that govern the system failure. This formulation is, by nature, computationally intensive. A new efficient unilevel formulation for reliability based design optimization was developed by the authors in earlier studies. In this formulation, the lower level optimiza- tion (evaluation of reliability constraints in the double loop formulation) was replaced by its corresponding first order Karush-Kuhn-Tucker (KKT) necessary optimality conditions at the upper level optimization. It was shown that the unilevel formulation is computation- ally equivalent to solving the original nested optimization if the lower level optimization is solved by numerically satisfying the KKT conditions (which is typically the case), and the two formulations are mathematically equivalent under constraint qualification and general- ized convexity assumptions. In the unilevel formulation, the KKT conditions of the inner optimization for each probabilistic constraint evaluation are imposed at the system level as equality constraints. Most commercial optimizers are usually numerically unreliable when applied to problems accompanied by many equality constraints. In this investigation an optimization framework for reliability based design using the unilevel formulation is de- veloped. Homotopy methods are used for constraint relaxation and to obtain a relaxed feasible design. A series of optimization problems are solved as the relaxed optimization problem is transformed via a homotopy to the original problem. A heuristic scheme is employed in this paper to update the homotopy parameter. The proposed algorithm is illustrated with example problems.
- NetEdit: A Collaborative EditorZaffer, Ali A.; Shaffer, Clifford A.; Ehrich, Roger W.; Perez, Victor M. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2001)We present a collaborative text editor named NetEdit. NetEdit uses a replicated architecture with processing and data distributed across all clients. Due to replication, the response time for local edits is quite close to that of a single-user editor. Clients do not need explicit awareness of other clients since all communication is coordinated by a central server. As a result, NetEdit is quite scalable (linear growth relative to purely distributed systems (quadratic growth) in terms of number of communication paths required as the number of clients grow. NetEdit uses an n-way synchronization algorithm derived from the synchronization protocol of the Jupiter collaboration system. Along with describing the editor, its architecture and its synchronization algorithm, we present the results of a usability study that evaluated the collaboration awareness tools included in NetEdit.
- Reduced Sampling for Construction of Quadratic Response Surface Approximations Using Adaptive Experimental DesignPerez, Victor M.; Renaud, John E.; Watson, Layne T. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2007)The purpose of this paper is to reduce the computational complexity per step from O(n^2) to O(n) for optimization based on quadratic surrogates, where n is the number of design variables. Applying nonlinear optimization strategies directly to complex multidisciplinary systems can be prohibitively expensive when the complexity of the simulation codes is large. Increasingly, response surface approximations, and specifically quadratic approximations, are being integrated with nonlinear optimizers in order to reduce the CPU time required for the optimization of complex multidisciplinary systems. For evaluation by the optimizer, response surface approximations provide a computationally inexpensive lower fidelity representation of the system performance. The curse of dimensionality is a major drawback in the implementation of these approximations as the amount of required data grows quadratically with the number n of design variables in the problem. In this paper a novel technique to reduce the magnitude of the sampling from O(n^2) to O(n) is presented. The technique uses prior information to approximate the eigenvectors of the Hessian matrix of the response surface approximation and only requires the eigenvalues to be computed by response surface techniques. The technique is implemented in a sequential approximate optimization algorithm and applied to engineering problems of variable size and characteristics. Results demonstrate that a reduction in the data required per step from O(n^2) to O(n) points can be accomplished without significantly compromising the performance of the optimization algorithm. A reduction in the time (number of system analyses) required per step from O(n^2) to O(n) is significant, even more so as n increases. The novelty lies in how only O(n) system analyses can be used to approximate a Hessian matrix whose estimation normally requires O(n^2) system analyses.
- Usability Inspection Report of iLuminaShivakumar, Priya; Hartson, H. Rex; Perez, Victor M. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2002-09-01)iLumina is a digital library of sharable undergraduate teaching resource materials for science, mathematics, technology, and engineering being developed by the University of North Carolina at Wilmington (UNCW), Collegis, Virginia Tech, Georgia State University, Grand Valley State and The College of New Jersey. Types of iLumina resources include papers, tutorials, applets, presentations, visualizations, experiments, assignments, software, exercises.