Browsing by Author "Trosset, Michael W."
Now showing 1 - 6 of 6
Results Per Page
Sort Options
- Adjusting process count on demand for petascale global optimization⋆Radcliffe, Nicholas R.; Watson, Layne T.; Sosonkina, Masha; Haftka, Raphael T.; Trosset, Michael W. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2011)There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, the modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed.
- Algorithm XXX: QNSTOP—Quasi-Newton Algorithm for Stochastic OptimizationAmos, Brandon D.; Easterling, David R.; Watson, Layne T.; Thacker, William I.; Castle, Brent S.; Trosset, Michael W. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2014-07-01)QNSTOP consists of serial and parallel (OpenMP) Fortran 2003 codes for the quasi-Newton stochastic optimization method of Castle and Trosset. For stochastic problems, convergence theory exists for the particular algorithmic choices and parameter values used in QNSTOP. Both the parallel driver subroutine, which offers several parallel decomposition strategies, and the serial driver subroutine can be used for stochastic optimization or deterministic global optimization, based on an input switch. QNSTOP is particularly effective for “noisy” deterministic problems, using only objective function values. Some performance data for computational systems biology problems is given.
- The Cost of Numerical Integration in Statistical Decision-theoretic Methods for Robust Design OptimizationKugele, Sean C.; Trosset, Michael W.; Watson, Layne T. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2006)The Bayes principle from statistical decision theory provides a conceptual framework for quantifying uncertainties that arise in robust design optimization. The difficulty with exploiting this framework is computational, as it leads to objective and constraint functions that must be evaluated by numerical integration. Using a prototypical robust design optimization problem, this study explores the computational cost of multidimensional integration (computing expectation) and its interplay with optimization algorithms. It concludes that straightforward application of standard off-the-shelf optimization software to robust design is prohibitively expensive, necessitating adaptive strategies and the use of surrogates.
- Mathematical Software for Multiobjective Optimization ProblemsChang, Tyler Hunter (Virginia Tech, 2020-06-15)In this thesis, two distinct problems in data-driven computational science are considered. The main problem of interest is the multiobjective optimization problem, where the tradeoff surface (called the Pareto front) between multiple conflicting objectives must be approximated in order to identify designs that balance real-world tradeoffs. In order to solve multiobjective optimization problems that are derived from computationally expensive blackbox functions, such as engineering design optimization problems, several methodologies are combined, including surrogate modeling, trust region methods, and adaptive weighting. The result is a numerical software package that finds approximately Pareto optimal solutions that are evenly distributed across the Pareto front, using minimal cost function evaluations. The second problem of interest is the closely related problem of multivariate interpolation, where an unknown response surface representing an underlying phenomenon is approximated by finding a function that exactly matches available data. To solve the interpolation problem, a novel algorithm is proposed for computing only a sparse subset of the elements in the Delaunay triangulation, as needed to compute the Delaunay interpolant. For high-dimensional data, this reduces the time and space complexity of Delaunay interpolation from exponential time to polynomial time in practice. For each of the above problems, both serial and parallel implementations are described. Additionally, both solutions are demonstrated on real-world problems in computer system performance modeling.
- Note on the Effectiveness OF Stochastic Optimization Algorithms for Robust DesignIyer, Manjula A.; Phillips, Rhonda D.; Trosset, Michael W.; Watson, Layne T. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2008-05-01)Robust design optimization (RDO) uses statistical decision theory and optimization techniques to optimize a design over a range of uncertainty (introduced by the manufacturing process and unintended uses). Since engineering ob jective functions tend to be costly to evaluate and prohibitively expensive to integrate (required within RDO), surrogates are introduced to allow the use of traditional optimization methods to find solutions. This paper explores the suitability of radically different (deterministic and stochastic) optimization methods to solve prototypical robust design problems. The algorithms include a genetic algorithm using a penalty function formulation, the simultaneous perturbation stochastic approximation (SPSA) method, and two gradient-based constrained nonlinear optimizers (method of feasible directions and sequential quadratic programming). The results show that the fully deterministic standard optimization algorithms are consistently more accurate, consistently more likely to terminate at feasible points, and consistently considerably less expensive than the fully nondeterministic algorithms.
- Parallel Deterministic and Stochastic Global Minimization of Functions with Very Many MinimaEasterling, David R.; Watson, Layne T.; Madigan, Michael L.; Castle, Brent S.; Trosset, Michael W. (Department of Computer Science, Virginia Polytechnic Institute & State University, 2011)The optimization of three problems with high dimensionality and many local minima are investigated under five different optimization algorithms: DIRECT, simulated annealing, Spall’s SPSA algorithm, the KNITRO package, and QNSTOP, a new algorithm developed at Indiana University.