Global Optimization Of Computationally Expensive Blackbox Problems Using Radial Basis Functions

Other Titles


Three derivative-free global optimization methods are developed based on radial basis functions (RBFs) for computationally expensive blackbox simulation models. First, we develop a multistart global optimization method, called SOMS (SurrOgate MultiStart). SOMS uses an RBF surrogate model to approximate the objective function in order to reduce the number of function evaluations necessary to identify the most promising points from which each nonlinear programming local search is started. We show that SOMS detects any local minimum within a finite number of iterations almost surely. The numerical results show that SOMS performs favorably in comparison to alternative methods and that the surrogate approach saves a significant number of computationally expensive function evaluations. In the second part of this work, we introduce PADS (PArallel Dynamic coordinate search with Surrogates), which is a surrogate-based global optimization framework for highdimensional expensive blackbox functions. In each parallel iteration of PADS, multiple points are selected from a large set of candidate points that are generated by perturbing only a subset of the coordinates of the current best solution. The selected points are then evaluated in parallel with up to 16 parallel processors. We show that PADS converges to the global optimum with probability 1. We develop two versions, PADS1 and PADS2, which use different underlying distributions to generate candidate points. We show that PADS1 and PADS2 are able to find better solutions more efficiently compared to alternative methods, with PADS1 performing even better than PADS2 in problems up to 200 dimensions. In the final part of this dissertation, we develop an effective new parallel surrogate global optimization method called SOP (Surrogate Optimization with Pareto center selection). The search mechanism of SOP incorporates bi-objective optimization, tabu search, and surrogate assisted local search, which exploits the information from the already evaluated points, for selecting a large number of new evaluation points. The newly selected points are evaluated in parallel, and hence a significant reduction in wall-clock time can be achieved. We give sufficient conditions for almost sure convergence of SOP. The results of our numerical experiments show that SOP performs very well compared to alternative parallel surrogate model algorithms with 8 and 32 processors obtaining superlinear speedup on some test problems.

Journal / Series

Volume & Issue



Date Issued




Computationally expensive functions; Surrogate optimization; Blackbox global optimization


Effective Date

Expiration Date




Union Local


Number of Workers

Committee Chair

Shoemaker, Christine Ann

Committee Co-Chair

Committee Member

Frazier, Peter
Nussbaum, Michael

Degree Discipline

Applied Mathematics

Degree Name

Ph. D., Applied Mathematics

Degree Level

Doctor of Philosophy

Related Version

Related DOI

Related To

Related Part

Based on Related Item

Has Other Format(s)

Part of Related Item

Related To

Related Publication(s)

Link(s) to Related Publication(s)


Link(s) to Reference(s)

Previously Published As

Government Document




Other Identifiers


Rights URI


dissertation or thesis

Accessibility Feature

Accessibility Hazard

Accessibility Summary

Link(s) to Catalog Record