464636 Uncertainty Considerations for Surrogate Functions for Constrained Grey-Box Optimization

Wednesday, November 16, 2016: 8:30 AM
Monterey I (Hotel Nikko San Francisco)
Fani Boukouvala1,2, Yannis A. Guzman1,2,3 and Christodoulos A. Floudas1,2, (1)Artie McFerrin Department of Chemical Engineering, Texas A&M University, College Station, TX, (2)Texas A&M Energy Institute, Texas A&M University, College Station, TX, (3)Department of Chemical and Biological Engineering, Princeton University, Princeton, NJ

Derivative-free optimization methods have attracted a lot of attention recently, due to the increasing interest in multiscale expensive simulations which cannot be directly optimized, as well as the abundance of data from systems which cannot be described by analytic first-principle models [1-2]. We have previously presented our framework, AlgoRithms for Global Optimization of coNstrAined grey-box compUTational problems (ARGONAUT) [3-4], which identifies an appropriate surrogate representation for all of the unknown correlations of the constraints set and the objective, and subsequently combines all of the surrogate equations with any known equations, in order to form a non-convex non-linear programming (NLP) problem. The hybrid NLP is iteratively solved to global optimality in order to identify new samples and update the surrogate models. ARGONAUT addresses many elements ranging from optimal sampling, optimal sampling reduction, model identification, bound refinement, variable selection to global optimization. Insofar, we have shown that this framework exhibits competitive performance over a set of high-dimensional constrained benchmark problems [3] and a case study for the optimization of pressure swing adsorption simulation [4]. In this work, we will show improvements by (a) incorporating the effects of uncertainty introduced by sampling and surrogate model identification and (b) fully parallelizing our computational implementation to optimize efficiency.

One of the main components of any model-based derivative-free or black-box optimization method, is the identification of good surrogate approximations. This remains to be one of the major challenges of optimization without analytical forms or accurate derivatives, since the uncertainty caused by the fitted parameters of the surrogate models, as well as the mismatch between the underlying unknown function and the fitted model, highly affect the performance and reliability of any algorithm. Specifically, it has been observed that sampling plays a major role in the identification of the optimal parameters, since given a different set of samples, the optimal parameters of a surrogate model can be different. Moreover, the actual form of the input-output model is unknown, magnifying the observed variability in the surrogate model parameters given a different set of samples. It is important to develop surrogate models which provide good search directions, within which the surrogate models will be refined. Thus, the main challenge becomes the identification of promising refined search spaces, when the surrogate models cannot be entirely trusted as exact representations of the unknown functions.

In this work, we investigate the effects of surrogate parameter uncertainty caused by two main sources: (a) sampling of the search space and (b) selection of the appropriate surrogate function. We show that through the use of concepts from robust counterpart optimization [5-6], we can formulate a 'best-case' constrained grey-box representation, as well as a 'worst-case' constrained grey-box representation, which can serve as probabilistic lower and upper bounds on the actual optimum, respectively. We will show this concept through several examples using a set of different surrogate function types (i.e., linear, polynomial, kriging and radial basis functions). We will also discuss how these uncertainty-based bounds are incorporated within our constrained grey-box optimization framework to improve the reliability by mitigating the variability caused by the initial sampling stage. We will show results of our improved ARGONAUT framework which incorporates all of the above, as well as improved variable and term selection methods. Finally, we show that through efficient parallelization of the sampling and model identification stages, we can solve problems with large dimensionality and total number of constraints with reduced computational running time.


1. Boukouvala, F., Misener, R., Floudas, C.A. (2016). Global Optimization Advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO. European Journal of Operational Research, 252(3), 701–727

2. Floudas, C.A., Niziolek, A.M., Onel, O., Matthews, L.R. (2016). Multi-scale Systems Engineering for Energy and the Environment: Challenges and Opportunities. AIChE J, 62(3), 602-623

3. Boukouvala, F., Hasan, M.M.F., Floudas, C.A. (2016). Global optimization of general constrained grey-box models: new method and its application to constrained PDEs for pressure swing adsorption, Journal of Global Optimization, DOI: 10.1007/s10898-015-0376-2

4. Boukouvala, F., Floudas, C.A. (2016). ARGONAUT: Algorithms for Global Optimization of Constrained Grey-Box Computational Problems. Optimization Letters,
DOI: 10.1007/s11590-016-1028-2

5. Guzman, Y.A., Matthews, L.R., Floudas, C.A. (2016). New a priori and a posteriori probabilistic bounds for robust counterpart optimization: I. Unknown probability distributions. Computers & Chemical Engineering, 84, 568-598

6. Li, Z., Tang, Q., Floudas C.A. (2012). A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: II. Probabilistic Guarantees on Constraint Satisfaction, Ind.&Eng. Chem. Res., 51, 6769-6788

Extended Abstract: File Not Uploaded
See more of this Session: Advances in Optimization I
See more of this Group/Topical: Computing and Systems Technology Division