464725 Optimization of Constrained and Multidimensional Black-Box Problems Using Convex Hull Approximation and Single-Dimension Surrogate Model

Wednesday, November 16, 2016: 8:49 AM
Monterey I (Hotel Nikko San Francisco)
Ishan Bajaj and M. M. Faruque Hasan, Artie McFerrin Department of Chemical Engineering, Texas A&M University, College Station, TX

Many important optimization problems in science and engineering are black-box, i.e., the analytical forms of the objective function and constraints are hidden or unknown. Identifying unknown feasible region and optimization of high-dimensional problems are the major challenges in black-box optimization. Evaluating black-box functions and their derivatives can be difficult, computationally expensive and unreliable, which limits the use of gradient-based optimization. One approach in derivative-free optimization is to replace expensive black-box functions with inexpensive surrogates through sampling and parameter estimation [1–6]. However, the accuracy of the surrogate models depends on the sampling set, surrogate function type, parameter estimation method, etc. These limitations often restrict the use of surrogate models to low-dimensional problems and few constraints, as the model size, complexity and parameters increase with the number of variables and constraints.

In the first part of the work, we present a data-driven approximation of the feasible region of constrained black-box problems without using surrogate models for the constraints. Data-driven construction of convex region has been recently proposed [7]. In this work, we approximate a feasible region (convex or nonconvex, continuous or disjoint) as the region described by the convex hull of feasible samples and their neighboring points, while subtracting the infeasible samples and their neighboring points from the convex hull. We represent each sample and its neighboring points using a Euclidean ball with the sample as its center and with the radius proportional to the minimum constraint satisfaction (if the sample is feasible) or violation (if the sample is infeasible). We solve a linear optimization (LP) model to select the largest Euclidean balls while avoiding any intersection between the feasible and infeasible balls. This increases the accuracy of our convex hull-based approximation as we obtain more samples. The design of experiments (DoE) for new samples is now posed as a packing problem, which is a quadratically constrained program (QCP) and is solved to optimality using a global solver. Unlike surrogate model-based approaches, where we need to postulate a surrogate model and estimate the parameters for each constraint, our method exploits a single global parameter related to the size of the Euclidean balls, irrespective of the number of constraints.

In the second part of the work, we propose a novel approach to address high-dimensional black-box problems. Instead of constructing multivariate models, we project the original n-dimensional problem as a separate single-dimensional problem. Specifically, we consider a problem of minimizing F(t), which geometrically can be interpreted as the projection of the original objective function f(xi), i = 1, …, n taken in the t-space following certain conditions imposed on the projection and the relations between xi and t. We obtain F(t) such that the minimum of F(t) also corresponds to the solution of the original problem. The overall problem can be now decomposed into the following: fit a single-dimension surrogate model for F(t), minimize F(t) to approximately find t*, find xi* corresponding to t*, and check critically measure for xi*. This procedure is repeated iteratively in a trust region framework [8] to converge to an optimum t. The idea is extended to constrained problems by relating the objective function and constraint violations with tand applying filter technique [9] to converge to local optima.

In the derivative-free optimization paradigm, these developments can be promising because of the use of (i) a global parameter to approximate the feasible region irrespective of the number of constraints, and (ii) a single-dimension surrogate model to approximate the objective function irrespective of the number of variables. The applicability of the methods described above will be demonstrated on both benchmark black-box problems and chemical engineering applications.


[1] Rios, L. M.; Sahinidis, N.V. Derivative-free optimization: a review of algorithms and comparison of software implementations. Journal of Global Optimization, 56(3):1247–1293, 2013.

[2] Boukouvala, F.; Misener, R.; Floudas, C. A. Global optimization advances in Mixed-Integer Nonlinear Programming, MINLP, and Constrained Derivative-Free Optimization, CDFO. European Journal of Operational Research, 252(3):701–727, 2016.

[3] Boukouvala, F.; Hasan, M. M. F.; Floudas, C. A. Global optimization of general constrained grey-box models: new method and its application to constrained PDEs for pressure swing adsorption. Journal of Global Optimization, 2015. DOI 10.1007/s10898-015-0376-2.

[4] Biegler, L.T., Lang, Y-D, Lin, W-J. Multi-scale optimization for process systems engineering. Computers & Chemical Engineering, 60(10): 17-30, 2014.

[5] Boukouvala, F.; Ierapetritou, M. Surrogate-based optimization of expensive flowsheet modeling for continuous pharmaceutical manufacturing. Journal of Pharmaceutical Innovation, 8(2):131:145, 2013.

[6] Caballero, J. A.; Grossmann, I. E. An algorithm for the use of surrogate models in modular flowsheet optimization. AIChE Journal, 54(10):2633–2650, 2008.

[7] Zhang, Q.; Grossmann, I. E.; Sundaramoorthy, A.; Pinto, J. M. Data-driven construction of convex region Surrogate models. Optimization and Engineering, 2015. DOI 10.1007/s11081-015-9288-8

[8] Conn, A. R.; Scheinberg, K.; Vicente, L. N. Global convergence of general derivative free trust-region algorithms to first-and second-order critical points. SIAM Journal on Optimization, 20(1):387–415, 2009.

[9] Fletcher, R.; Gould, N.I.M.; Leyffer, S.; Toint, P. L.; Wachter, A. Global convergence of a trust-region SQP-filter algorithm for general nonlinear programming. SIAM Journal on Optimization, 13(3):635–659, 2002.

Extended Abstract: File Not Uploaded
See more of this Session: Advances in Optimization I
See more of this Group/Topical: Computing and Systems Technology Division