283753 Alamo: Automatic Learning of Algebraic Models for Optimization

Tuesday, October 30, 2012: 8:30 AM
328 (Convention Center )
Alison Cozad1, Nick Sahinidis1 and David C. Miller2, (1)Chemical Engineering, Carnegie Mellon University, Pittsburgh, PA, (2)U.S. Department Of Energy, National Energy Technology Laboratory, Pittsburgh, PA

We address a central problem in machine learning, namely that of learning an algebraic model from data obtained from simulations or experiments.  The problem arises naturally in situations in which there is a need to replace a computationally intensive model with a cheaper to compute surrogate or reduced order model.  The problem also arises in situations in which one is interested in developing a theoretical model from experimental measurements.  We are interested in developing a technique that learns models that are (a) as accurate as possible and (b) as simple as possible.  Requirement (a) is obvious, while requirement (b) is driven by our desire to utilize the developed model in further studies, for instance by embedding it in a larger multi-scale model for optimization, simulation, or analysis.  Finally, we are interested in developing methodology that achieves the above goals without requiring too many simulations or experiments.

We present a methodology aiming to achieve the above requirements.  The proposed approach begins by building a low-complexity surrogate model.  The model is built using a best subset technique that leverages a mixed-integer linear programming formulation to allow for considering a very large number of possible functional components in the model.  The model is then tested, exploited, and improved through the use of derivative-free optimization solvers to adaptively sample new simulation or experimental points. 

Finally, we describe ALAMO, the computational implementation of the proposed methodology, along with extensive computational comparisons between ALAMO and a variety of machine learning techniques, including Latin hypercube sampling, simple least squares regression, and lasso.

Extended Abstract: File Not Uploaded