MPC Performance Monitoring and Evaluation Principles

Thursday, October 20, 2011: 12:50 PM
103 F (Minneapolis Convention Center)
Luo Ji and James B. Rawlings, Chemical and Biological Engineering, University of Wisconsin-Madison, Madison, WI

Abstract

Over the past 30 years, Model Predictive Control (MPC) has become a dominant control technology and utilization of MPC loops has increased rapidly in the process and manufacturing industries [5]. The widespread utilization of MPC naturally calls for the need of performance assessment, monitoring and analysis [1]. A control performance problem usually consists of the goal-achievement evaluation of a single controller, performance rank ordering of multiple controllers, and corresponding performance maintenance or trouble shooting. However, some surveys indicate that only about one third of controllers actually provide an acceptable level of performance [2], and there is no well established, systematic control theory or technology that enables MPC practitioners to routinely evaluate the operations of their MPC systems. Therefore, industrial practice has a large and growing need for MPC controller performance monitoring and analysis.
    For general control technologies other than MPC, there are some methods of performance evaluation. The most famous one is the Harris Index, or the minimum variance (MV) benchmark [3], which is based on the assumption that the smaller the variance of the output around its desired setpoint, the better the performance. Theoretically, the variance of an ideal controlled output has only the effect of the measurement noise. Comparison of the actual plant output variance with this theoretical minimum is a measure of control performance. There are also some other types of benchmarks, such as the historical benchmark, the LQG (Linear Quadratic Gaussian) benchmark, and the user-defined benchmark [7, 6, 8]. However, these benchmarks are not suitable for MPC controllers because of the complexity of MPC system models, the process noise in the system dynamics, limitations in current algorithms, and the use of suboptimal MPC optimization. Even with the current best MPC algorithm, those complexities would prevent the practical MPC output variances to be close to the theoretical minimum value. Therefore, these benchmarks usually provide an unrealistic and unreachable performance standard from the practical viewpoint.
    This presentation introduces a new strategy of MPC performance evaluation, which takes advantage of the system model information as well as the autocovariance least squares (ALS) technology [4]. ALS is able to provide estimates of the covariances of both the process and measurement noise from provided plant input/output data. An MPC simulation can be performed using these estimated noises to mimic the plant operating condition. This simulated performance is referred to as the average achievable performance of a perfectly tuned MPC controller using current algorithms and in the presence of the plant noises. To numerically evaluate the performance, a key performance index (KPI) is defined as a quadratic function of the output and input tracking error. Comparison between the plant KPI and the simulated KPI indicates the current plant performance condition. Different calculation methods of KPI are discussed for linear models applied to unconstrained and constrained cases. Results verify that the simulated KPI converges to the plant KPI with perfect modeling. The simulated KPI fails to converge to the plant KPI when there is some process model mismatch or noise model mismatch, indicating the plant performance is worse than the expectation. Furthermore, for the trouble shooting of poor-performing systems, a performance evaluation scenario is proposed: four levels of simulated KPI 1 as well as the plant KPI will be calculated; the value difference of adjacent KPI levels implies what prevents our current system to have a better performance and how to improve it.
    Two industrial case studies are investigated. The first one is a linear and unconstrained system, but with large nonzero-mean deterministic disturbance detected in the output data. The expected performance shown by the simulated KPI is much better than the actual plant performance, because the effect of the deterministic disturbance is not considered in the nominal simulated KPI. To address this problem, a deterministic-disturbance-affected KPI is defined and calculated using the sampled disturbance data, providing a more realistic performance standard.
    The second case study is a nonlinear and constrained system, controlled with PI controllers. With the entire system model provided, we show that both a simulated PI KPI and a simulated MPC KPI could be calculated simultaneously. The comparison between simulated PI KPI and plant KPI suggests the plant controller is not well-tuned and what plant performance could be expected with improved PI control; while the comparison between simulated PI KPI and simulated MPC KPI illustrates how much plant performance improvement could be expected if an MPC layer is installed to replace the PI layer. This information will help in deciding if an MPC controller implementation is worthwhile.

References

[1] M. Bauer and I.K. Craig. Economic assessment of advanced process control–A survey and framework. J. Proc. Cont., 18(1):2–18, 2008.

[2] D.B. Ender. Process control performance: Not as good as you think. J. Proc. Cont., 40(10):180–190, 1993.

[3] Thomas J. Harris. Assessment of control loop performance. Can. J. Chem. Eng., 67:856–861, 1989.

[4] Brian J. Odelson, Murali R. Rajamani, and James B. Rawlings. A new autocovariance least-squares method for estimating noise covariances. Automatica, 42(2):303–308, February 2006.

[5] S. Joe Qin and Thomas A. Badgwell. A survey of industrial model predictive control technology. Control Eng. Prac., 11(7):733–764, 2003.

[6] S.J. Qin and J. Yu. Recent developments in multivariable controller performance monitoring. J. Proc. Cont., 17(3):221–227, 2007.

[7] J. Schafer and A. Cinar. Multivariable MPC system performance assessment, monitoring, and diagnosis. J. Proc. Cont., 14(2):113–129, 2004.

[8] S.L. Shah, R. Patwardhan, and B. Huang. Multivariable controller performance analysis: methods, applications and challenges. Chem. Proc. Cont., 98:190–207, 2002.

    1
      Four simulated KPIs are: linear and unconstrained KPI, linear and constrained KPI, linear and constrained with identified deterministic disturbance KPI, nonlinear and constrained KPI.


Extended Abstract: File Not Uploaded
See more of this Session: Process Monitoring and Fault Detection
See more of this Group/Topical: Computing and Systems Technology Division