Global dynamic optimization arises in many engineering applications such as parameter estimation, global optimal control, and optimization-based worst-case uncertainty analysis. In branch-and-bound deterministic global optimization algorithms, a major computational bottleneck is generating appropriate lower bounds for the globally optimal objective value. These bounds are typically constructed using convex relaxations for the solutions of dynamic systems with respect to decision variables. Tighter convex relaxations thus translate into tighter lower bounds, which will typically reduce the number of iterations required by branch-and-bound. Subgradients, as useful local sensitivities of convex relaxations, are typically required by nonsmooth optimization solvers to effectively minimize these relaxations. This thesis develops novel techniques for efficiently computing tight convex relaxations with the corresponding subgradients for the solutions of ordinary differential equations (ODEs), to ultimately improve efficiency of deterministic global dynamic optimization.
Firstly, new bounding and comparison results for dynamic process models are developed, which are more broadly applicable to engineering models than previous results. These new results show for the first time that in a state-of-the-art ODE relaxation framework, tighter enclosures of the original ODE system's right-hand side will necessarily translate into enclosures for the state variables that are at least as tight, which paves the way towards new advances for bounding in global dynamic optimization.
Secondly, new convex relaxations are proposed for the solutions of ODE systems. These new relaxations are guaranteed to be at least as tight as state-of-the-art ODE relaxations. Unlike established ODE relaxation approaches, the new ODE relaxation approach can employ any valid convex and concave relaxations for the original right-hand side, and tighter such relaxations will necessarily yield ODE relaxations that are at least as tight. In a numerical case study, such tightness does indeed improve computational efficiency in deterministic global dynamic optimization. This new ODE relaxation approach is then extended in various ways to further tighten ODE relaxations.
Thirdly, new subgradient evaluation approaches are proposed for ODE relaxations. Unlike established approaches that compute valid subgradients for nonsmooth dynamic systems, the new approaches are compatible with reverse automatic differentiation (AD). It is shown for the first time that subgradients of dynamic convex relaxations can be computed via a modified adjoint ODE sensitivity system, which could speed up lower bounding in global dynamic optimization.
Lastly, in the situation where convex relaxations are known to be correct but subgradients are unavailable (such as for certain ODE relaxations), a new approach is proposed for tractably constructing useful correct affine underestimators and lower bounds of the convex relaxations just by black-box sampling. No additional assumptions are required, and no subgradients must be computed at any point. Under mild conditions, these new bounds are shown to converge rapidly to an original nonconvex function as the domain of interest shrinks. Variants of the new approach are presented to account for numerical error or noise in the sampling procedure. / Thesis / Doctor of Philosophy (PhD)
Identifer | oai:union.ndltd.org:mcmaster.ca/oai:macsphere.mcmaster.ca:11375/26945 |
Date | January 2021 |
Creators | Song, Yingkai |
Contributors | Khan, Kamil, Chemical Engineering |
Source Sets | McMaster University |
Language | English |
Detected Language | English |
Type | Thesis |
Page generated in 0.002 seconds