Al-Baali, Mehiddin; Purnama, Anton; Grandinetti, Lucio
(Ed.)
Second order, Newton-like algorithms exhibit convergence properties superior to gradient-based or derivative-free optimization algorithms. However, deriving and computing second order derivatives--needed for the Hessian-vector product in a Krylov iteration for the Newton step--often is not trivial. Second order adjoints provide a systematic and efficient tool to derive second derivative infor- mation. In this paper, we consider equality constrained optimization problems in an infinite-dimensional setting. We phrase the optimization problem in a general Banach space framework and derive second order sensitivities and second order adjoints in a rigorous and general way. We apply the developed framework to a partial differential equation-constrained optimization problem.
more »
« less
An official website of the United States government

