This work describes all basic equaitons and inequalities that form the necessary and sufficient optimality conditions of variational calculus and the theory of optimal control. Subjects addressed include developments in the investigation of optimality conditions, new classes of solutions, analytical and computation methods, and applications.
Table of Contents
Control Processes Optimal Control Problem Bounding and Solving Functions Sufficient Conditions of Optimality Some Special Optimal Control Problems Direct Application of the Sufficient Optimality Conditions Sufficient Optimality Conditions and Basic Equations of the Variational Calculus and Optimal Control Theory Optimal Feeback Policy Hamilton-Jacobi Method Computer Methods for Successive Improvements in a Control Program Dual Computational Algorithms for Solving and Optimizing Controllable Systems of Equations Extension of the Class of Solving Functions Theorems of Existence of Solving Functions Optimal Control Under Conditions of Incomplete Information
". . .offers a different, but very interesting, view and treatment of the optimal process for dynamical systems. . . .highly recommended. . ..contains a valuable learning and reference resource, and many workout examples. "
---Applied Mechanics Reviews
". . .an excellent book. . . . . .has complete treatment of the theory of optimal control. . .. . . .highly recommend[ed]. . . "