This book is based on the Third Kingston Conference on Differential Games and Control Theory held at the University of Rhode Island June 5-8, 1978. It deals with deterministic systems and stochastic systems, and is helpful for the researchers in applied mathematics.
Table of Contents
1. Necessary and Sufficiency Conditions for Optimal Strategies in Impulsive Control 2. Guaranteed Ultimate Boundedness for a Class of Uncertain Linear Dynamical Systems 3. The Optimal Control of Semimartingale 4. Minimum Exit Probabilities and Differential Games 5. A Hybrid Relaxed-Lagrangian Second Order Condition for Minimum 6. Dynamic Optimization in Markovian Queueing Systems 7. Two Person Zero Sum Differential Games with incomplete information—A Bayesian Model 8. Sensitivity Results in Optimization with Applications to Optimal Control Problems 9. Primal and Dual Game theoretic Solutions with Applications to Decentralized Control 10. Some Remarks concerning the Dual Control Problem 11. Application of the Generalized Strong Laws of Large Numbers to the Proper Choice of the Amplification Coefficients in Stochastic Approximation with Correlated Disturbances