State Estimation for Dynamic Systems presents the state of the art in this field and discusses a new method of state estimation. The method makes it possible to obtain optimal two-sided ellipsoidal bounds for reachable sets of linear and nonlinear control systems with discrete and continuous time. The practical stability of dynamic systems subjected to disturbances can be analyzed, and two-sided estimates in optimal control and differential games can be obtained. The method described in the book also permits guaranteed state estimation (filtering) for dynamic systems in the presence of external disturbances and observation errors. Numerical algorithms for state estimation and optimal control, as well as a number of applications and examples, are presented. The book will be an excellent reference for researchers and engineers working in applied mathematics, control theory, and system analysis. It will also appeal to pure and applied mathematicians, control engineers, and computer programmers.