**Also available as eBook on:**

*Give Your Students the Proper Groundwork for Future Studies in Optimization*

**A First Course in Optimization** is designed for a one-semester course in optimization taken by advanced undergraduate and beginning graduate students in the mathematical sciences and engineering. It teaches students the basics of continuous optimization and helps them better understand the mathematics from previous courses.

The book focuses on general problems and the underlying theory. It introduces all the necessary mathematical tools and results. The text covers the fundamental problems of constrained and unconstrained optimization as well as linear and convex programming. It also presents basic iterative solution algorithms (such as gradient methods and the Newton–Raphson algorithm and its variants) and more general iterative optimization methods.

This text builds the foundation to understand continuous optimization. It prepares students to study advanced topics found in the author’s companion book, *Iterative Optimization in Inverse Problems*, including sequential unconstrained iterative optimization methods.

**Optimization without Calculus **Chapter Summary

The Arithmetic Mean-Geometric Mean Inequality

An Application of the AGM Inequality: the Number e

Extending the AGM Inequality

Optimization Using the AGM Inequality

The Holder and Minkowski Inequalities

Cauchy's Inequality

Optimizing using Cauchy's Inequality

An Inner Product for Square Matrices

Discrete Allocation Problems

**Geometric Programming **Chapter Summary

An Example of a GP Problem

Posynomials and the GP Problem

The Dual GP Problem

Solving the GP Problem

Solving the DGP Problem

Constrained Geometric Programming

**Basic Analysis **Chapter Summary

Minima and Infima

Limits

Completeness

Continuity

Limsup and Liminf

Another View

Semi-Continuity

**Convex Sets **Chapter Summary

The Geometry of Real Euclidean Space

A Bit of Topology

Convex Sets in RJ

More on Projections

Linear and Affine Operators on RJ

The Fundamental Theorems

Block-Matrix Notation

Theorems of the Alternative

Another Proof of Farkas' Lemma

Gordan's Theorem Revisited

**Vector Spaces and Matrices **Chapter Summary

Vector Spaces

Basic Linear Algebra

LU and QR Factorization

The LU Factorization

**Linear Programming **Chapter Summary

Primal and Dual

Converting a Problem to PS Form

Duality Theorems

The Basic Strong Duality Theorem

Another Proof

Proof of Gale's Strong Duality Theorem

Some Examples

The Simplex Method

Yet Another Proof

The Sherman–Morrison–Woodbury Identity

An Example of the Simplex Method

Another Example

Some Possible Difficulties

Topics for Projects

**Matrix Games and Optimization **Chapter Summary

Two-Person Zero-Sum Games

Deterministic Solutions

Randomized Solutions

Symmetric Games

Positive Games

Example: The "Bluffing" Game

Learning the Game

Non-Constant-Sum Games

**Differentiation **Chapter Summary

Directional Derivative

Partial Derivatives

Some Examples

Gâteaux Derivative

Fréchet Derivative

The Chain Rule

**Convex Functions **Chapter Summary

Functions of a Single Real Variable

Functions of Several Real Variables

Sub-Differentials and Sub-Gradients

Sub-Gradients and Directional Derivatives

Functions and Operators

Convex Sets and Convex Functions

**Convex Programming **Chapter Summary

The Primal Problem

From Constrained to Unconstrained

Saddle Points

The Karush–Kuhn–Tucker Theorem

On Existence of Lagrange Multipliers

The Problem of Equality Constraints

Two Examples

The Dual Problem

Nonnegative Least-Squares Solutions

An Example in Image Reconstruction

Solving the Dual Problem

Minimum One-Norm Solutions

**Iterative Optimization**Chapter Summary

The Need for Iterative Methods

Optimizing Functions of a Single Real Variable

The Newton–Raphson Approach

Approximate Newton–Raphson Methods

Derivative-Free Methods

Rates of Convergence

Descent Methods

Optimizing Functions of Several Real Variables

Auxiliary-Function Methods

Projected Gradient-Descent Methods

Feasible-Point Methods

Quadratic Programming

Simulated Annealing

**Solving Systems of Linear Equations **Chapter Summary

Arbitrary Systems of Linear Equations

Regularization

Nonnegative Systems of Linear Equations

Regularized SMART and EMML

Block-Iterative Methods

**Conjugate-Direction Methods**Chapter Summary

Iterative Minimization

Quadratic Optimization

Conjugate Bases for RJ

The Conjugate Gradient Method

Krylov Subspaces

Extensions of the CGM

**Operators **

Chapter Summary

Operators

Contraction Operators

Orthogonal-Projection Operators

Two Useful Identities

Averaged Operators

Gradient Operators

Affine-Linear Operators

Paracontractive Operators

Matrix Norms

**Looking Ahead **Chapter Summary

Sequential Unconstrained Minimization

Examples of SUM

Auxiliary-Function Methods

The SUMMA Class of AF Methods

Bibliography

Index

*Exercises appear at the end of each chapter.*

"This book, split in fifteen chapters, presents a thorough introduction to optimization for undergraduate and graduate courses in science and engineering and forms a useful toolkit of optimization techniques across several domains. … Throughout this very interesting book, a number of solved and unsolved exercises provide the reader with all the necessary tools to understand the techniques and their applicability in real-life problems."

—Zentralblatt MATH1312