1st Edition

Optimality Conditions in Convex Optimization A Finite-Dimensional View

By Anulekha Dhara, Joydeep Dutta Copyright 2012
    444 Pages 17 B/W Illustrations
    by CRC Press

    444 Pages 17 B/W Illustrations
    by CRC Press

    Optimality Conditions in Convex Optimization explores an important and central issue in the field of convex optimization: optimality conditions. It brings together the most important and recent results in this area that have been scattered in the literature—notably in the area of convex analysis—essential in developing many of the important results in this book, and not usually found in conventional texts. Unlike other books on convex optimization, which usually discuss algorithms along with some basic theory, the sole focus of this book is on fundamental and advanced convex optimization theory.

    Although many results presented in the book can also be proved in infinite dimensions, the authors focus on finite dimensions to allow for much deeper results and a better understanding of the structures involved in a convex optimization problem. They address semi-infinite optimization problems; approximate solution concepts of convex optimization problems; and some classes of non-convex problems which can be studied using the tools of convex analysis. They include examples wherever needed, provide details of major results, and discuss proofs of the main results.

    What Is Convex Optimization?
    Basic concepts
    Smooth Convex Optimization

    Tools for Convex Optimization
    Convex Sets

    Convex Functions
    Subdifferential Calculus
    Conjugate Functions
    Epigraphical Properties of Conjugate Functions

    Basic Optimality Conditions using the Normal Cone
    Slater Constraint Qualification
    Abadie Constraint Qualification
    Convex Problems with Abstract Constraints
    Max-Function Approach
    Cone-Constrained Convex Programming

    Saddle Points, Optimality, and Duality
    Basic Saddle Point Theorem
    Affine Inequalities and Equalities and Saddle Point Condition
    Lagrangian Duality
    Fenchel Duality
    Equivalence between Lagrangian and Fenchel Duality: Magnanti’s Approach

    Enhanced Fritz John Optimality Conditions
    Enhanced Fritz John Conditions Using the Subdifferential
    Enhanced Fritz John Conditions under Restrictions
    Enhanced Fritz John Conditions in the Absence of Optimal Solution
    Enhanced Dual Fritz John Optimality Conditions

    Optimality without Constraint Qualification
    Geometric Optimality Condition: Smooth Case
    Geometric Optimality Condition: Nonsmooth Case
    Separable Sublinear Case

    Sequential Optimality Conditions and Generalized Constraint Qualification
    Sequential Optimality: Thibault’s Approach
    Fenchel Conjugates and Constraint Qualification
    Applications to Bilevel Programming Problems

    Representation of the Feasible Set and KKT Conditions
    Smooth Case
    Nonsmooth Case

    Weak Sharp Minima in Convex Optimization
    Weak Sharp Minima and Optimality

    Approximate Optimality Conditions
    ε-Subdifferential Approach
    Max-Function Approach
    ε-Saddle Point Approach
    Exact Penalization Approach
    Ekeland’s Variational Principle Approach
    Modified ε-KKT Conditions
    Duality-Based Approach to ε-Optimality

    Convex Semi-Infinite Optimization
    Sup-Function Approach
    Reduction Approach
    Lagrangian Regular Point
    Farkas–Minkowski Linearization
    Noncompact Scenario: An Alternate Approach

    Convexity in Nonconvex Optimization
    Maximization of a Convex Function
    Minimization of d.c. Functions



    Anulekha Dhara earned her Ph.d. in IIT Delhi and subsequently moved to IIT Kanpur for her post-doctoral studies. Currently, she is a post-doctoral fellow in Mathematics at the University of Avignon, France. Her main area of interest is optimization theory.

    Joydeep Dutta is an Associate Professor of Mathematics at the Indian Institute of Technology, (IIT) Kanpur. His main area of interest is optimization theory and applications.

    "It discusses a number of major approaches to the subject, bringing together many results from the past thirty-five years into one handy volume. … Researchers in variational analysis should find this book to be a useful reference; for those new to convex optimization, it provides a very accessible entry point to the field. I have begun recommending it to graduate students who would like to learn about convex subdifferential calculus. … a valuable book, a most welcome addition to the optimization theory literature."
    —Doug Ward, Mathematical Reviews, January 2013