244 Pages 89 B/W Illustrations
    by Chapman & Hall

    244 Pages 89 B/W Illustrations
    by Chapman & Hall

    One of the main issues in communications theory is measuring the ultimate data compression possible using the concept of entropy. While differential entropy may seem to be a simple extension of the discrete case, it is a more complex measure that often requires a more careful treatment.

    Handbook of Differential Entropy provides a comprehensive introduction to the subject for researchers and students in information theory. Unlike related books, this one brings together background material, derivations, and applications of differential entropy.

    The handbook first reviews probability theory as it enables an understanding of the core building block of entropy. The authors then carefully explain the concept of entropy, introducing both discrete and differential entropy. They present detailed derivations of differential entropy for numerous probability models and discuss challenges with interpreting and deriving differential entropy. They also show how differential entropy varies as a function of the model variance.

    Focusing on the application of differential entropy in several areas, the book describes common estimators of parametric and nonparametric differential entropy as well as properties of the estimators. It then uses the estimated differential entropy to estimate radar pulse delays when the corrupting noise source is non-Gaussian and to develop measures of coupling between dynamical system components.

    Probability in Brief
    Probability Distributions
    Expectation and Moments
    Random Processes
    Probability Summary

    The Concept of Entropy
    Discrete Entropy
    Differential Entropy
    Interpretation of Differential Entropy
    Historical and Scientific Perspective

    Entropy for Discrete Probability Distributions

    Differential Entropies for Probability Distributions

    Differential Entropy as a Function of Variance

    Applications of Differential Entropy
    Estimation of Entropy
    Mutual Information
    Transfer Entropy

    Appendices
    Derivation of Maximum Entropy Distributions under Different Constraints
    Moments and Characteristic Function for the Sine Wave Distribution
    Moments, Mode, and Characteristic Function for the Mixed-Gaussian Distribution
    Derivation of Function L(α) Used in Derivation for Entropy of Mixed-Gaussian Distribution
    References to Formulae Used in This Text

    Bibliography

    Biography

    Joseph V. Michalowicz is a consultant with Sotera Defense Solutions. He retired from the U.S. Naval Research Laboratory as head of the Sensor and Data Processing Section in the Optical Sciences Division. He has published extensively in the areas of mathematical modeling, probability and statistics, signal detection, multispectral infrared sensors, and category theory. He received a Ph.D. in mathematics with a minor in electrical engineering from the Catholic University of America.

    Jonathan M. Nichols is a member of the Maritime Sensing Section in the Optical Sciences Division at the U.S. Naval Research Laboratory. His research interests include signal and image processing, parameter estimation, and the modeling and analysis of infrared imaging devices. He received a Ph.D. in mechanical engineering from Duke University.

    Frank Bucholtz is head of the Advanced Photonics Section at the U.S. Naval Research Laboratory. He has published in the areas of microwave signal processing and microwave photonics, fiber optic sensors, micro-optical devices, nonlinear dynamics and chaos, hyperspectral imaging systems, and information theory. His current research focuses on optical components for digital communications. He received a Ph.D. in physics from Brown University.