Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:
Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
Table of Contents
Neural Complex: A Nonlinear CI System?
Neural Complex vis-a-vis Statistical Mechanics, Entropy, Thermodynamics and Information Theory
Neural Communication and Control in Information-Theoretic Plane
Neural Complexity: An Algorithmic Representation
Neural Information Dynamics
Semiotic Framework of Neural Information Processing
Genetic Algorithmic Based Depiction of Neural Information