Catalog Search Results
Author
Language
English
Description
The first comprehensive introduction to information theory, this text explores the work begun by Shannon and continued by McMillan, Feinstein, and Khinchin. Its rigorous treatment addresses the entropy concept in probability theory and fundamental theorems as well as ergodic sources, the martingale concept, anticipation and memory, and other subjects. 1957 edition.
In Interlibrary Loan
Didn't find what you need? Items not owned by Pueblo Library can be requested from other Interlibrary Loan libraries to be delivered to your local library for pickup.
Didn't find it?
Can't find what you are looking for? Try our Materials Request Service. Submit Request