The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered i...
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
原文摘录
· · · · · ·
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable. (查看原文)
Wiley Series in Telecommunications and Signal Processing(共15册),
这套丛书还有
《Fundamentals of Telecommunication Networks》《Polynomial Signal Processing》《Vector Space Projections》《Digital Communication Receivers》《Introduction to Digital Mobile Communication》
等
。
喜欢读"Elements of Information Theory"的人也喜欢
· · · · · ·
读这本书的时候看见一个知乎的帖子,问 “信息论值不值得研究?” 想了一下,有些理论还是很重要的,比如信息熵,字节跳动 2021 年 ACL 的 Best Paper Award 就用到了,我还非常有幸和论文第一作者交流了一下学术。 信息论看来可能对于通讯领域更加重要吧。很难想象在几十年前...
(展开)
0 有用 谁家的鸡 2019-12-26 14:21:13
写得真好!信息论用到的数学是如此简单,但是结论又这么优雅,Shannon 真的厉害。
0 有用 无色无味气体 2019-07-22 17:56:49
做毕设看了一点,还是应该学点信息论,本科遗憾之一了。
0 有用 叉叉小箭猪 2010-02-25 04:03:42
讲得很屌,就是太简约
0 有用 银魔像卡恩 2011-06-11 07:39:05
EE634 信息论的教材 一个学期上完书还是没怎么翻过 就光看别人的笔记了... 平时作业都是抄的参考答案的... 不过最后考试考得还不错,看来我很有瞎证乱搞的天赋
0 有用 Hao 2022-02-02 16:39:04
大约在 8 年前我就有学习信息论的计划