The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered i...
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
原文摘录 · · · · · ·
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable. (查看原文)
Wiley Series in Telecommunications and Signal Processing (共15册),
这套丛书还有
《Principles of Broadband Switching and Networking (Wiley Series in Telecommunications and Signal Processing)》,《Vector Space Projections》,《Advances in Multiuser Detection》,《Polynomial Signal Processing》,《Introduction to Digital Mobile Communication》 等。
喜欢读"Elements of Information Theory"的人也喜欢
· · · · · ·
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication. The Kol-mogorov complexity of a binary string is defined as the length of the shortest computer program that prints out the st...
2014-05-23 09:381人喜欢
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable.引自 1.1 PREVIEW OF THE BOOK
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variab...
2013-10-26 02:52
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variable. Mutual information is a special case of a more general quantity called relative entropy, which is a measure of the distance between two probability distributions.
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication. The Kol-mogorov complexity of a binary string is defined as the length of the shortest computer program that prints out the st...
2014-05-23 09:381人喜欢
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable.引自 1.1 PREVIEW OF THE BOOK
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variab...
2013-10-26 02:52
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variable. Mutual information is a special case of a more general quantity called relative entropy, which is a measure of the distance between two probability distributions.
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication. The Kol-mogorov complexity of a binary string is defined as the length of the shortest computer program that prints out the st...
2014-05-23 09:381人喜欢
The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable.引自 1.1 PREVIEW OF THE BOOK
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variab...
2013-10-26 02:52
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be. This notion is extended to define mutual information, which is a measure of the amount of information one random variable contains about another. Entropy then becomes the self-information of a random variable. Mutual information is a special case of a more general quantity called relative entropy, which is a measure of the distance between two probability distributions.
0 有用 Creasy 2013-02-28
Great book!
0 有用 银魔像卡恩 2011-06-11
EE634 信息论的教材 一个学期上完书还是没怎么翻过 就光看别人的笔记了... 平时作业都是抄的参考答案的... 不过最后考试考得还不错,看来我很有瞎证乱搞的天赋
1 有用 zchenah 2015-06-20
废话多内容散读不完,不够elements
0 有用 英子 2009-08-08
只读了第二章,讲述了熵的概念,感觉还是比较不错的。其它的章节不知道了就。
4 有用 Bayesian 2012-04-08
经典。但若是为了研究统计推断而学习信息论,我更推荐MacKay的Information Theory, Inference and Learning Algorithms。
0 有用 δεζη 2021-03-20
(读了大概三分之一)我感觉还是挺适合我这种信息论菜鸟的。 当年,去蹭耿老师的网络信息论,什么都听不懂。课下拿起这本教材恶补基础。然鹅,最后也就学个皮毛[捂脸]
0 有用 一个榴莲三只鸡 2020-12-31
暑假跟着网课断断续续读完了前面九章。十二月重读,感觉前面大部分证明能跟上了。不过看到习题还是两眼一黑。
0 有用 咸鱼呀 2020-08-09
八成标记错了我看的是这个😂😂😂还记得当年半吊子英语看的这个费劲
0 有用 青宁 2020-06-21
挺全面的,不过有些地方感觉对信息论新手不太友好~(比如 Joint AEP就接受,直接给结论,有点懵😂)
0 有用 shimmeringx 2020-03-03
有些地方有点散乱,但毫无疑问仍然是信息论教材的事实标准。