The initial questions treated by information theory lay in the areas of data compression and transmission. The answers are quantities such as entropy and mutual information, which are functions of the probability distributions that underlie the process of communication.
The Kol-mogorov complexity of a binary string is defined as the length of the
shortest computer program that prints out the string.
Entropy is the uncertainty of a single random variable.
A communication channel is a system in which the output depends probabilistically on its input.
The entropy H of a random variable is a lower bound on the average length of the shortest description of the random variable. (查看原文)
读这本书的时候看见一个知乎的帖子,问 “信息论值不值得研究?” 想了一下,有些理论还是很重要的,比如信息熵,字节跳动 2021 年 ACL 的 Best Paper Award 就用到了,我还非常有幸和论文第一作者交流了一下学术。 信息论看来可能对于通讯领域更加重要吧。很难想象在几十年前...
(展开)
1 有用 Cannibal 2011-04-29 22:48:47
this is good
0 有用 英子 2010-06-06 15:35:19
读过其中一章关于相对熵等概念的地方,第12章,觉得说的非常容易懂,很喜欢的一本参考书。
1 有用 Cannibal 2011-04-29 22:48:47
this is good
0 有用 英子 2010-06-06 15:35:19
读过其中一章关于相对熵等概念的地方,第12章,觉得说的非常容易懂,很喜欢的一本参考书。