出版社: Cambridge University Press
副标题: The Logic of Science
出版年: 200369
页数: 753
定价: USD 110.00
装帧: Hardcover
ISBN: 9780521592710
内容简介 · · · · · ·
The standard rules of probability can be interpreted as uniquely valid principles in logic. In this book, E. T. Jaynes dispels the imaginary distinction between 'probability theory' and 'statistical inference', leaving a logical unity and simplicity, which provides greater technical power and flexibility in applications. This book goes beyond the conventional mathematics of pro...
The standard rules of probability can be interpreted as uniquely valid principles in logic. In this book, E. T. Jaynes dispels the imaginary distinction between 'probability theory' and 'statistical inference', leaving a logical unity and simplicity, which provides greater technical power and flexibility in applications. This book goes beyond the conventional mathematics of probability theory, viewing the subject in a wider context. New results are discussed, along with applications of probability theory to a wide variety of problems in physics, mathematics, economics, chemistry and biology. It contains many exercises and problems, and is suitable for use as a textbook on graduate level courses involving data analysis. The material is aimed at readers who are already familiar with applied mathematics at an advanced undergraduate level or higher. The book will be of interest to scientists working in any area where inference from incomplete information is necessary.
豆瓣成员常用的标签(共81个) · · · · · ·
喜欢读"Probability Theory"的人也喜欢的电子书 · · · · · ·
喜欢读"Probability Theory"的人也喜欢 · · · · · ·

The general central tendency of probability distribution towards this final form is now seen as a consequence of their maximum entropy properties. If a probability distribution is subjected to some transformation that discards information but leaves certain quantities invariant, then, under very general conditions, if the transformation is repeated, the distribution tends to the one with maximum e...
20140901 12:45
The general central tendency of probability distribution towards this final form is now seen as a consequence of their maximum entropy properties. If a probability distribution is subjected to some transformation that discards information but leaves certain quantities invariant, then, under very general conditions, if the transformation is repeated, the distribution tends to the one with maximum entropy, subject to the constraints of those conserved quantities.This brings us to the term "central limit theorem", which we have derived as a special case of the phenomenon just noted  the behavior of probability distributions under repeated convolutions, which conserve fist and second moments. This name was introduced by George Polya (1920), with the intention that the adjective "central" was to modify the noun "theorem"; i.e. it is the limit theorem which is central to probability theory. Almost universally, students today think that "central" modifies "limit", so that it is instead a theorem about a "central limit", whatever that means.回应 20140901 12:45 
My own impression...is that the mathematical results have outrun their interpretation and that some simple explanation of the force and meaning of the celebrated integral...will one day be found...which will at once render useless all the works hitherto written.  Augustus de Morgan (1838) 145 years after de Morgan's remark (1983), "Why have we for so long managed with normality assum...
20140901 10:50
My own impression...is that the mathematical results have outrun their interpretation and that some simple explanation of the force and meaning of the celebrated integral...will one day be found...which will at once render useless all the works hitherto written.  Augustus de Morgan (1838)145 years after de Morgan's remark (1983),"Why have we for so long managed with normality assumptions?"回应 20140901 10:50 
The most ubiquitous reason for using the Gaussian sampling distribution is not that the error frequencies are known to be  or assumed to be  Gaussian, but rather because those frequencies are unknown. One sees what a totally different outlook this is than that of Feller and Barnard; "normality" was not an assumption of physical fact at all. It was a valid description of our state of kn...
20140901 10:42
The most ubiquitous reason for using the Gaussian sampling distribution is not that the error frequencies are known to be  or assumed to be  Gaussian, but rather because those frequencies are unknown. One sees what a totally different outlook this is than that of Feller and Barnard; "normality" was not an assumption of physical fact at all. It was a valid description of our state of knowledge. In most cases, had we done anything different, we would be making an unjustified, gratuitous assumption.回应 20140901 10:42 
We shall find, in the central limit theorem, still another strong justification for using Gaussian error distributions. But if the Gaussian law is nearly always a good representation of our state of knowledge about the errors in our specific data set, it follows that inferences made from it are nearly always the best ones that could have been made from the information that we actually have.
20140901 10:35
We shall find, in the central limit theorem, still another strong justification for using Gaussian error distributions. But if the Gaussian law is nearly always a good representation of our state of knowledge about the errors in our specific data set, it follows that inferences made from it are nearly always the best ones that could have been made from the information that we actually have.回应 20140901 10:35

The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) er have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability witch is, or ought to be, in a reasonable man's mind. ...
20130731 19:59
The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) er have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability witch is, or ought to be, in a reasonable man's mind. James Clerk Maxwell (1850)回应 20130731 19:59 
和不秉烛 (La Liberté guidant le peuple)
Of course, on publishing a new theorem, the mathematician will try very hard to invent an argument which uses only the first kind; but the reasoning process which led to the theorem in the first place almost always involves one of the weaker forms (based, for example, on following up conjectures suggested by analogies). The same idea is expressed in a remark of S. Banach (quoted by S. Ulam, 1957):...20110503 21:51
Stefan Banach ([ˈstɛfan ˈbanax] ( listen); March 30, 1892 – August 31, 1945) was a Polish mathematician who worked in interwar Poland and in Soviet Ukraine. He is generally considered to have been one of the 20th century's most important and influential mathematicians.A selftaught mathematics prodigy, Banach was the founder of modern functional analysis and a founder of the Lwów School of Mathematics. Among his most prominent achievements was the 1932 book, Théorie des opérations linéaires (Theory of Linear Operations), the first monograph on the general theory of functional analysis.Notable mathematical concepts named after Banach include the Banach–Tarski paradox, the Hahn–Banach theorem, the Banach–Steinhaus theorem, the BanachMazur game, and the Banach space.Of course, on publishing a new theorem, the mathematician will try very hard to invent an argument which uses only the first kind; but the reasoning process which led to the theorem in the first place almost always involves one of the weaker forms (based, for example, on following up conjectures suggested by analogies). The same idea is expressed in a remark of S. Banach (quoted by S. Ulam, 1957): Good mathematicians see analogies between theorems; great mathematicians see analogies between analogies.
回应 20110503 21:51 
In our reasoning we depend very much on prior information to help us in evaluating the degree of plausibility in a new problem. This reasoning process goes on unconsciously, almost instantaneously, and we conceal how complicated it really is by calling it common sense.
20130731 20:27

和不秉烛 (La Liberté guidant le peuple)
Many people are fond of saying, ‘They will never make a machine to replace the human mind – it does many things which no machine could ever do.’ A beautiful answer to this was given by J. von Neumann in a talk on computers given in Princeton in 1948, which the writer was privileged to attend. In reply to the canonical question from the audience (‘But of course, a mere machine can’t really thi...20110504 09:50
Many people are fond of saying, ‘They will never make a machine to replace the human mind – it does many things which no machine could ever do.’ A beautiful answer to this was given by J. von Neumann in a talk on computers given in Princeton in 1948, which the writer was privileged to attend. In reply to the canonical question from the audience (‘But of course, a mere machine can’t really think, can it?’), he said: You insist that there is something a machine cannot do. If you will tell me precisely what it is that a machine cannot do, then I can always make a machine which will do just that!
回应 20110504 09:50

The general central tendency of probability distribution towards this final form is now seen as a consequence of their maximum entropy properties. If a probability distribution is subjected to some transformation that discards information but leaves certain quantities invariant, then, under very general conditions, if the transformation is repeated, the distribution tends to the one with maximum e...
20140901 12:45
The general central tendency of probability distribution towards this final form is now seen as a consequence of their maximum entropy properties. If a probability distribution is subjected to some transformation that discards information but leaves certain quantities invariant, then, under very general conditions, if the transformation is repeated, the distribution tends to the one with maximum entropy, subject to the constraints of those conserved quantities.This brings us to the term "central limit theorem", which we have derived as a special case of the phenomenon just noted  the behavior of probability distributions under repeated convolutions, which conserve fist and second moments. This name was introduced by George Polya (1920), with the intention that the adjective "central" was to modify the noun "theorem"; i.e. it is the limit theorem which is central to probability theory. Almost universally, students today think that "central" modifies "limit", so that it is instead a theorem about a "central limit", whatever that means.回应 20140901 12:45 
My own impression...is that the mathematical results have outrun their interpretation and that some simple explanation of the force and meaning of the celebrated integral...will one day be found...which will at once render useless all the works hitherto written.  Augustus de Morgan (1838) 145 years after de Morgan's remark (1983), "Why have we for so long managed with normality assum...
20140901 10:50
My own impression...is that the mathematical results have outrun their interpretation and that some simple explanation of the force and meaning of the celebrated integral...will one day be found...which will at once render useless all the works hitherto written.  Augustus de Morgan (1838)145 years after de Morgan's remark (1983),"Why have we for so long managed with normality assumptions?"回应 20140901 10:50 
The most ubiquitous reason for using the Gaussian sampling distribution is not that the error frequencies are known to be  or assumed to be  Gaussian, but rather because those frequencies are unknown. One sees what a totally different outlook this is than that of Feller and Barnard; "normality" was not an assumption of physical fact at all. It was a valid description of our state of kn...
20140901 10:42
The most ubiquitous reason for using the Gaussian sampling distribution is not that the error frequencies are known to be  or assumed to be  Gaussian, but rather because those frequencies are unknown. One sees what a totally different outlook this is than that of Feller and Barnard; "normality" was not an assumption of physical fact at all. It was a valid description of our state of knowledge. In most cases, had we done anything different, we would be making an unjustified, gratuitous assumption.回应 20140901 10:42 
We shall find, in the central limit theorem, still another strong justification for using Gaussian error distributions. But if the Gaussian law is nearly always a good representation of our state of knowledge about the errors in our specific data set, it follows that inferences made from it are nearly always the best ones that could have been made from the information that we actually have.
20140901 10:35
We shall find, in the central limit theorem, still another strong justification for using Gaussian error distributions. But if the Gaussian law is nearly always a good representation of our state of knowledge about the errors in our specific data set, it follows that inferences made from it are nearly always the best ones that could have been made from the information that we actually have.回应 20140901 10:35
论坛 · · · · · ·
一个介绍其思想片段的地方  来自黠之大者chi  3 回应  20110713 
本书英文影印版即将由人民邮电出版社图灵公司出版。  来自刘江  3 回应  20150907 
在豆瓣App讨论这本书 · · · · · ·
没有豆瓣App？ 去下载
这本书的其他版本 · · · · · · ( 全部2 )
 人民邮电出版社版 20094 / 125人读过
以下豆列推荐 · · · · · · ( 全部 )
 《暗时间》里提到的书目 (pn)
 要读的书 (TBONTB64)
 个人觉得不错的数学教材 (rushui999)
 [个人] Readinglist Season 1011 (伞保护协会)
 统计，人工智能，机器学习 (要专注)
谁读这本书?
二手市场
订阅关于Probability Theory的评论:
feed: rss 2.0
0 有用 revv 20120524
把艰深难懂的理论说的娓娓道来，非常推荐。
0 有用 流年闲草 20150215
概率论的哲学视角，深邃，富有洞见且美丽
0 有用 夢の點滴 20100820
這書太過給力..結果剛到第四章就看不進去了..(。_。)
0 有用 Seal Huang 20110201
先放一放，补补基础...
0 有用 beren 20161011
很有G.Polya的风范，序言里提到，这是一本没有完成作者就已经去世的书，去世前作者拜托编辑帮他出版，尽管如此，这依然是我今年读到的书里分量最重也是读得最细的一本，读完仍然觉得值得一读再读。虽然总体是关于Bayesian统计的，里面可以读到的东西却很多，关于数学，量子物理都有涉及，讲得也比较本源。感觉对于频率学派的批判这本书里见到的是最多的，也是最详细最系统的，比如关于p值完全不靠谱的论述和各种悖论读起来都让人兴奋，即使忽略里面的公式，读起来也依然十分精彩！
0 有用 beren 20161011
很有G.Polya的风范，序言里提到，这是一本没有完成作者就已经去世的书，去世前作者拜托编辑帮他出版，尽管如此，这依然是我今年读到的书里分量最重也是读得最细的一本，读完仍然觉得值得一读再读。虽然总体是关于Bayesian统计的，里面可以读到的东西却很多，关于数学，量子物理都有涉及，讲得也比较本源。感觉对于频率学派的批判这本书里见到的是最多的，也是最详细最系统的，比如关于p值完全不靠谱的论述和各种悖论读起来都让人兴奋，即使忽略里面的公式，读起来也依然十分精彩！
0 有用 流年闲草 20150215
概率论的哲学视角，深邃，富有洞见且美丽
0 有用 Collin 20141012
You'd be surprised how motivated people are.
0 有用 Tzahd 20120722
Jaynes想写得太多，又想写得太全，于是书很厚，写得略苦，读得略晦涩，但仍是一本好书，因为阅读它能感受到一个物理学家一生追求的厚重
0 有用 revv 20120524
把艰深难懂的理论说的娓娓道来，非常推荐。