出版社: Springer New York
出版年: 20100219
页数: 608
定价: USD 109.00
装帧: Paperback
丛书: Springer Texts in Statistics
ISBN: 9781441929785
豆瓣成员常用的标签(共12个) · · · · · ·
丛书信息
喜欢读"Mathematical Statistics"的人也喜欢 · · · · · ·
> 更多短评 4 条
Mathematical Statistics的话题 · · · · · · ( 全部 条 )
Mathematical Statistics的书评 · · · · · · ( 全部 0 条 )
读书笔记 · · · · · ·
我来写笔记
大刀 (持续努力，不断进步)
some inequality should be take care: /公式内容已省略/ CauchyScharwz inequality: /公式内容已省略/ Holder inequality: /公式内容已省略/ in which:/公式内容已省略/ Liapounov inequality: (1回应)20111231 10:39
some inequality should be take care: ($[Cov(X_i,X_j)]^2\leq Var(X_i)Var(X_j),~i\neq j$)CauchyScharwz inequality: ($[E(XY)]^2\leq EX^2EY^2$)Holder inequality: ($ EXY\leq (EX^p)^{\frac{1}{p}}(EY^q)^{\frac{1}{q}}$) in which:($p>1,\frac{1}{p}+\frac{1}{q}=1$)Liapounov inequality: ($(EX^r)^{\frac{1}{r}}\leq(EX^s)^{\frac{1}{s}}$) in which:($1\leq r\leq s$)Minkowski inequality: ($ (EX+Y^p)^{\frac{1}{q}}\leq (EX^p)^{\frac{1}{p}}+(EY^P)^{\frac{1}{p}}$) in which:($p\geq1$)Jansen inequality: ($f(EX) \leq Ef(X)$) in which ($f$) is a convex function on an open convex ($A$) and ($P(X \in A)=1$)Chebyshev inequality: ($\varphi(t)P(X\geq t)\leq\int_{X\geq t}\varphi(X)dP\leq E\varphi(X)$) in which ($varphi(t)$) is a nonnegative nondecreasing function on ($[0,\infty)$) and ($\varphi(t)=\varphi(t)$)1回应 20111231 10:39 
大刀 (持续努力，不断进步)
In this subsectionï¼Œthe author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element. As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear. ...20111228 13:39
In this subsection，the author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element.As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear.A rigorous and logically consistent definition of probability was given by one of the most famous mathematicians Kolmogorov in 1933.That is a fabulous work which develop the measuretheoretic fundamental of probability theory.We can say this as advanced probability as a distinction from classical probability.Now let's make this more detail.First of all,I would like to introduce the definition of ($\sigma$)fields.After that we can define measure on ($\sigma$)fields so to get measure space.Then we give some examples about measure space and particular the probability space.We will see that probability is essentially a measure on some particular space.Let ($\Omega$) be a set of elements of interest.We call ($\Omega$) the outcome space.(In statistics ($\Omega$) is usually represent the set of all possible outcomes of an experiment which we are interested in).Measure is a natural mathematical extension of the length,area or volume of subsets in low dimension Eucilidean Space.We should be clear that not all the subset of a space can be measurable.So we should define a set whose elements are all measurable subset of the original space.That is the motivation of ($\sigma$)fields.Here is the definition of ($\sigma$)field: Let ($F$) be a collection of subsets of ($\Omega$).($F$) is called a ($\sigma$)field if and only if it has the following properties: (1).($\o \in F$) (2).If ($A \in F$),then ($A^c \in F$) (3).if ($A_i \in F ~i=1,2,\ldots~$) then ($~\bigcup A_i ~\in~ F$)That is the definition.This kind of definition will be seen here and there in measure theory.The vary kind definition which uses properties to define a thing make it easy to define something and so developed the measure theory.A pair ($(\Omega,F)$) is called a measurable space.The elements of ($F$) are called measurable sets in measure theory or events in Statistics.Let's see a really popular ($\sigma$)field——Borel ($\sigma$)field.On the real line ($R$).($C$) is the collection of all finite open intervals on ($R$).Then ($\sigma(C)$) is called Borel ($\sigma$)field,denoted by ($B$).The elements of ($B$) is called Borel sets. We can show that all intervals no matter open or closed are Borel sets.So we have the definition of ($\sigma$)field and measurable space,now let's see how the measure defined: Let ($(\Omega,F)$) be a measurable space.A set function ($\nu$) defined on ($F$) is called a measure if and only if it has the following properties: (1) ($0 \leq \nu(A) \leq \infty$) (2)($\nu(\o)=0$) (3)If ($A_i \in F,~i=1,2,\ldots$) and ($A_i$) are disjoint,then ($\nu(\bigcup A_i)=\sum \nu(A_i)$)Then the triple ($(\Omega,F,\nu)$) is called measure space.When ($\nu(\Omega)=1$) the ($\nu$) is called probability measure and usually denoted by ($P$).($~(\Omega,F,P)$) is called a probability space.Lebesgue measure may be the most intuitionistic measure.There is a onetoone correspondence between the set of all probability measures on ($(R,B)$) and a set of functions on ($R$).So here we meet cumulative distribution function(c.d.f) as a map from probability measure to a function. ($F(x)=P((\infty,x])$)We can also extend the Lebesgue measure to higher dimension space using Cartesian product.I just mention it here,the reader should refer to the book on page 5.Now we have measure space and measures.Since ($\Omega$) can be very arbitrary,consider a mapping ($f$) from ($\Omega$) to simpler space ($\Lambda$) is convenient.If ($f^{1}(G)=F$),then the mapping ($f$) is called a measurable function from ($(\Omega,F)$) to ($(\Lambda,G)$) .In probability theory a measurable function is called random element.If it is measurable from ($(\Omega,F)$) to ($(R,B)$),it is called random variable.($f^{1}(G)$) is a sub($\sigma$)field of ($F$) denoted by ($\sigma(f)$).So we have the definition of random variable(random vector can be defined similarly).Indicator function for ($A \subset \Omega$) is the simplest measurable function.($\sigma(I_A)=\{\o,\Omega,A,A^c\}$) is a much smaller ($\sigma$)field.The class of simple functions is obtained by ($\phi(\omega)=\sum_{i=1}^k a_iI_{A_i}(\omega)$)We know that such a function is a Borel function.Proposition 1.4 in page 8 give some relationship between Borel function and measurable function.There are many Borel functions.It is really hard to find a nonBorel function.If ($\nu=P$) and ($X$) is a random variable then ($P X^{1}$) is called the law or distribution of ($X$) and denoted by ($P_X$).By far,we have already had the probability space,measures and random variables in Measure Theory.That is the basis of Advanced Mathematical Statistics.回应 20111228 13:39

大刀 (持续努力，不断进步)
some inequality should be take care: /公式内容已省略/ CauchyScharwz inequality: /公式内容已省略/ Holder inequality: /公式内容已省略/ in which:/公式内容已省略/ Liapounov inequality: (1回应)20111231 10:39
some inequality should be take care: ($[Cov(X_i,X_j)]^2\leq Var(X_i)Var(X_j),~i\neq j$)CauchyScharwz inequality: ($[E(XY)]^2\leq EX^2EY^2$)Holder inequality: ($ EXY\leq (EX^p)^{\frac{1}{p}}(EY^q)^{\frac{1}{q}}$) in which:($p>1,\frac{1}{p}+\frac{1}{q}=1$)Liapounov inequality: ($(EX^r)^{\frac{1}{r}}\leq(EX^s)^{\frac{1}{s}}$) in which:($1\leq r\leq s$)Minkowski inequality: ($ (EX+Y^p)^{\frac{1}{q}}\leq (EX^p)^{\frac{1}{p}}+(EY^P)^{\frac{1}{p}}$) in which:($p\geq1$)Jansen inequality: ($f(EX) \leq Ef(X)$) in which ($f$) is a convex function on an open convex ($A$) and ($P(X \in A)=1$)Chebyshev inequality: ($\varphi(t)P(X\geq t)\leq\int_{X\geq t}\varphi(X)dP\leq E\varphi(X)$) in which ($varphi(t)$) is a nonnegative nondecreasing function on ($[0,\infty)$) and ($\varphi(t)=\varphi(t)$)1回应 20111231 10:39 
大刀 (持续努力，不断进步)
In this subsectionï¼Œthe author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element. As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear. ...20111228 13:39
In this subsection，the author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element.As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear.A rigorous and logically consistent definition of probability was given by one of the most famous mathematicians Kolmogorov in 1933.That is a fabulous work which develop the measuretheoretic fundamental of probability theory.We can say this as advanced probability as a distinction from classical probability.Now let's make this more detail.First of all,I would like to introduce the definition of ($\sigma$)fields.After that we can define measure on ($\sigma$)fields so to get measure space.Then we give some examples about measure space and particular the probability space.We will see that probability is essentially a measure on some particular space.Let ($\Omega$) be a set of elements of interest.We call ($\Omega$) the outcome space.(In statistics ($\Omega$) is usually represent the set of all possible outcomes of an experiment which we are interested in).Measure is a natural mathematical extension of the length,area or volume of subsets in low dimension Eucilidean Space.We should be clear that not all the subset of a space can be measurable.So we should define a set whose elements are all measurable subset of the original space.That is the motivation of ($\sigma$)fields.Here is the definition of ($\sigma$)field: Let ($F$) be a collection of subsets of ($\Omega$).($F$) is called a ($\sigma$)field if and only if it has the following properties: (1).($\o \in F$) (2).If ($A \in F$),then ($A^c \in F$) (3).if ($A_i \in F ~i=1,2,\ldots~$) then ($~\bigcup A_i ~\in~ F$)That is the definition.This kind of definition will be seen here and there in measure theory.The vary kind definition which uses properties to define a thing make it easy to define something and so developed the measure theory.A pair ($(\Omega,F)$) is called a measurable space.The elements of ($F$) are called measurable sets in measure theory or events in Statistics.Let's see a really popular ($\sigma$)field——Borel ($\sigma$)field.On the real line ($R$).($C$) is the collection of all finite open intervals on ($R$).Then ($\sigma(C)$) is called Borel ($\sigma$)field,denoted by ($B$).The elements of ($B$) is called Borel sets. We can show that all intervals no matter open or closed are Borel sets.So we have the definition of ($\sigma$)field and measurable space,now let's see how the measure defined: Let ($(\Omega,F)$) be a measurable space.A set function ($\nu$) defined on ($F$) is called a measure if and only if it has the following properties: (1) ($0 \leq \nu(A) \leq \infty$) (2)($\nu(\o)=0$) (3)If ($A_i \in F,~i=1,2,\ldots$) and ($A_i$) are disjoint,then ($\nu(\bigcup A_i)=\sum \nu(A_i)$)Then the triple ($(\Omega,F,\nu)$) is called measure space.When ($\nu(\Omega)=1$) the ($\nu$) is called probability measure and usually denoted by ($P$).($~(\Omega,F,P)$) is called a probability space.Lebesgue measure may be the most intuitionistic measure.There is a onetoone correspondence between the set of all probability measures on ($(R,B)$) and a set of functions on ($R$).So here we meet cumulative distribution function(c.d.f) as a map from probability measure to a function. ($F(x)=P((\infty,x])$)We can also extend the Lebesgue measure to higher dimension space using Cartesian product.I just mention it here,the reader should refer to the book on page 5.Now we have measure space and measures.Since ($\Omega$) can be very arbitrary,consider a mapping ($f$) from ($\Omega$) to simpler space ($\Lambda$) is convenient.If ($f^{1}(G)=F$),then the mapping ($f$) is called a measurable function from ($(\Omega,F)$) to ($(\Lambda,G)$) .In probability theory a measurable function is called random element.If it is measurable from ($(\Omega,F)$) to ($(R,B)$),it is called random variable.($f^{1}(G)$) is a sub($\sigma$)field of ($F$) denoted by ($\sigma(f)$).So we have the definition of random variable(random vector can be defined similarly).Indicator function for ($A \subset \Omega$) is the simplest measurable function.($\sigma(I_A)=\{\o,\Omega,A,A^c\}$) is a much smaller ($\sigma$)field.The class of simple functions is obtained by ($\phi(\omega)=\sum_{i=1}^k a_iI_{A_i}(\omega)$)We know that such a function is a Borel function.Proposition 1.4 in page 8 give some relationship between Borel function and measurable function.There are many Borel functions.It is really hard to find a nonBorel function.If ($\nu=P$) and ($X$) is a random variable then ($P X^{1}$) is called the law or distribution of ($X$) and denoted by ($P_X$).By far,we have already had the probability space,measures and random variables in Measure Theory.That is the basis of Advanced Mathematical Statistics.回应 20111228 13:39

大刀 (持续努力，不断进步)
some inequality should be take care: /公式内容已省略/ CauchyScharwz inequality: /公式内容已省略/ Holder inequality: /公式内容已省略/ in which:/公式内容已省略/ Liapounov inequality: (1回应)20111231 10:39
some inequality should be take care: ($[Cov(X_i,X_j)]^2\leq Var(X_i)Var(X_j),~i\neq j$)CauchyScharwz inequality: ($[E(XY)]^2\leq EX^2EY^2$)Holder inequality: ($ EXY\leq (EX^p)^{\frac{1}{p}}(EY^q)^{\frac{1}{q}}$) in which:($p>1,\frac{1}{p}+\frac{1}{q}=1$)Liapounov inequality: ($(EX^r)^{\frac{1}{r}}\leq(EX^s)^{\frac{1}{s}}$) in which:($1\leq r\leq s$)Minkowski inequality: ($ (EX+Y^p)^{\frac{1}{q}}\leq (EX^p)^{\frac{1}{p}}+(EY^P)^{\frac{1}{p}}$) in which:($p\geq1$)Jansen inequality: ($f(EX) \leq Ef(X)$) in which ($f$) is a convex function on an open convex ($A$) and ($P(X \in A)=1$)Chebyshev inequality: ($\varphi(t)P(X\geq t)\leq\int_{X\geq t}\varphi(X)dP\leq E\varphi(X)$) in which ($varphi(t)$) is a nonnegative nondecreasing function on ($[0,\infty)$) and ($\varphi(t)=\varphi(t)$)1回应 20111231 10:39 
大刀 (持续努力，不断进步)
In this subsectionï¼Œthe author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element. As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear. ...20111228 13:39
In this subsection，the author Jun Shao gives a brief but sufficient introduction of the measures,probability and random element.As measure theory has always been a trouble when I read papers,I had made up my mind to read this wellknown book and write down some notes.I really don't want to be a PHD or something,I just want to make clear something that a master in Statistics should be clear.A rigorous and logically consistent definition of probability was given by one of the most famous mathematicians Kolmogorov in 1933.That is a fabulous work which develop the measuretheoretic fundamental of probability theory.We can say this as advanced probability as a distinction from classical probability.Now let's make this more detail.First of all,I would like to introduce the definition of ($\sigma$)fields.After that we can define measure on ($\sigma$)fields so to get measure space.Then we give some examples about measure space and particular the probability space.We will see that probability is essentially a measure on some particular space.Let ($\Omega$) be a set of elements of interest.We call ($\Omega$) the outcome space.(In statistics ($\Omega$) is usually represent the set of all possible outcomes of an experiment which we are interested in).Measure is a natural mathematical extension of the length,area or volume of subsets in low dimension Eucilidean Space.We should be clear that not all the subset of a space can be measurable.So we should define a set whose elements are all measurable subset of the original space.That is the motivation of ($\sigma$)fields.Here is the definition of ($\sigma$)field: Let ($F$) be a collection of subsets of ($\Omega$).($F$) is called a ($\sigma$)field if and only if it has the following properties: (1).($\o \in F$) (2).If ($A \in F$),then ($A^c \in F$) (3).if ($A_i \in F ~i=1,2,\ldots~$) then ($~\bigcup A_i ~\in~ F$)That is the definition.This kind of definition will be seen here and there in measure theory.The vary kind definition which uses properties to define a thing make it easy to define something and so developed the measure theory.A pair ($(\Omega,F)$) is called a measurable space.The elements of ($F$) are called measurable sets in measure theory or events in Statistics.Let's see a really popular ($\sigma$)field——Borel ($\sigma$)field.On the real line ($R$).($C$) is the collection of all finite open intervals on ($R$).Then ($\sigma(C)$) is called Borel ($\sigma$)field,denoted by ($B$).The elements of ($B$) is called Borel sets. We can show that all intervals no matter open or closed are Borel sets.So we have the definition of ($\sigma$)field and measurable space,now let's see how the measure defined: Let ($(\Omega,F)$) be a measurable space.A set function ($\nu$) defined on ($F$) is called a measure if and only if it has the following properties: (1) ($0 \leq \nu(A) \leq \infty$) (2)($\nu(\o)=0$) (3)If ($A_i \in F,~i=1,2,\ldots$) and ($A_i$) are disjoint,then ($\nu(\bigcup A_i)=\sum \nu(A_i)$)Then the triple ($(\Omega,F,\nu)$) is called measure space.When ($\nu(\Omega)=1$) the ($\nu$) is called probability measure and usually denoted by ($P$).($~(\Omega,F,P)$) is called a probability space.Lebesgue measure may be the most intuitionistic measure.There is a onetoone correspondence between the set of all probability measures on ($(R,B)$) and a set of functions on ($R$).So here we meet cumulative distribution function(c.d.f) as a map from probability measure to a function. ($F(x)=P((\infty,x])$)We can also extend the Lebesgue measure to higher dimension space using Cartesian product.I just mention it here,the reader should refer to the book on page 5.Now we have measure space and measures.Since ($\Omega$) can be very arbitrary,consider a mapping ($f$) from ($\Omega$) to simpler space ($\Lambda$) is convenient.If ($f^{1}(G)=F$),then the mapping ($f$) is called a measurable function from ($(\Omega,F)$) to ($(\Lambda,G)$) .In probability theory a measurable function is called random element.If it is measurable from ($(\Omega,F)$) to ($(R,B)$),it is called random variable.($f^{1}(G)$) is a sub($\sigma$)field of ($F$) denoted by ($\sigma(f)$).So we have the definition of random variable(random vector can be defined similarly).Indicator function for ($A \subset \Omega$) is the simplest measurable function.($\sigma(I_A)=\{\o,\Omega,A,A^c\}$) is a much smaller ($\sigma$)field.The class of simple functions is obtained by ($\phi(\omega)=\sum_{i=1}^k a_iI_{A_i}(\omega)$)We know that such a function is a Borel function.Proposition 1.4 in page 8 give some relationship between Borel function and measurable function.There are many Borel functions.It is really hard to find a nonBorel function.If ($\nu=P$) and ($X$) is a random variable then ($P X^{1}$) is called the law or distribution of ($X$) and denoted by ($P_X$).By far,we have already had the probability space,measures and random variables in Measure Theory.That is the basis of Advanced Mathematical Statistics.回应 20111228 13:39
这本书的其他版本 · · · · · · ( 全部6 )
以下豆列推荐 · · · · · · ( 全部 )
 谁能告诉我统计学是什么？ (azalea)
 Methodology & Software (Intercept)
 Measurebased Mathematical Statistics (NathanJ)
 待出售的200本宝贝 (mlatfield)
 数学 (想望)
谁读这本书?
二手市场
订阅关于Mathematical Statistics的评论:
feed: rss 2.0
3 有用 [已注销] 20160124
配套solution简直业界良心。
0 有用 Cat Helix 20180623
当作复习参考书有部分练习答案救了我这门课诶
0 有用 mlatfield 20140129
还有一本Mathematical Statistics Exercises and solutions（练习册）
1 有用 科洛桑理工学院 20190327
选了MATH 361 Theory of statistical inference的朋友们，我觉得这本书能救命
1 有用 科洛桑理工学院 20190327
选了MATH 361 Theory of statistical inference的朋友们，我觉得这本书能救命
3 有用 [已注销] 20160124
配套solution简直业界良心。
0 有用 Cat Helix 20180623
当作复习参考书有部分练习答案救了我这门课诶
0 有用 mlatfield 20140129
还有一本Mathematical Statistics Exercises and solutions（练习册）