Introduction to Mathematical Statistics |
From inside the book
Results 1-3 of 38
Page 12
... Proof . In Theorem 1 , take A = we have P ( 0 ) 1 = - = P ( A ) 1 - 1 = 0 , and the theorem is proved . Theorem 3. If A , and A2 are subsets of A such that A , A2 , then P ( A1 ) ≤ P ( A2 ) . ( 442 ) = 0 . Proof . Now A2 = A1 ~ ( Ã ...
... Proof . In Theorem 1 , take A = we have P ( 0 ) 1 = - = P ( A ) 1 - 1 = 0 , and the theorem is proved . Theorem 3. If A , and A2 are subsets of A such that A , A2 , then P ( A1 ) ≤ P ( A2 ) . ( 442 ) = 0 . Proof . Now A2 = A1 ~ ( Ã ...
Page 47
... Proof . The proof is given when the random variable X is of the continuous type ; but the proof can be adapted to the discrete case if we replace integrals by sums . Let A = { x ; u ( x ) c } and let f ( x ) denote the p.d.f. of X. Then ...
... Proof . The proof is given when the random variable X is of the continuous type ; but the proof can be adapted to the discrete case if we replace integrals by sums . Let A = { x ; u ( x ) c } and let f ( x ) denote the p.d.f. of X. Then ...
Page 72
... Proof . The stochastic independence of X1 and X , implies that the joint p.d.f. of X and X2 is ƒ1 ( x1 ) f2 ( x2 ) . Thus , we have , by definition of mathematical expectation , in the continuous case , 1 ∞ E [ u ( X1 ) v ( X2 ) ] = ƒ ...
... Proof . The stochastic independence of X1 and X , implies that the joint p.d.f. of X and X2 is ƒ1 ( x1 ) f2 ( x2 ) . Thus , we have , by definition of mathematical expectation , in the continuous case , 1 ∞ E [ u ( X1 ) v ( X2 ) ] = ƒ ...
Other editions - View all
Common terms and phrases
A₁ A₂ Accordingly best critical region c₁ cent confidence interval chi-square distribution complete sufficient statistic compute conditional p.d.f. confidence interval Consider continuous type critical region decision function defined degrees of freedom denote a random discrete type distribution function distribution having p.d.f. Equation Example EXERCISES function F(x given H₁ hypothesis H independent random variables integral joint p.d.f. k₁ Let X1 Let Y₁ limiting distribution marginal p.d.f. matrix maximum likelihood moment-generating function mutually stochastically independent noncentral normal distribution order statistics p.d.f. of Y₁ P(A₁ p₁ Poisson distribution positive integer probability density functions quadratic form random experiment random interval random sample random variables X1 respectively sample space Show significance level simple hypothesis statistic Y₁ stochastically independent random sufficient statistic theorem unbiased statistic variance o² w₁ X₁ X₁ and X2 X₂ Y₂ Z₁ zero elsewhere μ₁ μ₂ Σ Σ