next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 7 Up: 22S:194 Statistical Inference II Previous: Assignment 6

Solutions

8.14
Use $ R = \{x: \sum x_{i} > c\}$. $ \alpha=0.01$ means

$\displaystyle 0.01 = P(\sum X_{i} > c\vert p=0.49) \approx P\left(Z > \frac{c-0.49}{\sqrt{0.49 \times 0.51}}\sqrt{n}\right)$    

So

$\displaystyle \frac{c-0.49}{\sqrt{0.49 \times 0.51}}\sqrt{n} = 2.326$    

$ \beta=0.99$ implies

$\displaystyle 0.99 = P(\sum X_{i} > c\vert p=0.51) \approx P\left(Z > \frac{c-0.51}{\sqrt{0.49 \times 0.51}}\sqrt{n}\right)$    

So

$\displaystyle \frac{c-0.51}{\sqrt{0.49 \times 0.51}}\sqrt{n} = -2.326$    

So

$\displaystyle c-0.49$ $\displaystyle = 2.326\sqrt{0.49 \times 0.51}\frac{1}{\sqrt{n}}$    
$\displaystyle c-0.51$ $\displaystyle = -2.326\sqrt{0.49 \times 0.51}\frac{1}{\sqrt{n}}$    

or

$\displaystyle \sqrt{n} \times 0.002$ $\displaystyle = 2 \times 2.326 \sqrt{0.49 \times 0.51}$    
$\displaystyle n$ $\displaystyle = (100)^{2} \times (2.326)^{2} \times 0.49 \times 0.51$    
  $\displaystyle = 13521$    

8.17
For $ {X}_{1},\ldots,{X}_{n}$ $ i.i.d.$ Beta($ \mu,1$)

$\displaystyle L(\mu\vert x)$ $\displaystyle = \mu^{n} \prod x_{i}^{\mu-1}$    
  $\displaystyle = \mu^{n}e^{\mu\sum\log x_{i}} e^{-\sum\log x_{i}}$    

So

$\displaystyle \widehat{\mu}$ $\displaystyle = -\frac{n}{\sum \log x_{i}}$    
$\displaystyle L(\widehat{\mu}\vert x)$ $\displaystyle = \left(-\frac{n}{\sum\log x_{i}}\right)^{n}\exp\{-n-\sum\log x_{i}\}$    

and

$\displaystyle \Lambda(x)$ $\displaystyle = \frac{\left(-\frac{n+m}{\sum\log x_{i}+\sum\log y_{i}}\right)^{...
...og x_{i}\} \left(-\frac{m}{\sum\log y_{i}}\right)^{m}\exp\{-m-\sum\log y_{i}\}}$    
  $\displaystyle = \frac{(n+m)^{n+m}}{n^{n}m^{m}}T^{n}(1-T)^{m}$    

So

$\displaystyle \{\Lambda < c\} = \{$$ T < c_{1}$ or $ T > c_{2}$$\displaystyle \}$    

for suitable $ c_{1}, c_{2}$.

Under $ H_{0}$, $ -\log X_{i},-\log Y_{i}$ are $ i.i.d.$ exponential, so $ T \sim$   Beta$ (n,m)$.

To find $ c_{1}$ and $ c_{2}$ either cheat and use equal tail probabilities (the right thing to do by symmetry if $ n=m$), or solve numerically.

8.15

$\displaystyle L(\sigma^{2}\vert x)$ $\displaystyle = \frac{1}{(\sigma^{2})^{n/2}} \exp\left\{-\frac{1}{2\sigma^{2}}\sum x_{i}^{2}\right\}$    
$\displaystyle \frac{L(\sigma_{1}^{2})}{L(\sigma_{0}^{2})}$ $\displaystyle = \left(\frac{\sigma_{0}}{\sigma_{1}}\right)^{n/2} \exp\left\{\fr...
..._{i}^{2} \left(\frac{1}{\sigma_{0}^{2}}-\frac{1}{\sigma_{1}^{2}}\right)\right\}$    

If $ \sigma_{1} > \sigma_{2}$ then this is increasing in $ \sum
x_{i}^{2}$. So

$\displaystyle L(\sigma_{1}^{2})/L(\sigma_{0}^{2}) > k$    

for some $ k$ if and only if $ \sum X_{i}^{2} > c$ for some $ c$.

Under $ H_{0}:\sigma=\sigma_{0}$, $ \sum X_{i}^{2}/\sigma_{0}^{2} \sim
\chi^{2}_{n}$, so

$\displaystyle c = \sigma_{0}^{2} \chi^{2}_{n,\alpha}$    

8.25
a.
For $ \theta_2 > \theta_1$

$\displaystyle \frac{g(x\vert\theta_2)}{g(x\vert\theta_1)} = \frac{\exp\left\{-\...
...2}{2\sigma^2}\right\}} {\exp\left\{-\frac{(x-\theta_1)^2}{2\sigma^2}\right\}} =$   const$\displaystyle \times \exp\{x(\theta_2 - \theta_1)/\sigma^2\}$    

This in increasing in $ x$ since $ \theta_2 - \theta_1 > 0$.
b.
For $ \theta_2 > \theta_1$

$\displaystyle \frac{g(x\vert\theta_2)}{g(x\vert\theta_1)} = \frac{\theta_2^x e^{-\theta_2}/ x!}{\theta_1^x e^{-\theta_1}/ x!} =$   const$\displaystyle \times \left(\frac{\theta_2}{\theta_1}\right)^x$    

which is increasing $ x$ since $ \theta_2/\theta_2 > 1$.
c.
For $ \theta_2 > \theta_1$

$\displaystyle \frac{g(x\vert\theta_2)}{g(x\vert\theta_1)} = \frac{\binom{n}{x}\theta_2^s(1-\theta_2)^{n-x}} {\binom{n}{x}\theta_1^s(1-\theta_1)^{n-x}} =$   const$\displaystyle \times \left(\frac{\theta_2/(1-\theta_2)}{\theta_1/(1-\theta_1)}\right)^x$    

This is increasing in $ x$ since $ \theta/(1-\theta)$ is increasing in $ \theta$.
8.28
a.

$\displaystyle \frac{f(x\vert\theta_{2})}{f(x\vert\theta_{1})} = e^{\theta_{1}-\theta_{2}} \frac{(1+e^{x-\theta_{1}})^{2}}{(1+e^{x-\theta_{2}})^{2}} =$   const$\displaystyle \times \left(\frac{e^{\theta_{1}}+e^{x}}{e^{\theta_{2}}+e^{x}}\right)^{2}$    

Let

$\displaystyle g(y)$ $\displaystyle = \frac{A+y}{B+y}$    
$\displaystyle g'(y)$ $\displaystyle = \frac{B+y-A-y}{(B+y)^{2}} = \frac{B-A}{(B+y)^{2}}$    

Then $ g'(y) \ge 0$ if $ B \ge A$. So we have MLR in $ x$.
b.
Since the ratio is increasing in $ x$, the most powerful test is of the form $ R = \{x > c \}$. Now

$\displaystyle F_{\theta}(x) = 1-\frac{1}{1+e^{x-\theta}}$    

So for $ H_{0}: \theta = 0$ and $ \alpha=0.2 = 1/(1+e^{c})$, so

$\displaystyle 1+e^{c}$ $\displaystyle = 5$    
$\displaystyle e^{c}$ $\displaystyle = 4$    
$\displaystyle c$ $\displaystyle = \log 4 = 1.386$    

The power is

$\displaystyle \beta(1) = \frac{1}{1+e^{\log(4) -1}} = \frac{1}{1+4/e} = 0.405$    

c.
Since we have MLR, the test is UMP. This is true for any $ \theta_{0}$. This only works for $ n=1$; otherwise there is no one-dimensional sufficient statistic.

8.33
a.

$\displaystyle P($$ Y_{1} > k$ or $ Y_{n} > 1$$\displaystyle \vert\theta=0) = P(Y_{1} > k\vert\theta=0) = (1-k)^{n} = \alpha$    

So $ k = 1-\alpha^{1/n}$.
b.

$\displaystyle \beta(\theta)$ $\displaystyle = P($$ Y_{n} > 1$ or $ Y_{1} > k$$\displaystyle \vert\theta)$    
  $\displaystyle = P(Y_{n} > 1\vert\theta) + P($$ Y_{1} > k$ and $ Y_{n} \le 1$$\displaystyle )$    
  $\displaystyle = \begin{cases}1 & \theta > 1\\ 1-(1-\theta)^{n} + (1-\max\{k,\theta\})^{n} & \theta \le 1 \end{cases}$    

c.

$\displaystyle f(x\vert\theta) = 1_{(\theta,\infty)}(Y_{1})1_{(-\infty,\theta+1)}(Y_{n})$    

Fix $ \theta' > 0$. Suppose $ k \le \theta'$. Then $ \beta(\theta') = 1$.

Suppose $ k >\theta'$. Take $ k'=1$ in the NP lemma. Then

$\displaystyle f(x\vert\theta') < f(x\vert\theta_{0})$ $\displaystyle \Rightarrow$    $ 0 < Y_{1} < \theta' < k$, so $ x \not \in R$    
$\displaystyle f(x\vert\theta') > f(x\vert\theta_{0})$ $\displaystyle \Rightarrow$    $ 1 < Y_{n} < \theta'+1$, so $ x \in R$    

So $ R$ is a NP test for any $ \theta'$.

So $ R$ is UMP.

d.
The power is one for all $ n$ if $ \theta > 1$.


next up previous
Link to Statistics and Actuarial Science main page
Next: Assignment 7 Up: 22S:194 Statistical Inference II Previous: Assignment 6
Luke Tierney 2003-05-04