You are on page 1of 99

UNIVERSITY OF CALIFORNIA, SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Saturday June

15 9:00 12:00 1996

Answer any 5 out of the 9 questions. All questions carry equal weight. Students seeking a Ph.D. pass should answer at least two questions from Part B. Part A: M.S. Questions 1. Let Y1 , . . . , Yn be independent random variables with the Uniform distribution on [0, 1]. (a) Find the PDF of loge {(Y1 Y2 )2 }.
1/n (b) Let Zn = ( n , the geometric mean of Y1 , . . . , Yn . Show that as n , i=1 Yi ) Zn c in probability for some c, and nd c.

2. Suppose that Y1 , . . . , Yn are independent random variables with the PDF f (y | ), where is an unknown scalar parameter. If the decision rule is to estimate by the estimator T = t(Y1 , . . . , Yn ), then the loss is measured by L(T, ) and the risk function is R(, ). (a) If () is the prior PDF for , then the Bayes risk of T is R(, ) ()d. Show that the estimator which minimizes Bayes risk minimizes E{L(T, ) | Y1 , . . . , Yn }. (b) Let f (y | ) = exp(y ) for y 0, () = exp() for 0 and L(T, ) = (T )2 /. (i) Obtain the posterior PDF of given Y1 , . . . , Yn [is proportional to n exp{( Yj + )}]. (ii) Show that the estimator that minimizes Bayes risk is T = n . Yj +

3. The independent random variables X1 , X2 , . . . , Xn each have density f (x | ) = 1 (2 )1/2 x3/2 exp (x ) 2 , 2x2

where 0 < x, < . (a) Express the PDF as an exponential family PDF. (b) Calculate E(X ) and Var(X ). (c) We wish to test the null hypothesis H0 : = 1 against the alternative hypothesis HA : > 1. Show that the uniformly most powerful test at level rejects H0 if n j =1 Xj c , and obtain an approximation for c that is suitable for large n.

4. Suppose that Y1 , . . . , Yn are independent N (0, 2 ) random variables. (a) Show that S = Yi2 is a sucient statistic for . Is it a minimal sucient statistic? Is it a complete sucient statistic? (b) Show that n1 S is the UMVU estimator of 2 . (c) Among all estimators of the form cS for some constant c, show that S/(n + 2) has the minimum mean squared error. 5. Suppose that certain electronic components have independent lifetimes (measured in days) that are exponentially distributed with PDF f (t| ) = (1/ ) exp(t/ ), t 0. Five new components are put on test simultaneously. We observe that the rst failure occurs immediately after 100 days, and nothing else is observed. (a) What is the likelihood function of ? (b) Obtain the maximum likelihood estimator of . (c) Find a 95% condence interval for . Part B: Ph.D. Questions 6. Let Y1 , . . . , Yn be independent random variables with the PDF f (y | , ) = c() exp{ cos(y )}, 0 y, 2,

where is a known positive constant and c() is a known function. (a) Show that the MLE of is = tan1 ( sin Yj / cos Yj ), such that cos < 0 i cos Yj < 0. (b) Prove that in probability as n . [Assume that E(cos Y ) = 0.] OR [Hint: It may help to prove that E{sin(Y )} = 0 and to assume that E(cos Y ) = 0.] (c) Calculate an approximation for the variance of . 7. Y1 , . . . , Yn are i.i.d. from the uniform distribution on [, 2], > 0. (a) Obtain the minimal sucient statistic for . 1 (b) Show that the maximum likelihood estimator of is 2 Y(n) . (c) Prove or disprove: the minimal sucient statistic is complete. 8. A certain plant can appear in any one of three genotypes labelled AA, Aa and aa. In a random sample of n of these plants, x are of type AA, y are of type Aa, and z are of type aa. The genetic theory of random mating says that the ratios E(X ) : E(Y ) : E(Z ) are 2 : 2(1 ) : (1 )2 , 0 1. (a) Write down the likelihood function for given x, y, z . versus HA : < 1 . Derive the uniformly (b) We wish to test the hypothesis H0 : = 1 2 2 most powerful test, and show how to calculate the P-value. (c) Explain how to test the goodness of t of the genetic theory.

9. An experiment to compare two treatments (A and B) is performed on n subjects as follows. For the j th individual an initial response Xj is measured, which is Poisson with mean j . If Xj a, then Treatment A is applied to the subject and the further response Yj is measured, where Yj is Poisson with mean j . However, if Xj > a then Treatment B is applied and the further response Yj is measured, where Yj is Poisson with mean j . The experimental data are therefore paired outcomes (x1 , y1 ), . . . , (xn , yn ). All responses are mutually independent, conditional on the j s, which are all unknown. The parameters and are also unknown, and we are interested in comparing them. (a) Write down the joint likelihood function for 1 , . . . , n , , . What are the sucient statistics for these parameters? (b) Show that if = , then the likelihood function is
n

L(1 , . . . , n , ) =

Yj j =1

j j

X + Yj

exp{j (1 + )}.

What are the sucient statistics now? (c) Write = , and consider testing the null hypothesis H0 : = 1 versus HA : > 1. Show that the score test statistic is proportional to Yj
j :Xj >a n k =1 Yk n k =1 (Xk

+ Yk )

( Xj + Y j ) .

[You need not standardize this, but do explain briey how you would.] (d) Explain in one or two sentences what part (b) has to do with testing H0 in part (c).

Unused Questions A. (a) Explain the use of complete sucient statistics in unbiased minimum variance unbiased estimation. (b) Consider independent random samples X1 , X2 , . . . , Xn N (1 , 1) and Y1 , Y2 , . . . , Ym N (2 , 1). Both 1 and 2 are unknown, < 1 , 2 < . Derive the uniform minimum variance unbiased estimator of P (X1 < Y1 ). B. Observations y1 , y2, . . . , yn are a random sample from the normal distribution with mean zero and variance . The prior distribution of = / is chi-squared with degrees of freedom, where and are known constants. (a) Find the prior means of and 1 . (b) Show that the posterior distribution of ( + is chi-squared with + n degrees of freedom. (c) Find the posterior means of and 1 . (d) Find an interval [0, a) such that P [ [0, a)|y1, . . . , yn ] = 1 .
n i=1 2 yi )

UNIVERSITY OF CALIFORNIA, SANTA BARBARA Department of Statistics & Applied Probability

MATHEMATICAL STATISTICS QUALIFYING EXAM


Sunday September 21 9:00 { 12:30 1997

Answer FIVE questions. All questions carry equal weight.

1. Suppose that Xij

N ( i ; 2 ) for i = 1; : : : ; k and j = 1; : : : ; r with r > 1. The parameters 1; : : : ; k and 2 are all unknown. We P want to nd an accurate 95% con dence interval for the rst mean, 1. De ne Xi = r?1 r j =1 Xij . Consider the interval X1 c ^ td; , where tdf;p denotes the p quantile of the Student-t distribution with df degrees of freedom. Specify c, d, and the estimate ^ 2 so that the
1 1 2

interval is an exact 95% interval for . 2. Let x ; x ; : : : ; xn denote a random sample drawn from an exponential distribution with probability density function
f (xj ) = exp(? x)

(x > 0)

where has a gamma prior distribution with probability density function 1 hg g? exp(?h ) ( > 0) ( ) = ?( g) with g and h known positive constants.
1

(a) Derive the posterior distribution of given the data. Explain brie y how you would construct a Bayesian con dence interval for . (b) Obtain the Bayes estimator t of for the loss function L(t; ) = (t ? ) = .
2

3. Suppose that X and Y are independent Poisson random variables with means and
. Derive the exact (uniformly most powerful unbiased) test of NH: = versus AH: > .

4. Suppose that (X; Y ) has a bivariate normal distribution with mean vector 0, marginal
variances equal to 1, and correlation . (a) Prove that E(X jX
a) = (a)= (a); E(Y jX a) =

(a)= (a);

where (a) = 1 ? (a) with and the PDF and CDF of the N (0; 1) distribution. (b) Prove that E(X jX a) is an increasing function of a. You may quote properties of the conditional distribution of Y given X without proof.] 5. Consider a random sample of observations x ; x ; : : : ; xn from the two-parameter Weibull distribution with probability density function
1 2

f (xj ; ) =

x ?1 exp

(x > 0);

where and are positive constants. (a) Show that, if is known, then the vector of order statistics (X ; X ; : : : ; X n ) is minimal su cient for . (b) Show that, if is known, then there is a one-dimensional su cient statistic for . (c) Let = 1 in the Weibull probability density function given above. Obtain equations that determine (i) the maximum likelihood estimator for , (ii) a method of moments estimator for .
(1) (2) ( )

6. Suppose that X

N ( ; 2 ) and that Xj = + (Xj ?1 ? ) + j for j = 2; : : : ; n, where 2 ; : : : ; n are IID N (0; 2 ) independent of the Xj 's, such that 2 = 2 =(1 ? 2 ). These conditions guarantee that Xj N ( ; 2) for all j . Assume that and 2 are known, and note that Cov(Xj ; Xk ) = jj?kj 2 .
1

(a) The joint PDF of X ; : : : ; Xn in this case may be written in the form
1

fX1 ;:::;Xn (x1 ; : : : ; xn j ) = fX1 (x1 j )

n Y j =2

fXj jXj?1 (xj j xj ?1 ; ):

Develop an explicit expression for the likelihood function for . (b) Show that the MLE of is ^ = X + (1 ? )(X + + Xn? ) + Xn : 2 + (n ? 2)(1 ? )
1 2 1

(c) Calculate the e ciency of X = n? Pn j Xj relative to ^.


1 =1

7. Suppose that Y ; : : : ; Yn are IID with some unknown PDF f (y). Two hypotheses
1 0 0 1

about f are NH: f = f and AH: f = fA, where both f and fA are completely speci ed. (a) State the Neyman-Pearson Lemma for testing NH versus AH. (b) Suppose that f (y) = (y), the N (0; 1) PDF, and that fA(y) = (2 )? expf?jyj= g, where is such that Var(Y ) = 1. Derive the most powerful size test of NH versus AH, P and show P that the critical region can be expressed in the form f(y ; : : : ; yn) : yj + a jyj j c g. (Specify a numerically.) (c) Obtain a formula for approximate numerical evaluation of c in (b).
0 1 2

8. Suppose that Y ; : : : ; Yn are independent and identically distributed with a distribution depending on the scalar parameter . We are interested in point estimation and con dence intervals for consistent estimator ~ is de ned implicitly through the Pn . The ~ estimating equation j a(Yj ; ) = 0.
=1 1

(a) Derive the in uence function for ~, and use it to derive a formula for the variance of the asymptotic distribution of n = ( ~ ? ). (b) If Pn j a(Yj ; ) is continuous, monotone decreasing with respect to , verify that for any d n X Pr( ~ < + d) = Prf a(Yj ; + d) < 0g:
1 2 =1

Explain carefully how this identity can be used to calculate an approximate 1 ? con dence interval for .

j =1

UNIVERSITY OF CALIFORNIA, SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Thursday September 23 1999 9:00 12:00

Answer FIVE (5) questions. All questions carry equal weight.

1. Suppose that X1 , . . . , Xn , Xn+1 are independent Bernoulli random variables with Pr(Xj = 1) = = 1 Pr(Xj = 0), 0 < < 1. Only the values of X1 , . . . , Xn are observed, and not Xn+1 . Prior information about is summarized by the prior PDF 2. (a) Calculate the posterior PDF for , given X1 , . . . , Xn . (b) Calculate Pr(Xn+1 = 1 | X1 , . . . , Xn ). 2. Discuss, using appropriate brief examples, the advantages and disadvantages of maximum likelihood as a method of statistical estimation. 3. Suppose that X1 , . . . , Xn are independent Binomial random variables with index m and rate , i.e. B (m, ). (a) Show that T = Xj is complete and sucient.

(b) Using the Lehmann-Schee Theorem or otherwise, nd the Uniform Minimum Variance Unbiased Estimator for q () = (1 )m + m(1 )m1 = Pr(X 1). (c) Say whether or not the estimator in (b) should attain the Cramer-Rao bound, and give a reason for your answer. 4. Suppose that Y1 , . . . , Yn are independent and identically distributed with probability density f (y | ), where is a scalar parameter. For estimation of by T = t(Y1 , . . . , Yn ), the loss function is L(T, ). And has a prior distribution with density (). (a) Dene the risk function, and the Bayes risk, for any estimator T . (b) If R, L(T, ) = (T )4 , and the posterior density for is f ( | Y1 , . . . , Yn ) = ), prove that the Bayes estimator for is T = Y if g (z ) = g (z ) for all z . g ( Y PLEASE TURN OVER

5. Suppose that the sequence of random variables X1 , X2 , . . . are independent Poisson with mean . (a) Set up the Wald sequential probability ratio test for testing H0 : = 1 versus H1 : = 3, with both error probabilities set at 0.2.
(b) In using the above test for the composite hypotheses H0 : 1 versus H1 : 3, nd approximations to the power function () and the average sample number function E (N ).

6. Let X1 , . . . , Xm and Y1 , . . . , Yn be random samples from two exponential distributions with unknown scale parameters 1 and 2 respectively, i.e. means 1/1 and 1/2 . (a) Show that the critical region of the Generalized Likelihood Ratio Test of H0 : 1 = /X . Explain how one would 2 versus H1 : 1 = 2 depends only on the ratio Y obtain the necessary critical values for the test. (b) If m = n = 1, show that S (X1 , Y1 ) = {(1 , 2 ) : 1 X1 + 2 Y1 c} is a condence region for (1 , 2 ) with condence coecient 1 (1 + c) exp(c). 7. Suppose that X1 , . . . , X4 are independent N (0, 2 ). (a) Dene Y = aj Xj , where a1 , . . . , a4 are real constants. If Y 2 and Q = X1 X2 X3 X4 are independent, determine a1 , . . . , a4 . (b) Find the distribution of Q =
1 2 ( X1 22 2 + X3 )+ 1 2 ( X2 2

+ X1 X3 ).

UNIVERSITY OF CALIFORNIA, SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Friday September 9 2005 9:00 12:00+

Answer SIX (6) of the ten questions. All questions carry equal weight. 1. Suppose that X is N (, 1), and dene = 2 . (i) Find a simple unbiased estimator for . (ii) Show that the estimator in (i) is inadmissible under the squared-error loss function. (iii) If has the N (, 2 ) prior distribution, what is E( | X )? 2. Suppose that Xij are independent Poisson random variables for i = 1, 2 and j = 1, 2. The mean of Xij is ij which can be represented by the loglinear model log ij = + i + j with i = j = 0. Find a suitable prediction for X22 given one observation of the triple (X11 , X12 , X21 ). 3. Suppose that X1 , X2 , , Xn are IID with mean and that Y1 , , Yn are IID with mean Y be chosen as estimator of . . Dene = / and let T = X/ (i) Use the CLT and the Delta Method to obtain a Normal approximation for the distribution of n1/2 (T ). (ii) It may be that the exact variance of T does not exist. Give one example each of situations where this is (a) true, (b) untrue. 4. (i) Suppose that the random variables X = (X1 , X2 , , Xn ) have joint distribution depending on parameter , and that S = s(X) is a complete sucient statistic for . If E{a(X) | } = c( ) for all , show that E{a(X) | S } = b(S ). (ii) If X1 , X2 , , Xn are independent Poisson random variables, one statistic used to test n 1 2 hypothesis H that the n means are equal is T = n j =1 Xj . j =1 (Xj X ) /X , where X = n Using part (i) or otherwise, show that E(T | H ) = n 1. 5. Suppose that X1 , X2 , , Xn are IID random variables each with the PDF f (x | ) = exp{(x )}, x > . and E{b(S ) | } = c( )

(a) Show that this family has the monotone likelihood ratio (MLR) property in T = X(1) . (b) Write down the UMP level test for testing H0 : = 0 versus HA : > 0 . Obtain the formula for the exact level critical value. (c) Derive the uniformly most accurate (UMA) lower 1 condence bound for .

PLEASE TURN OVER

6. An experimenter wants to estimate the rate of a homogeneous Poisson process of events. She decides to count the number of events N in the time interval [0, t] which will be Poisson with mean t and then record the next m inter-event times Y1 , , Ym which will be independent Exponential. (i) Compute separately the MLEs N and Y based on N and on Y1 , , Ym , respectively. (ii) Calculate the mean of Y . (iii) Is it possible to obtain a best unbiased combination of N and Y ? Explain. 7. Suppose that X1 , X2 , . . . , Xn are IID random variables with density f (x) = 0
1 x2

for x 1 otherwise.

Assume a prior distribution on that is uniform on [0, 1]. (i) What is the posterior density of ? (ii) What is the posterior mode of ? (iii) Find the Bayes estimator of under squared-error loss. 8. For xed constants xi , i = 1, , n, the random responses Yi are given by Yi = + xi + Zi , where the Zi s are independent with means zero and variances 2 . The parameters and will be estimated by Least Squares. If the values of the xi s are constrained to the interval [c, c], nd the choice of x1 , , xn that maximizes the accuracies of the two estimators. [You can take n to be an even number if it helps.] 9. X is a single non-negative integer random variable with PMF p(x). Find the most powerful test of level 0.019 for testing H0 : p = p0 versus H1 : p = p1 , where p0 is the Poisson distribution x+1 ] with mean 1 and p1 is the geometric distribution with mean 1. [i.e p1 (x) = ( 1 2) 10. The pairs (Xj , Yj ), j = 1, , n, are independent bivariate Normal with mean vector 0,
2 Var(Xj ) = 1 , 2 Var(Yj ) = 2 ,

Cov(Xj , Yj ) = 1 2

with 1 , 2 , unknown. We wish to test the null hypothesis H0 : 1 = 2 versus HA : 2 > 1 . (i) Transform Xj , Yj to Sj = Xj + Yj , Dj = Xj Yj . What is the joint distribution of (Sj , Dj )? (ii) Using the transformation in (i) or otherwise, construct the best test that you can for H0 versus HA .

ABBREVIATIONS USED: IID = independent and identically distributed; CLT = Central Limit Theorem; PDF = probability density function; MLE = maximum likelihood estimator; PMF = probability mass function; CDF = cumulative (probability) distribution function; UMP = uniformly most powerful; UMA = uniformly most accurate.

University of California Santa Barbara


Departn1ent of Statistics and Applied Probability
Mathematical Statistics Qualifying Exam
Please answer any 5 of these 8 questions.
1. Suppose that we observe n independent pairs (Xi,

Sept. 24, 2007

Yi) where the Xi

rv

N(OL, 1) and then conditionally Yi I Xi

r...,;

N(/3X i ,1).
(a) Calculate the correlation of X and Y. (b) Find a IniniInal sufficient statistic for estilnating (OL, (3). (c) Is this acornplete statistic? Justify your response. 2. Suppose that Xl, X 2, ... ,Xn are independent and identically distributed randOlTI variables, each with uniforlTI distribution on the interval [0,20] with 0 > O. Let X(l) and X(n) denote the smallest and largest of the Xi's. (a) V\Trite down the likelihood function L( e) for {} and sketch the graph of L( e). (b) Shovv that (c) Verify that
T = X(n) T

/2 is the MLE of

e.
=

and

X(n)/ X(l)

are jointly n1inimal sufficient for O. 2?

(d) Note that 1 :::; R :::; 2. What would you conclude about 0 if R

3. Independent lifetin1es Y 1 , ... , Yn have the COInmon Gamma PDF j(yIOL, /3) = /3-n yo.-l exp[-y/ /3]/r(o:). The special case Q = 1 corresponds to the exponential distribution, and this is a null hypothesis to be tested against the alternp.tivc hypothesis that a > 1. (a) Obtain an expression for the log likelihood function of (a, (3) . (b) Under the null hypothesis, the mean and standard deviation of Yare equal so a simple intuitive test n1ight be based on the ratio of the sample mean and standard deviation, W = Y / s. Sho\v that In(W -1) converges in distribution to a standard normal under the null. (c) Use the result from (3b) to suggest an approximate size
OL

critical region based on VV.

(d) \iVhy Inight you expect that the test based on W is not the most powerful test available? 4. Let Xl , ... ,Xn be lID exponential \vith pdf

j(xO) =

{Be-ex,
o~

X>o
x~O

(a) Sho\v that the joint pdf f(x; 0) has the monotone likelihood ratio (11LR) property in T = -

2:7=1 Xi.

(b) Find a UI\lP test for testing H o : 0 ? 00 versus HI : 0 < 00 and find explicitly the critical value needed to insure a test of size a. (c) Find a (1 - 0:) uniforlnly lTIOSt accurate one-sided upper confidence interval for O. (d) If 0 has the prior density

n( 0)

{o-n,
o

0 > J
0 :::; J

\vhere c5 > 0 is given, find the posterior pdf of 0 and calculate the (1 - 0:) highest posterior density (HPD) credible interval for O.

5. Suppose that the observations Yi.i are generated by a mixture distribution that first generates d independent exponential random variables L i that are not observed and all have mean {3 . Then for each i, there is a sample of ni independent Poisson random variables Yil, Yi2, ... ,Yini with mean L i

(a) Calculate a lllethod of moments estimator of (3.


(b) Find the joint Plllf for the Yij. (c) Calculate the MLE for {3.

6. Let Xl, ... , X n be lID observations from a distribution with density

for 0 <

Xi

< 1 and

e > O.

We want to estimate

T =

e- o.

(a) Find the distribution of W = -log (Il~=l Xi)' (b) Shovv that

Vii

( [

D
n

Xi

] lin

- e-

O)

converges in distribution to a normal distribution with lllean O.

(c) Calculate the asymptotic variance of the estimator T == [rr~l Xi] lin.
7. If Xl ~ ... , X n are lID with comrnon pmf
p(x; 0) ==

ex - l (1 + 02 )- X!l,

= 1,3,5, ...

and

e > o.
thi~

(a) \lcrify that

belongs to an exponential fa111ily.

(b) Sho\v that T

= L~l

Xi is sufficient for

e.

Is it c0111plete?

(c) Find the pn1f of

}rl

= (Xl - 1)/2. Calculate the n1ean and variance of YI .

(d) Using the result in (7c) or otherwise find the UMVUE for 8. Let X I
, ... ,

e2 .
Var(Yi) == 1.

X n: Y I

, ... ,

Yn be independent normal randolll variables with

IE Xi = (a) Find the I\;ILE's of l i x and (b) Find the GLRT of size
i. H o
: ({Ll, {L2)
(X

{Lx,

Var(X i )

==

1,

IE Yi

== {Ly,

{Ly.

for the following tests

ii. H o :

ILl

+ 12

= (0,0) versus HI : ({Lli {L2) -I (0,0). == 0 versus HI : PI + {L2 -=I O.

(c) \\Trite down the appropriate large sample approximation to the distribution of the likelihood ratio in each of these tests.

UNIVERSITY OF CALIFORNIA, SANTA BARBARA

Department of Statistics and Applied Probability

Mathematical Statistics Qualifying Exam

Friday, September 22, 2006, 1000AM-100PM.

Answer any 5 out of the 8 Questions.

1. In life-testing and reliability, it is often assumed that the life-time X follows a Weibull(" {3) distribution, with the Cumulative Distribution Function (cdf)

F(x)
Assuming that, is KNOWN,

==

1- exp(-x 7 /{3)

O.

(a) find a (i) method of moments estin1ator and (ii) maximum likelihood estimator (MLE) for {3 based on an iid sample of size n. (b) Discuss briefly the properties of these estimator(s) namely, unbiasedness, consis tency, sufficiency and efficiency (whether the Cran1er-Rao bound is attained). (c) Find also the MLE for P(X > 2) when X has the above cdf. 2. Suppose that Xl, ... , X n (n ~ 2) are independent copies of the Bernoulli random variable X with Pr(X == 1) == p == 1 - Pr(X == 0). Obtain the uniformly minimum variance unbiased estimate of B == p2. Does this attain the Cran1er-Rao bound? If the bound is at fault, are there more precise bounds? 3. Suppose that we have n independent observations UI , ... ,Un from a uniform distribu tion on the interval [o,~] for some unknown () > posterior mode? 4. Suppose that the pair of random variables (X, Y) has the joint density

o.

If we have an exponential prior

distribution on the parameter B with mean 10, i.e., 1f(B) == Iloe- o/IO , then what is the

f (X, y ) - r(a)r(,B)r(')')
for x > 0, Y > 0, and x

_ r(a+{3+,)
1.

a-I 11-1 (
X

1-

X -

)7-1

+y <

(a) Find the joint density of S == X

+ Y and R

==

X/(X + Y).

(b) Assume that a and {3 are known. From n i.i.d. copies of the pair (Xi, Yi), suppose we want to estimate ,. (i) Show that TI~1 (1- Si) is a complete sufficient statistic for estimating ,. (ii) Show that TI~1 R i is an ancillary statistic for estimating ,. What does Basu's theorem say in this context?

5. Let Xl, ... ,Xn be independent and identically distributed (i.i.d.) observations from the exponential pdf

po(x)

~e-(X/O) for B '


0, otherwise.

> 0,

(a) Explain if there is a UMPU test for testing Ho : B == B o versus the 2-sided alter native, H A : B -# B o. If there is, use this to find the Uniformly Most Accurate Unbiased confidence set of level (1- a). What does "Unbiased" mean in the case of a confidence set? (b) What is a good pivot in this case for constructing a confidence interval for B? Use this pivot to construct the shortest level (1 - a) confidence interval for B. (Hint: Recall this is a scale parameter family and use, if you can, the fact that (~) has a chi-square distribution). (c) Find the MLE of B and the Fisher infornlation i(B) in a single observation. Using these, find a large-sample confidence interval for B. 6. The 2n random variables Xij are independent Poisson with means /-lij, i == 1, ... ,n and
j

== 1,2. We believe that H A : {3 > 1.

/-lil

== ai, /-li2

==

{3ai and we want to test H o : {3 ::::: 1 versus

(a) Show that S ::::: (X ll +X12 , ... ,Xnl +Xn2 ) is minimal sufficient for under H o. this to derive the UMPU test of H o versus HA.

a == (al' ...

,an)

(b) Derive the joint distribution of {X ij , i == 1, ... , n, j == 1, 2} given S ::::: s and use (c) Outline an appropriate test of the model fornl, i.e. a test of whether or not the belief
/-lil

== ai, /-li2 == (3ai is correct.

7. Find the Generalized Likelihood Ratio Test for testing H o : a == ao versus the one-sided alternative H o : a < ao based on a sample Xl, X 2 , ... , X n from N(/-l, a 2) if (i) /-l is known and (ii) /-l is unknown. Do either of these tests result in a UMP or UMPU procedure? 8. Let Z be an nx1 column vector with Nn(O, In) distribution where In is the Identity matrix. Consider matrices of the form A == a.In + b.En where a and b constants, and En is the nxn matrix with all entries 1.

-#

are scalar

(a) Show that for matrices A of the above form, Q == Z' AZ has a chi-squared distri bution if and only if either (i) a
A l ), OR (ii) a == and b == (lin) (- call the resulting matrix A 2 ). Find the degrees of freedom of the chi-squared in each case.

==

1, and b == -(lin) (-call this resulting matrix

(b) Show that Ql == Z' A1Z and Q2 == Z' A 2 Z are independent using results on quadratic forms. Relate these quadratic forms to the sanlple mean and sample variance of the components of Z.

Mathematical Statistics Qualifying Exam

September 8, 2010

Answer any 5 of these 8 questions. You may consult your textbook, but no notes. Please indicate on the front of your exam paper which 5 questions you have answered. 1. Suppose that we have n independent observations from a geometric distribution with pmf Pp {X = k } = p(1 p)k1 (a) Find MLE of = (1 p)4 . (b) Find the MVUE of = (1 p)4 . 2. Suppose that Y1 , . . . , Yn are independent N (, c2 ) where c is a known constant and < < . (a) Obtain formulas for the two statistics which form the minimal sucient statistic S for . (b) Prove that S is not complete. (c) What implications, if any, does the lack of completeness of S have for estimation of ? (d) Each component of S can be used to generate a simple estimate of . Which one is better? Explain your choice. 3. Observations Y1 , . . . , Yn are a random sample from the normal distribution with mean zero and variance . There is a prior distribution on where = / has a chi-squared distribution with degrees of freedom and and are known constants. (a) Find the prior means of and 1 . (b) Show that the posterior distribution of + is chi-squared with + n degrees of freedom. (c) Find an interval of the form [0, a) that is a 1 credible set for . 4. The random variables Yij are independent Poisson with means ij , where ij = i j for i = 1, 2 and j = 1, . . . , n. All parameters 1 , 2 , 1 , . . . , n are unknown. We want to test the null hypothesis that 1 = 2 versus the one-sided alternative 1 < 2 . Obtain the most powerful exact test and give a formula for calculating the P -value (signicance probability). 5. Suppose that X1 , . . . , Xn are independent trinomial with Pr(Xi = k ) = pk for k = 1, 2, 3. All parameters p1 , p2 , p3 are unknown. We wish to test the null hypothesis N H : p1 = p3 versus the alternative hypothesis p1 > p3 . Derive a good test and explain why it is good. 6. Let X1 , X2 , . . . , Xn be independent normal random variables with unknown parameters E(Xi ) = i and Var(Xi ) = 2 for i = 1, 2, . . . , n. For = (1 , 2 , . . . , n )T , suppose that we know cT j = dj , for j = 1, 2, . . . , k , where cj and dj are given for 0 < k < n and c1 , c2 , . . . , ck are linearly independent. What is the UMVUE of 2 ?
n i=1 2 yi

for k = 1, 2, 3, . . .

7. Let X1 , X2 , . . . , Xm be iid Uniform distributed on the interval (0, ), and Y1 , Y2 , . . . , Yn iid Uniform on (0, ). Assume the Xi s and Yi s are mutually independent. (a) Derive the density for U= (b) Suppose we wish to test the hypotheses H0 : vs. H1 : > max Xi max Yi

Consider the test which rejects H0 when U > c for some given critical value c. Show that the power of the test is monotonically increasing in = / . (c) If m = 4 and n = 2, nd the critical value of the test in (b) so as to achieve signicance level = 1/24. 8. Suppose that we have n independent observations from a Laplace distribution f (xi | ) = We hope to nd a 95% condence interval for 2 . (a) Construct a pivot that is a function of x . (b) Calculate an approximate 95% condence interval for 2 based on the pivot from part (a). (c) Construct another approximate 95% condence interval that is a function of the median of xi . (d) Which interval should we prefer? Why? 1 |xi |/2 e . 4

UNIVERSITY OF CALIFORNIA, SANTA BARBARA Department of Statistics & Applied Probability MATHEMATICAL STATISTICS QUALIFYING EXAM Monday September 12 2011 9:00 12:00

Answer 5 of the 9 questions. All questions carry equal weight. It is permitted to refer to the book Statistical Inference by Casella and Berger, but nothing else. 1. Suppose that X1 , X2 , . . . , Xn are independent bivariate Normal with mean vector and variance matrix 2 I with I the identity matrix and 2 is known. We want to test the null hypothesis N H : T 1 versus AH : T > 1. is minimal sucient for . (i) Show that X (ii) Derive an appropriate powerful test for N H versus AH . (iii) Explain how you would evaluate the P -value of your test, exactly or approximately. 2. Suppose that Y1 , . . . , Yn are independent with pdf f (y | , ) = 1 exp{ cos(y )}, 2c( )

where 0 y, < 2 and 0. Both and are unknown. (i) Derive the maximum likelihood estimator of . (ii) Obtain a formula for calculating an approximate variance of that does not require knowing c( ). (iii) What is the exact distribution of when = 0? (Note: you are not expected to derive this exact distribution using density function calculations, but briey try and justify your answer.) 3. Let X1 , X2 , . . . , Xn be independent Poisson with common mean . is the uniformly minimum variance unbiased estimator of . Carefully describe (i) The sample average X ) and the other not. two ways to show this, one requiring calculation of Var(X 2 ) = X and (b) (ii) The sample variance S is also an unbiased estimate for . Prove that (a) E(S 2 | X ). Var(S 2 ) > Var(X 4. Suppose that X1 , X2 , . . . , Xn are independent N (, ) i.e. with variance equal to mean with > 0 unknown. Describe two distinct pivotal quantities that can be used to obtain exact condence intervals for . Is it possible to say which interval is better? 5. The random variables Y1 , Y2 , . . . , Ym are independent Poisson with dierent means 1 , . . . , m . Those means are independently sampled from the Gamma distribution with probability density function p( | , ) = 1 exp(/ ) . ()

(i) Obtain the optimal Bayes estimates T1 , . . . , Tm of 1 , . . . , m with respect to the loss function
m

(Ti i )2 /i .
i=1

(ii) Specify exactly how you would apply these Bayes estimates if and were unknown. PLEASE TURN OVER

6. Suppose that we are going to observe n independent Exponential failure times with mean = 1/. For the single unknown parameter we have the conjugate prior distribution () a exp(b). The ordered failure times are T1 < T2 < < Tn . We want to predict Tn as soon as the (n 1)st failure occurs. (i) Show that Z1 = nT1 and Zj = (n j + 1)(Tj Tj 1 ), j = 2, . . . , n are mutually independent Exponential with mean . (ii) Obtain a formula for the upper 1 p prediction limit for the nal failure time Tn given T1 < T2 < < Tn1 . 7. Let Yij = + i xj + ij for i = 1, . . . , m and j = 1, . . . , n. The random errors ij are independent N (0, 2 ). (i) Obtain explicit formulae for the least square estimates of , 1 , . . . , m . (ii) Explain carefully how to test whether or not the slope parameters are equal when is unknown. [You may need the following matrix identity: If M= A bT b c then M 1 = A1 + kddT kdT kd k ,

where c is scalar, d = A1 b and k = (c bT A1 b)1 .] 8. Let Yi = + x i + i for i = 1, . . . , n. The parameters , , are all unknown, and the random errors i are independent N (0, 2 ) where is unknown. Obtain the score test for the null hypothesis that = 1 with alternative hypothesis that < 1. 9. The independent pairs (Xj , Yj ), j = 1, . . . , n, are non-negative integers with the X s and Y s having Poisson distributions with dierent unknown means, but not being independent necessarily. Rather Xj = Uj + Wj and Yj = Vj + Wj , where the U s, V s and W s are mutually independent Poisson, the means being respectively , and which are all unknown. We want to test the null hypothesis N H : = 0, corresponding to independence of X s and Y s, with one-sided alternative of positive dependence (i.e. > 0). (i) Write down the joint probability density function of (Xj , Yj ). (ii) What is the minimal sucient statistic S = s(X1 , Y1 , . . . , Xn , Yn ) for (, ) under N H ? (iii) Show that the score test statistic for testing N H is proportional to
n

T =
j =1

)(Yj Y )/(X Y ). (Xj X

(iv) Prove that E(T | S, N H ) = 0.

You might also like