Tải bản đầy đủ (.pdf) (60 trang)

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (2.55 MB, 60 trang )

Mathematical
Statistics
Asymptotic Minimax Theory

Solutions Manual
Alexander Korostelev
Olga Korosteleva


Solutions Y.la.nual to
MATHEMATICAL STATISTICS:
Asymptotic Minhnax Theory
Alexander Korostelcv

Olga Korostcleva

Wayrte Stale Um.t•et·stty,

Caltform.a Stale Um.verstty,

Dettvzt, MI 48202

Long Beach, CA 90S4tJ

www.pdfgrip.com


Chapter 1
EXERCISE 1.1 To verify firHt that the r<'pr<'SCntatiou holds, compute the

S<'cond partial derivative of lnp(.t,B) with reRpl. .ct to 8. It is



fP lnp(3'.. 8) _ _
1
(Dp(.v,8))2
882
(p(3·.8)] 1
DB

= _(a lup(x.0))2 +

an

+

1 l'"Pp(3'.• 8)
p(.v,8) tJ82

tJ2 p(a:.O).

1
p(x.8)

an2

Multiplying by p(x. 8) and r<'arranging th<' t<'rm...:; produ<'e the r<'...c;ttlt,
p(x.8))~ 1 O) _ fPp(:r, 8) _
( 8ln {)IJ
P\X!
{)IJ2
~ow


(CP ln882
p(a·, 8)) ( O)
p x. .

int<'grating hoth sides of this equality with r<'Hpect to .v, we obtain

-1

iJl p (.r., 8) d

-

1l

R

=

11

{jjji

= _

n

882

[)21


R

n

p(x, 0) d2· -n

1(

()2 ln p(3',8))

R

882

{ ([)2lnp(.v,8))
082

J:t

( fJ) .
p x, l13

p (.t·, 0) dx

0

1
R


:r

(iJllnp(x.8)) ( n)d· = _ E [lfl-htJ>(:r.8)]
882
p .r., 17 3
n o
iJ82
.

EXERCISE 1.2 The fust step is to notice that 8~ is an unbiased <'..Rthnator of
8. TndeE"d. Et~[ n;] = Et~[(l/n) L~=l (X; -J.t) 2] = Ee[(Xl p) 2] = 8.

Further, the log-likE>lihood function for the N(Jt. 8) distribution has the rorm

ThereforE>,
fJlup(.r,8) __ ..!_

ao

-

20

+

(:r- JL) 2
d D1 1np(a·,8) _ __!._
202 , an
802
- 202


_ (.v- p)~
03

Applyiug the result of Exerci~ 1.1. we get
I (n) = _
II 17

1}

E [8.!.1np(X.8)] = _ E [-1 _ (X -J.t).!.]
0
882
11 0 282
(}1

www.pdfgrip.com

.


= -

11

8]

1
[ 282
-


n

= 282 .

(13

Kext, using th~ fac1 tha~ E~- 1 (X; /l) 2/8 hns a chi-squared dis1 ribu~ ion
with n degrees of freedom, aud, hen(·e its variance equals to 2n, we arrive at

Varo

[8;] - Vm·o [-n1

"
]
~(X,- /l) 2
~
I= 1

-

'1. nli·

2fi·

n-

n


1

-., - - -

n;

--.

I fl (8)

Thu..CJ, we have shown that
is an tmhiascd estimator of 8 and that its variance attains the Ct·amer-Rno lower bound, Htn1 is,
is an ellid~n~ estimator
of 8.

e;

EXERCISE 1.3 For the Bl'•moulli(8) di..;;tl'ihution,

lnp (x, 8)- :c In 8 + (1- x) ln(1- 8),
thus,

U lnp(.t. 8)
D8

l'

1- X

=9-1-8


and

iJ-~

lnp(x, 8)

1- .v
( 1 - 8)2 .

l'

{}82

=-

(8

1-8) -8(1-8)'
n

82 -

From here,

[ X 1-X]

ln(O)=-uEe - 8J - ( 1 _ 8)2 =n 82+(1-8)2
On the other hand. Eo[.tn] - Eo[X]
8(1- 8)/n - 1/I,(8). Therefore


-

8 and Varo[Xn] -

e,; - Xn is <>flicient.

Vm·o[X]/n -

EXERCISE 1.1 In the Pois....;on( 0) model,

lnp(l". 8)

= .r ln8- 8 -ln:r!,

hcnc<'.
alnp(~r.O) = ~ _ 1

a8

8

nnd

8J 1n p(:r, 0)
882

:r

82.


Thus,
I , (8) - - n.Eo [ - X]
- !!:.
~
8'
The estimate .tn is unl>ias~d with the variauc~ Vm·e[.Xn] = 0/n = 1/ln(O),
and therefore efficieut.

www.pdfgrip.com


EXERCISE

1.!) For the given <.'>..'POn<'ntial dcu.~ity.
ln 1' (.r, 8) = - ln 8 - a'/8,

whcuce.

ahl p(:I.
(J)
. "
ao

1
:r
=-o+o~

02ln p(:r. 8)


ao~

aud

1

.

= 02

2:r
-

OJ .

Therefore,

1
In(IJ) - -nEll [ (J2

-

2X]
1Jo1

-

- 'Tl

[1


28]
(J'J

IJ2 -

-

n
IJ2.

Al80, EB [.Yn] = 0 and Vm·e [X n] = 02 In. = 1I1n(0). Ht'nce efficieucy holds.
1.6 If xl ..... X, arc iudcp<'nd<'nt <'>..lJOUentialrandom variabl<.'S
with the mean 1ltJ. their HUm Y - L~'= 1 X 1 has a gamma distribution with
the density
EXER('ISE

Consequently,
1] _

Ee [ •"-11
v
- -niJ
f(n.)

[!!..] _- n 1-,c ! u"-r(0" e-ue dy
1

Eg y


0

11)

U

1'')() y n-2 (Jn-1 e -yO dy0

,

niJf(n- 1)
f(n.)

nO(n-2)!
nO
(n- 1)! = n- 1 ·
Al..c;o,

Varo[1IX,]- Varo[n.IY]- .,~ (Ee[1IYJ]- (Eo[liY]) 2 )
- n2 [82f(n- 2) tJ.! ] - n2 82 [
1
1
]
f( n)
( n 1 )2 - •
( n - 1 )( n - 2)
( n - 1 )2
112 82

EXERCISE 1. 7


(n - 1)2 (n - 2) ·

Th<' trick hf•r<' is to notic<' th<' relation
Dlnp 0 (:r- 8)
08

-

1

8po(.r. -

Po(X - tJ)

4
www.pdfgrip.com

88

8)


8po(x- 0)

1

Po '(:t- 0)

= - P-o"""":(-:r---0~) .....;;_-:-8-x__;,. - - Po(x- 0) •


Thus we can write

0))2]

l (LJ) - nE [ (
" u

8

-

-

Po'(XPo( X- 8)

= ~1 f
' ./k

(Po'(y) )2 dy
Po(Y)
· '

which is a <:onstant independent of 8.
EXERCISE 1.8 esing the <':>.."}}rcs...;;ion for the fi...;;h<'r information dclived in th('

pr<'\iou.CJ <'X<'rdse, W<' write
In(IJ) - n.

= n C a2


1(

Po '(11) ) 2

~

11r!

2

·
Po(Y)

dy -

'fl

1"1

2 ( -Co·

-1r12

sin2 ycosn-':! ydy = n C a 2

-1r/~

1


y sin y ) 2 d
y
C cos" y
<'OSn-t

1"/'J. (1- cos y) CO.CJ
2

0 -

2

ydy

-1r/2

1r/2

= nC0 2

( C'OEin-'J.

y- COR0 y) dy.

-1r/~

Her"' the first t(•nn iCJ integrable if a- 2 > -1 (equival<'ntly, a> 1). whil"'
th<' second one i.. ;; int(•grahle if o > -1. Therefore, th"' Fi..<~h<'r information
cx.ists wh(•n a > 1.


www.pdfgrip.com


Chapter 2
EXERCISE 2.9 By Exercise 1.4, tht"' Fisher information of th<' Poisson(O)
sample is 1,(0) = nf8. The joint distribution of th<' sample is
,~

,~

(
PAJ,···

A 11

ll)

,u

=

c

llL X·

11

'e

u


718

where Cn = Cn(X1 •..•• X 11 ) is tht> normalizing c0lu3tant independent of 0.
As a function of 8. this joint probability has the algebraic form of a gamma
distribution. Thus, if we select the prior density to he a gamma density.
7r(B) = C(c.r. on-l ('- '~ 0 • () > 0, for SOlllt"' positive a and j-i, thC'n th<'
weighted posterior density is also R gamma density,

m

= l,((})C,()"L.s.e " 8C(u. fJ)fJ
C'n ol.. \;+o-l e-<n-[J)H. 0 > 0,

.f(BIXt .... ,X")
=

1e

0

118

n C,(X1, ... , Xn) C(oJJ) is th<' normalizing constant. Th<'
where C,
expected value of the weighted posterior gamma distribution is eqnal to

1
x


o

L

~

OJ(OIXJ .... ,Xn)dO-

X. + 0:
• 1 j

-

1
·

n+

EXERC'ISE 2.10 As shown in Exampl<' 1.10. the Fisher information 1,(8) nfu 2 • Thus, the W<'ight<'d posterior distribution of 0 can bt"' found RS follows:

{
C, 11
= · a2 t"'XP -

(

E x;
211 2

2(}

-

2a 2

= Ct exp { - -1 [ (J~) ( -n
2
u2

- C2 e>.-p { -

E X,

nfJ2

+ 2u2 + 2t7'J8

1) + -;;

~ (;~ + : 3) (0 -

U8

2 0 (nXn
-

(n u~ Xn

20Jl.

()'l


u.!

-

'l.a2
8

+

jt 2 ) }
2u 2
8

+ -Jt )] }
u~

+ pu 2 )f(n.u~ + u 2 )

r}.

Her<' C. C 1 , and C2 ar<' th<' appropriate normalizing constants. Thns. th<'
weighted posterior m<'Rn is ( n t7~ X, + Jta 2 ) f (n 11~ + u 2 ) Rnd tht"' variance is

(nfa 2 + 1/a3)- 1 = a 2 uU(na~ +112 ).
ExJ<;H.CISJ<; 2.11

First, we derivt=> the Fisher information for the exponent ia1

model. We have


In p(.r., 0)

= h1 ()

- 0 :r.

81n p(.c, (})
88

www.pdfgrip.com

1

=0-

.r. ,


and

B21np(:r, 0)

1
= - 02'

()02
ConSE>qu~nt Jy.

In(8) =


-nEo[- ;

2]

= ;:.

Further, the joint distribution of the sample is

P(x1 • · · · X n • 8> -_

rr

\..·,

8L X. e-OL,X,
·

with 1hE> norma1izing constant C, = Cn(X1, ••• , Xn) indepE>ndE>nt of fJ. As a
func1ion of 8, this joint pl'Obability belongs to 1h~ fami1y of gamma dis1ributions. hence. if we choose the ('Onjugate prior to be a gamma distribution,
1r(O) = C(a, 1J) oo- 1 e- 10 , 0 > 0, with some <.t > 0 and f3 > 0, theu th~
weighted posterior is al80 a gamma.

i- (81

x1 .....

Xn) - 1,(8)

c, 8L X, ~-O L, X, C(n, p) 8°-


1 r,- 80

_ f:,.,fJ'L.X,+a-3 e-(L,X;+J)O

where C11 - nC11 (X~o ... ,X,)C'(u,/J) is 1hE> norma1i~ing constant. Th~
('Orresponcling weighted posterior mean of the gamma distribution is equal
to

EXERCISE 2.12 (i) The joint density of n independ<>nt Bernoulli(B) observatious X 1 •••• , X, is

p

,,.
,,. a)
(""1
8 E x. ( 1 - 8) ,_Ex, .
' .•. """ ' f7 =

l7sing the conjugatE> priot· 1r(8) = C [ 8 (1 - 8)] v'ii/:l- 1 , we obtain the nonweighted post~riorclensity f(O Ixh ... . Xn) = coEX;+...;n/2- 1 (1-0)"-EX,+v'n/2-1
which i..'i a beta density with the mean

0. =

"

r:.xi + .fii/2
r:.xi + Vfi/2 + 11- r:.xi + vn/2

(ii} The vari&l('e of


_

r:.xi + vn/2
n

+ .fii

o,: is
-

uO(l- 0)
(n + .jii)2 •

and the bias E>qua1s to

b,(8, 8~)
n

= Eo[ 8~) n

8

= nH + Vri./'2 n+vn

8

7

www.pdfgrip.com


= /»/2 - /» 8
n+Vri.


CoURcquently, the non-normalized quadratic risk of

Ee[(o,;- 0) 2 ~ = Vare[O;]

e; is

+ b~(o,o;)

u0(1-0)+(vn/2-yln0) 2
n./4
1
(n + y'11) 2
(n + y'n) 2
4{1 + y'1i) 2 •
{iii) Le~ ln = t 11 (Xt- ... , X,) be 1he BnyE'S esHmatot· with t·espE'Ct to a nonnormnlijf.ed risk fund ion

RnUJ,Bn, w) = Ee[w(Hn- 8)].
Thl"' statement and th<' proof of Thcorl"'lll 2.7> remain exactly th<' same if the
non-normalizro risk and the corresponding Bay<'.s l"'Stimator arc used. Since

8,; is the Bayes estima~or fot· a constant non-norma1iY.ed t·isk, it is minimax.
EXERCISE 2.13 In Example 2.1, lt>t (.\: = () = 1

+ 1/b.


Theu the Bayes

estimator assumt>S the form

L

X, + 1/b
+ 2/b
where X;'s arc independent D<'ruottlli(B) random variabl<'S. The uormalizru
quadratic risk of t 11 (b) i'i equal to

t (b) "

n

Rn(O, tn(b), w) = Ee [ ( y'T,:{O){t11 (b)- 0) ) 2 ]

[vm· [t,(h)] +b~(8,tn(b))]
] + (nEe[XJ] + 1/b 8)2]
1,(8) [ rt.Vare[X1
= 1,(8)

=

+ 2/b)2

(n

u
= 0(1-0)


=
_

0

n + 2/b

[ n.0(1-0)
(n0+1/b
e)..!]
(n+2/b)..! +
n+2/b -

n
[ n8(1- 8)
(1- 28) 2
8(1- 8) (n + 2/b) 2 + b2 (n + 2/b)..!
n

8(1 - 8)

]

n8(1 - 8) = 1 as b _ oo.
n2

Thus. by Tht>Orem 2.8. tht> miuima."< lower bound is equal to 1. The uorwalizcd quadratic risk of ..Y11 - lilll~J-x t,(b) is d<'rivcd as

Rn(8, Xn, w) = E11 [ ( vr;J8} (X11 - 8)) 2]

-

- 111 (8) Varo [Xn]

-

n

8 (1 - 8 )

8(1- 8)
n

- 1.

That is. it attainR the minimax lowl"'r bound, and hence

www.pdfgrip.com

Xn

i.CJ minima."<.


Chapter 3
X "' Binomial( n. , 82 ). Then

EXERCISE 3.14 Let

Eo[IV-Vn- el]- Eo[IIJx?n :JIJ

~ ~E~~[IX/n- 82 1] ~~VEe[ (X/n- 82 ) 2 ]
(by the Cauchy-Schwarz inequality)

EXERCISE 3.15 First we show that the Hodges estimator

Bn is Mylllptotically

m1biaHed. To this <'nd w1:ite

Eo [Bn - X, + .tn - 8] - Ee [Bn - .Yn]

Eo [Bn - 8] =

Eo [ - X,I(IX,I
< n -1/1) ] <

11 -1/t ..,..

0 a..c; n- oc.

Kext cousider the ca.c;<.> 8 f 0. We will <'heck that

lim Eo[n (Bn -B)~] = 1.

II

•X

Firstly. we show that


Ee[ n (iJn -.til )

2]

-0 as n..,.. oo.

Indeed.

Eo n (8, - Xn [

A

-

_

- n

)

1/2 /

) ]

= nEo

nl/4

(-Xn)-I IX,I < n- 1/ 1) ]


[

-

1

__

. rn=e

nl/•1

l

(

-

(1£-8n 11'J) 2 /2

v211"

d

u.

Here we made a suhstitutiou ·u = z + On 112 • ~ow. since
exponent can be bmmdcd from above as follows

- (u- fJn. 112 ) 2 /2- - u2 /2 + u8n 112


-

lui <

82 n/2 ~ - u2 /2 + 8n 314

g
www.pdfgrip.com

n. 114 , the

-

82 n/2.


ilild. thus. for all sufficiently large n. th<> above integral admits the upper
bound

1

n11-t

< 1/2
- n

< e-il2 n/4
-


F\u·thcr.

w~

tu;e

<

u

:.?r

1

111!1

- - t' " 2 / 2 d-u

/

V2ir

-. 0 as n ~ oc.

Cauchy-Schwlll'z inequality to write

E~~[n(Bn- 8) 2 ]
= Ee [ n (Hn

..fiic

)

_,tt4

,1/4

th~

1

_ _ -~&.l/2+0nii4 -02 u/2d

Et~[n(O,- Xn +Xn- 8) 2 ]

=

~Y11 ){Xn

- Xn) 2 ] + 2Ee [ 11 (Bn

8)] + Et~ [n (Xn

' - Xn)
- 2 ] +2 { Ee [ n (On
.. - Xn)
- 2] } 1/2
1E(I [ n (On

=1


8) 2 ]

X

=1

Con..;;idcr now tho case 8 = 0. \Ve \\ill verify that
lim Ee [ n 8~ ]

n •XI

= 0.

EXERCISE 3.16 The follo\\iug lower bmmtl holdo.;:

sup Ee [ In(O) (On - 0) 2 ]
t~c<~

~

11

I.

wax Et~ [(On- Ol]

8C{do.Bd

(by (:t8))
10


www.pdfgrip.com


~

2It 1.. Eoo [ ( (8,.. -Do)2 +

.. - D1) 2 exp{Zo } ) I ( l::.Ln(tJo. 81) ;:::: zo )
(8,

~
2)(
> n1.cxp{Z'o}
Eo., [("'
(8n-8o) 2exp{-.:o }+(8,-81)
I D.Ln(Do.81)
2

>
-

11 1..

<.>xp{ Zu} EBo [ (

2

(0n - a0 )2 + (0n -


]

~ Zo )

01 )2 ) I ( t::.L n(a0..01) >
.,. ) ] .,
_ ;;o

since exp{- ~0 } > 1 for Zo is as..•mmoo ncgativ<'.

> n I.. exp{ ~o} (81 - Do )2 p Ou ( t::.Ln(llO
. (} ) > z )
l-O

-

2

2

nl,..Po exp{zo} ( _1_ ) 2 _ ! 1
{-}
>
1
;::
-1-.PtJexp ..o.

,

"'t


EXERCISE 3.17 First we show that the iuequality stated in the hint is valid.
For any 3' it i.e; n<'c<'s.rml'ily true that <'ithcr l·tl > 1/2 or lx-11 > 1/2, hc<'au.qc_. .
if the contrary holds, thcn-1/2 < x < 1/2 and -1/2 < 1-J: < 1/2 imply
that 1 = :c + (1- .t} < 1/2 + 1/2 = 1, which is false.
Furthea·, since w(.c) = w( -:c) we may assume that 3· > 0. And suppose that
.t ;:::: 1/2 (as opposed to the ca~ x - 1 > 1/2). In view of the facts that 1he
loss fuu('tion u• is evervwhert' uounegative and is increasing on the positive
half-a.-xis, we have

u:(.v)

+ w(x- 1)

;:::: w(.v) ;:::: ·w(l/2).

1\<.>>..'t, usiug the argmu<'nt idcutical to that in Ex<'rcisc 3.16. we obtain

~~f Eo [·u·( vn (B,- 8))]

;::::

4exp{zt~} Eoo [ ( w( v'» (B,- 8o)) +

+w(y'fi(On -Ot)) )I(D.Ln(Oo.Ot) > zo)].
Kow rc<'all that 81 = 80
continnl. .

+ 1/Vn- and


u...:;c the inl. .quality provro earlier to

Rx ..:ltC'IS ..: 3.18 I• sufikes to prove th~ assertion (3.H) foa· an indka1or fun(tion, that i8. for the bounded loss fun('tion w(·u) = r( lui > "'),where "' is
a fixed coustant. Wt' write

i b-n

(b a)

w( c- u) e-" 2/ 2 d·u

=

ib-n

I( lc- ul > "'!) e-"212 du

(b a)

11
www.pdfgrip.com


1 ~-;

-

e-u 2 / 2 du

+


1b-a

-tb-n)

t:-" 2 /J

du.

c+')

To minimize this t>xpressiou over values of c. take the derivative with respect
to c aud set it equal to zero to ohtain
or, oquival(•ntly, (c- -y)!

= (c + ")')2 •

The solution is c = 0.
Finally, the result holds for any lo.."'..'i fm1ction w since it can bc> written as a
limit of linear combination."! of indicator fuuctions,

,1_,

n

w(c- u) e-u.

2

/


2 du

-

-lb-n)

lim

L

tiwi

, ..... x i=l

whore

l,_,

1l( lc- ul > ")',) c-u.J/2 du

• -(b-nl

a

b

"f, = - - t, ti·w, = w(")',) - u:("'f;-d.
11


EXERCISE :t19 \V(•

will show that for both di..;;tributions tho representation

(3.15) takE'S placE>.
(i) For the exponential model, as shown in Exet·dse 2.11. 1he Fisher information l 11 (8) = n/8 2 • hence,

L,( 8o

+ tj JI,(Bo)) - Ln(8o)

= L,( 8o

+ ~) -

L,(8o)

- nln ( 80 + 8o~t ) - ( flu+ flo~t ) nX,
- nln(8o) +flo nXn

vn

= .vJa(HoT+

nln(1+

vn

:n) -~-


tBo...fiiX,-

~ +~.

Using the Tavlor expansion, we get that for large n,
2

-2t
'ft

1 ) = t Vii+ o,(-)
11

12 /2

+ o,(1).

Also, by the Central Limit 'l'he01·em, fot· a11 sufficiE>nt1y 1at·ge n, .Y11 is ap1/8o)8o ..;»=(flo X 11 - 1
proximately N(1/8o, 1/(nB~)), that is, (X,
is approximately N(O, 1). Consequent1y, Z = - (Bo Xn
t)vn is approximately standard normal as well. Thus, nln ( 1 + t/vn) - t00 vn.Y: 11 =
tvn- t 2 /2 + on(1)- tOo..fii.Xn = -t(OoXn- l)y'V- t 2 /2 + on(l) =
t Z - t 2 /2 + 0 11 ( 1) .

)..;»

12
www.pdfgrip.com



(ii) For the Poisson model. by Exercise 1.4. 1,(8)- njfl. thus,

L,( 9o + tjy'I,(fJu)) - Ln(Bo) - L,( 9o + t ~) - Ln(9o)
- nX, h1 ( 8o + t {8;) - n ( 9o + t {8;) -



t

= n X 11 ln (1+. nr::)- t
v u0 n
-.

= tZ-

-

(1 +

z

t

+ n.80

t
t~ +on(-)
1 )
n X 11 ( ~- 211
v u0 n

vo n.
n

.[ii;;'n =

- t A 11

n.Y, ln(fl0 )


,.

.;e;;;; -

t2

~) -2
v~n

t

An

~

2 + On(1)

flo

+ On(1)


-t..ftio

= tZ-

tJ
? + On(1).
w

Here we used th<' fa<'t that by the CLT, for all large enough n . •Y, is approximately N(90 , fl0 /n). and hem;<'.

z _ .Y, - flo
.;o;;Tn

;e:;;,

.Y, ~ _

_

V8o

is approximately N(O, 1) raudom variable. Also,

Xn

=
~

(v'BoTi + Z) .;o;;JTi

11

~

= 1

+

Z

~-

v~n

(
1 +On 1).

EXERCISE 3.20 Consider a trun<'atro loss fun<'tion u:c( u) = min( u•( u), C)

foa· some C

> 0. As in thE> proof of TheorE>m 3.8, wE" write
sup Eo [we( v'ni(fJ) (8,

-

fl))]

8€=P


>

.jnl(O)

~b

jb/.;;I(O} Eo [we( .jn/(9) (Bn -

]

9)) dB

-b/.;;;;rol

= 21b

Jb E,1 ~ [we( ynl(O)On~ t)
t::::'i"i7i\

b

whet'(' we usf•d a chang<' of variables t
We <'Ontinue
= ;b

j_: Eo [

u:c( .;n;;iJn-

= y'n/(9).


]

dt

L<'t an= nl(t/y'n/(0)).

t) exp{dLn(O,t/v'nl(O))}] dt.
13

www.pdfgrip.com


Applying the LA~ condition (3.16). we get

1

= 2b

jb Eo [We·( VQ; 8,
-b

l.rl > lul-la· - Yl for any .r, and y E R implies that

An demcntary inequality

+

;b j_: En [we( va;; 8, - t) I<'}~."}) { z,(O)
-


~ow,

E":ll.l> {

z,(o) t - t2 /2}

+ e,(O, t)}

t - t 2 /'2

I] dt.

by ThE>Ol'em 3.11, and th~ fact tha1 we < C, the second 1el'ln vanishE>.s
grows, and thus is on(l) as n - oo. Hence, we obtain the following

as 11
lower bouud

sup
8Cl0

~

;b

Eg[ we( .jn.J(O) (On- 0))]

jb Eo [me( Fn 8, - t) <'}~."}) { Z,,(O) t- t2/'2}] dt
-b


+on(l).
Put,.,,=

va;;fJ,- z,(O).

> ;b

Eo [ ~xp { ~z~(O)} u:c( rJ,- (I -zn(O)))

L:

Wf•

CEI.ll

rewrite the hound

AS

exp {

~(1- z,(0)) 2 }] dt

+on(l)
which, aft<'r th(' substitution u = t - ~,(0)

h
+o,(l).


A..,. in th<' proof of Th(•or<'lll 3.8, for n- oo.

anrl, by an at·gumen1 similar to

th~

pt·oof of Theot·~m 3.9,

14
www.pdfgrip.com


Vb and l<>tting b, C

Putting a - b conclusion that

ood n go to infinity, we arrive at the

,

[

sup Et~ we( .jnl(8)(8n - 8))
dCP

EXERCISE

] lx
~


-x

2
w('u)
rn= f:- "12 du.
V 2tr

3.21 Note that the distorted parabola cau he vnitten in the form

The parabola- (l/2)(t- z) 2 + z2 /2 is maximized at t = ~. The value of the
dist01ted parabola at t = z is bounded from below by

On the other hand, for all t sud1 that It-~ I >
less thau z2 /2- 6. lndtted,

2v'6, this functiou is strictly

- (1/2)(t- z) 2 + z2 /2 + ~(/) < - (1/2)(2v'6)2 + z2 /2 + ~(t)

<

26 + ~ 2 /2 + o = z 2 /2 -

o.

Thus. the value t - t* at which the function is maximized must satisfY

It* - ::.1 :::; 2V"S.


15
www.pdfgrip.com


Chapter 4
EXERCISE 4.22 (i) The likl-.lihood function has the form
n

II p(X;,O) =

n

o-n II r(O < xi

i-1

= e-nr(o

< X1

Hc>r<' X 111 )

=

< 0)

i-1

o::; X2 <


~B.

8, ... , o ~

x,::;

8) = e-"I(X
max( X 17 •••• X n). AR depicted in the figml... below, function

e-n decreases ev~rywhere, at1o.ining its maxinnnu at th~ ]eft-mos• poin1.

Therefore, 1he MT.E of IJ is

8" = X 1n)·

0

(ii) The c.d.f. of X(n) can bt' found as follows:

Fx,.,Jr.)

= Po(X<n> < .t) = Po( X1 :5

:r, X2 ~ :r •... , X, ~ :r)

:5 :t") P9(X2 < .c) ... Pe {Xn < x) (by iudept'ndt'n<'e)

= Pe(Xl


= [?(

x1 ~

:r )

r i r· ,o
= (

< .r. :5 8 .

Hen<'e the density of Xtn) is

:rn )' = n.:r"-1
/Y,,,(.x) = }~,.,,(x) = ( 0"
0"
The expe<"ted value of

Eo"X )] ·

tn

1
(1

X(n)

n:r"- 1

:r ·


0

is (Omputed as

8 11

It

da:-8 11

1
9

.vnd;c-

0

nen-l
(n + 1)8n

aud therefor<>.

Ee[fJn·]

]=n+l..!!:.!!.._=8.
6 -n- 1">
= E[n+lx
n n +1
16

www.pdfgrip.com

n8
11

+1·


(iii) The variance of X(nl i..'l

'Vare[Xn
- 9n

1
11

0

:c

n+l

10

=
(

d.c -


~~

~

nO

nxn-1

x-

on

n8 )
n+ 1

..!

d2·- (

+ l)

11

n8"+2
- (n + 2) 8" -

11 292

(


2

n8 )
n+ 1

..!

nfJ2

=--(n+1)2- (n+1)2(n+2)'
n+2
Consequently. the varhUlCC of 8,; i..'l

+]

Var11 [0,

=

Va1·9

[u+l

Bx ..:H.C'IS ..: ·1.23 ( i)

]

Th~

n2


n0 2
(n+l)2(n+2)- n (n + 2) ·

likE-lihood func1 ion can bE> wri1 tE>n as

n

II p(X, 8) =

(u+lr~

=

-n.-X(n)

n

11

i=l

i=l

(L Xi- nB)} II n(X; :2: B)

exp {

i =I
11


= exp { -

L

X,

+ nB} I(Xl ~ (}, x2 :2:

8 .... ' Xn > 8)

1-1

n

= exp {nO}

r(X{l) :2: 0) exp { -

L xi}
i-1

with x(l) = min(Xb ... 'Xn ). Th<' S('Cond <'XpOn(•nt is constant with 1'('spect to 8 and may h<' disr<'p,Ardcd for maximizAtion pmpo.c;(•s. The function
exp{n8} is increAsing and therefore reAches its mAximum At the right-most

iJn = x(l> .

point
(ii)


Th~

c.d.f. of 1he minimum can

1 - Fx0 .(.r)

= P"(X
:2: .r)

h~

found by

= Pt~(Xl

th~

following argumen1:

~ :~·. X2

:2:

:r., ••. ,

Xn >

.c)


- P'11(X1 ~ :r) 1P'o(X2 ~ :r) ... P'11 (Xn ~ :r.) (by independence)

=

[Pe(Xl :2:

:r)

r = Lloo e dyr = [e 8)r =
(y

wh~nce

f'x(l)(l·)

=

(:r

ill

1-

c

n(:r

t.-n{x-111.

Therefore, the deuHity of Xc l) i.., derived aH


f.\(l>(x)

= F~u,(x) =

[t- c-u(:r-m]' = nr-u<:r-O),
17
www.pdfgrip.com

.r. :2:8.

tl)


The

~xpreted

value of Xcn is equal to

EtJ [ x(l) ]
=

1

100 :r. n e

=

n (:l 8)


d:~·

00 (!!.. + 0) e 11 dy (after substitution 11 = n(:r - 0))

o

n

11oc

ye- 11 dy +8

= -

n

lx

th~

+ 8.

~

-1

As il l'(.'to!Ult,

1

e- 11 dy = n

•0

0

=1

estimator 8,; -

X(l) -

1/n ic; an unbiased <.'Stimator of 8.

(iii) Th<' variance of X(l) is computed as

1

y
(-

0C'

=

u

0

1x y e

n-

1
= -;;

2

o

11

dy

+ 0) 2E-v dy

+ -20
n

-

( -1 + 0)2
u

100 u e "dy + 1-,c e
~

o

-2


o

Y


~

-1

-1

1
28
'l
1
- - - - - 0 - = -2
112
n
n •
Rx~o:n.CISJo: '1.2,1 Wf!' will show tha~ 1he S
VJ>(.' 8) is equal to D.8 + o(D.fJ) as D.fJ - 0. Then by Theot·em ·1.3 and
Examp1f!' '1.'1 H will fo11ow that thf!' Fishet· infonnation does not exis~. Ry
defiuition. we obtain

II JP( · . e + a8) - Jp( • • 8) II~ -

1


8 t at1

=

e


8 >d.c

+ ( et.l.812

-,c

1)- 1
1)
,

-

0+~0

0

= 1-

e-t.l.O

+ ( ct::..0/2


-

2 e-t.l.o

18
www.pdfgrip.com

c

(.1'

8) d:~·


- 2 - 2 r,-Q.0/ 2

1:18

-

+ o(/:18)

as 1:18-+ 0.

EXERCISE 4.25 First of all, we find tho valu<'.s of c_ and c_ a..c; functions of
8. By our assumption, r.. - c = 8. Also, since th~ density intf'grntE'S to
ouf', c 1 + c = 1. H~ur.e, c. = (t - 8)/2 and c. 1 = (t + 8)/2.

Next, we u::;e the formula proved in Theorem ·1.3 to compute the .Fisher
information. We have


J(o)

=

til ay'p( .. o);ao II~ =

_ 4, [ju (ay'(l0)/2)
88

2

-1

1
[
= '1 8(1 8)

+

da



+

1.

ay'(l + 0)/2) .2
88


0

1

8(1

1 (

+ 8)

]

1
= 1 - 82

d.r;

]



EXERCISE '1.26 In tht> case of the shifted exponential distribution w~ hav~

Z

_ rr"
, ((J ,8 + 11 I n) - ,_

exp {-Xi+ (0


1

+ ·u/n) }1l(X, ~ 0 + ufu)
+ 0 } I ( X; ~ 0 )

{ -.Xi,
e>.."P

E7_.

- t>Xp { X; + 71 (0 + u/n)} r( Xcn ~ 0
exp { - Ei= 1 X;+ nfJ}I(X(l) > 8)

- c-

+ ·u/n)

ur(Xctl~O+u/n)
"l(u~Tn)
- e I( X(l) > 8 ) where Tn - n (Xcn - 8).
n( XclJ ~ 8 )

Ht>r~ :Pp( Xn)

> 0)

= 1, and

:P'e( '1~, > t) = Pe( n (X(l) - 8) > t)


= Pe( Xcn > fJ + l/n) = exp {

-

n (8 + t/n - 8)} = exp {

t}.

Therefore, the likelihood ratio has a representation that satisfies prop~rty (ii)
in the definition of an asymptotically exponential stati~.Jtical experiment with
A(8) - 1. Note that in thi-; ca.'J<.', Tn has an exact exponential distribution
for any n. and on(l) - 0.
EXERCISE 4.27 (i) From Ex<'r<'isc 4.2:l, tho ('Rtimator (J~ is unhia.c;cd and its
varianc~ is equnJ to 82 /[n(n + 2)]. Thf'rerore,

,lli_·~~ Eo0 [ (n(B;- 8u)) 2 ] =
w

lim n2 Vm·oo [ 8,;]

n-oc

19

www.pdfgrip.com

=

lim


11-00

t

2(12
0 )

n n

+ 2 = 8~.


(ii) Fi·om Excrci.'K! 4.23.
Ilene<'.

8: is unbiased and its val'iancc is <'<}nal to 1/n

2•

EXERCISE 4.28 Consider the ca..o:;e y

1nc: lu

Ao min
y
ulr. >.oudu

0


- min (
y::;o

In the case y

>

_!_ ;\o

y) -

< 0. Then

= .Xo min
y..!:.. ,
Au

leo
0

y) e

(u

·\o u dv

y- 0.


attained at

0,

.

- mm

(2c->.o11-1

v~o

Ao

ln2

+ Y ) - Ao '

attained at y = ln 2/;\o.

Thus,

Ao min
ycli.

Exl<;ltCJSI<;

=

1o~ lu- YIE-,\ou du


(w2 1)

=min ~. \
1\0

1\0

w2

- -

Ao ·

'1.29 (i) For n normn)izing constant C, we wl'ite by definition

c E>Xp {

n

-

L (X;- 8)} n(x.

1

> 8) ... K(Xn > 8) f, 1l(O < 8 ~ I>)

i=l


where

C1 - (

1y
o

t,no d8)

-l -

{

<'Xpn

~}

-

l, Y- rnin(X(ll•b).

:lO
www.pdfgrip.com


(ii) The posterior mean follows by dire<:t integration.

8 * (b)

= {y


"

It OE;

exp{

./0

n

718

d8 -

Y} - 1

.!

1
{ n y I et dl
E>xp{ r1 Y } - 1 .In

11

1 11 Y f'xp{ 11 }' } - ( exp{ n Y } - 1)
- ;;
exp{ n Y } - 1

y _


.!_ +

Y
.
exp( n. Y ) - 1

n

0

(iii) Consider th<' last tP.rm in thP- P.XprCRsion for th<' <'Stimator o;(b ). Since
by our assumption (} ~ Vb, we hav<' that ..Jb ::; Y ::; b. Th<'r<'for<', for all
large enough b, the detel"minist ic upper bound holds wH h 1P'8 - probahilH y 1:

y
---.,.---- <

b

---t 0 as b --+ oo .

exp{ 11 Y} - 1 - exp{ 11 Vb} - 1

Hence •hE> last term is negligible. To prove the proposition. it rf'mains to
show that

lim Ee[, (Y- .!_0)
n
2


2

b ·oo

1.

] =

rsing th<' definition of Y and th<' explicit formula forth<' distribution of Xwe g<'t

Eo[11 1 ( Y- ; - B ) 2 ]

= 1Eo[n2 ( X(t)- ~=

n2 {b

.~

(y-

1

B) 2 1I(X(t) $b)+

0

~- 8 rll(X

b-

8) 2nc-n(y-O)dy + n (b- 2._B)
Tl

2._-

2

11

n(l• 8)

=

n2 (

=

(t- 1) 2 e-t dt

+ ( n(b- 0)

2

Po(Xol

2

- 1)


e-n(b-fl) --+

1 as b ---too.

Here the first tenn t f'nds to 1, whilE> the SE>eond one vanishE-s as b
w1iformly in 0 E : ..fb, b- Vb].
(iv) WE> write

r

~~fEe ( 11 (8, - 8) <')] ~
A

[

1" b
0

1 lEt~ [ ( n (811
A

lh-Vh lEo [(

1 fb
~ b
lo lEo [ ( n(8,;(b) - B) ) 2 ] d(J ~ b1 . v'b

>


b-:

..jb

inf

~b)

-

11

8))

--+

oo,

2] dB

(B;(b)

lEe [ ( 11 (8,;(b) - 8) ) 2 ]

.

v'b
The infimum if' whatewr dose to 1 if b if' sufficiently large. Thuf': th<' limit
as b ....,. OC• of the right-hand sid<' equal..;; 1.
)


y'jj 5:85, b

21
www.pdfgrip.com


Chapter 5
EXERCISE

5.30 The Bayes CRtilnator 8,; is th<' posterior m<'an,

L:;;t 8 exp{ Ln(B)}
(1ln) Ee-l exp{ L,(B)}

= (1ln)

(J*
n

=

E;; 1 fJ <'>q>{ Ln(fJ)}
LB-t <'XP{ Ln(8}}

Applying Theorem 5.1 and rome transfonnationR, we get

- E;=l (J <'>..']>{ L,.(fJ) - L,.(8o)}
" E;=t <'xp{ L,.(8) - L,.(fJo)}


e~

_ L; 1<7+0oL 1 l~i+Oo~n (:'xp{ Ln(j + Oo) - Ln(Oo)}
_ L 1 lL 1 1 ~ 1+ tlo ~ n exp{ c ~V (J) - c.! li II 2}
L 1 1< 1 -oo= Oo + E 1 1 ~ 1 llo ~ <'XP{C ~V(")
J - <'2 1J·1 I 2} ·
11

5.31 WP. u~ the dP.finition or lV(J) 10 U01iCP. 1hnt lV(J) has a
N(o. IJ I) distribu1ion. Th~rerore,
RXJ.;RCISJo:

Ee0 [

~xp { c l-V (J)

- c2 U II 2 } ] = exp{- c U II 2 } Eso
= <'XP { - <'2 l.i II 2

+ tf IJ II 2 }

The cxpt-cted value of the numerator in (5.3)
Etlo [

L j <'XI> { (' 1-V(j) -

<'2


i.~

l.i II 2}]
i.~

=

L cxp{ clV(j)-?. 1i 112}] = L 1 =

00.

OC•.

i~:J:.

Kot(:' that
- KJ. -1~
-·-x>

=

LJ

infinit<',

jtF'F

EXERCISE 5.32


=

j~7.

the exp<'ctation of the denominator

Etln[

= 1.

equal to

i~~

Lik<'wi~c,

[ exp { c W (.i) } ]

[In Po(:t ± p.)] Po(x) dx
Po(x)

1oo [In (1 + Po(Y ±Po(.t}
JL) -

Po(Y))] J>o(x) dx

-
www.pdfgrip.com



<

1

00

[

Po(.r

± Jt)

- Po(.r)] Po(x) dx
Po(x)

-x-

j_:

[Po(3' ± Jt}

=1-

Po(3')] dx

1

= 0.


Ht'rt' Wt' have applied the inequality ln( 1 + g) < y , if g ::/= 0 , and the fact
that probability densities Po(x ± /1) aud Po(x) iutt'gratt' to 1.
EXERCISE 5.33 Assume for simplicity that On > 00 • By the definition of the
MLE, l:l.Ln(Oo, On) = Ln(On) - Ln(Oo) > 0. Also, bv Theort'm 5.H,

l:l.Ln(Bo. B,) = ~V(B"

Bo) -

/(+

(B" Bo)

L

=
i

Thererore, thP.

<

foUowin~

c, -

/(+

(0"- Bo).


Oo
inP.quali1 ies •akP. placP.

X

t'C.

I

l=m

l=m

i=l

L P'o.. ( l:l.Ln(Bu.Bu + l) 2:: 0) - L Poo( L e; ~ K+ l)
Ll
"C)

<

Ct


<

c2


m

(J t .SI •

l=m

A similar argument treats the caNe
con..c;taut c3 such that

8" <

80 • Thus. there cxi.~ts a positive

CouSf>QUP.n• ly.
00

:lO

L

m!!1PBo( IBn -Bol = m) < c.l

L m 2m-m-0

m-0

Rx ..:ltC'IS ..: 5.3·1 Wf> f'$• imate •he 1l'tle change poin• valuf> by • he maximum
likelihood me• hod. Tht' log-likt'lihood function has the fonu
~


0

L(O) =

L

[x;ln(OA)+(l-X;)ln(0.6)] +

i=l

L
i=Btl

23
www.pdfgrip.com

[x;ln(O.i)+(l-X;)ln(0.3)].


Plnggin~

in th~ concn•t<• obf'<•rvationf': we ohtain the vahl<'s of tlw log-likelihood
function for 0
1
2

L(O)


-21.87
-21.18
3 -21.74
4 -21.04
5 -21.60
6 -20.91
7 -20.22
8 -20.78
9 -21.36
10 -20.6tl

0
11
12
13
14
2fi
16
17
18
19
20

L(O)

0
-19.95 21
-20.51 22
-21.07 33
-20.37 24

-20.93 2:i
-20.24 26
-19.5.) 27
-10.11 28
-20.67 29
-19.97 30

L(O)
-20.53
-21.09
-21.65
-20.96
-21.52
-20.83
-21.39
-21.95
-22.51
-21.81

Tbe log-likelihood function reaehes its maximwn -19.55 when 0

=

17.

EXERCISE 5.35 Consider a s~t X ~ R with the property that th~ probability
of a random variabl~ wit.h the c.d.f. F 1 falling into that set is not <'<}nal to
th~ prohahility of this <•vent for a random variabl<• with th~ r.d.f. F2 • Kot<'
that such a set necessarily <'xic;ts. becaus<' otherwise. F 1 and F 2 would h<'
id<'ntkally <'qual. Ideally we "ould like the set X to be as large as po...:;.c;ible.

That is. we "ant X to be the largest set snch that

.l

dF1(x)

f.~ dF·i:d.

Heplaciug the original observations Xi bv the indicators li = r(Xi EX):
i = 1, ... , n: we get a model of lleruoulli observations with the probability of
a f'Ucc~ss Pt dFt (x) hefore the jnmp: and P2 dF.!(:r). afterw~u·df'.
The met hod of maximum likelihood may b<> appli<>d to find the :MLE of the
ehmlg<' point (s<·~ Exerds~ 5.34).

Jt

J't

24

www.pdfgrip.com


Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay
×