You are on page 1of 10

A

B
C Cramer Rule
D


descriptive statistics

(1)
(4)

1.

2.

(2) (3)
(5)
(1) (2)
(3) (4)
(1)

(2)

(3) (4)
(5)
(6) (6)

II
II
.
.
.
.
..
1

I
p.d.f. 
1. P(a X  )
2. : F(x)
3. EX Var(X)
4.
. f(x) P
.
.

II
Marginal p.d.f. f X ( x)

EX

2
2
V (X) = EX (EX)

1. joint p.d.f
. f (x , y )

f ( x Y = y )
E(X Y= y)
Conditional p.d.f.

V(X Y =y)
f (y X = x )

r. v. (X , Y)

2. joint c.d.f.

E [ E ( XY ) ] = EX
V (X) = E [ V ( XY ) ] + V [ E ( XY ) ]

F ( x , y) = P ( X x , Yy )
2F ( x , y )

= f ( x , y)
x y

marginal c.d.f. F ( x ) = p ( x x , Y< )


conditional c.d.f. F (y x )

3. r. v. transformation
X = g ( U , V ) Y = H ( U , V ) f (u , v ) = f XY (g ( u , v )h (u , v ) ) Jacobian

.
S XY

(1) rXY =

XY =

S XX

S YY

cov( X , Y )
X Y

(2) COV(X , Y) = EXY(EX)(EY)


(3)

1.

2.

3.

4.

5.

6.

1.

2.

3. Gamma

4. Normal

5. t

6.

7. F

8.Beta

9.

7.
.

II

.(order statistics)
.:
1.(Markov Inequality)
2.(Chebyshev Inequality)
. (Law of Large Number)
.(Central Limit Theorem)
.
.

(1) MLEmax L()

(2) MME ()
(3)
(4) OLSE ()

(1) E =

(2) MSE , = Var + E

MSE
(3)

lim E = , lim V = 0
n
n

(4)
Fisher Neyman

UMVUE
1.Information inequality
2.Rao Blackwell
BLUE
Cramer-

1. Normal
(1) C.I.

(2) 2 C.I.
2. Normal

(1) 1 2 C.I.

12
(2) 2 C.I.
2

II

3. Normal

D C.I.

(n30 C.L.T.)

C.I .
1.
1 2 C.I .
P C.I .
2. Binomial
P1 P2 C.I .

1.

Z , if
(1) -test
t , if
2
(2) test

2.
Z , if 1 , 2
(1) 1 2 test
2
2
pooled t test if 1 = 2
2

12
(2) 2 F
2
3.
D = 1 2 testpaired-t-test

II

1.
(1) -test

(2) p-test

2.
(1) 1 2 test
(2) p1 p 2 test

n30 () Z

III

1.MP test:Neypearson Lemma


2.UMP test

3.LR test

(ANOVA )
(1)(CRD)one way ANOVA
(completely randomized design)


Scheffe
Bonferroni

 Tukey
Duncan

Bartlett test
Hartley test

H 0 : 1 = 2 = L = k
2

(2)(RBD)two way ()
(randomized block design)
(3)(LSD)
(Latin square design)

(1) two way ANOVA ()


(2) ()
6


I :

.(residual)(error)
.
1.model assumption
2. assumptions: (1) : Gauss Markov

(2)
.:
1.(OLSE) 2.(MLE)
3.(MME)
.:
1.ANOVA F-test
2.-test 3.-test
. E(Yx) Y

4.Fisher Z : -test

. : R

II : (Lack
of fit test)
(nonlinear

regression)


.
. (Quadratic form)
.
. OLSE 
. ANOVA
.

I :
1. (classical assumptions)
2. Normal
7

II : :
1. OLSE : normal equation
3.

2. correlation matrix

III :

(1) F-test 0
(ANOVA test)
(2)
-t-test
(3)
-t-test +
(4) F-test

IV :
1. ( adjusted R2 )

R2 (coefficient of multiple of determination)


2. (coefficient of multiple correlation)
(extra sum of squares)
3. (coefficient of partial determination)
4. (coefficient of partial correlation)


(Dummy
Dummy variable
variableIndictor variable)
variable
1. ()

2.

3.

4. ( piecewise linear regression )


5. ANOVA

(multicollinearity)
1.
1
1 R2
3.

2. VIF =

4.
5.

Beta ( Beta coefficient )


8

( Auto-correlation )
1. model
2.Auto RegressionAR ( 1 )
3. DurbinWatson

( residual plot ) ei

= Yi Y i

(1)
(4)

(2) i
(5) i

(3) i
(6)

1.
(1) (2) (3) (4) (5)
2.
(1) (2)Kolmogorv-Smirnov (3)Lilliefors
3.

Normal
- test
Normal
paired-t-test

1 sign test

2 Wilcoxon signed-rank test

1 2
1 sign test

2 Wilcoxon signed-rank test

Normal

1 2

pooled-t-test

Mann-Whitney-Wilcoxon test

ANOVA C.R.D.

Kruskal Wallist test

ANOVA R.B.D.

Friedman test

(1) Spearman rank correlation coefficient test

- test

(2) Kendall test

4.run test
5.Mc_Nemar test
()

1.
2.
3.
4. P( S i I ) P ( S i )
5.(EVSI)
6.

II

1. ( Secular Trend )

2. ( Seasonal Movement )

3. ( Cyclical Fluctuation )

4. ( Irregular Movement )

( Method of Moving Average )


( Exponential Smoothing Method )

1.
2.

10

You might also like