You are on page 1of 4

CFA Level 2 Quantitative Analysis

Testing the Existence of a Significant Relationship between


Dependent Variable and Independent Variable
1. Test on Population Slope

H0: = 0 (There is no relationship)


H1: 0 (There is a relationship)

Test Statistic:

b1 1
sb1
b
t 1
sb1
t

where Sb1 is the Sample Standard Deviation for Sample Slope

If the t-ratio > the critical t-statistics, then H0 is rejected (Beta is significantly
different from zero) and evidence of a linear relationship is concluded.

2. Test on the Existence of a Significant Correlation,

H0: = 0 (There is no relationship)


H1: 0 (There is a relationship)

Test Statistic:
t

n2
1

where (n 2) = Degrees of Freedom (= n k - 1)


n = no. of observations
k = no. of independent variables

If the t-ratio > the critical t-statistics, then H0 is rejected (Correlation


Coefficient is significantly different from zero) and evidence of a linear
relationship is concluded.

3. Test on Confidence-Interval of

H0: = 0 (There is no relationship)


H1: 0 (There is a relationship)
b1 tn-2Sb1

Test Statistic:

If the b1 - tn-2Sb1 < < b1 + tn-2Sb1 , the interval does not include zero, then H0 is
rejected (Beta is significantly different from zero) and evidence of a linear
relationship is concluded.

P-VALUE APPROACH

The p-value approach involve calculating the value of the specified test statistic
and the corresponding p-value from sample data, then comparing with the chosen
significance level .
Norman Cheung
1

CFA Level 2 Quantitative Analysis

If the p-value < than the significant level , reject the H0.

F-STATISTIC

F-Statistic The F-statistic is used to test the hypothesis that all the
regression coefficients of the independent variables are simultaneously
equal to zero. Expressed mathematically, the F-statistic is used to test the null
hypothesis that H0: Beta1=Beta2=Beta3=0. (i.e. all the slope coefficients in the
regression equation are significant as a group)

For Multiple Regression Model

If the F-statistic > the critical value, then H0 is rejected

T-STATISTIC

T-Statistic The T-statistic is used to test the hypothesis that the only one
regression coefficient of a Linear Model or one single of a Multiple
Regression Model is equal to zero (Note the difference bet. T-test & F-test).
Expressed mathematically, the T-statistic is used to test the null hypothesis that
H0: Beta1=0.

If the T-statistic > the critical value, then H0 is rejected

Interpretation of R2 Goodness of Fit or Coefficient of


Determination

The R2 indicates the percentage of the total variability of the dependent variable
that can be explained by the regression model.

INTERCEPT AND SLOPE COEFFICIENT

The intercept coefficient may be interpreted as the level of the dependent


variable given a zero value for the independent variable. This interpretation
assumes that the prediction occurs within the range of values of the independent
variable used in developing the regression equation.

The slope coefficient represents the change in the dependent variable given a
unit change in independent variable.

AUTOCORRELATION

Autocorrelation occurs when successive observations of the dependent variable


are correlated and tends to occur when data are collected over a period of time,
and a residual analysis is required to determine if autocorrelation exists.

Norman Cheung
2

CFA Level 2 Quantitative Analysis

MULTICOLLINEARITY

Multicollinearity exists when there is a high degree of correlation among the


independent variables in a regression model.

Symptoms of high multicollinearity include:


1. Large R2 but statistically insignificant regression coefficients.
2. Regression coefficients that change greatly in value when independent
variables are dropped or added to the equation
3. The magnitude of one or more coefficients is unexpectedly large or small
relative to expectations.
4. A coefficient appears to have a wrong sign (the sign is counterintuitive).

The Danger of Multicollinearity is that the coefficients of the regression


equation may be unstable. When multicollinearity exists in a model, it is difficult to
determine which independent variables are contributing significantly to the
explanatory power of the model. Therefore, the reliability of this model for
explaining which variables contribute to the forecast of the value of the dependent
variable.

STATISTICALLY SIGNIFICANT AND ECONOMICALLY SIGNIFICANT

Statistically Significant do have a relationship between the dependent and


independent variables, say a linear relationship.

Economically Significant the impact of the independent variable on the


dependent is significant in terms of quantity; i.e., the regression coefficient (or
estimate of the dependent variable) is large enough, say 100, 200 instead of
0.0001 or 0.0002 etc .

ANOVA ANALYSIS OF VARIANCE TABLE


Source of Variation

Degrees of Freedom

Sum of Squares

Mean Square

F-Statistics

(df)
Regression (R)

Sampling Error

SSR

n - (k+1)

SSE

n-1

SST

(E)
Total (T)

Norman Cheung
3

MSR

MSE

SSR
k

SSE
n ( k 1)

MSR
MSE

CFA Level 2 Quantitative Analysis

SEE, Standard Error of the Estimate

MSE

SSE
n (k 1)

More CFA info & materials can be retrieved from the followings:
For visitors from Hong Kong: http://normancafe.uhome.net/StudyRoom.htm
For visitors outside Hong Kong: http://www.angelfire.com/nc3/normancafe/StudyRoom.htm

Norman Cheung
4

You might also like