Professional Documents
Culture Documents
17-1
Chapter Outline
1) Overview
2) Product-Moment Correlation
3) Partial Correlation
4) Nonmetric Correlation
5) Regression Analysis
6) Bivariate Regression
7) Statistics Associated with Bivariate Regression
Analysis
8) Conducting Bivariate Regression Analysis
i. Scatter Diagram
ii. Bivariate Regression Model
17-2
Chapter Outline
17-3
Chapter Outline
13) Multicollinearity
18) Summary
17-4
Product Moment Correlation
17-5
Product Moment Correlation
D iv is io n o f th e n u m er ato r an d d en o m in ato r b y ( n - 1 ) g iv es
n
( X i - X )( Y i - Y )
S
i=1
n -1
r=
n
(X i - X )2 n
(Y i - Y )2
S n -1 S= 1 n -1
i=1 i
C OV x y
=
SxSy
17-6
Product Moment Correlation
17-7
Explaining Attitude Toward the City of
Residence
Table 17.1
Respondent No Attitude Toward Duration of Importance
the City Residence Attached to
Weather
1 6 10 3
2 9 12 11
3 8 12 4
4 3 4 1
5 10 12 11
6 4 6 1
7 5 8 7
8 2 2 4
9 11 18 8
10 9 9 10
11 10 17 8
12 2 2 5
17-8
Product Moment Correlation
= (6 + 9 + 8 + 3 + 10 + 4 + 5 + 2 + 11 + 9 + 10 + 2)/12
Y = 6.583
n = (10 -9.33)(6-6.58) + (12-9.33)(9-6.58)
S=1 (X i - X )(Y i - Y ) + (12-9.33)(8-6.58) + (4-9.33)(3-6.58)
i + (12-9.33)(10-6.58) + (6-9.33)(4-6.58)
+ (8-9.33)(5-6.58) + (2-9.33) (2-6.58)
+ (18-9.33)(11-6.58) + (9-9.33)(9-6.58)
+ (17-9.33)(10-6.58) + (2-9.33)(2-6.58)
= -0.3886 + 6.4614 + 3.7914 + 19.0814
+ 9.1314 + 8.5914 + 2.1014 + 33.5714
+ 38.3214 - 0.7986 + 26.2314 + 33.5714
= 179.6668
17-9
Product Moment Correlation
n
S=1 (X i - X )2 = (10-9.33)2 + (12-9.33)2 + (12-9.33)2 + (4-9.33)2
i + (12-9.33)2 + (6-9.33)2 + (8-9.33)2 + (2-9.33)2
+ (18-9.33)2 + (9-9.33)2 + (17-9.33)2 + (2-9.33)2
= 0.4489 + 7.1289 + 7.1289 + 28.4089
+ 7.1289+ 11.0889 + 1.7689 + 53.7289
+ 75.1689 + 0.1089 + 58.8289 + 53.7289
= 304.6668
n
S (Y i - Y )2 = (6-6.58)2 + (9-6.58)2 + (8-6.58)2 + (3-6.58)2
i =1 + (10-6.58)2+ (4-6.58)2 + (5-6.58)2 + (2-6.58)2
+ (11-6.58)2 + (9-6.58)2 + (10-6.58)2 + (2-6.58)2
= 0.3364 + 5.8564 + 2.0164 + 12.8164
+ 11.6964 + 6.6564 + 2.4964 + 20.9764
+ 19.5364 + 5.8564 + 11.6964 + 20.9764
= 120.9168
Thus, r= 179.6668
= 0.9361
(304.6668) (120.9168)
17-10
Decomposition of the Total Variation
2 E x p la in e d v a r ia tio n
r =
T o ta l v a r ia tio n
S S x
=
S S y
= T o ta l v a r ia tio n - E r r o r v a r ia tio n
T o ta l v a r ia tio n
S S y - S S e rro r
=
S S y
17-11
Decomposition of the Total Variation
H0 : r = 0
H1 : r 0
17-12
Decomposition of the Total Variation
Fig. 17.1
Y6
0
-3 -2 -1 0 1 2 3
X 17-14
Partial Correlation
17-16
Part Correlation Coefficient
rx y - ry z rx z
ry (x . z ) =
1 - rx2z
The partial correlation coefficient is generally viewed as
more important than the part correlation coefficient.
17-17
Nonmetric Correlation
17-18
Regression Analysis
17-19
Statistics Associated with Bivariate
Regression Analysis
17-20
Statistics Associated with Bivariate
Regression Analysis
17-22
Conducting Bivariate Regression Analysis
Plot the Scatter Diagram
17-23
Conducting Bivariate Regression Analysis
Fig. 17.2
Plot the Scatter Diagram
where
Y = dependent or criterion variable
X = independent or predictor variable
b 0= intercept of the line
b 1= slope of the line
Yi = b 0 + b 1 Xi + ei
Fig. 17.3
9
Attitude
Duration of Residence
17-26
Which Straight Line Is Best?
Line 1
Fig. 17.4
Line 2
9 Line 3
Line 4
6
17-27
Bivariate Regression
Fig. 17.5
Y β0 + β1X
YJ
eJ
eJ
YJ
X
X1 X2 X3 X4 X5
17-28
Conducting Bivariate Regression Analysis
Estimate the Parameters
In most cases, b 0 and b 1 are unknown and are estimated
from the sample observations using the equation
Y i = a + b xi
where Y i is the estimated or predicted value of Yi, and
a and b are estimators of b 0 and b 1 , respectively.
COV xy
b=
S x2
n
S (X i - X )(Y i - Y )
= i=1
n 2
S (X i - X )
i=1
n
S X iY i - nX Y
= i=1
n
S X i2 - nX 2
i=1
17-29
Conducting Bivariate Regression Analysis
Estimate the Parameters
a =Y - bX
For the data in Table 17.1, the estimation of parameters may be
illustrated as follows:
12
S XiYi
i =1
= (10) (6) + (12) (9) + (12) (8) + (4) (3) + (12) (10) + (6) (4)
+ (8) (5) + (2) (2) + (18) (11) + (9) (9) + (17) (10) + (2) (2)
= 917
12
S Xi2 = 102 + 122 + 122 + 42 + 122 + 62
i =1
+ 82 + 22 + 182 + 92 + 172 + 22
= 1350
17-30
Conducting Bivariate Regression Analysis
Estimate the Parameters
a=Y-b X
= 6.583 - (0.5897) (9.333)
= 1.0793
17-31
Conducting Bivariate Regression Analysis
Estimate the Standardized Regression Coefficient
17-32
Conducting Bivariate Regression Analysis
Test for Significance
17-33
Conducting Bivariate Regression Analysis
Test for Significance
17-34
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
where n
SSy = iS=1 (Yi - Y)2
n
SSreg = iS (Yi - Y)2
=1
n
SSres = iS= (Yi - Yi)2
1
17-35
Decomposition of the Total
Variation in Bivariate Regression
Fig. 17.6
Y
Residual Variation
SSres
Explained Variation
SSreg
Y
X
X1 X2 X3 X4 X5
17-36
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
SS y - SS res
=
SS y
To illustrate the calculations of r2, let us consider again the effect of attitude
toward the city on the duration of residence. It may be recalled from earlier
calculations of the simple correlation coefficient that:
n
SS y = S (Y i - Y )2
i =1
= 120.9168
17-37
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
17-38
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
Therefore, n
S S reg = S (Y i - Y )
2
i =1
= (6.9763-6.5833)2 + (8.1557-6.5833)2
+ (8.1557-6.5833)2 + (3.4381-6.5833)2
+ (8.1557-6.5833)2 + (4.6175-6.5833)2
+ (5.7969-6.5833)2 + (2.2587-6.5833)2
+ (11.6939 -6.5833)2 + (6.3866-6.5833)2
+ (11.1042 -6.5833)2 + (2.2587-6.5833)2
=0.1544 + 2.4724 + 2.4724 + 9.8922 + 2.4724
+ 3.8643 + 0.6184 + 18.7021 + 26.1182
+ 0.0387 + 20.4385 + 18.7021
= 105.9524
17-39
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
n 2
SS res = S (Y i - Y i ) = (6-6.9763)2 + (9-8.1557)2 + (8-8.1557)2
i =1 + (3-3.4381)2 + (10-8.1557)2 + (4-4.6175)2
+ (5-5.7969)2 + (2-2.2587)2 + (11-11.6939)2
+ (9-6.3866)2 + (10-11.1042)2 + (2-2.2587)2
= 14.9644
r 2 = SSreg /SSy
= 105.9524/120.9168
= 0.8762
17-40
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
H0: R2pop = 0
17-41
Conducting Bivariate Regression Analysis
Determine the Strength and Significance of Association
SS reg
F=
SS res /(n-2)
r2 = 105.9522/(105.9522 + 14.9644)
= 0.8762
F = 105.9522/(14.9644/10)
= 70.8027
17-43
Bivariate Regression
Table 17.2
Multiple R 0.93608
R2 0.87624
Adjusted R2 0.86387
Standard Error 1.22329
ANALYSIS OF VARIANCE
df Sum of Squares Mean Square
17-44
Conducting Bivariate Regression Analysis
Check Prediction Accuracy
To estimate the accuracy of predicted values, Y , it is useful to
calculate the standard error of estimate, SEE.
n
(Y i - Yˆ i )
2
SEE = i =1
n-2
or
SEE = SS res
n-2
SEE = SS res
n - k -1
For the data given in Table 17.2, the SEE is estimated as follows:
SEE = 14.9644/(12-2)
= 1.22329
17-45
Assumptions
17-46
Multiple Regression
Y = b 0 + b 1 X1 + b 2 X2 + b 3 X3+ . . . + b k X k + e
which is estimated by the following equation:
17-47
Statistics Associated with Multiple Regression
• F test. The F test is used to test the null hypothesis that the
coefficient of multiple determination in the population, R2pop, is
zero. This is equivalent to testing the null hypothesis. The test
statistic has an F distribution with k and (n - k - 1) degrees of
freedom.
17-48
Statistics Associated with Multiple Regression
17-49
Conducting Multiple Regression Analysis
Partial Regression Coefficients
Y = a + b1X1 + b2X2
17-50
Conducting Multiple Regression Analysis
Partial Regression Coefficients
17-51
Conducting Multiple Regression Analysis
Partial Regression Coefficients
or
17-52
Multiple Regression
Table 17.3
Multiple R 0.97210
R2 0.94498
Adjusted R2 0.93276
Standard Error 0.85974
ANALYSIS OF VARIANCE
df Sum of Squares Mean Square
where
n
SSy = S (Y i - Y )2
i =1
n
2
S S reg = S (Y i - Y )
i =1
n
2
S S res = S (Y i - Y i )
i =1
17-54
Conducting Multiple Regression Analysis
Strength of Association
SS reg
R2 =
SS y
17-55
Conducting Multiple Regression Analysis
Significance Testing
H0 : R2pop = 0
H0: b 1 = b2 = b 3 = . . . = b k = 0
SS reg /k
F=
SS res /(n - k - 1)
2
= R /k
(1 - R 2 )/(n- k - 1)
17-56
Conducting Multiple Regression Analysis
Significance Testing
t= b
SE
b
17-57
Conducting Multiple Regression Analysis
Examination of Residuals
17-58
Conducting Multiple Regression Analysis
Examination of Residuals
17-59
Residual Plot Indicating that
Variance Is Not Constant
Fig. 17.7
Residuals
Predicted Y Values
17-60
Residual Plot Indicating a Linear Relationship
Between Residuals and Time
Fig. 17.8
Residuals
Time
17-61
Plot of Residuals Indicating that
a Fitted Model Is Appropriate
Fig. 17.9
Residuals
Predicted Y Values
17-62
Stepwise Regression
17-63
Multicollinearity
17-64
Multicollinearity
17-65
Relative Importance of Predictors
17-66
Relative Importance of Predictors
17-67
Cross-Validation
17-68
Regression with Dummy Variables
17-69
Analysis of Variance and Covariance
with Regression
17-70
Analysis of Variance and Covariance
with Regression
R2 = h2
Overall F test = F test
17-71
SPSS Windows
17-72
SPSS Windows: Correlations
7. Click OK.
17-73
SPSS Windows: Bivariate Regression
8. Click CONTINUE.
9. Click OK.
17-74
17-75
17-76