Professional Documents
Culture Documents
Model Summary
Model
.737a
R Square
Adjusted R
Square
Estimate
.543
.502
.52614
ANOVAa
Model
Sum of Squares
df
Mean Square
Regression
14.803
3.701
Residual
12.457
45
.277
Total
27.261
49
Sig.
.000b
13.369
Coefficientsa
Model
Unstandardized Coefficients
Standardized
Sig.
Collinearity Statistics
Coefficients
B
Std. Error
(Constant)
.358
.569
Distributive justice
.146
.120
Procedural justice
.076
Interactive justice
Informational justice
a. Dependent Variable: Job satisfaction
Beta
Tolerance
VIF
.630
.532
.170
1.218
.229
.523
1.913
.147
.110
.521
.605
.227
4.398
.237
.128
.306
1.848
.071
.369
2.710
.282
.135
.292
2.090
.042
.520
1.925
EVALUATION
The explanatory variable DJ 0.523 has moderate degree tolerance and has moderate
multicollinearity, procedural justice and interactive justice 0.227, 0.369 have low degree
tolerance and high degree multicollinearity whereas informational justice INJ 0.520 has
moderate degree of tolerance and moderate multicollinearity because of closer to1.
The explanatory variable DJ having high multicollinearity i.e VIF 1.913, PJ having very high
multicollinearity i.e VIF 4.398, IJ having high multicollinearity i.e VIF 2.710, and INJ having
high multicollinearity i.e VIF 1.529.
MULTICOLLINEARITY REMEDIES
1: Do nothing
Being an Econometrics student, one solution is that to do nothing. i.e
Dont interfere in model. This remedies are adopted by many
Econometricians(i.e Koutsoyiannis, Gujrati etc)
2: Increase the sample size (n)
By increasing the number of observation, this problem either totally
solved or minimized, Hence increasing the sample size is a comparatively
a good solution for removal of multi-collinearly.
3: Dropping one variable
By dropping one variable (which is not more important for study) which is
multi-collinear, this problem can be solved. E.g. The consumption function
in case of wealth (X1) and Income (X2).
Yi = 0+1X1+2X2+ei
Here X1 is not an important variable as Income X2. So by dropping
X2 variable from model, the problem will be removed.
Yi = 0+1X+ei (New Model)
4: Taking log of variable
By taking log of data, the correlation between Xs can be removed or
minimized because log make data more minimum.
E.g: log Y = 0+1logX1+2logX2+ei
5: Ratio transformation
One solution of removal of strong correlation between Xs by dividing
whole model by an explanatory variable.
E.g: Yi = 0+1X1+2X2+ei --------- (A)
Where, Y= Consumption
X1= Population
X2= Net Income
Dividing equation (A) by X
Yi/x = 0/x+1X1/x+2X2/x+ei/x
Yi = 0+1X1+2X2+ei ---------- (B)
Where, Yi = Per Person Consumption
X1 = Reciprocal of Population
X2 = Per Capita Income
6: Taking differences
By taking differences the problem of multi collinearity can be removed.
E.g: Let Yt = 0+1X1t+2X2t+eit --------- (A)
The 1st difference of the above model will be,
Yt-1 = 0+1X1t-1+2X2t-1+eit-1 -------- (B)
By taking difference of (A) and (B)
Yt-Yt-1 = 0+1(X1t-X1t-1)+2(X2t-X2t-1)+eit-et-1
Yt = 0+1X1t+2X2t+et
Collinearity Diagnosticsa
Model
Dimension
Eigenval
Condition Index
ue
Variance Proportions
(Constant)
Distributive
Procedural
Interacti
justice
justice
ve
Inf
justice
4.944
1.000
.00
.00
.00
.00
.022
14.858
.44
.04
.10
.03
.014
18.488
.12
.54
.04
.14
.014
18.518
.00
.00
.04
.20
.005
32.083
.44
.42
.83
.63
Collinearity Diagnosticsa
Model
Dimension
Eigenvalue
Variance Proportions
Condition Index
(Constant)
Distributive
Procedural
Interactive
justice
justice
justice
4.944
1.000
.00
.00
.00
.00
.022
14.858
.44
.04
.10
.03
.014
18.488
.12
.54
.04
.14
.014
18.518
.00
.00
.04
.20
.005
32.083
.44
.42
.83
.63