You are on page 1of 17

Analysis of presence of Multicollinearity QAM-II

Submitted to Prof. Abhijit Bhattacharya

SUBMITTED BY GROUP 5
December 13, 2011

AmeyRambole ABM08009 AbhijitTalukdar PGP27134 Manish Pushkar PGP27161 SumeetChoudhary PGP27185

NitinRawat PGP26104 Akash Joshi PGP27136 Suraj Somashekhar PGP27187

Problem Statement
The owner of Pizza Corner, Bangalore would like to build a regression model consisting of six well defined explanatory variables to predict the sales of pizzas. The six variables are: X1 X2 X3 X4 X5 X6 : : : : : : Number of delivery boys Cost (in Rupees) of advertisements (000s) Number of outlets Varieties of pizzas Competitors activities index Number of existing customers (000s)

Sales data of past fifteen months and the above listed variables is given below: SALES DATA FOR PIZZA Month 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Sales 81 23 18 8 16 4 29 22 15 6 45 11 20 60 5 X1 15 10 7 2 4 1 4 7 5 3 13 2 5 12 1 X2 20 12 11 6 10 5 14 12 10 5 17 9 12 18 5 X3 35 10 14 9 11 6 15 16 18 8 20 10 15 30 6 X4 17 13 14 13 12 12 15 16 15 13 14 12 12 15 12 X5 4 4 3 3 4 5 2 3 4 2 2 3 3 4 5 X6 70 43 31 10 17 8 39 40 30 16 30 20 25 50 20

Initial Regression Model


Y(Sales) was regressed over X1,X2,X3,X4,X5 and X6 in SPSS The following model was developed:^Ysales = 6.372 + .919X1 + .699X2 + 1.620X3 - 1.978X4 + .067X5 + .242X6

SPSS Regression Output for initial model

Descriptive Statistics Mean Sales X1 X2 X3 X4 X5 X6 24.20 6.07 11.07 14.87 13.67 3.40 29.93 Std. Deviation 21.913 4.511 4.758 8.340 1.633 .986 16.529 N 15 15 15 15 15 15 15

This table depicts mean, standard deviation and size of sample provided across 15 months for various dependent and independent variables.

Variables Entered/Removedb Model 1 Variables Entered X6, X5, X4, X1, X3, X2a Variables Removed Method . Enter

a. All requested variables entered. b. Dependent Variable: Sales This table tells you the method that SPSS used to run the regression. "Enter" means that each independent variable was entered in usual fashion. It says all variables were entered and no variable was removed

Correlations Sales Pearson Correlation Sales X1 X2 X3 X4 X5 X6 1.000 .902 .934 .953 .725 -.040 .880 X1 .902 1.000 .905 .845 .672 -.103 .841 X2 .934 .905 1.000 .904 .702 -.189 .867 X3 .953 .845 .904 1.000 .794 -.036 .856 X4 .725 .672 .702 .794 1.000 -.178 .819 X5 -.040 -.103 -.189 -.036 -.178 1.000 .006 X6 .880 .841 .867 .856 .819 .006 1.000

This table provides individual correlation coefficients between dependent and independent variables Model Summaryb Adjusted R Square .918 Std. Error of the Estimate Durbin-Watson 6.260 1.745

Model 1

R .976a

R Square .953

a. Predictors: (Constant), X6, X5, X4, X1, X3, X2 b. Dependent Variable: Sales R is the correlation between the observed and predicted values of dependent variable.

R-Square 95.3 % of variation is explained by model. This is the proportion of variance in the dependent variable (Sales) which can be explained by the independent variables (X6, X5, X4, X1, X3, and X2). This is an overall measure of the strength of association and does not reflect the extent to which any particular independent variable is associated with the dependent variable.

Adjusted R-square 91.8 % of variation in sales is explained by model- adjusted for number of independent variables and sample size.

Std. Error of the Estimate - This is also referred to as the root mean squared error. It is the standard deviation of the error term and the square root of the Mean Square for the Residuals in the ANOVA table

ANOVAb Model 1 Regression Residual Total Sum of Squares 6408.864 313.536 6722.400 Df 6 8 14 Mean Square 1068.144 39.192 F 27.254 Sig. .000a

a. Predictors: (Constant), X6, X5, X4, X1, X3, X2 b. Dependent Variable: Sales

Sum of Squares - These are the Sum of Squares associated with the three sources of variance, Total, Regression and Residual. The Total variance is partitioned into the variance which can be explained by the independent variables (Regression) and the variance which is not explained by the independent variables (Residual). df - These are the degrees of freedom associated with the sources of variance. The total variance has 14 (N-1) degrees of freedom. The Regression degrees of freedom correspond to the number of coefficients estimated minus 1. Including the intercept, there are 7 coefficients, so the model has 7-1=6 degrees of freedom. The Error degrees of freedom are the DF total minus the DF model, 14 - 6 =8. Mean Square - These are the Mean Squares, the Sum of Squares divided by their respective DF.

F and Sig. - This is the F-statistic the p-value associated with it. The F-statistic is the Mean Square (Regression) divided by the Mean Square (Residual): 1068.144/39.192 = 27.254. The p-value is compared to some alpha level in testing the null hypothesis that all of the model coefficients are 0. H0: 1 = 2= 3=4= 5= 6=0 H1: Not all j are 0 In this model we reject H0 and can conclude that not all j are 0 as p-value is 0 Sales depends significantly upon some predictors.

Coefficientsa 95% Confidence Interval for B Collinearity Statistics Lower Upper Sig. Bound Bound Tolerance .850 81.516 68.773 3.017 3.704 3.046 3.349 5.165 .931 .166 .073 .105 .197 .589 .115 6.018 13.733 9.500 5.083 1.696 8.719

Unstandardized Coefficients Std. Error 32.586 .910 1.303 .618 2.310 2.211 .299

Standardized Coefficients

Model 1 (Const ant) X1 X2 X3 X4 X5 X6 a. Dependent Variable: Sales

B 6.372 .919 .699 1.620 -1.978 .067 .242

Beta

t .196 .189 1.010 .152 .537

VIF

.342 -1.179 .606 -2.306 .031 .195

.617 2.621 -.147 -.856 .003 .182 .030 .808

.417 -7.305 .977 -5.032 .442 -.448

B - These are the values for the regression equation for predicting the dependent variable from the independent variable. Interpretations made are

Controlling other variables constant, if Number of delivery boys is increased by 1 then Sales will increase by 0.919 Controlling other variables constant, if cost (in rupees) of ads (000s) is increased by 1 then Sales will increase by 0.699 Controlling other variables constant, if Number of outlets is increased by 1 then Sales will increase by 1.620 Controlling other variables constant, if varieties of pizza is increased by 1 then Sales will decrease by 1.978 Controlling other variables constant, if Competitors activities index is increased by 1 then Sales will increase by 0.067 Controlling other variables constant, if Number of existing customers (000s) is increased by 1 then Sales will increase by 0.242

Std. Error: These are the standard errors associated with the coefficients. Beta - These are the standardized coefficients. By standardizing the variables before running the regression, you have put all of the variables on the same scale, and you can compare the magnitude of the coefficients to see which one has more of an effect. We can interpret these coefficients in same way as we did B values. Here B for constant is 0. Look for the regression coefficient having the highest magnitude. Corresponding regressor contributes the most.

Standardized Yi

Y Y , sY X1i X1 X X2 , Standardized X 2i 2i s X1 sX 2

Standardized X1i

T and Sig. - These are the t-statistics and their associated 2-tailed p-values used in testing whether a given coefficient is significantly different from zero. Using an alpha of 0.05: o The coefficient for X1 (0.919) is not significantly related to dependent variable because its pvalue is 0.342, which is greater than 0.05. The coefficient for X2 (0.699) is not significantly related to dependent variable because its pvalue is 0.606, which is greater than 0.05. The coefficient for X3 (1.620) is significantly related to dependent variable because its p-value is 0.031, which is less than 0.05. 6

The coefficient for X4 (-1.978) is not significantly related to dependent variable because its pvalue is 0.417, which is greater than 0.05. The coefficient for X5 (0.067) is not significantly related to dependent variable because its pvalue is 0.977, which is greater than 0.05. The coefficient for X6 (0.242) is not significantly related to dependent variable because its pvalue is 0.442, which is greater than 0.05.

Tolerance - The tolerance of a variable is defined as 1 minus the squared multiple correlation of this variable with all other independent variables in the regression equation. Therefore, the smaller the tolerance of a variable, the more redundant is its contribution to the regression (i.e., it is redundant with the contribution of other independent variables). Small value of Tolerance indicates multicollinearity. VIF it is equal to 1/Tolerance. High value (more than 10) of VIF indicates multicollinearity. Collinearity Diagnosticsa Variance Proportions

Mode Dimens l ion 1 1 2 3 4 5 6 7

Eigenvalue 6.470 .388 .052 .044 .032 .012 .001

Condition Index (Constant) X1 1.000 4.082 11.145 12.101 14.166 23.582 74.026 .00 .00 .01 .00 .00 .00 .99 .00 .04 .04 .64 .00 .27 .00

X2 .00 .00 .02 .00 .00 .63 .35

X3 .00 .01 .00 .12 .34 .14 .39

X4 .00 .00 .01 .00 .00 .04 .95

X5 .00 .03 .42 .00 .03 .09 .42

X6 .00 .00 .03 .13 .38 .00 .46

a. Dependent Variable: Sales Smaller value of Eigen value () indicates presence of multicollinearity Condition Index = (Maximum value of )/ (Minimum value of ). High value of CI indicates multicollinearity. 7

1. Check for Multicollinearity


Multicollinearity is a statistical phenomenon in which two or more predictor variables in a multiple regression model are highly correlated. In this situation the coefficient estimates may change erratically in response to small changes in the model or the data. Multicollinearity does not reduce the predictive power or reliability of the model as a whole, at least within the sample data themselves; it only affects calculations regarding individual predictors. That is, a multiple regression model with correlated predictors can indicate how well the entire bundle of predictors predicts the outcome variable, but it may not give valid results about any individual predictor, or about which predictors are redundant with respect to others. The primary concern is that as the degree of multicollinearity increases, the regression model estimates of the coefficients become unstable and the standard errors for the coefficients can get wildly inflated.

VIF Test: Since VIF value of Variable X2 is greater than 10 (13.733) in Coefficients table there is presence of collinearity

Eigen Value Test: Very small Eigen value (.001) of 7th dimension and very high value of its corresponding Condition index (74.026) in Collinearity Diagnostics table indicates presence of collinearity.

Pearson Correlation: Correlations Sales Pearson Sales Correlation X1 X2 X3 X4 X5 X6 1.000 .902 .934 .953 .725 -.040 .880 1.000 .905 .845 .672 -.103 .841 1.000 .904 .702 -.189 .867 1.000 .794 -.036 .856 1.000 -.178 .819 1.000 .006 1.000 X1 X2 X3 X4 X5 X6

If the absolute value of Pearson correlation is greater than 0.9, collinearity is very likely to exist. Values in Bold indicate multicollinearity.

Statistical Test From the above correlations table we can see that Pearson correlation between (X1, Sales) is .902, (X2, Sales) is .934 Corresponding P-value of X1 and X2 are .342 and .606 resp. Considering 10 % level of significance these high p values does not suggest rejection of H0: j =0. So via T-test we conclude that X1 and X2 are not significantly related to dependent variable. So there is presence of collinearity as Pearson correlations and T-test present contradictory information.

Variable T statistic Sig. (p value)

X1 1.010 .342

X2 .537 .606

X3 2.621 .031

X4 -.856 .417

X5 .030 .977

X6 .808 .442

2. SPSS Stepwise Regression equation


^Ysales = -11.817 + 1.640X1 + 1.753X3

3. SPSS Stepwise Regression Output

Descriptive Statistics Mean Sales X1 X2 X3 X4 X5 X6 24.2000 6.0667 11.0667 14.8667 13.6667 3.4000 29.9333 Std. Deviation 21.91281 4.51136 4.75795 8.33981 1.63299 .98561 16.52905 N 15 15 15 15 15 15 15

This table depicts mean, standard deviation and size of sample provided across 15 months for various dependent and independent variables.

Variables Entered/Removeda Variables Entered Variables Removed

Model 1

Method Stepwise (Criteria: Probability-of-F-to-enter <= .050, Probability-of-F-toremove >= .100). Stepwise (Criteria: Probability-of-F-to-enter <= .050, Probability-of-F-toremove >= .100).

X3

X1

a. Dependent Variable: Sales Here we can see in first iteration X3 is entered because it has highest r value correlation with Sales (Check from correlation table)

10

Variables are only added or removed if the Sig F Change value is significant. For this SPSS performs regression between Y and (X3,X1), Y and (X3,X2) and so on till Y and (X3,X6) It calculates then F (1,n-3)= (SSR(X3,XI)-SSR(X3))/MSSE(X3,XI) for all (X3,XI) combinations It then includes XI which has maximum F ratio o This value can be obtained from the table of Excluded Variables. It shows the Partial Correlation between each candidate for entry and the dependent variable. o Partial correlation is a measure of the relationship of the dependent variable to an independent variable, where the variance explained by previously entered independent variables has been removed from both. o From the Excluded Variables table we can see X1 has maximum partial correlation .594 and minimum p-value .025. So X1 is next variable to be entered subjected to whether resulting model with Sales as dependent variable and (X1, X3) as independent variable improve R

The process of adding more variables stops when all of the available variables have been included or when it is not possible to make a statistically significant improvement in R using any of the variables not yet included.

11

Correlations Sales Pearson Correlation Sales X1 X2 X3 X4 X5 X6 1.000 .902 .934 .953 .725 -.040 .880 X1 .902 1.000 .905 .845 .672 -.103 .841 X2 .934 .905 1.000 .904 .702 -.189 .867 X3 .953 .845 .904 1.000 .794 -.036 .856 X4 .725 .672 .702 .794 1.000 -.178 .819 X5 -.040 -.103 -.189 -.036 -.178 1.000 .006 X6 .880 .841 .867 .856 .819 .006 1.000

This table provides individual correlation coefficients between dependent and independent variables In step wise regression this table is used to find out first variable to be entered which has maximum correlation coefficient with Sales. In this case it is X3.

Model Summaryc Adjusted R Square .900 .930 Std. Error of the Estimate Durbin-Watson 6.91277 5.78925 1.477

Model 1 2

R .953a .970b

R Square .908 .940

a. Predictors: (Constant), X3 b. Predictors: (Constant), X3, X1 c. Dependent Variable: Sales

12

As we can see with inclusion of X1 in model in iteration 2 has increased R square value so model is able to explain 94 % of variability in Sales

Other parameters are explained earlier in section

ANOVAc Model 1 Regression Residual Total 2 Regression Residual Total a. Predictors: (Constant), X3 b. Predictors: (Constant), X3, X1 c. Dependent Variable: Sales Sum of Squares 6101.176 621.224 6722.400 6320.215 402.185 6722.400 Df 1 13 14 2 12 14 3160.108 33.515 94.288 .000b Mean Square 6101.176 47.786 F 127.676 Sig. .000a

F and Sig. - This is the F-statistic the p-value associated with it. The p-value is compared to some alpha level in testing the null hypothesis that all of the model coefficients are 0. H0: 1 = 2= 3=j=0 H1: Not all j are 0

In both models we reject H0 and can conclude that not all j are 0 as p-value is 0 in both cases Sales depend significantly upon X3 in model 1. Sales depend significantly upon X3 or X1 in model 2. To find which one look in coefficients table output Other parameters are explained in section 3

13

Coefficientsa Unstandardized Standardized Coefficients Coefficients Std. Error 95% Confidence Interval for B Lower Bound -21.106 2.025 -18.728 .997 .242 Upper Bound -4.921 2.982 -4.906 2.510 3.038 .286 3.498 .286 3.498 1.000 1.000 Collinearity Statistics

Model 1 (Constant) X3 2 (Constant) X3 X1

Beta

t -3.474 .953 11.299 -3.726 .667 .338 5.053 2.556

Sig. .004 .000 .003 .000 .025

Tolerance

VIF

-13.013 3.746 2.503 .222

-11.817 3.172 1.753 1.640 .347 .641

a. Dependent Variable: Sales

T stat and Sig for Model 2 o The coefficient for X1 (1.640) is significantly related to dependent variable because its p-value is 0.025, which is less than 0.05. o The coefficient for X3 (1.753) is significantly related to dependent variable because its p-value is 0.000, which is less than 0.05.

B Values for Model 2 o Controlling other variables constant, if Number of delivery boys is increased by 1 then Sales will increase by 0.1.640 o Controlling other variables constant, if Number of outlets is increased by 1 then Sales will increase by 1.753

Since VIF value of Variable X1 and X3 is less than 10 (3.498) in Coefficients table we can safely say there is no multicollinearity. Other parameters are explained in section 3

14

Excluded Variablesc Collinearity Statistics Partial Correlation .594 .563 -.171 -.018 .411 .311 -.215 .077 .217 Minimum Tolerance .286 .183 .370 .999 .267 .113 .193 .281 .214

Model 1 X1 X2 X4 X5 X6 2 X2 X4 X5 X6

Beta In .338a .400a -.085a -.006a .241a .226b -.087b .019b .113b

t 2.556 2.361 -.600 -.064 1.560 1.085 -.732 .257 .736

Sig. .025 .036 .560 .950 .145 .301 .480 .802 .477

Tolerance .286 .183 .370 .999 .267 .113 .370 .981 .219

VIF 3.498 5.465 2.703 1.001 3.740 8.820 2.703 1.020 4.569

a. Predictors in the Model: (Constant), X3 b. Predictors in the Model: (Constant), X3, X1 T test and Sig o Model 1 X1 and X2 are eligible to enter after iteration 1 as both have p-value less than .05. But X1 has higher partial correlation so it enters in second iteration. X4, X5, and X6 have high p-value > .05 so H0: j =0 is not rejected. So Sales is not significantly dependent on X4, X5, and X6. o Model 2

15

X2, X4, X5, and X6 have high p-value > .05 so H0: j =0 is not rejected. So Sales is not significantly dependent on X2, X4, X5, and X6.

So final model has only significant dependence on X3 and X1

Collinearity Diagnosticsa Variance Proportions Dimens Model ion 1 1 2 2 1 2 3 Eigenvalue 1.879 .121 2.761 .198 .040 Condition Index 1.000 3.944 1.000 3.732 8.273 (Constant) .06 .94 .03 .71 .26 X3 .06 .94 .01 .01 .98 .01 .16 .83 X1

a. Dependent Variable: Sales In model 2 Eigenvalues are not close to 0 and Condition index is less than 10 so there is less evidence of multicollinearity.

16

You might also like