You are on page 1of 11

Econometrics

Multicollinearity
FAROZA SHAMSI MA ECONOMICS SEMESTER II ECO/12/02

Introduction
Assumption 8 of the classical linear regression model is that there's no multicollinearity among the regressors included in the regression model. Multicollinearity occurs when two or more independent variables in a model are highly correlated to each other. and it happens when assumption number 8 is violated.

Meaning of multicolinearity
Multicollinearity is used to denote the presence of linear relationship among explanatory variables of the model when the explanatory variables are perfectly correlated the method of least squares breaks down. On the other hand variables are said to be orthogonal if the are uncorelated. For example 2 3

yi 1 X i 2 X i 3 X i ui

Where Y= total cost Y= output The variables xi squared and xi cubed are obviously functionally related to xi output but the relationship is non linear. Therefore in such a model the assumption of no multi collinearity is not violated. As a matter of fact to depict the u shaped cost curves of economic thoery the model has to be in this form.

Nature of multicollinearity
The term multicollinearity is due to Ragnar Frisch. Originally it meant the existence of a perfect or exact linear relationship among some or all explanatory variables of a regression model. Nature of multicollinearity can be well depicted by Ballantines venn diagrams.

X2

x3

Figure 3c No multicollinearity

Sources of multicollinearity
The data collection method employed Constraints in the model or the population being sampled. Model specification An over determined model Regressors share a common trend that is they increase or decrease over time.

Consequences of multicollinearity
Least squares estimators are indeterminate, the variances and co variances of the estimators become infinitely large Because of large standard errors the probabiltiy of accepting a false hypothesis increases. Ols estimates and their standard errors become very sensitive to slightest change in the samlpe data. If multicollinearity is high one may obtain a high R square but none or very few regression coefficients are statistically significant.

Tests for detecting multicollinearity


Farrar Glauber Test: which includes Chi square test: which tests presence and severity of multicollinearity in a function F test: pattern of multicollinearity T test: location of multicollinearity

Corrective measures
Increase the size of the sample Introduce an additional equation in the model Use extraneous information A) method of using a prior information B) transformation of variables C) method of pooling cross section and time series data

Thankyou

You might also like