You are on page 1of 7

The vector autoregressive model (VAR) is actually simpler to estimate than the VEC model.

It is used when there is no cointegration among the variables and it is estimated using time series that have been transformed to their stationary values. In the example from your book, we have macroeconomic data log of real personal disposable income (denoted as Y) and log of real personal consumption expenditure (denoted as C) for the U.S. economy over the period 1960:1 to 2009:4 that are found in the fred.dta dataset. As in the previous example, the first step is to determine whether the variables are stationary. If they are not, then difference them, checking to make sure that the differences are stationary (i.e., the levels are integrated). Next, test for cointegration. If they are cointegrated, estimate the VEC model. If not, use the differences and lagged differences to estimate a VAR. model. Load the data. In this exercise well be using the fred.dta data.
use fred, clear

Estimating a VAR

The data are quarterly and begin in 1960q1 and extend to 2009q4. Just as we did in the example above, sequences of quarterly dates:
gen date = q(1960q1) + _n - 1 format %tq date tsset date

The first step is to plot the series in order to identify whether constants or trends should be included in the tests of nonstationarity. Both the levels and differences are plotted.
tsline lc ly, legend(lab (1 "ln(RPCE)") lab(2 "ln(RPDI)")) /// name(l1, replace) tsline D.lc D.ly, legend(lab (1 "ln(RPCE)") lab(2 "ln(RPDI)")) /// name(d1, replace)
9.5 7.5
1960q1

8.5

1970q1

1980q1 date ln(RPCE)

1990q1

2000q1 ln(RPDI)

2010q1

The levels series appear to be trending together. The differences show no obvious trend, but the mean of the series appears to be greater than zero, suggesting that a constant be included in the ADF regressions.
.06 -.02 0 .02 .04

1960q1

1970q1

1980q1 date ln(RPCE)

1990q1

2000q1 ln(RPDI)

2010q1

The other decision that needs to be made is the number of lagged differences to include in the augmented Dickey-Fuller regressions. The principle to follow is to include just enough so that the residuals of the ADF regression are not autocorrelated. So, start out with a basic regression that contains no lags, estimate the DF regression, then use the LM test discussed earlier to determine whether the residuals are autocorrelated. Add enough lags to eliminate the autocorrelation among residuals. If this strategy is pursued in Stata, then the ADF regressions will have to be explicitly estimated; the estat bgodfrey command will not be based on the proper regression if issued after dfuller. The regressions for the ADF tests are
qui reg D.lc L.lc L(1/1).D.lc estat bgodfrey, lags(1 2 3) qui reg D.lc L.lc L(1/2).D.lc estat bgodfrey, lags(1 2 3) qui reg D.lc L.lc L(1/3).D.lc estat bgodfrey, lags(1 2 3)

The test results for the last two regressions appear below.

. estat bgodfrey, lags(1 2 3) Breusch-Godfrey LM test for autocorrelation lags(p) 1 2 3 chi2 5.451 5.518 7.795 df 1 2 3 Prob > chi2 0.0196 0.0634 0.0504

H0: no serial correlation . qui reg L(0/3).D.lc L.lc . di "Lags = 3" Lags = 3 . estat bgodfrey, lags(1 2 3) Breusch-Godfrey LM test for autocorrelation lags(p) 1 2 3 chi2 0.056 2.311 3.499 df 1 2 3 Prob > chi2 0.8122 0.3148 0.3209

H0: no serial correlation

It is clear that the residuals of the ADF(2) are autocorrelated and those of the ADF(3) are not. The resulting ADF statistic is obtained using:
dfuller lc, lags(3)

where the indicated number of lags is used.


. dfuller lc, lags(3) Augmented Dickey-Fuller test for unit root Test Statistic Z(t) -1.995 Number of obs = 196

Interpolated Dickey-Fuller 1% Critical 5% Critical 10% Critical Value Value Value -3.478 -2.884 -2.574

MacKinnon approximate p-value for Z(t) = 0.2886

Note also that this regression contains a constant and that the test statistic is 1.995. The unit root hypothesis is not rejected at the 5% level. For the income series, loops are used to reduce the amount of coding. Two side benefits are 1) less output and 2) smaller probability of a coding error. Of course, if whats put in the body of the loop is coded with an error, then all of your results will be wrong! At any rate, here is the example for income:
forvalues p = 1/3 { qui reg L(0/`p').D.ly L.ly di "Lags =" `p' estat bgodfrey, lags(1 2 3) }

As seen earlier in the course, the loop structure is fairly easy to use. It starts with forvalues p, where p is the counter, and p is instructed to increment beginning at 1 and ending at 3. The default increment size is one. Thus p will be set at 1, 2 and 3 as the loop runs. The first line must end with a left brace {. In the body of the loop are the regression, a display statement to indicate the lag length, and a call to the post-

estimation command bgodfrey. The loop has to be closed on a separate line using a right brace }. After initializing the counter, p, it must be referred to in single quotes (left ` and right which as you may recall are different characters on the keyboard). This is as it appears in the first two lines of the loops body. The output is:
Lags =1 Breusch-Godfrey LM test for autocorrelation lags(p) 1 2 3 Lags =2 Breusch-Godfrey LM test for autocorrelation lags(p) 1 2 3 Lags =3 Breusch-Godfrey LM test for autocorrelation lags(p) 1 2 3 chi2 0.157 1.271 2.098 df 1 2 3 Prob > chi2 0.6916 0.5297 0.5523 chi2 2.077 2.539 2.542 df 1 2 3 Prob > chi2 0.1495 0.2810 0.4677 chi2 0.208 2.853 2.880 df 1 2 3 Prob > chi2 0.6487 0.2401 0.4105

H0: no serial correlation

H0: no serial correlation

H0: no serial correlation

If nothing else, the output is a lot neater looking since none of the commands are echoed to the screen. For this series, no lagged differences of ly are included as regressors (i.e., the regular Dickey-Fuller regression). The Dickey-Fuller test yields:
. dfuller ly, lags(0) Dickey-Fuller test for unit root Test Statistic Z(t) -2.741 Number of obs = 199

Interpolated Dickey-Fuller 1% Critical 5% Critical 10% Critical Value Value Value -3.477 -2.883 -2.573

MacKinnon approximate p-value for Z(t) = 0.0673

The p-value is less than 10% but larger than 5%. Its your call as to whether you are convinced that the unit root is rejected. Recall that the cointegrating relationship can be estimated using least squares.
Ct = 1 + 2Yt + vt

The residuals from this regression are obtained and their changes are regressed on the lagged value
t = e t 1 + e t 1 + vt e

The Stata code for this procedure is:


reg lc ly predict ehat, res reg D.ehat L.ehat D.L.ehat, noconst di _b[L.ehat]/_se[L.ehat] . di _b[L.ehat]/_se[L.ehat] -2.8728997

Note that an intercept term has been included here to capture the component of (log) consumption that is independent of disposable income. The 5% critical value of the test for stationarity in the cointegrating residuals is 3.37. Since the unit root t-value of 2.873 is greater than 3.37, it indicates that the errors are not stationary, and hence that the relationship between C (i.e., ln(RPCE)) and Y (i.e., ln(RPDI)) is spuriousthat is, we have no cointegration. In this case, estimate the coefficients of the model using a VAR in differences. The VAR is simple to estimate in Stata. The easiest route is to use the varbasic command. varbasic fits a basic vector autoregressive (VAR) model and graphs the impulse-response functions (IRFs) or the forecast-error variance decompositions (FEVDs). The basic structure of the VAR is found in the equations below:

yt = 11yt 1 + 12 xt 1 + vty xt = 21yt 1 + 22 xt 1 + vtx


The variables xt and yt are nonstationary, but the differences are stationary. Each difference is a linear function of it own lagged differences and of lagged differences of each of the other variables in the system. The equations are linear and least squares can be used to estimate the parameters. The varbasic command simplifies this. You need to specify the variables in the system (yt and xt) and the number of lags to include on the right-hand-side of the model. In our example, only 1 lag is included and the syntax to estimate the VAR is:
varbasic D.lc D.ly, lags(1/1) step(12) nograph

The syntax lags(1/1) tells Stata to include lags from the first number to the last, which in this case is lag 1 to lag 1. If your VAR is longer than 1 lag then youll change that here. Also added is the step(12) option. This option is useful in generating forecasts, a topic covered next. The output from this is:

. varbasic D.lc D.ly, lags(1/1) step(12) nograph Vector autoregression Sample: 1960q3 Log likelihood = FPE = Det(Sigma_ml) = Equation D_lc D_ly 2009q4 1400.444 2.62e-09 2.46e-09 Parms 3 3 RMSE .006575 .008562 R-sq 0.1205 0.1118 No. of obs AIC HQIC SBIC chi2 27.12459 24.92656 P>chi2 0.0000 0.0000 = 198 = -14.0853 = -14.04496 = -13.98565

Coef. D_lc lc LD. ly LD. _cons D_ly lc LD. ly LD. _cons .4754276 -.2171679 .0060367 .2156068 .1493798 .0052776

Std. Err.

P>|z|

[95% Conf. Interval]

.0741801 .0572953 .0007516

2.91 2.61 7.02

0.004 0.009 0.000

.0702164 .0370832 .0038046

.3609972 .2616765 .0067507

.0965863 .0746013 .0009786

4.92 -2.91 6.17

0.000 0.004 0.000

.286122 -.3633839 .0041187

.6647332 -.070952 .0079547

In light of the fact that longer lags were used in the Dickey-Fuller regressions it is likely that the VAR should also have longer lags. In practice, it would probably be a good idea to test the residuals of the VAR for autocorrelation. The Stata command varlmar issued after varbasic will perform a LM test of the residuals similar to the ones we performed for autocorrelation.
. varlmar Lagrange-multiplier test lag 1 2 chi2 9.5086 5.6784 df 4 4 Prob > chi2 0.04957 0.22449

H0: no autocorrelation at lag order

There is evidence of autocorrelation in the residuals since the p-value at lag 1 is less than 5%. Stata includes another procedure that makes selecting lag lengths in VAR models very easy. The varsoc command reports the final prediction error (FPE), Akaike's information criterion (AIC), Schwarz's Bayesian information criterion (SC), and the Hannan and Quinn information criterion (HQIC) lag-order selection statistics for a series of vector autoregressions. This can be used to find lag lengths for VAR or VEC models of unknown order. For the example above Stata yields:

. varsoc D.lc D.ly, maxlag(4) Selection-order criteria Sample: 1961q2 - 2009q4 lag 0 1 2 3 4 LL 1355.02 1379.09 1383.92 1388.24 1391.6 LR 48.129 9.6655* 8.6379 6.7149 df 4 4 4 4 p 0.000 0.046 0.071 0.152 FPE 3.2e-09 2.6e-09 2.6e-09 2.6e-09* 2.6e-09 Number of obs AIC -13.8772 -14.083 -14.0915 -14.0948* -14.0882 HQIC -13.8636 -14.0422* -14.0235 -13.9996 -13.9659 = SBIC -13.8436 -13.9823* -13.9237 -13.8598 -13.7861 195

Endogenous: Exogenous:

D.lc D.ly _cons

The AIC selects a lag order of 3 while the SC (labeled by Stata, SBIC) chooses 1.

You might also like