ARCH is similar to autocorrelation, in fact it could be described as autocorrelation of the residuals of an AR model. An AR model display ARCH, autoregressive conditional heteroskedasticity, if the variance of the residuals in one period is dependent on the variance of the residuals in a previous period. When this condition exists, the standard errors of the regression coefficients in AR models and the hypothesis tests of these coefficients are invalid.
We test of ARCH using an ARCH model. In this model, ARCH(1) would mean that the variance of the residuals in one period is dependent on (i.e., a function of) the variance of the residuals in the preceding period. To test this, we regress the squared residuals from an estimated time-series model against the first lag of the square residuals.
Where: where a0 is the constant and μt is an error term.
If a1 is statistically different from zero, the time series is ARCH(1). We can correct this using methods that correct heteroskedasticity, such as generalized least squares methods.
Regressions with Two Time Series
A requirement of AR models is that the time series must be covariance stationary. Covariance stationarity occurs if the following three conditions are satisfied:
- The expected value of the time series is constant over time (constant finite mean)
- The time series volatility around its mean does not change over time (constant finite variance)
- The covariance of the time series with leading or lagged values is itself constant.
The Dickey-Fuller test is necessary to determine whether an AR model is covariance stationary. The D-F test is a test for unit root, in other words, if B1 – 1 = 0. If B1 – 1 = 0 is not significantly different from zero, then it can be said that B1 = 1, and the series has unit root. In this case it is not covariance stationary and there is no mean reverting level:
In certain cases, we run a regression with two time series. In this situation, both time series can exhibit unit roots (display nonstationarity) and we must test for this condition to determine the validity of the regression. The following options are possible:
- One unit root – linear regression not valid
- Two unit roots – test variables for cointegration
- No unit roots – linear regression valid
Cointegration means that the two variables are linked and allows for a valid regression even if unit root is present. We test for cointegration using the Dickey-Fuller Engle and Granger test.