In cross sectional studies, data often involves households (in a consumption function analysis) or firms (in an investment study analysis), so if by chance an error term of a particular household or firm gets correlated with some other household or firm, then such correlations are termed as autocorrelation. Autocorrelation also occurs in the case of time series data. In the case of time series data, if the observations exhibit intercorrelations, especially when the time interval between the successive observations are short, then those intercorrelations are nothing but autocorrelation.

So, the term autocorrelation is defined as the correlation between the members of the series of the observations that are ordered with respect to time. Let us discuss two cases based on cross sectional and time series data to explain autocorrelation in a much better manner. In the case of cross sectional data, if a change in the income of a particular person affects the consumption expenditure of another household (other than his), then autocorrelation is present in the data. In the case of time series data, if an output is low in one quarter due to a labor strike, and if the data shows low output continues in the next quarter as well, then autocorrelation is present in the data.

Autocorrelation can be defined as the lag correlation of a given series with itself, lagged by a number of time units. On the other hand, serial autocorrelation defines the lag correlation between the two series in time series data.

There are certain patterns of autocorrelation, thus autocorrelation comes with various patterns. One example would be showing a discernible pattern among the residual errors. Autocorrelation exists when the residual error shows a cyclical pattern or an upward or downward trend in the disturbances, etc.

One of the major reasons for autocorrelation is the inertia or sluggishness caused in the time series data. The usage of an incorrect functional form also becomes a reason for an autocorrelation to occur.

The manipulation involving the extrapolation and the interpolation in the data also becomes a reason for autocorrelation. In this, the time series data is averaged so that smoothness occurs in the data, and this smoothness in the data exhibits a systematic pattern which in turn, introduces autocorrelation in the data.

A non stationarity property in the time series data also gives rise to the phenomenon of autocorrelation. Therefore, in order to make the time series free of autocorrelation, the researcher should make the data stationary.

Researchers should know that autocorrelation can be positive as well as negative. Economic time series generally exhibits positive autocorrelation as the series move in an upward or downward pattern. If the series move in a constant upward and downward movement, then autocorrelation is negative.

The major consequence of using ordinary least square (OLS) in the presence of autocorrelation is that it simply makes the estimator inefficient. As a result, the hypothesis testing procedures will give inaccurate results due to the presence of autocorrelation.

There is a popular test called the Durbin Watson test that detects the presence of autocorrelation. This test is conducted under the null hypothesis that there is no autocorrelation in the data. A test statistic called‘d’ is computed. “d” is defined as the ratio between the sum of the square of the difference in the residuals with ith and (i-1) time and the square of the residual in ith time. If the upper critical value of the test comes out to be less than the value of ‘d,’ then there is no autocorrelation. If the lower critical value of the test is more than the value of ‘d’ then there is autocorrelation.If one detects autocorrelation in the data, then the first thing a researcher should do is to try to find whether or not the autocorrelation is pure. If it is pure autocorrelation, then one can transform it into the original model, which is free from pure autocorrelation.