What is the difference between AR and MA time series models?




I was reading about time series models and I could not understand the concept of moving average time series model. I get what an auto-regressive time series model is but what is the difference between these two time series models?
How does noise / shock quickly vanish with time in MA models? Does the next instant in MA models not depend on the previous ones like in AR models?




Here’s the difference between AR and MA models:

Pure AR Models - Depends on the lagged values of the data you are modeling to make forecasts

Pure MA Models - Depends on the errors(residuals) of the previous forecasts you made to make current forecasts

Mixed Models ARMA - Takes into account both of the above factors when making predictions

Hope that helps! :smile:
Disclaimer - I’m not an expert in time-series but this is what I know at the moment! Thanks


Hi @adityashrm21,

The moving average model specifies that the output variable depends linearly on the current and various past values of a stochastic (imperfectly predictable) term. Rather than using the past values of the forecast variable in a regression, a moving average model uses past forecast errors in a regression-like model.

The primary difference between an AR and MA model is based on the correlation between time series objects at different time points. The covariance between x(t) and x(t-n) is zero for MA models. However, the correlation of x(t) and x(t-n) gradually declines with n becoming larger in the AR model.

This means that the moving average(MA) model does not uses the past forecasts to predict the future values whereas it uses the errors from the past forecasts. While, the autoregressive model(AR) uses the past forecasts to predict future values.

As mentioned earlier, the MA model, instead of depending on the previous forecasts like in AR model, depends on the error of previous forecasts. Hence the noise quickly vanishes with time in MA model.

Hope this helps!!