Autoregressive Defined

Autoregressive Defined

A statistical model is autoregressive if it predicts future values based on past values. Multiple regression models forecast a variable using a linear combination of predictors, whereas autoregressive models use a combination of past values of the variable. For example, an investor using an autoregressive model to forecast stock prices would need to assume that new buyers and sellers of that stock are influenced by recent market transactions when deciding how much to offer or accept for the security. Autoregressive models operate under the premise that past values have an effect on current values, which makes the statistical technique popular for analyzing nature, economics, and other processes that vary over time. However, since autoregressive models base their predictions only on past information, they implicitly assume that the fundamental forces that influenced the past prices will not change over time.

Autoregressive models predict future values based on past values.

What Does Autoregressive Mean?

A statistical model is autoregressive if it predicts future values based on past values. For example, an autoregressive model might seek to predict a stock's future prices based on its past performance.

Autoregressive models predict future values based on past values.
They are widely used in technical analysis to forecast future security prices.
Autoregressive models implicitly assume that the future will resemble the past. Therefore, they can prove inaccurate under certain market conditions, such as financial crises or periods of rapid technological change.

Understanding Autoregressive Models

Autoregressive models operate under the premise that past values have an effect on current values, which makes the statistical technique popular for analyzing nature, economics, and other processes that vary over time. Multiple regression models forecast a variable using a linear combination of predictors, whereas autoregressive models use a combination of past values of the variable.

An AR(1) autoregressive process is one in which the current value is based on the immediately preceding value, while an AR(2) process is one in which the current value is based on the previous two values. An AR(0) process is used for white noise and has no dependence between the terms. In addition to these variations, there are also many different ways to calculate the coefficients used in these calculations, such as the least squares method.

These concepts and techniques are used by technical analysts to forecast security prices. However, since autoregressive models base their predictions only on past information, they implicitly assume that the fundamental forces that influenced the past prices will not change over time. This can lead to surprising and inaccurate predictions if the underlying forces in question are in fact changing, such as if an industry is undergoing rapid and unprecedented technological transformation.

Nevertheless, traders continue to refine the use of autoregressive models for forecasting purposes. A great example is the Autoregressive Integrated Moving Average (ARIMA), a sophisticated autoregressive model that can take into account trends, cycles, seasonality, errors, and other non-static types of data when making forecasts.

Analytical Approaches

Although autoregressive models are associated with technical analysis, they can also be combined with other approaches to investing. For example, investors can use fundamental analysis to identify a compelling opportunity and then use technical analysis to identify entry and exit points.

Real World Example of an Autoregressive Model

Autoregressive models are based on the assumption that past values have an effect on current values. For example, an investor using an autoregressive model to forecast stock prices would need to assume that new buyers and sellers of that stock are influenced by recent market transactions when deciding how much to offer or accept for the security.

Although this assumption will hold under most circumstances, this is not always the case. For example, in the years prior to the 2008 Financial Crisis, most investors were not aware of the risks posed by the large portfolios of mortgage-backed securities held by many financial firms. During those times, an investor using an autoregressive model to predict the performance of U.S. financial stocks would have had good reason to predict an ongoing trend of stable or rising stock prices in that sector. 

However, once it became public knowledge that many financial institutions were at risk of imminent collapse, investors suddenly became less concerned with these stocks' recent prices and far more concerned with their underlying risk exposure. Therefore, the market rapidly revalued financial stocks to a much lower level, a move which would have utterly confounded an autoregressive model.

It is important to note that, in an autoregressive model, a one-time shock will affect the values of the calculated variables infinitely into the future. Therefore, the legacy of the financial crisis lives on in today’s autoregressive models.

Related terms:

Autoregressive Integrated Moving Average (ARIMA)

An autoregressive integrated moving average (ARIMA) is a statistical analysis model that leverages time series data to forecast future trends.  read more

Box-Jenkins Model

The Box-Jenkins Model is a mathematical model designed to forecast data from a specified time series. read more

Implied Volatility (IV)

Implied volatility (IV) is the market's forecast of a likely movement in a security's price. It is often used to determine trading strategies and to set prices for option contracts. read more

Least Squares Method

The least squares method is a statistical technique to determine the line of best fit for a model, specified by an equation with certain parameters to observed data.  read more

Mortgage-Backed Security (MBS)

A mortgage-backed security (MBS) is an investment similar to a bond that consists of a bundle of home loans bought from the banks that issued them. read more

Multiple Linear Regression (MLR)

Multiple linear regression (MLR) is a statistical technique that uses several explanatory variables to predict the outcome of a response variable. read more

Serial Correlation

Serial correlation is a statistical representation of the degree of similarity between a given time series and a lagged version of itself over successive time intervals read more

Technical Analysis of Stocks and Trends

Technical analysis of stocks and trends is the study of historical market data, including price and volume, to predict future market behavior. read more

Technical Analysis

Technical analysis is a trading discipline that seeks to identify trading opportunities by analyzing statistical data gathered from trading activity. read more

Trend Analysis

Trend analysis is a technique used in technical analysis that attempts to predict future stock price movements based on recently observed trend data. read more