What Are the Theoretical Implications of a Time-Series Being Autoregressive?

  • Thread starter euroazn
  • Start date
  • Tags
    Model
In summary, there is a lot of literature discussing empirical and statistical concerns with autoregressive time-series models, but there is also research on the theoretical implications of such models. The behavior of the model can be analyzed in terms of the roots of the auxilliary equation of its deterministic part, and "Box Jenkins" is a good search phrase for further information. The auxiliary equation plays a role in determining expectations, but confidence intervals are more complex.
  • #1
euroazn
12
0
There's a bunch of literature out there about empirical/data-fitting/statistical concerns regarding autoregressive times-series models, but is there anything out there about theoretical implications of a time-series being autoregressive? For example, when does limt → ∞E(Xt) exist? Does the prediction interval as t goes to infinity have a bound?
 
Physics news on Phys.org
  • #2
There are theoretical results - and I don't claim to know them off the top of my head!

A autoregressive model resembles a "linear recurrence relation" http://en.wikipedia.org/wiki/Recurrence_relation except it has a noise term. The solution of homegeneous linear recurrent relation is determined by finding the roots of the "auxillary equation". (Its analgous to finding the solution to a homogeneous differential equation by solving its auxillary equation.) There are theoretical results that analyze the behavior of the autoregressive model in terms of the roots of the auxilliary equation of its deterministic part. The big names in ARMA models used to be "Box Jenkins". I don't know the current theory, but "Box Jenkins" would be a good search phrase.
 
  • #3
this is what I figured and of course it's easy to see that the auxiliary equation has a lot to do with expectations, but confidence intervals aren't quite so easy :(

Thanks for your help, you've been very useful!
 

FAQ: What Are the Theoretical Implications of a Time-Series Being Autoregressive?

What is an autoregression model?

An autoregression model is a statistical model that uses past values of a variable to predict future values. It is based on the assumption that the current value of a variable is influenced by its previous values.

How is an autoregression model different from a linear regression model?

An autoregression model is different from a linear regression model in that it uses the variable's past values as predictors, rather than other variables. In other words, an autoregression model is a regression of a variable on its own lagged values.

What is the difference between an autoregression model and a moving average model?

An autoregression model and a moving average model are both time series models used for forecasting. The main difference is that an autoregression model uses past values of the variable itself, while a moving average model uses past forecast errors.

How do you determine the appropriate lag order for an autoregression model?

The appropriate lag order for an autoregression model can be determined using statistical tests such as the Akaike information criterion (AIC) or the Bayesian information criterion (BIC). These criteria measure the goodness of fit of the model and penalize for including too many or too few lagged values.

What are the limitations of using an autoregression model?

Some limitations of using an autoregression model include the assumption that the data is stationary (constant mean and variance), and the inability to capture non-linear relationships between variables. Additionally, the model may not perform well with large datasets or when there are missing values.

Similar threads

Replies
2
Views
3K
Replies
1
Views
6K
Replies
12
Views
3K
Replies
2
Views
11K
Replies
9
Views
2K
Replies
19
Views
6K
Back
Top