Mean and variance of difference operators on a time series process

In summary, the decomposition of the time series ${Y}_t={m}_t+{\varepsilon}_t$ results in a process ${\nabla}_2{Y}_t$ with mean 0 and variance $4{\sigma}^2$. However, the correct mean and variance for this process are $2b$ and $2{\sigma}^2$, respectively. The explanation for the mean term may be related to the definition of $m_t$ as $a+bt^2$, but it is unclear where the $2\sigma^2$ terms come from. Further clarification is needed.
  • #1
deba123
4
0
\(\displaystyle \text{Consider the following decomposition of the time series }{Y}_{t}\text{ where }{Y}_{t}={m}_{t}+{\varepsilon}_{t},\text{ where }{\varepsilon}_{t}\text{ is a sequence of i.i.d }\left(0,{\sigma}^{2}\right)\text{ process. Compute the mean and variance of the process }{\nabla}_{2}{Y}_{t}\text{ when }:{m}_{t}=a+bt.\)
 
Physics news on Phys.org
  • #2
Welcome to MHB! :D

Can you show what you have tried so far so our helpers have some idea where you are stuck and can then offer better help?
 
  • #3
MarkFL said:
Welcome to MHB! :D

Can you show what you have tried so far so our helpers have some idea where you are stuck and can then offer better help?

\(\displaystyle O.K\: so\: here's\: what\: i\: did:
{\nabla}_{2}{Y}_{t}=\nabla(\nabla{Y}_{t})=\nabla(({Y}_{t})-({Y}_{t-1}))={Y}_{t}-2{Y}_{t-1}+{Y}_{t-2}.
Which\: after\: expanding\: comes\: out:\:
{\varepsilon}_{t}-2{\varepsilon}_{t-1}+{\varepsilon}_{t-2}.
So\: mean\: is\: 0 \:and\: variance\: is\: 4{\sigma}^{2}.\)
\(\displaystyle But\:the\:answer\: is :mean=2b\:and\: variance=2{\sigma}^{2}.\) I don't get how the answer came about something like that. Please help.
 
  • #4
Did the answer detail the computations?
 
  • #5
girdav said:
Did the answer detail the computations?

No it didn't. But the answer could be wrong. I just want to know where I was wrong (or right) and may be if I was using the wrong definition. Thanks for your help.
 
  • #6
The term $2b$ for the variance could be explained if we had considered $m_t=a+bt^2$, but I fail to see how the $2\sigma^2$ terms come from.
 

FAQ: Mean and variance of difference operators on a time series process

What is the mean of a difference operator on a time series process?

The mean of a difference operator on a time series process is a measure of the central tendency of the process. It is the average value of the differences between consecutive data points in the time series.

How is the mean of a difference operator calculated?

The mean of a difference operator is calculated by first taking the differences between consecutive data points in the time series, and then taking the average of those differences. This can be done using mathematical formulas or software programs.

What is the variance of a difference operator on a time series process?

The variance of a difference operator on a time series process is a measure of the spread or variability of the process. It tells us how much the data points in the time series differ from the mean of the differences.

How is the variance of a difference operator calculated?

The variance of a difference operator is calculated by first taking the differences between consecutive data points in the time series, then squaring those differences, and finally taking the average of the squared differences. This can also be done using mathematical formulas or software programs.

Why is it important to understand the mean and variance of difference operators on a time series process?

Understanding the mean and variance of difference operators on a time series process is important because it allows us to analyze and interpret the patterns and trends in the data. It also helps us to make predictions and forecasts about future values of the time series. Additionally, the mean and variance can be used to compare different time series processes and determine which one has a stronger or more stable pattern.

Similar threads

Back
Top