401-4623-00: Time Series Analysis
Section 5
Linear Processes
Swiss Federal Institute of Technology Zurich
Eidgenössische Technische Hochschule Zürich
Last Edit Date: 01/09/2024
Disclaimer and Term of Use:
We do not guarantee the accuracy and completeness of the summary content. Some of the course material may not be included, and some of the content in the summary may not be correct. You should use this file properly and legally. We are not responsible for any results from using this file
This personal note is adapted from Professor Fadoua Balabdaoui and Professor Nicolai Meinshausen. Please contact us to delete this file if you think your rights have been violated.
This work is licensed under a Creative Commons Attribution 4.0 International License.
Linear Processes Basics¶
A time series $\{X_t\}_t$ is said to be a linear process if it admits: $X_t = \sum_{j = -\infty}^{\infty} \psi_j Z_{t-j}, t\in \mathbb{Z}$ with $Z_t \sim WN(0, \sigma^2)$ and $\{\psi_j\}_j$ is a (determinstic) and an absolutely convergence sequence:
$$\sum_{j = -\infty}^{\infty} |\psi_j| < \infty.$$
Alternative notaion: Let $\psi(Z) = \sum_{j = -\infty}^{\infty} \psi_j Z^j$ for $Z$ in the convergence domain of $\sum_{j = -\infty}^{\infty} \psi_j Z^j$.
Recall that $B$, the backward shift operator is defined as
$$ \begin{align} BX_t &= X_{t-1}\\ B^jX_t &= X_{t-j} \end{align} $$
For some stationary time series $\{Y_t\}_t$,
$$ \begin{align} \psi (B) Y_t &= \left( \sum_{j=-infty}^{\infty} \psi_j B^j \right) Y_t \\ &= \sum_{j=-\infty}^{\infty} \psi_j B^j Y_t \\ &= \sum_{j=-\infty}^{\infty} \psi_j Y_{t-j} \end{align} $$
This means that the linear process can be rewritten as
$$X_t = \psi (B) Z_t.$$
Remark: The condition $\sum_{j=-\infty}^{\infty} |\psi_j| < \infty$ ensures that $X_t$ is finite with probability 1.
Recall that $X_t = \sum_{j = -\infty}^{\infty} \psi_j Z_{t-j}$. Let $Y_t = \sum_{j=-\infty}^{\infty} |\psi_j| |Z_{t-j}| \ge 0$, we have
$$ \begin{align} \mathbb{E}[Y_t] &= \mathbb{E}\left[ \sum_{j=-\infty}^{\infty} |\psi_j| |Z_{t-j}| \right]\\ &= \sum_{j=-\infty}^{\infty} |\psi_j| \mathbb{E}\left[ |Z_{t-j}| \right] \end{align} $$
by Fubini's theorem (Fubini's theorem states that a double integral can be computed by iterated integrals if the function is Lebesgue integrable on a rectangle)
$$ \begin{align} \mathbb{E}[|Z_{t-j}|] &\le \sqrt{\mathbb{E}[Z_{t-j}^2]}\\ &= \sqrt{\mathrm{Var}(Z_{t-j})}\\ &= \sqrt{\sigma^2} = \sigma < \infty \end{align} $$
so
$$\sum_{j=-\infty}^{\infty} |\psi_j| \mathbb{E}[|Z_{t-j}|] \le \sigma \sum_{j=-\infty}^{\infty} |\psi_j| < \infty.$$
Hence, $\mathbb{E}[Y_t]$ is finite.
This implies that $\mathbb{P}(Y_t < \infty) = 1$
$$ \begin{align} \mathbb{P}(Y_t = \infty) &= \mathbb{P}\left( \lim_{M\rightarrow \infty} \bigcap_{m=1}^M \left\{ Y_t > m \right\} \right) \\ &= \mathbb{P}\left( \lim_{M\rightarrow \infty} \left\{ Y_t > M \right\} \right)\\ &= \lim_{M\rightarrow \infty} \mathbb{P} \left( \{Y_t > M\} \right)~~~~~\text{(Monotone convergence theorem)}\\ &\le \lim_{M\rightarrow \infty} \frac{\mathbb{E}[Y_t]}{M} = 0~~~~~\text{(Markov inequality)} \end{align} $$
Hence, $\sum_{j=-\infty}^\infty \psi_j Z_{t-j}$ is absolutely convergent, meaning that $X_t$ is well-defined with probability 1.
Moving Average Processes¶
If $\psi_j = 0$, $\forall j < 0$, then $X_t = \sum_{j=0}^{\infty} \psi_j Z_{t-j}$. In this case, $\{X_t\}_t$ is called a Moving Average Process:
$$\{X_t\}_t \sim MA(\infty).$$
Let $\{X_t\}_t$ be stationary with $\mathbb{E}[Y_t] = 0$ and ACVF to be $\gamma_y$. Also, let $\{\psi_j\}_j$ be a sequence such that
$$\sum_{j=-\infty}^{\infty} |\psi_j| < \infty.$$
Then, $X_t = \sum_{j=-\infty}^{\infty} \psi_j Y_{t-j}$ is stationary with $\mu_x(t) = \mathbb{E}[X_t] = 0$ and $\gamma_x(h) = \sum_{j=-\infty}^{\infty} \sum_{k=-\infty}^{\infty} \psi_j \psi_k \gamma_y(h + k -j)$.
In particular, if $\{Y_t\}_t \sim WN(0, \sigma^2)$, then $\gamma_x(h) = \sigma^2 \sum_{j=-\infty}^{\infty} \psi_j \psi_{j+h}$.
Proof: Consider $W_t = \sum_{j = -\infty} |\psi_j| |Y_{t-j}|$ and
$$ \begin{align} \mathbb{E}[W_t] &= \sum_{j=-\infty}^{\infty} |\psi_j| \mathbb{E}[|Y_t-j|] \\ &\le \sum_{j=-\infty}^{\infty} |\psi_j| \sqrt{\gamma_y(0)}~~~~~\text{(Jensen's inequality)}\\ &= \sqrt{\gamma_y(0)} \sum_{j=-\infty}^{\infty} |\psi_j| < \infty \end{align} $$
This implies that $W_t$ is finite with probability 1, meaning that $X_t$ is finite with probability 1.
We can also conclude that $\mathbb{E}[|X_t|] \le \infty$ (as $\le \mathbb{E}[W_t]$). Hence, $\mathbb{E}[X_t]$ is finite. Then
$$ \begin{align} \mathbb{E}[X_t] &= \sum_{j=-\infty}^{\infty} \psi_j \underbrace{\mathbb{E}[Y_{t-j}]}_{0}~~~~~\text{(Fubini's theorem)}\\ &= 0 \end{align} $$
Fix $h \in \mathbb{Z}$
$$ \begin{align} |X_t X_{t+h}| &= \left| \left( \sum_{j=-\infty}^{\infty} \psi_j Y_{t-j} \right) \left( \sum_{j=-\infty}^{\infty} \psi_j Y_{t+h-j} \right)\right| \\ &= \left| \sum_{j,k=-\infty}^{\infty} \psi_j \psi_k Y_{t-j} Y_{t+h-k} \right| \\ &\le \sum_{j,k=-\infty}^{\infty} \left| \psi_j \right| \left| \psi_k \right| \left| Y_{t-j} \right| \left| Y_{t+h-k} \right| \end{align} $$
Now we have
$$ \begin{align} \mathbb{E}\left[ \sum_{j,k=-\infty}^{\infty} |\psi_j| |\psi_k| |Y_{t-j}| |Y_{t+h-k}| \right] &= \sum_{j,k=-\infty}^{\infty} |\psi_j| |\psi_k| \mathbb{E}[|Y_{t-j}| |Y_{t+h-k}|]\\ &\le \sum_{j,k=-\infty}^{\infty} |\psi_j| |\psi_k| \sqrt{\gamma_y(0)} \sqrt{\gamma_y(0)}\\ &= \gamma_y(0) \sum_{j,k=-\infty}^{\infty} |\psi_j| |\psi_k|\\ &= \gamma_y(0) \left( \sum_{j=-\infty}^{\infty} |\psi_j| \right)^2 < \infty \end{align} $$
This implies that $\mathbb{E}[|X_t X_{t+h}|] \le \infty$, meaning that $\mathbb{E}[X_t X_{t+h}] < \infty$.
$$ \begin{align} \gamma_x(t, t+h) &= \mathrm{Cov}(X_t, X_{t+h})\\ &= \mathbb{E}[X_t X_{t+h}] - 0 \cdot 0\\ &= \mathbb{E}[X_t X_{t+h}] \end{align} $$
By Fubini's theorem, we comput
$$ \begin{align} \mathbb{E}[X_t X_{t+h}] &= \sum_{j,k=-\infty}^{\infty} \psi_j \psi_k \mathbb{E}[Y_{t-j}Y_{t+h-k}]\\ \gamma_y(t, t+h) &= \sum_{j,k=-\infty}^{\infty} \psi_j \psi_k \gamma_y(t+h-k-(t-j))\\ &= \sum_{j,k=-\infty}^{\infty} \psi_j \psi_k \gamma_y(h-k+j) \end{align} $$
Suppose $\{Y_t\} \sim WN(0, \sigma^2)$. In this case, we have that $\gamma_y(h-k+j) = \sigma^2 \mathbb{1}\{h-k+j=0\} = \sigma^2 \mathbb{1}\{k=h+j\}$.
This means that
$$\gamma_x(h) = \sigma^2 \sum_{j=-\infty}^{\infty} \psi_j \psi_{h+j}.$$
Example¶
Consider $X_t = Z_t + \theta Z_{t-1}$ with $\theta \in \mathbb{R}$ and $\{Z_t\} \sim WN(0, \sigma^2)$. Note that $\{X_t\} \sim MA(1)$, a special case of linear processes.
Given $X_t = \psi(B) Z_t$ with $\psi(B) = \sum_{j=-\infty}^{\infty} \psi_j B^j$, where
$$ \psi_j = \begin{cases} 1 & j = 0\\ \theta & j = 1\\ 0 & \text{otherwise} \end{cases}. $$
Since $\gamma_x$ is even, we conclude that
$$ \gamma_x(h) = \begin{cases} \sigma^2 (1 + \theta^2) & h = 0\\ \sigma^2 \theta & |h| = 1\\ 0 & \text{otherwise} \end{cases} $$
Composition of 2 line processes¶
Let $\{\alpha_j\}_j$ and $\{\beta_j\}_j$ be 2 real sequences such that $\sum_{j=-\infty}^{\infty} |\alpha_j| < \infty$ and $\sum_{j=-\infty}^{\infty} |\beta_j| < \infty$. Also, let $\{Y_t\}_t$ be some stationary time series.
Consider $X_t = \alpha(B) Y_t = \sum_{j=-\infty}^{\infty} \alpha_j Y_{t-j}$ and $W_t = \beta(B) X_t = \sum_{j=-\infty}^{\infty} \beta_j X_{t - j}$.
How can we write $W_t$ as a function of $Y_t$?
$$\underbrace{Y_t \stackrel{\alpha(B)}{\rightarrow} X_t \stackrel{\beta(B)}{\rightarrow} W_t}_{?}$$
We can consider write like this
$$W_t = \left( \underbrace{\beta(B) \cdot \alpha(B)}_{\psi{B}} \right) Y_t$$
where $\psi_j = \sum_{k=-\infty}^{\infty} \beta_k \alpha_{j-k}$.
Indeed, $W_t = \sum_{j=-\infty}^{\infty} \beta_j \alpha_{t - j} = \sum_{j=-\infty}^{\infty}\beta_j \sum_{k=-\infty}^{\infty} \alpha_k Y_{t-j-k}$.
Now, note that $\sum_{j,k=-\infty}^{\infty} \beta_j \alpha_k Y_{t-j-k}$ is finite with probability 1.
In fact
$$ \begin{align} \mathbb{E}\left[ \sum_{j,k=-\infty}^{\infty} |\beta_j| |\alpha_k| |Y_{t-j-k}| \right] &= \sum_{j,k=-\infty}^{\infty} |\beta_j| |\alpha_j| \mathbb{E} [|Y_{t-j-k}|]~~~~~\text{(Fubini's theorem)}\\ &\le \sqrt{\gamma_y(0)} \sum_{j,k=-\infty} |\beta_j| |\alpha_k|\\ &= \sqrt{\gamma_y(0)} \left( \sum_{j=-\infty}^{\infty} |\beta_j| \right) \left( \sum_{k=-\infty}^{\infty} |\alpha_k| \right) < \infty \end{align} $$
Using Fubini's theorem, we can write
$$ \begin{align} W_t &= \sum_{j,k=-\infty}^{\infty} \beta_j \alpha_k Y_{t-j-k} \\ &= \sum_{j=-\infty}^{\infty} \sum_{k=-\infty}^{\infty} \beta_j \alpha_k Y_{t-j-k}\\ &= \sum_{j=-\infty}^{\infty} \sum_{k=-\infty}^{\infty} \beta_j \alpha_{k'-j} Y_{t-k'}~~~~~\text{(replacing } k = k' - j \text{)}\\ &= \sum_{k'=-\infty}^{\infty} \sum_{j=-\infty}^{\infty} \beta_j \alpha_{k'-j} Y_{t-k'}\\ &= \sum_{k=-\infty}^{\infty} \left( \sum_{j=-\infty}^{\infty} \beta_j \alpha_{k-j} \right) Y_{t-k}\\ &= \sum_{k=-\infty}^{\infty} \psi_k Y_{t-k} \end{align} $$
As a result we can have
$$W_t = \psi(B)Y_t$$
where $\psi(B) = \sum_{j=-\infty}\psi_j B^j$ and $\psi_j = \sum_{k=-\infty}^{\infty}\beta_k \alpha_{j-k}$.
Note that $\psi_j = \sum_{k=-\infty}^{\infty} \alpha_j \beta_{k-j}$. This implies that $\alpha(B) \cdots \beta(B) = \beta(B) \cdot \alpha(B)$.
Autoregressive processes of $AR(1)$¶
Let $\phi \in \mathbb{R} \{-1, 1\}$. An autoregressive process of order (1) ($AR(1)$) is a stationary solution to the equations:
$$X_t = \phi X_{t-1} + Z_t$$
where $\{Z_t\}_t \sim WN(0, \sigma^2)$.
We can rewrite it as
$$ \begin{align} &\Leftrightarrow X_t - \phi X_{t-1} = Z_t\\ &\Leftrightarrow X_t - \phi B X_t = Z_t\\ &\Leftrightarrow \phi(B)X_t = Z_t \end{align} $$
where $\phi(B) = 1 - \phi B$.
First case - $|\phi| < 1$
We show that the $MA(\infty)$ process given by $\sum_{j=0}^{\infty} \phi^j Z_{t-j}$ is the unique stationary time series satisfying $X_t = \phi X_{t-1} + Z_t$ almost surely.
Consider $X_t = \sum_{j=0}^{\infty} \phi^j Z_{t-j} = \psi(B) Z_t$, where $\psi(B) = \sum_{j=-\infty}^{\infty} \psi_j B^j$ and $\psi_j = \begin{cases} \phi^j & j \ge 0\\ 0 & j < 0 \end{cases}$.
$$\sum_{j = -\infty}^{\infty} |\psi_j| = \sum_{j=0}^{\infty} |\phi^j| = \frac{1}{1 - |\phi|} < \infty$$
Using the previous proposition, we see that $\{X_t\}_t$ is stationary.
$$ \begin{align} X_t - \phi X_{t-1} &= \sum_{j=0}^{\infty} \phi^j Z_{t-j} - \phi \sum_{j=0}^{\infty} \phi^j Z_{t-1-j}\\ &= \sum_{j=0}^{\infty} \phi^j Z_{t-j} - \phi \sum_{j=1}^{\infty} \phi^{j-1} Z_{t-j}\\ &= \sum_{j=0}^{\infty} \phi^j Z_{t-j} - \sum_{j=1}^{\infty} \phi^{j} Z_{t-j}\\ &= \phi^0 Z_{t-0} = Z_t \end{align} $$
Hence, $X_t = \sum_{j=0}^{\infty} \phi^j Z_{t-j}$ is a stationary solution of $X_t = \phi X_{t-1} + Z_t$.
Suppose there exists another stationary solution, say $\{Y_t\}_t$. Then
$$ \begin{align} X_t - Y_t &= (\phi X_{t-1} + Z_t) - (\phi Y_{t-1} + Z_t)\\ &= \phi (X_{t-1} - Y_{t-1})\\ &= \phi^{k} (X_{t-k} - Y_{t-k})\\ \Rightarrow |X_t - Y_t| &= |\phi|^k |X_{t-k} - Y_{t-k}| \end{align} $$
As a result, the expectation can be computed as
$$ \begin{align} \mathbb{E}[|X_t - Y_t|] &= |\phi|^k \mathbb{E}[|X_{t-k} - Y_{t-k}|]\\ &\le |\phi|^k (\mathbb{E}[|X_{t-k}|] + \mathbb{E}[|Y_{t-k}|])\\ &\le |\phi|^k (\sqrt{\gamma_x(0)} + \gamma_y(0))\\ &= 0~~~~~\text{(as } k \text{ goes to infinity, it approaches 0)} \end{align} $$
This is because $\mathbb{E}[X_t] = \mathbb{E}[Y_t] = 0$, $\mathbb{E}[X_t^2] = \gamma_x(0)$, and $\mathbb{E}[Y_t^2] = \gamma_y(0)$.
As a result, we have $X_t = Y_t$ almost surely.
Note that in this case, the $AR$ model is automatically causal, meaning that $\mathrm{Cov}(X_s, Z_t) = 0, \forall s < t$. Indeed, $X_t$ depends only on the present and past terms of $\{Z_t\}_t$.
Second case - $|\phi| > 1$
Note that $\sum_{j=0}^{\infty} |\phi|^j = \infty$. However we can write
$$X_{t+1} = \phi X_t + Z_{t+1} \Leftrightarrow X_t = \frac{1}{\phi} X_{t+1} + \frac{1}{\phi}Z_{t+1}$$
where $\phi \neq 0$ always.
We can now show that $-\sum_{j=1}^{\infty} \frac{1}{\phi^j} Z_{t+j}$ is the unique stationary solution of $X_t = \phi X_{t-1} + Z_t$ almost surely.
Consider $X_t = -\sum_{j=1}^{\infty} \frac{1}{\phi^j} Z_{t+j}$. Note that $X_t = \psi(B) Z_t$ where $\psi_j = \begin{cases} \phi^j & j \le -1\\ 0 & j > -1 \end{cases}$.
$$ \begin{align} &\sum_{j=-\infty} |\psi_j| = \sum_{j=-\infty}^{-1} |\phi|^j = \sum_{j=1}^{\infty} |\phi|^{0j}\\ = & -1 + \sum_{j=0}^{\infty} |\phi|^{-1} = -1 + \frac{1}{1 - \frac{1}{|\phi|}} = \frac{1}{|\phi| - 1} < \infty \end{align} $$
Using the first proposition, we see that $\{X_t\}_t$ is stationary.
$$ \begin{align} X_t - \phi X_{t-1} &= -\sum_{j=1}^{\infty} \phi^{-j} Z_{t+j} - \phi \sum_{j=1}^{\infty} \phi^{-j} Z_{t-1+j}\\ &= -\sum_{j=1}^{\infty} \phi^{-j} Z_{t+j} - \phi \sum_{j=0}^{\infty} \phi^{-j-1} Z_{t+j}\\ &= -\sum_{j=1}^{\infty} \phi^{-j} Z_{t+j} - \sum_{j=0}^{\infty} \phi^{-j} Z_{t+j}\\ &= \phi^0 Z_{t+0} = Z_t \end{align} $$
Suppose there exists a stationary solution $\{Y_t\}_t$. Then
$$ \begin{align} X_t - Y_t &= \left(\frac{1}{\phi} X_{t+1} + Z_{t+1}\right) - \left(\frac{1}{\phi} Y_{t+1} + Z_{t+1}\right) \\ &= \frac{1}{\phi} (X_{t+1} - Y_{t+1})\\ &= \frac{1}{\phi^k} (X_{t+k} - Y_{t+k}) \end{align} $$
As for the expectation, we have
$$\mathbb{E}[|X_t - Y_t|] \le |\phi|^{-k} \left( \sqrt{\gamma_x(0)} + \sqrt{\gamma_y(0)} \right)$$
approaches to 0 as $k$ goes to infinity.
As a result $X_t = Y_t$ almost surely.
Note that the $AR$ model is non-causal in this case since $X_t$ depends on future terms of $\{Z_t\}_t$.
An alternative perspective inversion of operators¶
Consider an $AR(1)$ process,
$$X_t - \phi X_{t-1} = Z_t.$$
Let $|\phi| < 1$, we can write the equation as
$$\Phi(B)X_t = Z_t$$
where $\Phi(z) = 1 - \phi_z$ for $z \in \mathbb{C}$. We have shown that $X_t - \phi X_{t-1} = Z_t$ is uniquely solved by
$$X_t = \sum_{j=0}^{\infty} \phi^j Z_{t-j} = \chi(B)Z_t$$
for $\chi(z) = \sum_{j=0}^{\infty} \phi^j z^j$ for $|\phi_z| < 1$ as if the $AR(1)$ equations in $X_t - \phi X_{t-1} = Z_t$ are solved by
$X_t = (1 - \phi B)^{-1} Z_t$
$(1 - \phi B)^{-1} = \frac{1}{1 - \phi B} = \sum_{j=0} \phi^j B^j$
$X_t = \chi(B) Z_t = \sum_{j=0}^{\infty} \phi^j Z_{t-j}$
which is the only solution as seen previously.
Question: How can we justifiy these manipulations?
Answer: Let $\{\alpha_t\}_t$ and $\{\beta_t\}_t$ be such that $\sum |\alpha_t| < \infty$ and $\sum |\beta_t| < \infty$. For any stationary time series $\{Y_t\}_t$, $\{\psi(B) Y_t\}_t$ is also stationary for $\psi(B) = \alpha(B) \circ \beta(B)$, where $\alpha (B) = \sum_{j=0}^{\infty} \phi^j B^j$ and $\beta(B) = \sum_{j=-\infty}^{\infty} \beta_j B^j$.
Now let $\alpha(B) = \chi(B) = \sum_{j=0}^{\infty} \phi^j B^j$ and $\beta(B) = \Phi(B) = 1 - \phi B$.
It is clear that the requirements satisfied
$\sum_{j=-\infty}^{\infty} |\alpha_j| = \sum_{j=0}^{\infty} |\phi|^j < \infty$ and $\sum_{j=-\infty}^\infty |\beta_j| = 1 + |\phi| < \infty$
$\underbrace{\beta(B)}_{\Phi(B)} X_t = Z_t \Rightarrow (\underbrace{\alpha(B) \circ \beta(B)}_{\psi(B)}) X_t = \alpha(B) Z_t = \chi(B) Z_t = \sum_{j=0}^{\infty}\phi^j Z_{t-j}$.
Recall that $\psi_j = \sum_{k=-\infty} \beta_k \alpha_{j-k}$ where $\alpha_j = \begin{cases} \phi^j & j \ge 0 \\ 0 & \text{otherwise} \end{cases}$ and $\beta_j = \begin{cases} 1 & j=0 \\ -\phi & j=1 \\ 0 & \text{otherwise} \end{cases}$, we have
$$ \psi_j = \alpha_j - \phi \alpha_{j-1} = \begin{cases} 0 & j = 0\\ 1 & j = 1\\ \phi^j - \phi \phi^{j-1} = 0 & \text{otherwise} \end{cases} $$
As a result
$$\psi(B) = \sum_{j=-\infty}^{\infty} \psi_j B^j = B^0 = I.$$
Moreover we have
$$IX_t = \sum_{j=0}^{\infty} \phi^j Z_{t-1} \Leftrightarrow X_t = \sum_{j=0}^{\infty} \phi^j Z_{t-j}.$$
It is also possible to directly show that $\psi(B) = I$. Generally, we can use these manipulations as such:
Let $\{Y_t\}_t$ be a stationary time series and $\{\psi_j\}_j$ a real sequence with $\sum |\phi_j| < \infty$.
Put $\psi(z) = \sum_{j=-\infty}^{\infty} \psi_j z^j$, $\forall z \in$ convergence fomain. If $\frac{1}{\psi(z)} = \sum_{j=-\infty}^{\infty} \chi_j z^j$ for $\{X_j\}_j$ such that a real sequence $\sum_{j=-\infty}^{\infty} |X_j| < \infty$ and $z \in$ suitable convergence domain.
Then the stationary solution, $\{X_t\}_t$ of $\psi(B)X_t = Y_t$ is given by
$$X_t = \chi(B) Y_t.$$
In fact, we can show that $\chi(B) \circ \psi(B) = I$.