Skip to content

Commit

Permalink
Tom's second Feb 21 edits of svd_intro masterpiece
Browse files Browse the repository at this point in the history
  • Loading branch information
thomassargent30 committed Feb 21, 2022
1 parent 60fbd63 commit 1b3d3d4
Showing 1 changed file with 23 additions and 19 deletions.
42 changes: 23 additions & 19 deletions lectures/svd_intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -678,7 +678,8 @@ $$
## Reduced-order VAR
Consider a **vector autoregression**
DMD is a natural tool for estimating a **reduced order vector autoregression**,
an object that we define in terms of the populations regression equation
$$
X_{t+1} = \check A X_t + C \epsilon_{t+1}
Expand All @@ -687,7 +688,7 @@ $$ (eq:VARred)
where
* $X_t$ is an $m \times 1$ vector
* $\check A$ is an $m \times m$ matrix of rank $r$
* $\check A$ is an $m \times m$ matrix of rank $r$ whose eigenvalues are all less than $1$ in modulus
* $\epsilon_{t+1} \sim {\mathcal N}(0, I)$ is an $m \times 1$ vector of i.i.d. shocks
* $E \epsilon_{t+1} X_t = 0$, so that the shocks are orthogonal to the regressors
Expand All @@ -705,14 +706,14 @@ so that according to model {eq}`eq:VARred`
$$
X' = [\check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 \mid \cdots \mid \check A X_{n-1} C
\epsilon_n ]
X' = \begin{bmatrix} \check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 \mid \cdots \mid \check A X_{n-1} + C
\epsilon_n \end{bmatrix}
$$
To illustrate some useful calculations, assume that $n =3 $ and form
$$
X' X^T = [\check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 ]
X' X^T = \begin{bmatrix} \check A X_1 + C \epsilon_2 & \check A X_2 + C \epsilon_3 \end{bmatrix}
\begin{bmatrix} X_1^T \cr X_2^T \end{bmatrix}
$$
Expand All @@ -725,7 +726,7 @@ $$
but because
$$
E C( \epsilon_2 X_1^T + \epsilon_3 X_2^T) = 0
E ( \epsilon_2 X_1^T + \epsilon_3 X_2^T) = 0
$$
we have
Expand Down Expand Up @@ -768,25 +769,28 @@ $$
C \epsilon_{t+1} = X_{t+1} - \check A X_t , \quad t =1, \ldots, n-1
$$
and check whether they are serially uncorrelated as assumed.
and check whether they are serially uncorrelated as assumed in {eq}`eq:VARred`.
For example, we can compute spectra and cross-spectra of components of $C \epsilon_{t+1}$
and check for serial-uncorrelatedness in the usual ways.
We can also estimate the covariance matrix of $C \epsilon_{t+1}$
from
$$
\frac{1}{n} \sum_{j=1}^{n-1} (C \epsilon_{t+1} )( C \epsilon_{t+1})^T
\frac{1}{n-1} \sum_{t=1}^{n-1} (C \epsilon_{t+1} )( C \epsilon_{t+1})^T
$$
It can be useful to transform variables in our reduced order VAR
It can be enlightening to diagonize our reduced order VAR {eq}`eq:VARred` by noting that it can
be written
$$
X_{t+1} = \Phi \Lambda \Phi^{+} X_t + C \epsilon_{t+1}
$$
according to
and then writing it as
$$
\Phi^+ X_{t+1} = \Lambda \Phi^{+} X_t + \Phi^+ C \epsilon_{t+1}
Expand All @@ -796,13 +800,19 @@ or
$$
\tilde X_{t+1} = \Lambda \tilde X_t + \tilde \epsilon_{t+1}
$$
$$ (eq:VARmodes)
where $\tilde X_t $ is an $r \times 1$ **mode** and $\tilde \epsilon_{t+1}$ is an $r \times 1$
shock.
The $r$ modes $\tilde X_t$ obey the first-order VAR {eq}`eq:VARmodes` in which $\Lambda$ is an $r \times r$ diagonal matrix.
Note that while $\Lambda$ is diagonal, the contemporaneous covariance matrix of $\tilde \epsilon_{t+1}$ need not be.
**Remark:** It is permissible for $X_t$ to contain lagged values of observables.
**Remark:** It is permissible for $X_t$ to contain lagged values of observables. For example:
For example, we might have a setting in which
$$
X_t = \begin{bmatrix}
Expand All @@ -816,12 +826,6 @@ y_{2, t-1} \cr
\end{bmatrix}
$$
+++
## Source for Some Python Code
Expand Down

2 comments on commit 1b3d3d4

@github-actions
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@github-actions
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please sign in to comment.