next up previous contents
Next: Boundary conditions Up: Learning matrices Previous: Gaussian relaxation   Contents

Inverting in subspaces

Matrices considered as learning matrix have to be invertible. Non-invertible matrices can only be inverted in the subspace which is the complement of its zero space. With respect to a symmetric ${\bf A}$ we define the projector ${\bf Q}_0 = {\bf I} - \sum_i \psi_i^T\,\psi_i$ into its zero space (for the more general case of a normal ${\bf A}$ replace $\psi_i^T$ by the hermitian conjugate $\psi_i^ \dagger$) and its complement ${\bf Q}_1 = {\bf I} - {\bf Q}_0 = \sum_i \psi_i^T\,\psi_i$ with $\psi_i$ denoting orthogonal eigenvectors with eigenvalues $a_i\ne 0$ of ${\bf A}$, i.e., ${\bf A} \psi_i = a_i \psi_i \ne 0$. Then, denoting projected sub-matrices by ${\bf Q}_i {\bf A} {\bf Q}_j$ = ${\bf A}_{ij}$ we have ${\bf A}_{00}$ = ${\bf A}_{10}$ = ${\bf A}_{01}$ = $0$, i.e.,

\begin{displaymath}
{\bf A} =
{\bf Q}_1 {\bf A} {\bf Q}_1
=
{\bf A}_{11}.
\end{displaymath} (660)

and in the update equation
\begin{displaymath}
{\bf A} \Delta \phi^{(i)} = \eta \, G
\end{displaymath} (661)

only ${\bf A}_{11}$ can be inverted. Writing ${\bf Q}_j \phi$ = $\phi_j$ for a projected vector, the iteration scheme acquires the form
$\displaystyle \Delta \phi^{(i)}_1$ $\textstyle =$ $\displaystyle \eta {\bf A}^{-1}_{11} G_1,$ (662)
$\displaystyle 0$ $\textstyle =$ $\displaystyle \eta \, G_0.$ (663)

For positive semi-definite ${\bf A}$ the sub-matrix ${\bf A}_{11}$ is positive definite. If the second equation is already fulfilled or its solution is postponed to a later iteration step we have
$\displaystyle \phi^{(i+1)}_1$ $\textstyle =$ $\displaystyle \phi^{(i)}_1 + \eta {\bf A}_{11}^{-1}
\left( T_1^{(i)} - {{\bf K}}_{11}^{(i)} \phi_1^{(i)} -{{\bf K}}_{10}^{(i)} \phi_0^{(i)}\right),$ (664)
$\displaystyle \phi^{(i+1)}_0$ $\textstyle =$ $\displaystyle \phi^{(i)}_0.$ (665)

In case the projector ${\bf Q}_0 = {\bf I}_0$ is diagonal in the chosen representation the projected equation can directly be solved by skipping the corresponding components. Otherwise one can use the Moore-Penrose inverse ${\bf A}^\char93 $ of ${\bf A}$ to solve the projected equation
\begin{displaymath}
\Delta \phi^{(i)} = \eta {\bf A}^\char93  G.
\end{displaymath} (666)

Alternatively, an invertible operator $\tilde {\bf A}_{00}$ can be added to ${\bf A}_{11}$ to obtain a complete iteration scheme with ${\bf A}^{-1}$ = ${\bf A}_{11}^{-1}$ + $\tilde {\bf A}_{00}^{-1}$
$\displaystyle \phi^{(i+1)}$ $\textstyle =$ $\displaystyle \phi^{(i)}
+\eta {\bf A}_{11}^{-1}
\left( T_1^{(i)} - {{\bf K}}_{11}^{(i)} \phi_1^{(i)} -{{\bf K}}_{10}^{(i)} \phi_0^{(i)}\right)$  
    $\displaystyle +\eta
\tilde {\bf A}_{00}^{-1}\left( T_0^{(i)} - {{\bf K}}_{01}^{(i)} \phi_1^{(i)}
-{{\bf K}}_{00}^{(i)} \phi_0^{(i)}\right).$ (667)

The choice ${\bf A}^{-1}$ = $\left( {\bf A}_{11} + {\bf I}_{00} \right)^{-1}$ = ${\bf A}_{11}^{-1} + {\bf I}_{00}$, = ${\bf A}_{11}^{-1} + {\bf Q}_{0}$, for instance, results in a gradient algorithm on the zero space with additional coupling between the two subspaces.


next up previous contents
Next: Boundary conditions Up: Learning matrices Previous: Gaussian relaxation   Contents
Joerg_Lemm 2001-01-21