next up previous contents
Next: Covariances and invariances Up: General Gaussian prior factors Previous: Example: Square root of   Contents

Example: Distribution functions

Instead in terms of the probability density function, one can formulate the prior in terms of its integral, the distribution function. The density $P$ is then recovered from the distribution function $\phi $ by differentiation,

\begin{displaymath}
P(\phi) = \prod_k^{d_y} \frac{\partial \phi}{\partial y_k}
...
... \bigotimes_k^{d_y} {\bf R}_k^{-1} \phi.
= {\bf R}^{-1} \phi,
\end{displaymath} (203)

resulting in a non-diagonal ${\bf P}^\prime$. The inverse of the derivative operator ${\bf R}^{-1}$ is the integration operator ${\bf R}$ = $\bigotimes_k^{d_y} {\bf R}_k P$ with matrix elements
\begin{displaymath}
{\bf R} (x,y;x^\prime,y^\prime)
= \delta (x-x^\prime) \theta (y-y^\prime),
\end{displaymath} (204)

i.e.,
\begin{displaymath}
{\bf R}_k (x,y;x^\prime,y^\prime)
= \delta (x-x^\prime) \theta (y_k-y^\prime_k)
\prod_{l\ne k}\delta (y_l-y_l^\prime).
\end{displaymath} (205)

Thus, (203) corresponds to the transformation of ($x$-conditioned) density functions $P$ in ($x$-conditioned) distribution functions $\phi $ = ${\bf R} P$, i.e., $\phi(x,y)$ = $\int_{-\infty}^y P(x,y^\prime ) dy^\prime$. Because ${\bf R^T} {{\bf K}} {\bf R}$ is positive (semi-)definite if ${{\bf K}}$ is, a specific prior which is Gaussian in the distribution function $\phi $ is also Gaussian in the density $P$. ${\bf P}^\prime$ becomes
\begin{displaymath}
{\bf P}^\prime (x,y;x^\prime,y^\prime )
= \frac{\delta
\le...
...ta (x-x^\prime ) \prod_k^{d_y}\delta^\prime (y_k-y_k^\prime ).
\end{displaymath} (206)

Here the derivative of the $\delta$-function is defined by formal partial integration
\begin{displaymath}
\int_{-\infty}^\infty \!dy^\prime\, f(y^\prime) \delta^\prim...
...) \delta(y^\prime -y)\vert _{-\infty}^\infty
- f^\prime (y)
.
\end{displaymath} (207)

Fixing $\phi(x,-\infty) = 0$ the variational derivative $\delta/( \delta \phi (x, -\infty) )$ is not needed. The normalization condition for $P$ becomes for the distribution function $\phi $ = ${\bf R} P$ the boundary condition $\phi(x,\infty) = 1$, $\forall x\in X$. The non-negativity condition for $P$ corresponds to the monotonicity condition $\phi(x,y) \ge \phi(x,y^\prime)$, $\forall y\ge y^\prime$, $\forall x\in X$ and to $\phi(x,-\infty) \ge 0$, $\forall x\in X$.


next up previous contents
Next: Covariances and invariances Up: General Gaussian prior factors Previous: Example: Square root of   Contents
Joerg_Lemm 2001-01-21