In [1]:
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
sns.set()
np.random.seed(0)

Parallels between the two views

To more clearly undertand the parallel between the two views, we first start with the weight space view and make some draws.

Let \begin{equation} \phi(x) = \begin{bmatrix} 1 \\ \sin(x) \end{bmatrix} \end{equation}

The draws of functions from this space thus look like \begin{equation} f(x) = w_0 + w_1 \sin{x} \end{equation}

We will draw the weights from a unit normal distribution \begin{equation} \begin{bmatrix} w_0 \\ w_1 \end{bmatrix} \sim \mathcal{N}\left(0,\Sigma_P=\mathbb{I}\right) \end{equation}

In [2]:
x = np.linspace(0,2*np.pi,20)

w_0, w_1 = np.random.normal(scale=1.0,size=[2,1,5])

f = w_0 + w_1*np.sin(x.reshape([-1,1]))
plt.plot(x,f);

Now, we compute the covariance kernel for these basis functions

\begin{align} k(x,x') &= \phi(x)^T \Sigma_p \phi(x')\\ &= 1 + \sin(x) \sin(x') \end{align}

We now try to generate draws using the function space point of view. We will draw from a GP with mean 0 and a covariance kernel computed using the above function

In [3]:
x = np.linspace(0,2*np.pi,20)
μ = np.zeros_like(x)

cov = 1 + np.sin(x.reshape([1,-1]))*np.sin(x.reshape([-1,1])) 
In [4]:
y = np.random.multivariate_normal(μ, cov, size=5)
plt.plot(x,y.transpose());

We draw functions from the same space!

We look at the covariance kernel more closely:

In [5]:
plt.pcolor(cov)
plt.colorbar();
In [6]:
e_val,_ = np.linalg.eigh(cov)
plt.semilogy(e_val[::-1],'-x');

Since we have only 2 basis functions (from the weight space view) the corresponding kernel has only two non-zero eigenvalues. Infact, the eigenvalues of the covariance kernel will give us back the basis functions. KL decomposition of the 'function space view' gives us parameters for the 'weight space view'. If we were to have something like a squared exponential kernel, the space of functions in or space would be all smooth functions and the covariance kernel would have infinnite non-zero eigenvalues