Skip to content

Commit

Permalink
Merge pull request #15 from AaltoML/develop
Browse files Browse the repository at this point in the history
Develop
  • Loading branch information
William J. Wilkinson authored Nov 29, 2022
2 parents af3f625 + f583909 commit f50cd2c
Show file tree
Hide file tree
Showing 4 changed files with 43 additions and 6 deletions.
36 changes: 34 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,50 @@
# Bayes-Newton

Bayes-Newton is a library for approximate inference in Gaussian processes (GPs) in [JAX](https://github.com/google/jax) (with [objax](https://github.com/google/objax)), built and actively maintained by [Will Wilkinson](https://wil-j-wil.github.io/).
Bayes-Newton is a library for approximate inference in Gaussian processes (GPs) in [JAX](https://github.com/google/jax) (with [objax](https://github.com/google/objax)), built and maintained by [Will Wilkinson](https://wil-j-wil.github.io/).

Bayes-Newton provides a unifying view of approximate Bayesian inference, and allows for the combination of many models (e.g. GPs, sparse GPs, Markov GPs, sparse Markov GPs) with the inference method of your choice (VI, EP, Laplace, Linearisation). For a full list of the methods implemented scroll down to the bottom of this page.

The methodology is outlined in the following article:
* W.J. Wilkinson, S. Särkkä, and A. Solin (2021): **Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees**. [*arXiv preprint arXiv:2111.01721*](https://arxiv.org/abs/2111.01721).

## Installation

Latest (stable) release from PyPI
```bash
pip install bayesnewton
```

## Example
For *development*, you might want to use the latest source from GitHub: In a check-out of the develop branch of the BayesNewton GitHub repository, run
```bash
pip install -e .
```

### Step-by-step: Getting started with the examples

For running the demos or experiments in this repository or building on top of it, you can follow these steps for creating a virtual environment and activating it:
```bash
python3 -m venv venv
source venv/bin/activate
```

Installing all required dependencies for the examples:
```bash
python -m pip install -r requirements.txt
python -m pip install -e .
```

Running the tests requires additionally a specific version of GPflow to test against:
```bash
python -m pip install pytest
python -m pip install tensorflow==2.10 tensorflow-probability==0.18.0 gpflow==2.5.2
```

Run tests
```bash
cd tests; pytest
```

## Simple Example
Given some inputs `x` and some data `y`, you can construct a Bayes-Newton model as follows,
```python
kern = bayesnewton.kernels.Matern52()
Expand Down
2 changes: 1 addition & 1 deletion bayesnewton/basemodels.py
Original file line number Diff line number Diff line change
Expand Up @@ -813,7 +813,7 @@ def predict(self, X=None, R=None, pseudo_lik_params=None):

# if np.squeeze(test_var).ndim > 2: # deal with spatio-temporal case (discard spatial covariance)
if self.spatio_temporal: # deal with spatio-temporal case (discard spatial covariance)
test_var = diag(np.squeeze(test_var))
test_var = diag(test_var)
return np.squeeze(test_mean), np.squeeze(test_var)

def filter_energy(self):
Expand Down
1 change: 1 addition & 0 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
jax==0.2.9
jaxlib==0.1.60
objax==1.3.1
numba
numpy
matplotlib
scipy
Expand Down
10 changes: 7 additions & 3 deletions tests/test_vs_gpflow_shutters.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,9 +168,13 @@ def test_gradient_step(var_f, len_f, var_y):
loss_fn = gpflow_model.training_loss_closure(data)
adam_vars = gpflow_model.trainable_variables
adam_opt.minimize(loss_fn, adam_vars)
gpflow_hypers = np.array([gpflow_model.kernel.lengthscales.numpy()[0],
gpflow_model.kernel.lengthscales.numpy()[1],
gpflow_model.kernel.variance.numpy(),
#gpflow_hypers = np.array([gpflow_model.kernel.lengthscales.numpy()[0],
# gpflow_model.kernel.lengthscales.numpy()[1],
# gpflow_model.kernel.variance.numpy(),
# gpflow_model.likelihood.variance.numpy()])
gpflow_hypers = np.array([gpflow_model.kernel.parameters[0].numpy(),
gpflow_model.kernel.parameters[2].numpy(),
gpflow_model.kernel.parameters[1].numpy(),
gpflow_model.likelihood.variance.numpy()])
print(gpflow_hypers)
print(gpflow_grads)
Expand Down

0 comments on commit f50cd2c

Please sign in to comment.