A bare-bones TensorFlow framework for Bayesian deep learning and Gaussian process approximation
.. |copy| unicode:: 0xA9
.. image:: https://circleci.com/gh/data61/aboleth/tree/develop.svg?style=svg&circle-token=f02db635cf3a7e998e17273c91f13ffae7dbf088 :target: https://circleci.com/gh/data61/aboleth/tree/develop :alt: circleCI
.. image:: https://readthedocs.org/projects/aboleth/badge/?version=stable :target: http://aboleth.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status
TensorFlow_ framework for Bayesian deep learning and Gaussian process approximation _ with stochastic gradient variational Bayes inference _.
Some of the features of Aboleth:
Keras_ (see the
demos_ for more information).
The purpose of Aboleth is to provide a set of high performance and light weight components for building Bayesian neural nets and approximate (deep) Gaussian process computational graphs. We aim for minimal abstraction over pure TensorFlow, so you can still assign parts of the computational graph to different hardware, use your own data feeds/queues, and manage your own sessions etc.
Here is an example of building a simple Bayesian neural net classifier with one hidden layer and Normal prior/posterior distributions on the network weights:
.. code-block:: python
import tensorflow as tf import aboleth as ab
Define the network, ">>" implements function composition,
the InputLayer gives a kwarg for this network, and
allows us to specify the number of samples for stochastic
gradient variational Bayes.
net = ( ab.InputLayer(name="X", n_samples=5) >> ab.DenseVariational(output_dim=100) >> ab.Activation(tf.nn.relu) >> ab.DenseVariational(output_dim=1) )
X_ = tf.placeholder(tf.float, shape=(None, D)) Y_ = tf.placeholder(tf.float, shape=(None, 1))
Build the network, nn, and the parameter regularization, kl
nn, kl = net(X=X_)
Define the likelihood model
likelihood = tf.distributions.Bernoulli(logits=nn).log_prob(Y_)
Build the final loss function to use with TensorFlow train
loss = ab.elbo(likelihood, kl, N)
Now your TensorFlow training code here!
At the moment the focus of Aboleth is on supervised tasks, however this is subject to change in subsequent releases if there is interest in this capability.
NOTE: Aboleth is a Python 3 library only. Some of the functionality within it depends on features only found in python 3. Sorry.
To get up and running quickly you can use pip and get the Aboleth package from
$ pip install aboleth
For the best performance on your architecture, we recommend installing
TensorFlow from sources_.
Or, to install additional dependencies required by the
$ pip install aboleth[demos]
To install in develop mode with packages required for development we recommend you clone the repository from GitHub::
$ git clone [email protected]:data61/aboleth.git
Then in the directory that you cloned into, issue the following::
$ pip install -e .[dev]
quick start guide_ to get started, and for more in depth guide, have a look at our
tutorials. Also see the
demosfolder for more examples of creating and training algorithms with Aboleth.
The full project documentation can be found on
..  Cutajar, K. Bonilla, E. Michiardi, P. Filippone, M. Random Feature Expansions for Deep Gaussian Processes. In ICML, 2017. ..  Kingma, D. P. and Welling, M. Auto-encoding variational Bayes. In ICLR, 2014. ..  Hafner, D., Tran, D., Irpan, A., Lillicrap, T. and Davidson, J., 2018. Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors. arXiv preprint arXiv:1807.09289.
Copyright 2017 CSIRO (Data61)
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.