- Added
SoftMax
likelihood (#799) - Added likelihoods where expectations are evaluated with Monte Carlo,
MonteCarloLikelihood
(#799) - GPflow monitor refactoring, check
monitor-tensorboard.ipynb
for details (#792) - Speedup testing on Travis using utility functions for configuration in notebooks (#789)
- Support Python 3.5.2 in typing checks (Ubuntu 16.04 default python3) (#787)
- Corrected scaling in Students-t likelihood variance (#777)
- Removed jitter before taking the cholesky of the covariance in NatGrad optimizer (#768)
- Added GPflow logger. Created option for setting logger level in
gpflowrc
(#764) - Fixed bug at
params_as_tensors_for
(#751) - Fixed GPflow SciPy optimizer to pass options to actual scipy optimizer correctly (#738)
- Improved quadrature for likelihoods. Unified quadrature method introduced -
ndiagquad
(#736), (#747) - Added support for multi-output GPs, check
multioutput.ipynb
for details (#724)- Multi-output features
- Multi-output kernels
- Multi-dispatch for conditional
- Multi-dispatch for Kuu and Kuf
- Support Exponential distribution as prior (#717)
- Added notebook to demonstrate advanced usage of GPflow, such as combining GP with Neural Network (#712)
- Minibatch shape is
None
by default to allow dynamic change of data size (#704) - Epsilon parameter of the Robustmax likelihood is trainable now (#635)
- GPflow model saver (#660)
- Supports native GPflow models and provides an interface for defining custom savers for user's models
- Saver stores GPflow structures and pythonic types as numpy structured arrays and serializes them using HDF5
- Added inter-domain inducing features. Inducing points are used by default and are now set with
model.feature.Z
.
- Clear and aligned with tree-like structure of GPflow models design.
- GPflow trainable parameters are no longer packed into one TensorFlow variable.
- Integration of bare TensorFlow and Keras models with GPflow became very simple.
- GPflow parameter wraps multiple tensors: unconstained variable, constrained tensor and prior tensor.
- Instantaneous parameter's building into the TensorFlow graph. Once you created an instance of parameter, it creates necessary tensors at default graph immediately.
- New implementation for AutoFlow.
autoflow
decorator is a replacement. - GPflow optimizers match TensorFlow optimizer names. For e.g.
gpflow.train.GradientDescentOptimizer
mimicstf.train.GradientDescentOptimizer
. They even has the same instantialization signature. - GPflow has native support for Scipy optimizers -
gpflow.train.ScipyOptimizer
. - GPflow has advanced HMC implementation -
gpflow.train.HMC
. It works only within TensorFlow memory scope. - Tensor conversion decorator and context manager designed for cases when user needs to implicitly convert parameters to TensorFlow tensors:
gpflow.params_as_tensors
andgpflow.params_as_tensors_for
. - GPflow parameters and parameterized objects provide convenient methods and properties for building, intializing their tensors. Check
initializables
,initializable_feeds
,feeds
and other properties and methods. - Floating shapes of parameters and dataholders without re-building TensorFlow graph.
- bugfix for log_jacobian in transforms
- Different variants of
gauss_kl_*
are now deprecated in favour of a unifiedgauss_kl
implementation
- Rename python package name to
gpflow
. - Compile function has external session and graph arguments.
- Tests use Tensorflow TestCase class for proper session managing.
- Change to LowerTriangular transform interface.
- LowerTriangular transform now used by default in VGP and SVGP
- LowerTriangular transform now used native TensorFlow
- No longer use bespoke GPflow user ops.
- Improvements to VGP class allow more straightforward optimization
- Changed ordering of parameters to be alphabetical, to ensure consistency
- Update to work with TensorFlow 0.12.1.
- Changes to stop computations all being done on the default graph.
- Update list of GPflow contributors and other small changes to front page.
- Better deduction of
input_dim
forkernels.Combination
- Some kernels did not properly respect active dims, now fixed.
- Make sure log jacobian is computed even for fixed variables
- House keeping changes for paper submission.
- updated to work with tensorflow 0.11 (release candidate 1 available at time of writing)
- bugfixes in vgp._compile
- Added configuration file, which controls verbosity and level of numerical jitter
- tf_hacks is deprecated, became tf_wraps (tf_hacks will raise visible deprecation warnings)
- Documentation now at gpflow.readthedocs.io
- Many functions are now contained in tensorflow scopes for easier tensorboad visualisation and profiling
- Improvements to the way that parameters for triangular matrices are stored and optimised.
- Automatically generated Apache license headers.
- Ability to track log probabilities.
- Significant improvements to the way that data and fixed parameters are handled.
Previously, data and fixed parameters were treated as tensorflow constants.
Now, a new mechanism called get_feed_dict()
can gather up data and and fixed
parameters and pass them into the graph as placeholders.
-
To enable the above, data are now stored in objects called
DataHolder
. To access values of the data, use the same syntax as parameters:print(m.X.value)
-
Models do not need to be recompiled when the data changes.
-
Two models, VGP and GPMC, do need to be recompiled if the shape of the data changes
-
A multi-class likelihood is implemented
- Updated to work with tensorflow 0.9
- Added a Logistic transform to enable contraining a parameter between two bounds
- Added a Laplace distribution to use as a prior
- Added a periodic kernel
- Several improvements to the AutoFlow mechanism
- added FITC approximation (see comparison notebook)
- improved readability of code according to pep8
- significantly improved the speed of the test suite
- allowed passing of the 'tol' argument to scipy.minimize routine
- added ability to add and multiply MeanFunction objects
- Several new contributors (see README.md)
- Removed the need for a fork of TensorFlow. Some of our bespoke ops are replaced by equivalent versions.
- Included the ability to compute the full covaraince matrix at predict time. See
GPModel.predict_f
- Included the ability to sample from the posterior function values. See
GPModel.predict_f_samples
- Unified code in conditionals.py: see deprecations in
gp_predict
, etc. - Added SGPR method (Sparse GP Regression)
- included the ability to use tensorflow's optimizers as well as the scipy ones
The initial release of GPflow.