MCMC Sampling (liesel.goose)#
This is an overview of the most important building blocks for sampling from the
posterior of your model liesel.goose
.
We usually import liesel.goose
as follows:
import liesel.goose as gs
The workflow for MCMC sampling goose consists of the following steps:
Set up your model
Set up an
Engine
with MCMC kernels for your parameters and draw posterior samplesInspect your results
Note
This document provides an overview of the most important classes for MCMC sampling. You find more guidance on how to use them in the respective API documentation and in the tutorials.
Set up your model#
A natural option for setting up your model is the use of liesel.model
. See
Model Building with liesel.model for more. However, you are not locked
in to using Model
. Goose currently includes the following interfaces:
A |
|
A model interface for a model state represented by a |
|
A model interface for a model state represented by a |
|
A model interface for a model state represented by a |
|
An implementation of the Goose |
Find good starting values#
It can often be beneficial to find good starting values to get your MCMC sampling scheme
going. Goose provides the function optim()
for this purpose, which allows you
to run stochastic gradient descent on a liesel model.
Holds the results of model optimization with |
Set up an MCMC Engine and draw posterior samples#
To set up an MCMC engine, goose provides the EngineBuilder
. Please refer to
the linked EngineBuilder documentation to learn how to use it.
The |
|
MCMC engine capable of combining multiple transition kernels. |
Available MCMC kernels
Goose makes it easy for you to combine different MCMC kernels for different blocks of model parameters. Currently, the available MCMC kernels are:
A random walk kernel. |
|
An IWLS kernel with dual averaging and an (optional) user-defined function for computing the Cholesky decomposition of the Fisher information matrix, implementing the |
|
A HMC kernel with dual averaging and an inverse mass matrix tuner, implementing the |
|
A NUTS kernel with dual averaging and an inverse mass matrix tuner, implementing the |
|
A Gibbs kernel implementing the |
You can also define your own kernel by implementing the Kernel
protocol.
To draw samples from your posterior, you will want to call
sample_all_epochs()
. Once sampling is done, you can obtain the results
with get_results()
, which will return a SamplingResults
instance.
Inspect your results#
The two central classes for handling your sampling results are:
Contains the results of the MCMC engine. |
|
A summary object. |
You can obtain your posterior samples as a dictionary via
get_posterior_samples()
. There is also experimental support
for turning your samples into an arviz.InferenceData
object via
to_arviz_inference_data()
.
Plot posterior samples
Goose comes with a number of plotting functions that give you quick acccess to important diagnostics.
Visualizes trace plot, density plot and autocorrelation plot of a single subparameter. |
|
Visualizes posterior samples over time with a trace plot. |
|
Visualizes posterior distributions with a density plot. |
|
Produces a pairplot panel. |
|
Visualizes autocorrelations of posterior samples. |