liesel.model.nodes.param(value, distribution=None, name='')[source]#

Helper function that returns a parameter Var.

Sets the Var.parameter flag. If the parameter variable is a random variable, i.e. if it has an associated probability distribution, its log-probability is automatically added to the model log-prior (see Model.log_prior).

  • value (Union[Any, Calc]) – The value of the parameter.

  • distribution (Optional[Dist]) – The probability distribution of the parameter. (default: None)

  • name (str) – The name of the parameter. If you do not specify a name, a unique name will be automatically generated upon initialization of a Model. (default: '')

Return type:



A parameter variable.

See also


A node representing a general calculation/operation in JAX or Python.


A node representing some static data.


A node representing a tensorflow_probability Distribution.


A variable in a statistical model, typically with a probability distribution.


A helper function to initialize a Var as an observed variable.


A variable will compute its log probability only when Calc.update() is called. This does not happen automatically upon initialization. Commonly, the first time this method is called is during the initialization of a Model. To update the value immediately, you can call Var.update() manually.


>>> import tensorflow_probability.substrates.jax.distributions as tfd

A variance parameter with an inverse-gamma prior:

>>> prior = lsl.Dist(tfd.InverseGamma, concentration=0.1, scale=0.1)
>>> variance = lsl.param(1.0, prior, name="variance")
>>> variance

We can use this parameter variable in the distribution of an observed variable:

>>> scale = lsl.Calc(jnp.sqrt, variance)
>>> dist = lsl.Dist(tfd.Normal, loc=0.0, scale=scale)
>>> y = lsl.obs(jnp.array([-0.5, 0.0, 0.5]), dist, name="y")
>>> y

Now we can build the model graph:

>>> model = lsl.GraphBuilder().add(y).build_model()

The log_prior of the model is the sum of the log-priors of all parameters. In this case this is only our variance parameter:

>>> model.log_prior
Array(-2.582971, dtype=float32)
>>> variance.log_prob
Array(-2.582971, dtype=float32)

The log-likelihood of the model is the sum of the log-probabilities of all observed variables. In this case this is only our y variable:

>>> model.log_lik
Array(-3.0068154, dtype=float32)
>>> jnp.sum(y.log_prob)
Array(-3.0068154, dtype=float32)