.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_calibration/bayesian_calibration/plot_gibbs.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_calibration_bayesian_calibration_plot_gibbs.py: Gibbs sampling of the posterior distribution ============================================ .. GENERATED FROM PYTHON SOURCE LINES 7-17 We sample the from the posterior distribution of the parameters of a mixture model. .. math:: X \sim 0.7 \mathcal{N}(\mu_0, 1) + 0.3 \mathcal{N}(\mu_1, 1), where :math:`\mu_0` and :math:`\mu_1` are unknown parameters. They are a priori i.i.d. with prior distribution :math:`\mathcal{N}(0, \sqrt{10})`. This example is drawn from Example 9.2 from *Monte-Carlo Statistical methods* by Robert and Casella (2004). .. GENERATED FROM PYTHON SOURCE LINES 17-24 .. code-block:: Python import openturns as ot from openturns.viewer import View import numpy as np ot.RandomGenerator.SetSeed(100) .. GENERATED FROM PYTHON SOURCE LINES 25-26 Sample data with :math:`\mu_0 = 0` and :math:`\mu_1 = 2.7`. .. GENERATED FROM PYTHON SOURCE LINES 26-37 .. code-block:: Python N = 500 p = 0.3 mu0 = 0.0 mu1 = 2.7 nor0 = ot.Normal(mu0, 1.0) nor1 = ot.Normal(mu1, 1.0) true_distribution = ot.Mixture([nor0, nor1], [1 - p, p]) observations = np.array(true_distribution.getSample(500)) .. GENERATED FROM PYTHON SOURCE LINES 38-39 Plot the true distribution. .. GENERATED FROM PYTHON SOURCE LINES 39-46 .. code-block:: Python graph = true_distribution.drawPDF() graph.setTitle("True distribution") graph.setXTitle("") graph.setLegends([""]) _ = View(graph) .. image-sg:: /auto_calibration/bayesian_calibration/images/sphx_glr_plot_gibbs_001.png :alt: True distribution :srcset: /auto_calibration/bayesian_calibration/images/sphx_glr_plot_gibbs_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 47-77 A natural step at this point is to introduce an auxiliary (unobserved) random variable :math:`Z` telling from which distribution :math:`X` was sampled. For any nonnegative integer :math:`i`, :math:`Z_i` follows the Bernoulli distribution with :math:`p=0.3`, and :math:`X_i | Z_i \sim \mathcal{N}(\mu_{Z_i}, 1.0)`. Let :math:`n_0` (resp. :math:`n_1`) denote the number of indices :math:`i` such that :math:`Z_i=0` (resp. :math:`Z_i=1`). Conditionally to all :math:`X_i` and all :math:`Z_i`, :math:`\mu_0` and :math:`\mu_1` are independent: :math:`\mu_0` follows :math:`\mathcal{N} \left(\sum_{Z_i=0} \frac{X_i}{0.1 + n_0}, \frac{1}{0.1 + n_0} \right)` and :math:`\mu_1` follows :math:`\mathcal{N} \left(\sum_{Z_i=1} \frac{X_i}{0.1 + n_1}, \frac{1}{0.1 + n_1} \right)`. For any :math:`i`, conditionally to :math:`X_i`, :math:`\mu_0` and :math:`\mu_1`, :math:`Z_i` is independent from all :math:`Z_j` (:math:`j \neq i`) and follows the Bernoulli distribution with parameter .. math:: p = \frac{p \exp \left( -(X_i - \mu_1)^2 / 2 \right) }{p \exp \left( -(X_i - \mu_1)^2 / 2 \right) + (1-p) \exp \left( -(X_i - \mu_0)^2 / 2 \right) } We now sample from the joint distribution of :math:`(\mu_0, \mu1, Z_0,...,Z_{N-1})` conditionally to the :math:`X_i` using the Gibbs algorithm. We define functions that will translate a given state of the Gibbs algorithm into the correct parameters for the distributions of :math:`\mu_0`, :math:`\mu_1`, and the :math:`Z_i`. .. GENERATED FROM PYTHON SOURCE LINES 77-109 .. code-block:: Python def nor0post(pt): z = np.array(pt)[2:] x0 = observations[z == 0] mu0 = x0.sum() / (0.1 + len(x0)) sigma0 = 1.0 / (0.1 + len(x0)) return [mu0, sigma0] def nor1post(pt): z = np.array(pt)[2:] x1 = observations[z == 1] mu1 = x1.sum() / (0.1 + len(x1)) sigma1 = 1.0 / (0.1 + len(x1)) return [mu1, sigma1] def zpost(pt): mu0 = pt[0] mu1 = pt[1] term1 = p * np.exp(-((observations - mu1) ** 2) / 2) term0 = (1.0 - p) * np.exp(-((observations - mu0) ** 2) / 2) res = term1 / (term1 + term0) # output must be a 1d list or array in order to create a PythonFunction return res.reshape(-1) nor0posterior = ot.PythonFunction(2 + N, 2, nor0post) nor1posterior = ot.PythonFunction(2 + N, 2, nor1post) zposterior = ot.PythonFunction(2 + N, N, zpost) .. GENERATED FROM PYTHON SOURCE LINES 110-111 We can now construct the Gibbs algorithm .. GENERATED FROM PYTHON SOURCE LINES 111-129 .. code-block:: Python initialState = [0.0] * (N + 2) sampler0 = ot.RandomVectorMetropolisHastings( ot.RandomVector(ot.Normal()), initialState, [0], nor0posterior ) sampler1 = ot.RandomVectorMetropolisHastings( ot.RandomVector(ot.Normal()), initialState, [1], nor1posterior ) big_bernoulli = ot.ComposedDistribution([ot.Bernoulli()] * N) sampler2 = ot.RandomVectorMetropolisHastings( ot.RandomVector(big_bernoulli), initialState, range(2, N + 2), zposterior ) gibbs = ot.Gibbs([sampler0, sampler1, sampler2]) .. GENERATED FROM PYTHON SOURCE LINES 130-131 Run the Gibbs algorithm .. GENERATED FROM PYTHON SOURCE LINES 131-134 .. code-block:: Python s = gibbs.getSample(10000) .. GENERATED FROM PYTHON SOURCE LINES 135-136 Extract the relevant marginals: the first (:math:`mu_0`) and the second (:math:`\mu_1`). .. GENERATED FROM PYTHON SOURCE LINES 136-139 .. code-block:: Python posterior_sample = s[:, 0:2] .. GENERATED FROM PYTHON SOURCE LINES 140-141 Let us plot the posterior density. .. GENERATED FROM PYTHON SOURCE LINES 141-151 .. code-block:: Python ks = ot.KernelSmoothing().build(posterior_sample) graph = ks.drawPDF() graph.setTitle("Posterior density") graph.setLegendPosition("lower right") graph.setXTitle(r"$\mu_0$") graph.setYTitle(r"$\mu_1$") _ = View(graph) View.ShowAll() .. image-sg:: /auto_calibration/bayesian_calibration/images/sphx_glr_plot_gibbs_002.png :alt: Posterior density :srcset: /auto_calibration/bayesian_calibration/images/sphx_glr_plot_gibbs_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 11.939 seconds) .. _sphx_glr_download_auto_calibration_bayesian_calibration_plot_gibbs.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gibbs.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gibbs.py `