.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/general_purpose_metamodels/plot_expert_mixture.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_general_purpose_metamodels_plot_expert_mixture.py: Mixture of experts ================== .. GENERATED FROM PYTHON SOURCE LINES 6-37 In this example we are going to approximate a piece wise continuous function using an expert mixture of metamodels. The metamodels will be represented by the family of :math:`f_k \forall \in [1, N]`: .. math:: \begin{align} f(\underline{x}) = f_1(\underline{x}) \quad \forall \underline{z} \in Class\, 1 \dots f(\underline{x}) = f_k(\underline{x}) \quad \forall \underline{z} \in Class\, k \dots f(\underline{x}) = f_N(\underline{x}) \quad \forall \underline{z} \in Class\, N \end{align} where the N classes are defined by the classifier. Using the supervised mode the classifier partitions the input and output space at once: .. math:: z =(\underline{x}, f( \underline{x})) The classifier is MixtureClassifier based on a MixtureDistribution defined as: .. math:: p(\underline{x}) = \sum_{i=1}^N w_ip_i(\underline{x}) The rule to assign a point to a class is defined as follows: :math:`\underline{x}` is assigned to the class :math:`j=argmax_j \log w_kp_k(\underline{z})`. The grade of :math:`\underline{x}` with respect to the class :math:`k` is :math:`\log w_kp_k(\underline{x})`. .. GENERATED FROM PYTHON SOURCE LINES 39-47 .. code-block:: default import openturns as ot from matplotlib import pyplot as plt import openturns.viewer as viewer from matplotlib import pylab as plt from openturns.viewer import View import numpy as np ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 48-64 .. code-block:: default dimension = 1 # Define the piecewise model we want to rebuild def piecewise(X): # if x < 0.0: # f = (x+0.75)**2-0.75**2 # else: # f = 2.0-x**2 xarray = np.array(X, copy=False) return np.piecewise(xarray, [xarray < 0, xarray >= 0], [lambda x: x*(x+1.5), lambda x: 2.0 - x*x]) f = ot.PythonFunction(1, 1, func_sample=piecewise) .. GENERATED FROM PYTHON SOURCE LINES 65-66 Build a metamodel over each segment .. GENERATED FROM PYTHON SOURCE LINES 66-76 .. code-block:: default degree = 5 samplingSize = 100 enumerateFunction = ot.LinearEnumerateFunction(dimension) productBasis = ot.OrthogonalProductPolynomialFactory( [ot.LegendreFactory()] * dimension, enumerateFunction) adaptiveStrategy = ot.FixedStrategy( productBasis, enumerateFunction.getStrataCumulatedCardinal(degree)) projectionStrategy = ot.LeastSquaresStrategy( ot.MonteCarloExperiment(samplingSize)) .. GENERATED FROM PYTHON SOURCE LINES 77-78 Segment 1: (-1.0; 0.0) .. GENERATED FROM PYTHON SOURCE LINES 78-85 .. code-block:: default d1 = ot.Uniform(-1.0, 0.0) fc1 = ot.FunctionalChaosAlgorithm(f, d1, adaptiveStrategy, projectionStrategy) fc1.run() mm1 = fc1.getResult().getMetaModel() graph = mm1.draw(-1.0, -1e-6) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_001.png :alt: v0 as a function of x0 :srcset: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 86-87 Segment 2: (0.0, 1.0) .. GENERATED FROM PYTHON SOURCE LINES 87-94 .. code-block:: default d2 = ot.Uniform(0.0, 1.0) fc2 = ot.FunctionalChaosAlgorithm(f, d2, adaptiveStrategy, projectionStrategy) fc2.run() mm2 = fc2.getResult().getMetaModel() graph = mm2.draw(1e-6, 1.0) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_002.png :alt: v0 as a function of x0 :srcset: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 95-96 Define the mixture .. GENERATED FROM PYTHON SOURCE LINES 96-103 .. code-block:: default R = ot.CorrelationMatrix(2) d1 = ot.Normal([-1.0, -1.0], [1.0]*2, R) # segment 1 d2 = ot.Normal([1.0, 1.0], [1.0]*2, R) # segment 2 weights = [1.0]*2 atoms = [d1, d2] mixture = ot.Mixture(atoms, weights) .. GENERATED FROM PYTHON SOURCE LINES 104-105 Create the classifier based on the mixture .. GENERATED FROM PYTHON SOURCE LINES 105-107 .. code-block:: default classifier = ot.MixtureClassifier(mixture) .. GENERATED FROM PYTHON SOURCE LINES 108-109 Create local experts using the metamodels .. GENERATED FROM PYTHON SOURCE LINES 109-111 .. code-block:: default experts = ot.Basis([mm1, mm2]) .. GENERATED FROM PYTHON SOURCE LINES 112-113 Create a mixture of experts .. GENERATED FROM PYTHON SOURCE LINES 113-116 .. code-block:: default evaluation = ot.ExpertMixture(experts, classifier) moe = ot.Function(evaluation) .. GENERATED FROM PYTHON SOURCE LINES 117-118 Draw the mixture of experts .. GENERATED FROM PYTHON SOURCE LINES 118-121 .. code-block:: default graph = moe.draw(-1.0, 1.0) view = viewer.View(graph) plt.show() .. image-sg:: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_003.png :alt: v0 as a function of x0 :srcset: /auto_meta_modeling/general_purpose_metamodels/images/sphx_glr_plot_expert_mixture_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.184 seconds) .. _sphx_glr_download_auto_meta_modeling_general_purpose_metamodels_plot_expert_mixture.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_expert_mixture.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_expert_mixture.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_