.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_kriging_chose_trend.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: Kriging : choose a trend vector space ===================================== .. GENERATED FROM PYTHON SOURCE LINES 5-9 .. code-block:: default import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt .. GENERATED FROM PYTHON SOURCE LINES 10-25 Introduction ------------ In this example we present the basic trends which we may use in the kriging metamodel procedure : - ConstantBasis ; - LinearBasis ; - QuadraticBasis ; The model is the simple real function defined by .. math:: f : x \mapsto x \left( \sin \frac{x}{2} \right) which we study on the :math:`[0,10]` interval. .. GENERATED FROM PYTHON SOURCE LINES 27-30 .. code-block:: default dimension = 1 .. GENERATED FROM PYTHON SOURCE LINES 31-48 The focus being on the trend we assume a given squared exponential covariance kernel : .. math:: C_{\theta}(s,t) = \sigma^2 e^{ - \frac{(t-s)^2}{2\rho^2} } where :math:`\theta = (\sigma, \rho)` is the hyperparameters vectors. The nature of the covariance model is fixed but its parameters are calibrated during the kriging process. We want to choose them so as to best fit our data. Eventually the kriging (meta)model :math:`\hat{Y}(x)` reads as .. math:: \hat{Y}(x) = m(x) + Z(x) where :math:`m(.)` is the trend and :math:`Z(.)` is a gaussian process with zero-mean and its covariance matrix is :math:`C_{\theta}(s,t)`. The trend is deterministic and the gaussian process is probabilistc but they both contribute to the metamodel. A special feature of the kriging is the interpolation property : the metamodel is exact at the trainig data. .. GENERATED FROM PYTHON SOURCE LINES 48-50 .. code-block:: default covarianceModel = ot.SquaredExponential([1.], [1.0]) .. GENERATED FROM PYTHON SOURCE LINES 51-52 We define our exact model with a `SymbolicFunction` : .. GENERATED FROM PYTHON SOURCE LINES 52-55 .. code-block:: default model = ot.SymbolicFunction(['x'], ['x*sin(0.5*x)']) .. GENERATED FROM PYTHON SOURCE LINES 56-57 We use the following sample to train our metamodel : .. GENERATED FROM PYTHON SOURCE LINES 57-60 .. code-block:: default nTrain = 5 Xtrain = ot.Sample([[0.5], [1.3], [2.4], [5.6], [8.9]]) .. GENERATED FROM PYTHON SOURCE LINES 61-62 The values of the exact model are also needed for training : .. GENERATED FROM PYTHON SOURCE LINES 62-64 .. code-block:: default Ytrain = model(Xtrain) .. GENERATED FROM PYTHON SOURCE LINES 65-66 We shall test the model on a set of points and use a regular grid for this matter. .. GENERATED FROM PYTHON SOURCE LINES 66-70 .. code-block:: default nTest = 101 step = 0.1 x_test = ot.RegularGrid(0, step, nTest).getVertices() .. GENERATED FROM PYTHON SOURCE LINES 71-72 We draw the training points and the model at the testing points. We encapsulate it into a function to use it again later. .. GENERATED FROM PYTHON SOURCE LINES 72-88 .. code-block:: default def plot_exact_model(): graph = ot.Graph('', 'x', '', True, '') y_test = model(x_test) curveModel = ot.Curve(x_test, y_test) curveModel.setLineStyle("solid") curveModel.setColor("black") graph.add(curveModel) cloud = ot.Cloud(Xtrain, Ytrain) cloud.setColor("black") cloud.setPointStyle("fsquare") graph.add(cloud) return graph .. GENERATED FROM PYTHON SOURCE LINES 89-96 .. code-block:: default graph = plot_exact_model() graph.setLegends(['exact model', 'training data']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging : exact model') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :alt: 1D Kriging : exact model :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 97-101 A common pre-processing step is to apply a transform on the input data before performing the kriging. To do so we write a linear transform of our input data : we make it unit centered at its mean. Then we fix the mean and the standard deviation to their values with the `ParametricFunction`. We build the inverse transform as well. We first compute the mean and standard deviation of the input data : .. GENERATED FROM PYTHON SOURCE LINES 101-106 .. code-block:: default mean = Xtrain.computeMean()[0] stdDev = Xtrain.computeStandardDeviation()[0] print("Xtrain, mean : %.3f" % mean) print("Xtrain, standard deviation : %.3f" % stdDev) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Xtrain, mean : 3.740 Xtrain, standard deviation : 3.476 .. GENERATED FROM PYTHON SOURCE LINES 107-113 .. code-block:: default tf = ot.SymbolicFunction(['mu', 'sigma', 'x'], ['(x-mu)/sigma']) itf = ot.SymbolicFunction(['mu', 'sigma', 'x'], ['sigma*x+mu']) myInverseTransform = ot.ParametricFunction(itf, [0, 1], [mean, stdDev]) myTransform = ot.ParametricFunction(tf, [0, 1], [mean, stdDev]) .. GENERATED FROM PYTHON SOURCE LINES 114-119 A constant basis ---------------- In this paragraph we choose a basis constant for the kriging. There is only one unknown which is the value of the constant. The basis is built with the :class:`~openturns.ConstantBasisFactory` class. .. GENERATED FROM PYTHON SOURCE LINES 119-121 .. code-block:: default basis = ot.ConstantBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 122-124 We build the kriging algorithm by giving it the transformed data, the output data, the covariance model and the basis. .. GENERATED FROM PYTHON SOURCE LINES 124-126 .. code-block:: default algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) .. GENERATED FROM PYTHON SOURCE LINES 127-128 We can run the algorithm and store the result : .. GENERATED FROM PYTHON SOURCE LINES 128-131 .. code-block:: default algo.run() result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 132-133 The metamodel is the following :class:`~openturns.ComposedFunction` : .. GENERATED FROM PYTHON SOURCE LINES 133-135 .. code-block:: default metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 136-137 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 137-148 .. code-block:: default graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging : exact model and metamodel') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 149-150 We can retrieve the calibrated trend coefficient : .. GENERATED FROM PYTHON SOURCE LINES 150-153 .. code-block:: default c0 = result.getTrendCoefficients() print("The trend is the curve m(x) = %.6e" % c0[0][0]) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none The trend is the curve m(x) = -2.393406e+00 .. GENERATED FROM PYTHON SOURCE LINES 154-156 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 156-162 .. code-block:: default rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Scale parameter : 9.428e-01 Amplitude parameter : 6.305e+00 .. GENERATED FROM PYTHON SOURCE LINES 163-164 We build the trend from the coefficient : .. GENERATED FROM PYTHON SOURCE LINES 164-168 .. code-block:: default constantTrend = ot.SymbolicFunction(['a', 'x'], ['a']) myTrend = ot.ParametricFunction(constantTrend, [0], [c0[0][0]]) .. GENERATED FROM PYTHON SOURCE LINES 169-170 We draw the trend found by the kriging procedure : .. GENERATED FROM PYTHON SOURCE LINES 170-182 .. code-block:: default y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'constant trend']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a constant trend') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 183-184 We create a small function returning bounds of the 68% confidence interval :math:`[-sigma,sigma]` : .. GENERATED FROM PYTHON SOURCE LINES 184-194 .. code-block:: default def plot_ICBounds(x_test, y_test, nTest, sigma): vUp = [[x_test[i][0], y_test[i][0] + sigma] for i in range(nTest)] vLow = [[x_test[i][0], y_test[i][0] - sigma] for i in range(nTest)] polyData = [[vLow[i], vLow[i+1], vUp[i+1], vUp[i]] for i in range(nTest-1)] polygonList = [ot.Polygon(polyData[i], "grey", "grey") for i in range(nTest-1)] boundsPoly = ot.PolygonArray(polygonList) return boundsPoly .. GENERATED FROM PYTHON SOURCE LINES 195-196 We now draw it : .. GENERATED FROM PYTHON SOURCE LINES 196-205 .. code-block:: default boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'constant trend', '68% - confidence interval']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a constant trend') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 206-211 As expected with a constant basis, the trend obtained is an horizontal line amidst data points. The main idea is that at any point :math:`x_0`, the gaussian term :math:`Z(x_0)` helps to find a good value of :math:`\hat{Y}(x_0)` thanks to the high value of the amplitude :math:`\sigma` (:math:`\sigma \approx 6.359`) far away from the mean :math:`m(x)`. As a consequence the 68% confidence interval is wide. .. GENERATED FROM PYTHON SOURCE LINES 214-219 Linear basis ------------ With a linear basis, the vector space is defined by the basis :math:`\{1, X\}` : that is all the lines of the form :math:`y(X) = aX + b`. .. GENERATED FROM PYTHON SOURCE LINES 219-222 .. code-block:: default basis = ot.LinearBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 223-224 We run the kriging analysis and store the result : .. GENERATED FROM PYTHON SOURCE LINES 224-230 .. code-block:: default algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 231-232 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 232-244 .. code-block:: default graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging : exact model and metamodel') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 245-246 We can retrieve the calibrated trend coefficients with `getTrendCoefficients` : .. GENERATED FROM PYTHON SOURCE LINES 246-250 .. code-block:: default c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e X + %.6e" % (c0[0][1], c0[0][0])) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Trend is the curve m(X) = -4.065459e+00 X + -1.937932e+00 .. GENERATED FROM PYTHON SOURCE LINES 251-253 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 253-260 .. code-block:: default rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Scale parameter : 9.443e-01 Amplitude parameter : 5.513e+00 .. GENERATED FROM PYTHON SOURCE LINES 261-262 We draw the linear trend that we are interested in. .. GENERATED FROM PYTHON SOURCE LINES 262-276 .. code-block:: default linearTrend = ot.SymbolicFunction(['a', 'b', 'x'], ['a*x+b']) myTrend = ot.ParametricFunction(linearTrend, [0, 1], [c0[0][1], c0[0][0]]) y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'linear trend']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a linear trend') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :alt: 1D Kriging with a linear trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 277-278 The same graphic with the 68% confidence interval : .. GENERATED FROM PYTHON SOURCE LINES 278-286 .. code-block:: default boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'linear trend', '68% - confidence interval']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a linear trend') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_007.png :alt: 1D Kriging with a linear trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 287-291 The trend obtained is decreasing on the interval of study : that is the general trend we observe from the exact model. It is still nowhere close to the exact model but as in the constant case the gaussian part will do the job of building a correct (visually at least) metamodel. We note that the values of the amplitude and the scale parameters are similar to the previous constant case. As it can be seen on the previous figure the metamodel is interpolating (see the last data point). .. GENERATED FROM PYTHON SOURCE LINES 294-298 Quadratic basis --------------- In this last paragraph we turn to the quadratic basis. All subsequent analysis should remain the same. .. GENERATED FROM PYTHON SOURCE LINES 298-301 .. code-block:: default basis = ot.QuadraticBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 302-303 We run the kriging analysis and store the result : .. GENERATED FROM PYTHON SOURCE LINES 303-308 .. code-block:: default algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 309-310 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 310-322 .. code-block:: default graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging : exact model and metamodel') view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_008.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_008.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 323-324 We can retrieve the calibrated trend coefficients with `getTrendCoefficients` : .. GENERATED FROM PYTHON SOURCE LINES 324-329 .. code-block:: default c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e X**2 + %.6e X + %.6e" % (c0[0][2], c0[0][1], c0[0][0])) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Trend is the curve m(X) = -4.804137e+00 X**2 + -6.654850e-01 X + 3.128888e+00 .. GENERATED FROM PYTHON SOURCE LINES 330-332 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 332-338 .. code-block:: default rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Scale parameter : 1.000e-02 Amplitude parameter : 6.844e-01 .. GENERATED FROM PYTHON SOURCE LINES 339-340 The quadratic linear trend obtained is : .. GENERATED FROM PYTHON SOURCE LINES 340-345 .. code-block:: default quadraticTrend = ot.SymbolicFunction(['a', 'b', 'c', 'x'], ['a*x^2 + b*x + c']) myTrend = ot.ParametricFunction(quadraticTrend, [0, 1, 2], [ c0[0][2], c0[0][1], c0[0][0]]) .. GENERATED FROM PYTHON SOURCE LINES 346-347 In this case we restrict ourselves to the :math:`[0,6] \times [-2.5,4]` for visualization. .. GENERATED FROM PYTHON SOURCE LINES 347-362 .. code-block:: default y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'quadratic trend']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a quadratic trend') view = otv.View(graph) axes = view.getAxes() _ = axes[0].set_xlim(0.0, 6.0) _ = axes[0].set_ylim(-2.5, 4.0) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_009.png :alt: 1D Kriging with a quadratic trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_009.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 363-365 sphinx_gallery_thumbnail_number = 10 The same graphic with the 68% confidence interval. .. GENERATED FROM PYTHON SOURCE LINES 365-377 .. code-block:: default boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends(['exact model', 'training data', 'kriging metamodel', 'quadratic trend', '68% - confidence interval']) graph.setLegendPosition("bottom") graph.setTitle('1D Kriging with a quadratic trend') view = otv.View(graph) axes = view.getAxes() _ = axes[0].set_xlim(0.0, 6.0) _ = axes[0].set_ylim(-2.5, 4.0) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_010.png :alt: 1D Kriging with a quadratic trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_010.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 378-379 In this case the analysis is interesting to understand the general behaviour of the kriging process. .. GENERATED FROM PYTHON SOURCE LINES 381-393 From the previous figure we feel that the metamodel is mostly characterized by the trend which is a good approximate of the exact solution over the interval. However, to a certain extent, we have lost flexibility by having a too good and rigid trend as we still have to impose the interpolation property. We see that the gap between the trend and the exact model is small and the amplitude of the gaussian process is small as well : :math:`\sigma \approx 0.684`. Then the confidence interval is more narrow than before. The scale parameter is small, :math:`\rho \approx 0.010`, meaning that two distant points are rapidly independent. The only way to set the interpolation property is to create small peaks (see the figure). The sad conclusion is that the metamodel loses its smoothness. In that case one should use a constant or linear basis even if a quadratic trend seems right. .. GENERATED FROM PYTHON SOURCE LINES 396-397 Display figures .. GENERATED FROM PYTHON SOURCE LINES 397-398 .. code-block:: default plt.show() .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 1.360 seconds) .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_kriging_chose_trend.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_kriging_chose_trend.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_