.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_kriging_chose_trend.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: Kriging : choose a trend vector space ===================================== .. GENERATED FROM PYTHON SOURCE LINES 5-9 .. code-block:: Python import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt .. GENERATED FROM PYTHON SOURCE LINES 10-25 Introduction ------------ In this example we present the basic trends which we may use in the kriging metamodel procedure : - ConstantBasis ; - LinearBasis ; - QuadraticBasis ; The model is the simple real function defined by .. math:: f : x \mapsto x \left( \sin \frac{x}{2} \right) which we study on the :math:`[0,10]` interval. .. GENERATED FROM PYTHON SOURCE LINES 27-30 .. code-block:: Python dimension = 1 .. GENERATED FROM PYTHON SOURCE LINES 31-49 The focus being on the trend we assume a given squared exponential covariance kernel : .. math:: C_{\theta}(s,t) = \sigma^2 e^{ - \frac{(t-s)^2}{2\rho^2} } where :math:`\theta = (\sigma, \rho)` is the hyperparameters vectors. The nature of the covariance model is fixed but its parameters are calibrated during the kriging process. We want to choose them so as to best fit our data. Eventually the kriging (meta)model :math:`\hat{Y}(x)` reads as .. math:: \hat{Y}(x) = m(x) + Z(x) where :math:`m(.)` is the trend and :math:`Z(.)` is a gaussian process with zero-mean and its covariance matrix is :math:`C_{\theta}(s,t)`. The trend is deterministic and the gaussian process is probabilistc but they both contribute to the metamodel. A special feature of the kriging is the interpolation property : the metamodel is exact at the trainig data. .. GENERATED FROM PYTHON SOURCE LINES 49-51 .. code-block:: Python covarianceModel = ot.SquaredExponential([1.0], [1.0]) .. GENERATED FROM PYTHON SOURCE LINES 52-53 We define our exact model with a `SymbolicFunction` : .. GENERATED FROM PYTHON SOURCE LINES 53-56 .. code-block:: Python model = ot.SymbolicFunction(["x"], ["x*sin(0.5*x)"]) .. GENERATED FROM PYTHON SOURCE LINES 57-58 We use the following sample to train our metamodel : .. GENERATED FROM PYTHON SOURCE LINES 58-61 .. code-block:: Python nTrain = 5 Xtrain = ot.Sample([[0.5], [1.3], [2.4], [5.6], [8.9]]) .. GENERATED FROM PYTHON SOURCE LINES 62-63 The values of the exact model are also needed for training : .. GENERATED FROM PYTHON SOURCE LINES 63-65 .. code-block:: Python Ytrain = model(Xtrain) .. GENERATED FROM PYTHON SOURCE LINES 66-67 We shall test the model on a set of points and use a regular grid for this matter. .. GENERATED FROM PYTHON SOURCE LINES 67-71 .. code-block:: Python nTest = 101 step = 0.1 x_test = ot.RegularGrid(0, step, nTest).getVertices() .. GENERATED FROM PYTHON SOURCE LINES 72-73 We draw the training points and the model at the testing points. We encapsulate it into a function to use it again later. .. GENERATED FROM PYTHON SOURCE LINES 73-89 .. code-block:: Python def plot_exact_model(): graph = ot.Graph("", "x", "", True, "") y_test = model(x_test) curveModel = ot.Curve(x_test, y_test) curveModel.setLineStyle("solid") curveModel.setColor("black") graph.add(curveModel) cloud = ot.Cloud(Xtrain, Ytrain) cloud.setColor("black") cloud.setPointStyle("fsquare") graph.add(cloud) return graph .. GENERATED FROM PYTHON SOURCE LINES 90-97 .. code-block:: Python graph = plot_exact_model() graph.setLegends(["exact model", "training data"]) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging : exact model") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :alt: 1D Kriging : exact model :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 98-103 A common pre-processing step is to apply a transform on the input data before performing the kriging. To do so we write a linear transform of our input data : we make it unit centered at its mean. Then we fix the mean and the standard deviation to their values with the `ParametricFunction`. We build the inverse transform as well. We first compute the mean and standard deviation of the input data : .. GENERATED FROM PYTHON SOURCE LINES 103-108 .. code-block:: Python mean = Xtrain.computeMean()[0] stdDev = Xtrain.computeStandardDeviation()[0] print("Xtrain, mean : %.3f" % mean) print("Xtrain, standard deviation : %.3f" % stdDev) .. rst-class:: sphx-glr-script-out .. code-block:: none Xtrain, mean : 3.740 Xtrain, standard deviation : 3.476 .. GENERATED FROM PYTHON SOURCE LINES 109-115 .. code-block:: Python tf = ot.SymbolicFunction(["mu", "sigma", "x"], ["(x-mu)/sigma"]) itf = ot.SymbolicFunction(["mu", "sigma", "x"], ["sigma*x+mu"]) myInverseTransform = ot.ParametricFunction(itf, [0, 1], [mean, stdDev]) myTransform = ot.ParametricFunction(tf, [0, 1], [mean, stdDev]) .. GENERATED FROM PYTHON SOURCE LINES 116-121 A constant basis ---------------- In this paragraph we choose a basis constant for the kriging. There is only one unknown which is the value of the constant. The basis is built with the :class:`~openturns.ConstantBasisFactory` class. .. GENERATED FROM PYTHON SOURCE LINES 121-123 .. code-block:: Python basis = ot.ConstantBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 124-126 We build the kriging algorithm by giving it the transformed data, the output data, the covariance model and the basis. .. GENERATED FROM PYTHON SOURCE LINES 126-128 .. code-block:: Python algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) .. GENERATED FROM PYTHON SOURCE LINES 129-130 We can run the algorithm and store the result : .. GENERATED FROM PYTHON SOURCE LINES 130-133 .. code-block:: Python algo.run() result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 134-135 The metamodel is the following :class:`~openturns.ComposedFunction` : .. GENERATED FROM PYTHON SOURCE LINES 135-137 .. code-block:: Python metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 138-139 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 139-150 .. code-block:: Python graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(["exact model", "training data", "kriging metamodel"]) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging : exact model and metamodel") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 151-152 We can retrieve the calibrated trend coefficient : .. GENERATED FROM PYTHON SOURCE LINES 152-155 .. code-block:: Python c0 = result.getTrendCoefficients() print("The trend is the curve m(x) = %.6e" % c0[0]) .. rst-class:: sphx-glr-script-out .. code-block:: none The trend is the curve m(x) = -2.330278e+00 .. GENERATED FROM PYTHON SOURCE LINES 156-158 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 158-164 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter : 9.465e-01 Amplitude parameter : 6.360e+00 .. GENERATED FROM PYTHON SOURCE LINES 165-166 We build the trend from the coefficient : .. GENERATED FROM PYTHON SOURCE LINES 166-170 .. code-block:: Python constantTrend = ot.SymbolicFunction(["a", "x"], ["a"]) myTrend = ot.ParametricFunction(constantTrend, [0], [c0[0]]) .. GENERATED FROM PYTHON SOURCE LINES 171-172 We draw the trend found by the kriging procedure : .. GENERATED FROM PYTHON SOURCE LINES 172-185 .. code-block:: Python y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends( ["exact model", "training data", "kriging metamodel", "constant trend"] ) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a constant trend") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 186-187 We create a small function returning bounds of the 68% confidence interval :math:`[-sigma,sigma]` : .. GENERATED FROM PYTHON SOURCE LINES 187-196 .. code-block:: Python def plot_ICBounds(x_test, y_test, nTest, sigma): vUp = [[x_test[i][0], y_test[i][0] + sigma] for i in range(nTest)] vLow = [[x_test[i][0], y_test[i][0] - sigma] for i in range(nTest)] polyData = [[vLow[i], vLow[i + 1], vUp[i + 1], vUp[i]] for i in range(nTest - 1)] polygonList = [ot.Polygon(polyData[i], "grey", "grey") for i in range(nTest - 1)] boundsPoly = ot.PolygonArray(polygonList) return boundsPoly .. GENERATED FROM PYTHON SOURCE LINES 197-198 We now draw it : .. GENERATED FROM PYTHON SOURCE LINES 198-214 .. code-block:: Python boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends( [ "exact model", "training data", "kriging metamodel", "constant trend", "68% - confidence interval", ] ) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a constant trend") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 215-220 As expected with a constant basis, the trend obtained is an horizontal line amidst data points. The main idea is that at any point :math:`x_0`, the gaussian term :math:`Z(x_0)` helps to find a good value of :math:`\hat{Y}(x_0)` thanks to the high value of the amplitude :math:`\sigma` (:math:`\sigma \approx 6.359`) far away from the mean :math:`m(x)`. As a consequence the 68% confidence interval is wide. .. GENERATED FROM PYTHON SOURCE LINES 223-228 Linear basis ------------ With a linear basis, the vector space is defined by the basis :math:`\{1, X\}` : that is all the lines of the form :math:`y(X) = aX + b`. .. GENERATED FROM PYTHON SOURCE LINES 228-231 .. code-block:: Python basis = ot.LinearBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 232-233 We run the kriging analysis and store the result : .. GENERATED FROM PYTHON SOURCE LINES 233-239 .. code-block:: Python algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 240-241 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 241-253 .. code-block:: Python graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(["exact model", "training data", "kriging metamodel"]) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging : exact model and metamodel") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 254-255 We can retrieve the calibrated trend coefficients with `getTrendCoefficients` : .. GENERATED FROM PYTHON SOURCE LINES 255-259 .. code-block:: Python c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e X + %.6e" % (c0[1], c0[0])) .. rst-class:: sphx-glr-script-out .. code-block:: none Trend is the curve m(X) = -4.110244e+00 X + -1.910779e+00 .. GENERATED FROM PYTHON SOURCE LINES 260-262 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 262-269 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter : 9.519e-01 Amplitude parameter : 5.576e+00 .. GENERATED FROM PYTHON SOURCE LINES 270-271 We draw the linear trend that we are interested in. .. GENERATED FROM PYTHON SOURCE LINES 271-284 .. code-block:: Python linearTrend = ot.SymbolicFunction(["a", "b", "x"], ["a*x+b"]) myTrend = ot.ParametricFunction(linearTrend, [0, 1], [c0[1], c0[0]]) y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends(["exact model", "training data", "kriging metamodel", "linear trend"]) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a linear trend") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :alt: 1D Kriging with a linear trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 285-286 The same graphic with the 68% confidence interval : .. GENERATED FROM PYTHON SOURCE LINES 286-301 .. code-block:: Python boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends( [ "exact model", "training data", "kriging metamodel", "linear trend", "68% - confidence interval", ] ) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a linear trend") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_007.png :alt: 1D Kriging with a linear trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 302-307 The trend obtained is decreasing on the interval of study : that is the general trend we observe from the exact model. It is still nowhere close to the exact model but as in the constant case the gaussian part will do the job of building a correct (visually at least) metamodel. We note that the values of the amplitude and the scale parameters are similar to the previous constant case. As it can be seen on the previous figure the metamodel is interpolating (see the last data point). .. GENERATED FROM PYTHON SOURCE LINES 310-314 Quadratic basis --------------- In this last paragraph we turn to the quadratic basis. All subsequent analysis should remain the same. .. GENERATED FROM PYTHON SOURCE LINES 314-317 .. code-block:: Python basis = ot.QuadraticBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 318-319 We run the kriging analysis and store the result : .. GENERATED FROM PYTHON SOURCE LINES 319-324 .. code-block:: Python algo = ot.KrigingAlgorithm(myTransform(Xtrain), Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 325-326 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 326-338 .. code-block:: Python graph = plot_exact_model() y_test = metamodel(x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor("red") graph.add(curve) graph.setLegends(["exact model", "training data", "kriging metamodel"]) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging : exact model and metamodel") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_008.png :alt: 1D Kriging : exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_008.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 339-340 We can retrieve the calibrated trend coefficients with `getTrendCoefficients` : .. GENERATED FROM PYTHON SOURCE LINES 340-344 .. code-block:: Python c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e X**2 + %.6e X + %.6e" % (c0[2], c0[1], c0[0])) .. rst-class:: sphx-glr-script-out .. code-block:: none Trend is the curve m(X) = -4.804137e+00 X**2 + -6.654850e-01 X + 3.128888e+00 .. GENERATED FROM PYTHON SOURCE LINES 345-347 We also pay attention to the trained covariance model and observe the values of the hyperparameters : .. GENERATED FROM PYTHON SOURCE LINES 347-353 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter : %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter : %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter : 1.000e-02 Amplitude parameter : 6.844e-01 .. GENERATED FROM PYTHON SOURCE LINES 354-355 The quadratic linear trend obtained is : .. GENERATED FROM PYTHON SOURCE LINES 355-359 .. code-block:: Python quadraticTrend = ot.SymbolicFunction(["a", "b", "c", "x"], ["a*x^2 + b*x + c"]) myTrend = ot.ParametricFunction(quadraticTrend, [0, 1, 2], [c0[2], c0[1], c0[0]]) .. GENERATED FROM PYTHON SOURCE LINES 360-361 In this case we restrict ourselves to the :math:`[0,6] \times [-2.5,4]` for visualization. .. GENERATED FROM PYTHON SOURCE LINES 361-377 .. code-block:: Python y_test = myTrend(myTransform(x_test)) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor("green") graph.add(curve) graph.setLegends( ["exact model", "training data", "kriging metamodel", "quadratic trend"] ) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a quadratic trend") view = otv.View(graph) axes = view.getAxes() _ = axes[0].set_xlim(0.0, 6.0) _ = axes[0].set_ylim(-2.5, 4.0) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_009.png :alt: 1D Kriging with a quadratic trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_009.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 378-380 sphinx_gallery_thumbnail_number = 10 The same graphic with the 68% confidence interval. .. GENERATED FROM PYTHON SOURCE LINES 380-399 .. code-block:: Python boundsPoly = plot_ICBounds(x_test, y_test, nTest, sigma) graph.add(boundsPoly) graph.setLegends( [ "exact model", "training data", "kriging metamodel", "quadratic trend", "68% - confidence interval", ] ) graph.setLegendPosition("bottom") graph.setTitle("1D Kriging with a quadratic trend") view = otv.View(graph) axes = view.getAxes() _ = axes[0].set_xlim(0.0, 6.0) _ = axes[0].set_ylim(-2.5, 4.0) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_010.png :alt: 1D Kriging with a quadratic trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_010.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 400-401 In this case the analysis is interesting to understand the general behaviour of the kriging process. .. GENERATED FROM PYTHON SOURCE LINES 403-415 From the previous figure we feel that the metamodel is mostly characterized by the trend which is a good approximate of the exact solution over the interval. However, to a certain extent, we have lost flexibility by having a too good and rigid trend as we still have to impose the interpolation property. We see that the gap between the trend and the exact model is small and the amplitude of the gaussian process is small as well : :math:`\sigma \approx 0.684`. Then the confidence interval is more narrow than before. The scale parameter is small, :math:`\rho \approx 0.010`, meaning that two distant points are rapidly independent. The only way to set the interpolation property is to create small peaks (see the figure). The sad conclusion is that the metamodel loses its smoothness. In that case one should use a constant or linear basis even if a quadratic trend seems right. .. GENERATED FROM PYTHON SOURCE LINES 418-419 Display figures .. GENERATED FROM PYTHON SOURCE LINES 419-420 .. code-block:: Python plt.show() .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_kriging_chose_trend.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_kriging_chose_trend.py `