.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_kriging_chose_trend.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: Kriging: choose a polynomial trend ================================== .. GENERATED FROM PYTHON SOURCE LINES 5-9 .. code-block:: Python import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt .. GENERATED FROM PYTHON SOURCE LINES 10-44 Introduction ------------ In this example we present the polynomial trends which we may use in a kriging metamodel. This example focuses on three polynomial trends: - :class:`~openturns.ConstantBasisFactory`; - :class:`~openturns.LinearBasisFactory`; - :class:`~openturns.QuadraticBasisFactory`. In the :doc:`/auto_meta_modeling/kriging_metamodel/plot_kriging_beam_trend` example, we give another example of this method. In the :doc:`/auto_meta_modeling/kriging_metamodel/plot_kriging_beam_arbitrary_trend` example, we show how to configure an arbitrary trend. The model is the real function: .. math:: \model(x) = x \sin \left( \frac{x}{2} \right) for any :math:`x \in [0,10]`. We consider the :class:`~openturns.MaternModel` covariance kernel where :math:`\vect{\theta} = (\sigma, \rho)` is the vector of hyperparameters. The covariance model is fixed but its parameters must be calibrated depending on the data. The kriging metamodel is: .. math:: \widehat{Y}(x) = m(x) + Z(x) where :math:`m(.)` is the trend and :math:`Z(.)` is a Gaussian process with zero mean and covariance model :math:`C_{\vect{\theta}}(s,t)`. The trend is deterministic and the Gaussian process is probabilistic but they both contribute to the metamodel. A special feature of the kriging is the interpolation property: the metamodel is exact at the training data. .. GENERATED FROM PYTHON SOURCE LINES 46-48 .. code-block:: Python covarianceModel = ot.SquaredExponential([1.0], [1.0]) .. GENERATED FROM PYTHON SOURCE LINES 49-51 Define the model ---------------- .. GENERATED FROM PYTHON SOURCE LINES 53-54 First, we define the :class:`~openturns.MaternModel` covariance model. .. GENERATED FROM PYTHON SOURCE LINES 54-56 .. code-block:: Python covarianceModel = ot.MaternModel([1.0], [1.0], 2.5) .. GENERATED FROM PYTHON SOURCE LINES 57-58 We define our exact model with a :class:`~openturns.SymbolicFunction`. .. GENERATED FROM PYTHON SOURCE LINES 58-61 .. code-block:: Python model = ot.SymbolicFunction(["x"], ["x * sin(0.5 * x)"]) .. GENERATED FROM PYTHON SOURCE LINES 62-63 We use the following sample to train our metamodel. .. GENERATED FROM PYTHON SOURCE LINES 63-70 .. code-block:: Python xmin = 0.0 xmax = 10.0 ot.RandomGenerator.SetSeed(0) nTrain = 8 Xtrain = ot.Uniform(xmin, xmax).getSample(nTrain).sort() Xtrain .. raw:: html
v0
00.3250275
11.352764
23.47057
35.030402
46.298766
58.828052
69.206796
79.69423


.. GENERATED FROM PYTHON SOURCE LINES 71-72 The values of the exact model are also needed for training. .. GENERATED FROM PYTHON SOURCE LINES 72-75 .. code-block:: Python Ytrain = model(Xtrain) Ytrain .. raw:: html
y0
00.05258924
10.8467968
23.423725
32.94895
4-0.0490677
5-8.43802
6-9.152166
7-9.606383


.. GENERATED FROM PYTHON SOURCE LINES 76-77 We shall test the model on a set of points based on a regular grid. .. GENERATED FROM PYTHON SOURCE LINES 77-81 .. code-block:: Python nTest = 100 step = (xmax - xmin) / (nTest - 1) x_test = ot.RegularGrid(xmin, step, nTest).getVertices() .. GENERATED FROM PYTHON SOURCE LINES 82-83 We draw the training points and the model at the testing points. We encapsulate it into a function to use it again later. .. GENERATED FROM PYTHON SOURCE LINES 83-102 .. code-block:: Python def plot_exact_model(color): graph = ot.Graph("", "x", "", True, "") y_test = model(x_test) curveModel = ot.Curve(x_test, y_test) curveModel.setLineStyle("solid") curveModel.setColor(color) curveModel.setLegend("Model") graph.add(curveModel) cloud = ot.Cloud(Xtrain, Ytrain) cloud.setColor(color) cloud.setPointStyle("fsquare") cloud.setLegend("Data") graph.add(cloud) graph.setLegendPosition("bottom") return graph .. GENERATED FROM PYTHON SOURCE LINES 103-108 .. code-block:: Python palette = ot.Drawable.BuildDefaultPalette(5) graph = plot_exact_model(palette[0]) graph.setTitle("1D Kriging: exact model") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :alt: 1D Kriging: exact model :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 109-111 Scale the input training sample ------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 113-121 We often have to apply a transform on the input data before performing the kriging. This make the estimation of the hyperparameters of the kriging metamodel easier for the optimization algorithm. To do so we write a linear transform of our input data: we make it unit centered at its mean. Then we fix the mean and the standard deviation to their values with the :class:`~openturns.ParametricFunction`. We build the inverse transform as well. We first compute the mean and standard deviation of the input data. .. GENERATED FROM PYTHON SOURCE LINES 121-126 .. code-block:: Python mean = Xtrain.computeMean()[0] stdDev = Xtrain.computeStandardDeviation()[0] print("Xtrain, mean: %.3f" % mean) print("Xtrain, standard deviation: %.3f" % stdDev) .. rst-class:: sphx-glr-script-out .. code-block:: none Xtrain, mean: 5.526 Xtrain, standard deviation: 3.613 .. GENERATED FROM PYTHON SOURCE LINES 127-132 .. code-block:: Python tf = ot.SymbolicFunction(["mu", "sigma", "x"], ["(x - mu) / sigma"]) itf = ot.SymbolicFunction(["mu", "sigma", "x"], ["sigma * x + mu"]) myInverseTransform = ot.ParametricFunction(itf, [0, 1], [mean, stdDev]) myTransform = ot.ParametricFunction(tf, [0, 1], [mean, stdDev]) .. GENERATED FROM PYTHON SOURCE LINES 133-134 Scale the input training sample. .. GENERATED FROM PYTHON SOURCE LINES 134-138 .. code-block:: Python scaledXtrain = myTransform(Xtrain) scaledXtrain .. raw:: html
y0
0-1.439601
1-1.15512
2-0.5689025
3-0.1371353
40.2139527
50.9140688
61.018907
71.15383


.. GENERATED FROM PYTHON SOURCE LINES 139-146 Constant basis -------------- In this paragraph we choose a basis constant for the kriging. This trend only has one parameter which is the value of the constant. The basis is built with the :class:`~openturns.ConstantBasisFactory` class. .. GENERATED FROM PYTHON SOURCE LINES 148-151 .. code-block:: Python dimension = 1 basis = ot.ConstantBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 152-154 We build the kriging algorithm by giving it the transformed data, the output data, the covariance model and the basis. .. GENERATED FROM PYTHON SOURCE LINES 154-156 .. code-block:: Python algo = ot.KrigingAlgorithm(scaledXtrain, Ytrain, covarianceModel, basis) .. GENERATED FROM PYTHON SOURCE LINES 157-158 We can run the algorithm and store the result. .. GENERATED FROM PYTHON SOURCE LINES 158-161 .. code-block:: Python algo.run() result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 162-163 The metamodel is the following :class:`~openturns.ComposedFunction`. .. GENERATED FROM PYTHON SOURCE LINES 163-165 .. code-block:: Python metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 166-167 Define a function to plot the metamodel .. GENERATED FROM PYTHON SOURCE LINES 167-180 .. code-block:: Python def plotMetamodel(x_test, krigingResult, myTransform, color): scaled_x_test = myTransform(x_test) metamodel = result.getMetaModel() y_test = metamodel(scaled_x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dashed") curve.setColor(color) curve.setLegend("Metamodel") return curve .. GENERATED FROM PYTHON SOURCE LINES 181-182 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 182-187 .. code-block:: Python graph = plot_exact_model(palette[0]) graph.add(plotMetamodel(x_test, result, myTransform, palette[1])) graph.setTitle("1D Kriging: exact model and metamodel") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :alt: 1D Kriging: exact model and metamodel :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 188-189 We can retrieve the calibrated trend coefficient with :meth:`~openturns.KrigingResult.getTrendCoefficients`. .. GENERATED FROM PYTHON SOURCE LINES 189-192 .. code-block:: Python c0 = result.getTrendCoefficients() print("The trend is the curve m(x) = %.6e" % c0[0]) .. rst-class:: sphx-glr-script-out .. code-block:: none The trend is the curve m(x) = -2.726225e+00 .. GENERATED FROM PYTHON SOURCE LINES 193-194 We also observe the values of the hyperparameters of the trained covariance model. .. GENERATED FROM PYTHON SOURCE LINES 194-200 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter: %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter: %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter: 1.368e+00 Amplitude parameter: 6.077e+00 .. GENERATED FROM PYTHON SOURCE LINES 201-202 We build the trend from the coefficient. .. GENERATED FROM PYTHON SOURCE LINES 202-206 .. code-block:: Python constantTrend = ot.SymbolicFunction(["a", "x"], ["a"]) myTrend = ot.ParametricFunction(constantTrend, [0], [c0[0]]) .. GENERATED FROM PYTHON SOURCE LINES 207-208 Define a function to plot the trend .. GENERATED FROM PYTHON SOURCE LINES 208-220 .. code-block:: Python def plotTrend(x_test, myTrend, myTransform, color): scale_x_test = myTransform(x_test) y_test = myTrend(scale_x_test) curve = ot.Curve(x_test, y_test) curve.setLineStyle("dotdash") curve.setColor(color) curve.setLegend("Trend") return curve .. GENERATED FROM PYTHON SOURCE LINES 221-222 We draw the trend found by the kriging procedure. .. GENERATED FROM PYTHON SOURCE LINES 222-226 .. code-block:: Python graph.add(plotTrend(x_test, myTrend, myTransform, palette[2])) graph.setTitle("1D Kriging with a constant trend") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 227-228 Create a function to plot confidence bounds. .. GENERATED FROM PYTHON SOURCE LINES 230-262 .. code-block:: Python def plotKrigingConfidenceBounds(krigingResult, x_test, myTransform, color, alpha=0.05): bilateralCI = ot.Normal().computeBilateralConfidenceInterval(1.0 - alpha) quantileAlpha = bilateralCI.getUpperBound()[0] sqrt = ot.SymbolicFunction(["x"], ["sqrt(x)"]) n_test = x_test.getSize() epsilon = ot.Sample(n_test, [1.0e-8]) scaled_x_test = myTransform(x_test) conditionalVariance = ( krigingResult.getConditionalMarginalVariance(scaled_x_test) + epsilon ) conditionalSigma = sqrt(conditionalVariance) metamodel = krigingResult.getMetaModel() y_test = metamodel(scaled_x_test) dataLower = [ [y_test[i, 0] - quantileAlpha * conditionalSigma[i, 0]] for i in range(n_test) ] dataUpper = [ [y_test[i, 0] + quantileAlpha * conditionalSigma[i, 0]] for i in range(n_test) ] dataLower = ot.Sample(dataLower) dataUpper = ot.Sample(dataUpper) vLow = [[x_test[i, 0], dataLower[i, 0]] for i in range(n_test)] vUp = [[x_test[i, 0], dataUpper[i, 0]] for i in range(n_test)] polyData = [[vLow[i], vLow[i + 1], vUp[i + 1], vUp[i]] for i in range(n_test - 1)] polygonList = [ot.Polygon(polyData[i], color, color) for i in range(n_test - 1)] boundsPoly = ot.PolygonArray(polygonList) boundsPoly.setLegend("%d%% C.I." % ((1.0 - alpha) * 100)) return boundsPoly .. GENERATED FROM PYTHON SOURCE LINES 263-264 Plot a confidence interval. .. GENERATED FROM PYTHON SOURCE LINES 264-275 .. code-block:: Python graph.add(plotKrigingConfidenceBounds(result, x_test, myTransform, palette[3])) graph.setTitle("1D Kriging with a constant trend") view = otv.View( graph, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, figure_kw={"figsize": (7.0, 3.0)}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :alt: 1D Kriging with a constant trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 276-277 As expected with a constant basis, the trend obtained is an horizontal line. .. GENERATED FROM PYTHON SOURCE LINES 280-286 Linear basis ------------ With a linear basis, the vector space is defined by the basis :math:`\{1, z\}`: that is all the lines of the form :math:`y(z) = az + b` where :math:`a` and :math:`b` are real parameters. .. GENERATED FROM PYTHON SOURCE LINES 286-289 .. code-block:: Python basis = ot.LinearBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 290-291 We run the kriging analysis and store the result. .. GENERATED FROM PYTHON SOURCE LINES 291-297 .. code-block:: Python algo = ot.KrigingAlgorithm(scaledXtrain, Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 298-299 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 299-303 .. code-block:: Python graph = plot_exact_model(palette[0]) graph.add(plotMetamodel(x_test, result, myTransform, palette[1])) .. GENERATED FROM PYTHON SOURCE LINES 304-305 We can retrieve the calibrated trend coefficients with :meth:`~openturns.KrigingResult.getTrendCoefficients`. .. GENERATED FROM PYTHON SOURCE LINES 305-309 .. code-block:: Python c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e X + %.6e" % (c0[1], c0[0])) .. rst-class:: sphx-glr-script-out .. code-block:: none Trend is the curve m(X) = -2.578407e+00 X + -3.020888e+00 .. GENERATED FROM PYTHON SOURCE LINES 310-311 We observe the values of the hyperparameters of the trained covariance model. .. GENERATED FROM PYTHON SOURCE LINES 311-318 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter: %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter: %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter: 1.100e+00 Amplitude parameter: 4.627e+00 .. GENERATED FROM PYTHON SOURCE LINES 319-320 We draw the linear trend that we are interested in. .. GENERATED FROM PYTHON SOURCE LINES 320-325 .. code-block:: Python linearTrend = ot.SymbolicFunction(["a", "b", "z"], ["a * z + b"]) myTrend = ot.ParametricFunction(linearTrend, [0, 1], [c0[1], c0[0]]) graph.add(plotTrend(x_test, myTrend, myTransform, palette[2])) .. GENERATED FROM PYTHON SOURCE LINES 326-327 Add the 95% confidence interval. .. GENERATED FROM PYTHON SOURCE LINES 327-337 .. code-block:: Python graph.add(plotKrigingConfidenceBounds(result, x_test, myTransform, palette[3])) graph.setTitle("1D Kriging with a linear trend") view = otv.View( graph, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, figure_kw={"figsize": (7.0, 3.0)}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :alt: 1D Kriging with a linear trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 338-342 Quadratic basis --------------- In this last paragraph we turn to the quadratic basis. All subsequent analysis should remain the same. .. GENERATED FROM PYTHON SOURCE LINES 342-345 .. code-block:: Python basis = ot.QuadraticBasisFactory(dimension).build() .. GENERATED FROM PYTHON SOURCE LINES 346-347 We run the kriging analysis and store the result. .. GENERATED FROM PYTHON SOURCE LINES 347-352 .. code-block:: Python algo = ot.KrigingAlgorithm(scaledXtrain, Ytrain, covarianceModel, basis) algo.run() result = algo.getResult() metamodel = ot.ComposedFunction(result.getMetaModel(), myTransform) .. GENERATED FROM PYTHON SOURCE LINES 353-354 We can draw the metamodel and the exact model on the same graph. .. GENERATED FROM PYTHON SOURCE LINES 354-358 .. code-block:: Python graph = plot_exact_model(palette[0]) graph.add(plotMetamodel(x_test, result, myTransform, palette[1])) .. GENERATED FROM PYTHON SOURCE LINES 359-360 We can retrieve the calibrated trend coefficients with :meth:`~openturns.KrigingResult.getTrendCoefficients`. .. GENERATED FROM PYTHON SOURCE LINES 360-364 .. code-block:: Python c0 = result.getTrendCoefficients() print("Trend is the curve m(X) = %.6e Z**2 + %.6e Z + %.6e" % (c0[2], c0[1], c0[0])) .. rst-class:: sphx-glr-script-out .. code-block:: none Trend is the curve m(X) = -4.583776e+00 Z**2 + -5.327795e+00 Z + 1.564663e+00 .. GENERATED FROM PYTHON SOURCE LINES 365-366 We observe the values of the hyperparameters of the trained covariance model. .. GENERATED FROM PYTHON SOURCE LINES 366-372 .. code-block:: Python rho = result.getCovarianceModel().getScale()[0] print("Scale parameter: %.3e" % rho) sigma = result.getCovarianceModel().getAmplitude()[0] print("Amplitude parameter: %.3e" % sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none Scale parameter: 8.374e-02 Amplitude parameter: 9.440e-01 .. GENERATED FROM PYTHON SOURCE LINES 373-374 The quadratic trend obtained. .. GENERATED FROM PYTHON SOURCE LINES 374-378 .. code-block:: Python quadraticTrend = ot.SymbolicFunction(["a", "b", "c", "z"], ["a * z^2 + b * z + c"]) myTrend = ot.ParametricFunction(quadraticTrend, [0, 1, 2], [c0[2], c0[1], c0[0]]) .. GENERATED FROM PYTHON SOURCE LINES 379-380 Add the quadratic trend .. GENERATED FROM PYTHON SOURCE LINES 380-384 .. code-block:: Python y_test = myTrend(myTransform(x_test)) graph.add(plotTrend(x_test, myTrend, myTransform, palette[2])) .. GENERATED FROM PYTHON SOURCE LINES 385-386 Add the 95% confidence interval. .. GENERATED FROM PYTHON SOURCE LINES 388-389 sphinx_gallery_thumbnail_number = 6 .. GENERATED FROM PYTHON SOURCE LINES 389-399 .. code-block:: Python graph.add(plotKrigingConfidenceBounds(result, x_test, myTransform, palette[3])) graph.setTitle("1D Kriging with a quadratic trend") view = otv.View( graph, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, figure_kw={"figsize": (7.0, 3.0)}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :alt: 1D Kriging with a quadratic trend :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_chose_trend_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 400-401 Display figures .. GENERATED FROM PYTHON SOURCE LINES 401-402 .. code-block:: Python otv.View.ShowAll() .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_kriging_chose_trend.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_kriging_chose_trend.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_kriging_chose_trend.py `