.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_kriging_advanced.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_kriging_advanced.py: Advanced Kriging ================ .. GENERATED FROM PYTHON SOURCE LINES 6-10 In this example we will build a metamodel using gaussian process regression of the :math:`x\sin(x)` function. We will choose the number of learning points, the basis and the covariance model. .. GENERATED FROM PYTHON SOURCE LINES 13-21 .. code-block:: Python import openturns as ot from openturns.viewer import View import numpy as np import matplotlib.pyplot as plt import openturns.viewer as viewer ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 22-27 Generate design of experiment ----------------------------- We create training samples from the function :math:`x\sin(x)`. We can change their number and distribution in the :math:`[0; 10]` range. If the `with_error` boolean is `True`, then the data is computed by adding a gaussian noise to the function values. .. GENERATED FROM PYTHON SOURCE LINES 29-35 .. code-block:: Python dim = 1 xmin = 0 xmax = 10 n_pt = 20 # number of initial points with_error = True # whether to use generation with error .. GENERATED FROM PYTHON SOURCE LINES 36-49 .. code-block:: Python ref_func_with_error = ot.SymbolicFunction(["x", "eps"], ["x * sin(x) + eps"]) ref_func = ot.ParametricFunction(ref_func_with_error, [1], [0.0]) x = np.vstack(np.linspace(xmin, xmax, n_pt)) ot.RandomGenerator.SetSeed(1235) eps = ot.Normal(0, 1.5).getSample(n_pt) X = ot.Sample(n_pt, 2) X[:, 0] = x X[:, 1] = eps if with_error: y = np.array(ref_func_with_error(X)) else: y = np.array(ref_func(x)) .. GENERATED FROM PYTHON SOURCE LINES 50-60 .. code-block:: Python graph = ref_func.draw(xmin, xmax, 200) cloud = ot.Cloud(x, y) cloud.setColor("red") cloud.setPointStyle("bullet") graph.add(cloud) graph.setLegends(["Function", "Data"]) graph.setLegendPosition("upper left") graph.setTitle("Sample size = %d" % (n_pt)) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_001.png :alt: Sample size = 20 :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 61-63 Create the Kriging algorithm ---------------------------- .. GENERATED FROM PYTHON SOURCE LINES 63-93 .. code-block:: Python # 1. basis ot.ResourceMap.SetAsBool( "GeneralLinearModelAlgorithm-UseAnalyticalAmplitudeEstimate", True ) basis = ot.ConstantBasisFactory(dim).build() print(basis) # 2. covariance model cov = ot.MaternModel([1.0], [2.5], 1.5) print(cov) # 3. Kriging algorithm algokriging = ot.KrigingAlgorithm(x, y, cov, basis) # error measure # algokriging.setNoise([5*1e-1]*n_pt) # 4. Optimization # algokriging.setOptimizationAlgorithm(ot.NLopt('GN_DIRECT')) lhsExperiment = ot.LHSExperiment(ot.Uniform(1e-1, 1e2), 50) algokriging.setOptimizationAlgorithm(ot.MultiStart(ot.TNC(), lhsExperiment.generate())) algokriging.setOptimizationBounds(ot.Interval([0.1], [1e2])) # if we choose not to optimize parameters # algokriging.setOptimizeParameters(False) # 5. run the algorithm algokriging.run() .. rst-class:: sphx-glr-script-out .. code-block:: none Basis( [class=LinearEvaluation name=Unnamed center=[0] constant=[1] linear=[[ 0 ]]] ) MaternModel(scale=[1], amplitude=[2.5], nu=1.5) .. GENERATED FROM PYTHON SOURCE LINES 94-96 Results ------- .. GENERATED FROM PYTHON SOURCE LINES 98-99 get some results .. GENERATED FROM PYTHON SOURCE LINES 99-108 .. code-block:: Python krigingResult = algokriging.getResult() print("residual = ", krigingResult.getResiduals()) print("R2 = ", krigingResult.getRelativeErrors()) print("Optimal scale= {}".format(krigingResult.getCovarianceModel().getScale())) print( "Optimal amplitude = {}".format(krigingResult.getCovarianceModel().getAmplitude()) ) print("Optimal trend coefficients = {}".format(krigingResult.getTrendCoefficients())) .. rst-class:: sphx-glr-script-out .. code-block:: none residual = [5.39875e-16] R2 = [2.99965e-31] Optimal scale= [0.818671] Optimal amplitude = [4.51225] Optimal trend coefficients = [-0.115697] .. GENERATED FROM PYTHON SOURCE LINES 109-110 get the metamodel .. GENERATED FROM PYTHON SOURCE LINES 110-150 .. code-block:: Python krigingMeta = krigingResult.getMetaModel() n_pts_plot = 1000 x_plot = np.vstack(np.linspace(xmin, xmax, n_pts_plot)) fig, [ax1, ax2] = plt.subplots(1, 2, figsize=(12, 6)) # On the left, the function graph = ref_func.draw(xmin, xmax, n_pts_plot) graph.setLegends(["Function"]) graphKriging = krigingMeta.draw(xmin, xmax, n_pts_plot) graphKriging.setColors(["green"]) graphKriging.setLegends(["Kriging"]) graph.add(graphKriging) cloud = ot.Cloud(x, y) cloud.setColor("red") cloud.setLegend("Data") graph.add(cloud) graph.setLegendPosition("upper left") View(graph, axes=[ax1]) # On the right, the conditional Kriging variance graph = ot.Graph("", "x", "Conditional Kriging variance", True, "") # Sample for the data sample = ot.Sample(n_pt, 2) sample[:, 0] = x cloud = ot.Cloud(sample) cloud.setColor("red") graph.add(cloud) # Sample for the variance sample = ot.Sample(n_pts_plot, 2) sample[:, 0] = x_plot variance = [[krigingResult.getConditionalCovariance(xx)[0, 0]] for xx in x_plot] sample[:, 1] = variance curve = ot.Curve(sample) curve.setColor("green") graph.add(curve) View(graph, axes=[ax2]) fig.suptitle("Kriging result") .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_002.png :alt: Kriging result :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 0.98, 'Kriging result') .. GENERATED FROM PYTHON SOURCE LINES 151-153 Display the confidence interval ------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 153-176 .. code-block:: Python level = 0.95 quantile = ot.Normal().computeQuantile((1 - level) / 2)[0] borne_sup = krigingMeta(x_plot) + quantile * np.sqrt(variance) borne_inf = krigingMeta(x_plot) - quantile * np.sqrt(variance) fig, ax = plt.subplots(figsize=(8, 8)) ax.plot(x, y, ("ro")) ax.plot(x_plot, borne_sup, "--", color="orange", label="Confidence interval") ax.plot(x_plot, borne_inf, "--", color="orange") graph_ref_func = ref_func.draw(xmin, xmax, n_pts_plot) graph_krigingMeta = krigingMeta.draw(xmin, xmax, n_pts_plot) for graph in [graph_ref_func, graph_krigingMeta]: graph.setTitle("") View(graph_ref_func, axes=[ax], plot_kw={"label": "$x sin(x)$"}) View( graph_krigingMeta, plot_kw={"color": "green", "label": "prediction"}, axes=[ax], ) legend = ax.legend() ax.autoscale() .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_003.png :alt: plot kriging advanced :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 177-179 Generate conditional trajectories --------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 181-182 support for trajectories with training samples removed .. GENERATED FROM PYTHON SOURCE LINES 182-187 .. code-block:: Python values = np.linspace(0, 10, 500) for xx in x: if len(np.argwhere(values == xx)) == 1: values = np.delete(values, np.argwhere(values == xx)[0, 0]) .. GENERATED FROM PYTHON SOURCE LINES 188-189 Conditional Gaussian process .. GENERATED FROM PYTHON SOURCE LINES 189-192 .. code-block:: Python krv = ot.KrigingRandomVector(krigingResult, np.vstack(values)) krv_sample = krv.getSample(5) .. GENERATED FROM PYTHON SOURCE LINES 193-216 .. code-block:: Python x_plot = np.vstack(np.linspace(xmin, xmax, n_pts_plot)) fig, ax = plt.subplots(figsize=(8, 6)) ax.plot(x, y, ("ro")) for i in range(krv_sample.getSize()): if i == 0: ax.plot( values, krv_sample[i, :], "--", alpha=0.8, label="Conditional trajectories" ) else: ax.plot(values, krv_sample[i, :], "--", alpha=0.8) View( graph_ref_func, axes=[ax], plot_kw={"color": "black", "label": "$x*sin(x)$"}, ) View( graph_krigingMeta, axes=[ax], plot_kw={"color": "green", "label": "prediction"}, ) legend = ax.legend() ax.autoscale() .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_004.png :alt: plot kriging advanced :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 217-219 Validation ---------- .. GENERATED FROM PYTHON SOURCE LINES 221-230 .. code-block:: Python n_valid = 10 x_valid = ot.Uniform(xmin, xmax).getSample(n_valid) X_valid = ot.Sample(x_valid) if with_error: X_valid.stack(ot.Normal(0.0, 1.5).getSample(n_valid)) y_valid = np.array(ref_func_with_error(X_valid)) else: y_valid = np.array(ref_func(X_valid)) .. GENERATED FROM PYTHON SOURCE LINES 231-233 .. code-block:: Python validation = ot.MetaModelValidation(y_valid, krigingMeta(x_valid)) .. GENERATED FROM PYTHON SOURCE LINES 234-237 .. code-block:: Python graph = validation.drawValidation() view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_005.png :alt: Metamodel validation - n = 10 :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 238-242 .. code-block:: Python graph = validation.getResidualDistribution().drawPDF() graph.setXTitle("Residuals") view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_006.png :alt: plot kriging advanced :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 243-247 Nugget effect ------------- Let us try again, but this time we optimize the nugget effect. .. GENERATED FROM PYTHON SOURCE LINES 247-250 .. code-block:: Python cov.activateNuggetFactor(True) .. GENERATED FROM PYTHON SOURCE LINES 251-252 We have to run the opitmization algorithm again. .. GENERATED FROM PYTHON SOURCE LINES 252-257 .. code-block:: Python algokriging_nugget = ot.KrigingAlgorithm(x, y, cov, basis) algokriging_nugget.setOptimizationAlgorithm(ot.NLopt("GN_DIRECT")) algokriging_nugget.run() .. GENERATED FROM PYTHON SOURCE LINES 258-259 We get the results and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 259-275 .. code-block:: Python krigingResult_nugget = algokriging_nugget.getResult() print("residual = ", krigingResult_nugget.getResiduals()) print("R2 = ", krigingResult_nugget.getRelativeErrors()) print("Optimal scale= {}".format(krigingResult_nugget.getCovarianceModel().getScale())) print( "Optimal amplitude = {}".format( krigingResult_nugget.getCovarianceModel().getAmplitude() ) ) print( "Optimal trend coefficients = {}".format( krigingResult_nugget.getTrendCoefficients() ) ) .. rst-class:: sphx-glr-script-out .. code-block:: none residual = [6.52848e-16] R2 = [4.3864e-31] Optimal scale= [1.15712] Optimal amplitude = [4.67517] Optimal trend coefficients = [-0.350133] .. GENERATED FROM PYTHON SOURCE LINES 276-279 .. code-block:: Python krigingMeta_nugget = krigingResult_nugget.getMetaModel() variance = [[krigingResult_nugget.getConditionalCovariance(xx)[0, 0]] for xx in x_plot] .. GENERATED FROM PYTHON SOURCE LINES 280-282 Plot the confidence interval again. Note that this time, it always contains the true value of the function. .. GENERATED FROM PYTHON SOURCE LINES 282-319 .. code-block:: Python # sphinx_gallery_thumbnail_number = 7 borne_sup_nugget = krigingMeta_nugget(x_plot) + quantile * np.sqrt(variance) borne_inf_nugget = krigingMeta_nugget(x_plot) - quantile * np.sqrt(variance) fig, ax = plt.subplots(figsize=(8, 8)) ax.plot(x, y, ("ro")) ax.plot( x_plot, borne_sup_nugget, "--", color="orange", label="Confidence interval with nugget", ) ax.plot(x_plot, borne_inf_nugget, "--", color="orange") graph_krigingMeta_nugget = krigingMeta_nugget.draw(xmin, xmax, n_pts_plot) graph_krigingMeta_nugget.setTitle("") View(graph_ref_func, axes=[ax], plot_kw={"label": "$x sin(x)$"}) View( graph_krigingMeta_nugget, plot_kw={"color": "green", "label": "prediction with nugget"}, axes=[ax], ) View( graph_krigingMeta, plot_kw={ "color": "green", "linestyle": "dotted", "label": "prediction without nugget", }, axes=[ax], ) legend = ax.legend() ax.autoscale() plt.show() .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_007.png :alt: plot kriging advanced :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_advanced_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 320-322 We validate the model with the nugget effect: its predictivity factor is slightly improved. .. GENERATED FROM PYTHON SOURCE LINES 322-327 .. code-block:: Python validation_nugget = ot.MetaModelValidation(y_valid, krigingMeta_nugget(x_valid)) print("R2 score with nugget: ", validation_nugget.computeR2Score()) print("R2 score without nugget: ", validation.computeR2Score()) .. rst-class:: sphx-glr-script-out .. code-block:: none R2 score with nugget: [0.884249] R2 score without nugget: [0.861246] .. GENERATED FROM PYTHON SOURCE LINES 328-329 Reset default settings .. GENERATED FROM PYTHON SOURCE LINES 329-330 .. code-block:: Python ot.ResourceMap.Reload() .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_kriging_advanced.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_kriging_advanced.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_kriging_advanced.py `