.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_surrogate_modeling/gaussian_process_regression/plot_gpr_advanced.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_surrogate_modeling_gaussian_process_regression_plot_gpr_advanced.py: Advanced Gaussian process regression ==================================== .. GENERATED FROM PYTHON SOURCE LINES 7-11 In this example we will build a metamodel using Gaussian process regression of the :math:`x\sin(x)` function. We will choose the number of learning points, the basis and the covariance model. .. GENERATED FROM PYTHON SOURCE LINES 14-20 .. code-block:: Python import openturns as ot import openturns.experimental as otexp import openturns.viewer as otv import numpy as np import matplotlib.pyplot as plt .. GENERATED FROM PYTHON SOURCE LINES 21-26 Generate design of experiments ------------------------------ We create training samples from the function :math:`x\sin(x)`. We can change their number and distribution in the :math:`[0; 10]` range. If the `with_error` boolean is `True`, then the data is computed by adding a Gaussian noise to the function values. .. GENERATED FROM PYTHON SOURCE LINES 28-34 .. code-block:: Python dim = 1 xmin = 0 xmax = 10 n_pt = 20 # number of initial points with_error = True # whether to use generation with error .. GENERATED FROM PYTHON SOURCE LINES 35-48 .. code-block:: Python ref_func_with_error = ot.SymbolicFunction(["x", "eps"], ["x * sin(x) + eps"]) ref_func = ot.ParametricFunction(ref_func_with_error, [1], [0.0]) x = np.vstack(np.linspace(xmin, xmax, n_pt)) ot.RandomGenerator.SetSeed(1235) eps = ot.Normal(0, 1.5).getSample(n_pt) X = ot.Sample(n_pt, 2) X[:, 0] = x X[:, 1] = eps if with_error: y = np.array(ref_func_with_error(X)) else: y = np.array(ref_func(x)) .. GENERATED FROM PYTHON SOURCE LINES 49-59 .. code-block:: Python graph = ref_func.draw(xmin, xmax, 200) cloud = ot.Cloud(x, y) cloud.setColor("red") cloud.setPointStyle("bullet") graph.add(cloud) graph.setLegends(["Function", "Data"]) graph.setLegendPosition("upper left") graph.setTitle("Sample size = %d" % (n_pt)) view = otv.View(graph) .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_001.svg :alt: Sample size = 20 :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_001.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 60-62 Create the Gaussian process regression algorithm ------------------------------------------------ .. GENERATED FROM PYTHON SOURCE LINES 62-90 .. code-block:: Python # 1. Basis ot.ResourceMap.SetAsBool("GaussianProcessFitter-UseAnalyticalAmplitudeEstimate", True) basis = ot.ConstantBasisFactory(dim).build() print(basis) # 2. Covariance model cov = ot.MaternModel([1.0], [2.5], 1.5) print(cov) # 3. Gaussian process fitter algorithm algogpfitter = otexp.GaussianProcessFitter(x, y, cov, basis) # 4. Optimization # algogpr.setOptimizationAlgorithm(ot.NLopt('GN_DIRECT')) lhsExperiment = ot.LHSExperiment(ot.Uniform(1e-1, 1e2), 50) algogpfitter.setOptimizationAlgorithm(ot.MultiStart(ot.TNC(), lhsExperiment.generate())) # 5. Run the algorithm algogpfitter.run() gpfitterResult = algogpfitter.getResult() # 6. Gaussian process regression algorithm algogpr = otexp.GaussianProcessRegression(gpfitterResult) # 7. Run the algorithm algogpr.run() .. rst-class:: sphx-glr-script-out .. code-block:: none Basis( [[x0]->[1]] ) MaternModel(scale=[1], amplitude=[2.5], nu=1.5) .. GENERATED FROM PYTHON SOURCE LINES 91-93 Results ------- .. GENERATED FROM PYTHON SOURCE LINES 95-96 Get some results .. GENERATED FROM PYTHON SOURCE LINES 96-101 .. code-block:: Python gprResult = algogpr.getResult() print("Optimal scale= {}".format(gprResult.getCovarianceModel().getScale())) print("Optimal amplitude = {}".format(gprResult.getCovarianceModel().getAmplitude())) print("Optimal trend coefficients = {}".format(gprResult.getTrendCoefficients())) .. rst-class:: sphx-glr-script-out .. code-block:: none Optimal scale= [0.818671] Optimal amplitude = [4.51224] Optimal trend coefficients = [-0.115697] .. GENERATED FROM PYTHON SOURCE LINES 102-103 Get the metamodel .. GENERATED FROM PYTHON SOURCE LINES 103-145 .. code-block:: Python gprMeta = gprResult.getMetaModel() n_pts_plot = 1000 x_plot = np.vstack(np.linspace(xmin, xmax, n_pts_plot)) fig, [ax1, ax2] = plt.subplots(1, 2, figsize=(12, 6)) # On the left, the function graph = ref_func.draw(xmin, xmax, n_pts_plot) graph.setLegends(["Function"]) graphGpr = gprMeta.draw(xmin, xmax, n_pts_plot) graphGpr.setColors(["green"]) graphGpr.setLegends(["Gaussian process regression"]) graph.add(graphGpr) cloud = ot.Cloud(x, y) cloud.setColor("red") cloud.setLegend("Data") graph.add(cloud) graph.setLegendPosition("upper left") otv.View(graph, axes=[ax1]) # On the right, the conditional Gaussian process regression variance graph = ot.Graph("", "x", "Conditional Gaussian process regression variance", True, "") # Sample for the data sample = ot.Sample(n_pt, 2) sample[:, 0] = x cloud = ot.Cloud(sample) cloud.setColor("red") graph.add(cloud) # Sample for the variance sample = ot.Sample(n_pts_plot, 2) sample[:, 0] = x_plot variance = otexp.GaussianProcessConditionalCovariance( gprResult ).getConditionalMarginalVariance(x_plot) sample[:, 1] = variance curve = ot.Curve(sample) curve.setColor("green") graph.add(curve) otv.View(graph, axes=[ax2]) fig.suptitle("Gaussian process regression result") .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_002.svg :alt: Gaussian process regression result :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_002.svg :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none Text(0.5, 0.98, 'Gaussian process regression result') .. GENERATED FROM PYTHON SOURCE LINES 146-148 Display the confidence interval ------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 148-171 .. code-block:: Python level = 0.95 quantile = ot.Normal().computeQuantile((1 - level) / 2)[0] borne_sup = gprMeta(x_plot) + quantile * np.sqrt(variance) borne_inf = gprMeta(x_plot) - quantile * np.sqrt(variance) fig, ax = plt.subplots(figsize=(8, 8)) ax.plot(x, y, ("ro")) ax.plot(x_plot, borne_sup, "--", color="orange", label="Confidence interval") ax.plot(x_plot, borne_inf, "--", color="orange") graph_ref_func = ref_func.draw(xmin, xmax, n_pts_plot) graph_gprMeta = gprMeta.draw(xmin, xmax, n_pts_plot) for graph in [graph_ref_func, graph_gprMeta]: graph.setTitle("") otv.View(graph_ref_func, axes=[ax], plot_kw={"label": r"$x\sin(x)$"}) otv.View( graph_gprMeta, plot_kw={"color": "green", "label": "prediction"}, axes=[ax], ) legend = ax.legend() ax.autoscale() .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_003.svg :alt: plot gpr advanced :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_003.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 172-174 Generate conditional trajectories --------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 176-177 Support for trajectories .. GENERATED FROM PYTHON SOURCE LINES 177-179 .. code-block:: Python grid = ot.IntervalMesher([500]).build(ot.Interval(0, 10)) .. GENERATED FROM PYTHON SOURCE LINES 180-181 Conditional Gaussian process .. GENERATED FROM PYTHON SOURCE LINES 181-186 .. code-block:: Python krv = otexp.ConditionedGaussianProcess(gprResult, grid) krv_sample = krv.getSample(5) values = grid.getVertices() .. GENERATED FROM PYTHON SOURCE LINES 187-215 .. code-block:: Python x_plot = grid.getVertices() fig, ax = plt.subplots(figsize=(8, 6)) ax.plot(x, y, ("ro")) for i in range(krv_sample.getSize()): if i == 0: ax.plot( values, krv_sample[i].asPoint(), "--", alpha=0.8, label="Conditional trajectories", ) else: ax.plot(values, krv_sample[i].asPoint(), "--", alpha=0.8) otv.View( graph_ref_func, axes=[ax], plot_kw={"color": "black", "label": r"$x\sin(x)$"}, ) otv.View( graph_gprMeta, axes=[ax], plot_kw={"color": "green", "label": "prediction"}, ) legend = ax.legend() ax.autoscale() .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_004.svg :alt: plot gpr advanced :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_004.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 216-218 Validation ---------- .. GENERATED FROM PYTHON SOURCE LINES 220-229 .. code-block:: Python n_valid = 10 x_valid = ot.Uniform(xmin, xmax).getSample(n_valid) X_valid = ot.Sample(x_valid) if with_error: X_valid.stack(ot.Normal(0.0, 1.5).getSample(n_valid)) y_valid = np.array(ref_func_with_error(X_valid)) else: y_valid = np.array(ref_func(X_valid)) .. GENERATED FROM PYTHON SOURCE LINES 230-234 .. code-block:: Python metamodelPredictions = gprMeta(x_valid) validation = ot.MetaModelValidation(y_valid, metamodelPredictions) validation.computeR2Score() .. raw:: html
class=Point name=Unnamed dimension=1 values=[0.692168]


.. GENERATED FROM PYTHON SOURCE LINES 235-238 .. code-block:: Python graph = validation.drawValidation() view = otv.View(graph) .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_005.svg :alt: Metamodel validation - n = 10 :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_005.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 239-243 .. code-block:: Python graph = validation.getResidualDistribution().drawPDF() graph.setXTitle("Residuals") view = otv.View(graph) .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_006.svg :alt: plot gpr advanced :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_006.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 244-248 Nugget effect ------------- Let us try again, but this time we optimize the nugget effect. .. GENERATED FROM PYTHON SOURCE LINES 248-251 .. code-block:: Python cov.activateNuggetFactor(True) .. GENERATED FROM PYTHON SOURCE LINES 252-253 We have to run the optimization algorithm again. .. GENERATED FROM PYTHON SOURCE LINES 253-260 .. code-block:: Python algogpfitter = otexp.GaussianProcessFitter(x, y, cov, basis) algogpfitter.setOptimizationAlgorithm(ot.NLopt("GN_DIRECT")) algogpfitter.run() algogpr_nugget = otexp.GaussianProcessRegression(algogpfitter.getResult()) algogpr_nugget.run() .. GENERATED FROM PYTHON SOURCE LINES 261-262 We get the results and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 262-272 .. code-block:: Python gprResult_nugget = algogpr_nugget.getResult() print("Optimal scale= {}".format(gprResult_nugget.getCovarianceModel().getScale())) print( "Optimal amplitude = {}".format( gprResult_nugget.getCovarianceModel().getAmplitude() ) ) print("Optimal trend coefficients = {}".format(gprResult_nugget.getTrendCoefficients())) .. rst-class:: sphx-glr-script-out .. code-block:: none Optimal scale= [1.15712] Optimal amplitude = [4.67517] Optimal trend coefficients = [-0.350133] .. GENERATED FROM PYTHON SOURCE LINES 273-279 .. code-block:: Python gprMeta_nugget = gprResult_nugget.getMetaModel() gpr_conditional_covariance = otexp.GaussianProcessConditionalCovariance( gprResult_nugget ) variance = gpr_conditional_covariance.getConditionalMarginalVariance(x_plot) .. GENERATED FROM PYTHON SOURCE LINES 280-282 Plot the confidence interval again. Note that this time, it always contains the true value of the function. .. GENERATED FROM PYTHON SOURCE LINES 282-317 .. code-block:: Python # sphinx_gallery_thumbnail_number = 7 borne_sup_nugget = gprMeta_nugget(x_plot) + quantile * np.sqrt(variance) borne_inf_nugget = gprMeta_nugget(x_plot) - quantile * np.sqrt(variance) fig, ax = plt.subplots(figsize=(8, 8)) ax.plot(x, y, ("ro")) ax.plot( x_plot, borne_sup_nugget, "--", color="orange", label="Confidence interval with nugget", ) ax.plot(x_plot, borne_inf_nugget, "--", color="orange") graph_gprMeta_nugget = gprMeta_nugget.draw(xmin, xmax, n_pts_plot) graph_gprMeta_nugget.setTitle("") otv.View(graph_ref_func, axes=[ax], plot_kw={"label": "$x sin(x)$"}) otv.View( graph_gprMeta_nugget, plot_kw={"color": "green", "label": "prediction with nugget"}, axes=[ax], ) otv.View( graph_gprMeta, plot_kw={ "color": "green", "linestyle": "dotted", "label": "prediction without nugget", }, axes=[ax], ) legend = ax.legend() ax.autoscale() .. image-sg:: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_007.svg :alt: plot gpr advanced :srcset: /auto_surrogate_modeling/gaussian_process_regression/images/sphx_glr_plot_gpr_advanced_007.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 318-320 We validate the model with the nugget effect: its predictivity factor is slightly improved. .. GENERATED FROM PYTHON SOURCE LINES 320-325 .. code-block:: Python validation_nugget = ot.MetaModelValidation(y_valid, gprMeta_nugget(x_valid)) print("R2 score with nugget: ", validation_nugget.computeR2Score()) print("R2 score without nugget: ", validation.computeR2Score()) .. rst-class:: sphx-glr-script-out .. code-block:: none R2 score with nugget: [0.737924] R2 score without nugget: [0.692168] .. GENERATED FROM PYTHON SOURCE LINES 326-327 Display all figures .. GENERATED FROM PYTHON SOURCE LINES 327-328 .. code-block:: Python otv.View.ShowAll() .. _sphx_glr_download_auto_surrogate_modeling_gaussian_process_regression_plot_gpr_advanced.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gpr_advanced.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gpr_advanced.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gpr_advanced.zip `