.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_calibration/least_squares_and_gaussian_calibration/plot_calibration_chaboche.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: Calibration of the Chaboche mechanical model ============================================ In this example we present calibration methods on the Chaboche model. A detailed explanation of this mechanical law is presented :ref:`here `. As we are going to see, this model is relatively simple to calibrate: its parameters are identifiable and the output is relatively sensitive to the variation of the parameters. Hence, all methods perform correctly in this case. In this example, we use both least squares methods and Bayesian Gaussian methods. Please read :ref:`code_calibration` for more details on code calibration and least squares and read :ref:`gaussian_calibration` for more details on Gaussian calibration. Parameters to calibrate and observations ---------------------------------------- The vector of parameters to calibrate is: .. math:: \theta = (R,C,\gamma). We consider a data set where the number of observations is equal to: .. math:: n = 10. The observations are the pairs :math:`\{(\epsilon_i,\sigma_i)\}_{i=1,...,n}`, i.e. each observation is a couple made of the strain and the corresponding stress. In the particular situation where we want to calibrate this model, the following list presents which variables are observed input variables, input calibrated variables and observed output variables. - :math:`\epsilon`: Input. Observed. - :math:`R`, :math:`C`, :math:`\gamma` : Inputs. Calibrated. - :math:`\sigma`: Output. Observed. .. GENERATED FROM PYTHON SOURCE LINES 47-54 .. code-block:: Python import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt from openturns.usecases import chaboche_model ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 55-63 Define the observations ----------------------- In practice, we generally use a data set which has been obtained from measurements. This data set can be loaded using e.g. :meth:`~openturns.Sample.ImportFromCSVFile`. Here we import the data from the :class:`~openturns.usecases.chaboche_model.ChabocheModel` class. .. GENERATED FROM PYTHON SOURCE LINES 65-71 .. code-block:: Python cm = chaboche_model.ChabocheModel() print(cm.data) observedStrain = cm.data[:, 0] observedStress = cm.data[:, 1] nbobs = cm.data.getSize() .. rst-class:: sphx-glr-script-out .. code-block:: none [ Strain Stress (Pa) ] 0 : [ 0 7.56e+08 ] 1 : [ 0.0077 7.57e+08 ] 2 : [ 0.0155 7.85e+08 ] 3 : [ 0.0233 8.19e+08 ] 4 : [ 0.0311 8.01e+08 ] 5 : [ 0.0388 8.42e+08 ] 6 : [ 0.0466 8.49e+08 ] 7 : [ 0.0544 8.79e+08 ] 8 : [ 0.0622 8.85e+08 ] 9 : [ 0.07 8.96e+08 ] .. GENERATED FROM PYTHON SOURCE LINES 72-73 Print the Chaboche model .. GENERATED FROM PYTHON SOURCE LINES 75-78 .. code-block:: Python print("Inputs:", cm.model.getInputDescription()) print("Outputs:", cm.model.getOutputDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Inputs: [Strain,R,C,Gamma] Outputs: [Sigma] .. GENERATED FROM PYTHON SOURCE LINES 79-85 We see that there are four inputs: `Strain`, `R`, `C` and `Gamma` and one output: `Sigma`. The `Strain` is observed on input and the stress `Sigma` is observed on output. Using these observations, we want to estimate the parameters `R`, `C` and `Gamma`. .. GENERATED FROM PYTHON SOURCE LINES 87-93 Set the calibration parameters ------------------------------ In this part, we begin the calibration study. Define the value of the reference values of the :math:`\theta` parameter. In the Bayesian framework, this is called the mean of the *prior* Gaussian distribution. In the data assimilation framework, this is called the *background*. .. GENERATED FROM PYTHON SOURCE LINES 95-100 .. code-block:: Python R = 700e6 # Exact : 750e6 C = 2500e6 # Exact : 2750e6 Gamma = 8.0 # Exact : 10 thetaPrior = [R, C, Gamma] .. GENERATED FROM PYTHON SOURCE LINES 101-125 In the physical model, the inputs and parameters are ordered as presented in the next table. Notice that there are no parameters in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ | 1 | R | +-------+----------------+ | 2 | C | +-------+----------------+ | 3 | Gamma | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | ∅ | ∅ | +-------+-----------+ **Table 1.** Indices and names of the inputs and parameters of the physical model. .. GENERATED FROM PYTHON SOURCE LINES 127-130 .. code-block:: Python print("Physical Model Inputs:", cm.model.getInputDescription()) print("Physical Model Parameters:", cm.model.getParameterDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Physical Model Inputs: [Strain,R,C,Gamma] Physical Model Parameters: [] .. GENERATED FROM PYTHON SOURCE LINES 131-154 In order to perform calibration, we have to define a parametric model, with observed inputs and parameters to calibrate. In order to do this, we create a :class:`~openturns.ParametricFunction` where the parameters are `R`, `C` and `Gamma` which have the indices 1, 2 and 3 in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | 0 | R | +-------+-----------+ | 1 | C | +-------+-----------+ | 3 | Gamma | +-------+-----------+ **Table 2.** Indices and names of the inputs and parameters of the parametric model. .. GENERATED FROM PYTHON SOURCE LINES 156-159 The following statement create the calibrated function from the model. The calibrated parameters `R`, `C`, `Gamma` are at indices 1, 2, 3 in the inputs arguments of the model. .. GENERATED FROM PYTHON SOURCE LINES 161-164 .. code-block:: Python calibratedIndices = [1, 2, 3] mycf = ot.ParametricFunction(cm.model, calibratedIndices, thetaPrior) .. GENERATED FROM PYTHON SOURCE LINES 165-166 Then we plot the model and compare it to the observations. .. GENERATED FROM PYTHON SOURCE LINES 168-185 .. code-block:: Python graph = ot.Graph("Model before calibration", "Strain", "Stress (MPa)", True) # Plot the model curve = mycf.draw(cm.strainMin, cm.strainMax, 50).getDrawable(0) curve.setLegend("Model before calibration") curve.setLineStyle(ot.ResourceMap.GetAsString("CalibrationResult-ObservationLineStyle")) graph.add(curve) # Plot the noisy observations cloud = ot.Cloud(observedStrain, observedStress) cloud.setLegend("Observations") cloud.setPointStyle( ot.ResourceMap.GetAsString("CalibrationResult-ObservationPointStyle") ) graph.add(cloud) graph.setColors(ot.Drawable.BuildDefaultPalette(2)) graph.setLegendPosition("topleft") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :alt: Model before calibration :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 186-190 We see that the observations are relatively noisy, but that the trend is clear: this shows that it may be possible to fit the model. At this point, we have a data set that we can use for calibration and a model to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 193-198 Calibration with linear least squares ------------------------------------- The :class:`~openturns.LinearLeastSquaresCalibration` class performs the linear least squares calibration by linearizing the model in the neighbourhood of the reference point. .. GENERATED FROM PYTHON SOURCE LINES 200-204 .. code-block:: Python algo = ot.LinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior, "SVD" ) .. GENERATED FROM PYTHON SOURCE LINES 205-207 The :meth:`~openturns.LinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 209-212 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 213-217 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior density of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 217-229 .. code-block:: Python def printRCGamma(parameter, indentation=" "): """ Print the [R, C, Gamma] vector with readable units. """ print(indentation, "R = %.1f (MPa)" % (parameter[0] / 1.0e6)) print(indentation, "C = %.1f (MPa)" % (parameter[1] / 1.0e6)) print(indentation, "Gamma = %.4f" % (parameter[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 230-239 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print("theta After = ") printRCGamma(thetaMAP) print("theta Before = ") printRCGamma(thetaPrior) print("theta True = ") thetaTrue = [cm.trueR, cm.trueC, cm.trueGamma] printRCGamma(thetaTrue) .. rst-class:: sphx-glr-script-out .. code-block:: none theta After = R = 751.2 (MPa) C = 2209.6 (MPa) Gamma = 0.7726 theta Before = R = 700.0 (MPa) C = 2500.0 (MPa) Gamma = 8.0000 theta True = R = 750.0 (MPa) C = 2750.0 (MPa) Gamma = 10.0000 .. GENERATED FROM PYTHON SOURCE LINES 240-241 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 241-261 .. code-block:: Python def printRCGammaInterval(interval, indentation=" "): """ Print the [R, C, Gamma] C.I. with readable units. """ lowerBound = interval.getLowerBound() upperBound = interval.getUpperBound() print( indentation, "R in [%.1f, %.1f]" % (lowerBound[0] / 1.0e6, upperBound[0] / 1.0e6), ) print( indentation, "C in [%.1f, %.1f]" % (lowerBound[1] / 1.0e6, upperBound[1] / 1.0e6), ) print(indentation, "Gamma in [%.4f, %.4f]" % (lowerBound[2], upperBound[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 262-270 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [730.3, 772.1] C in [758.2, 3661.0] Gamma in [-1450.7629, 1452.3081] .. GENERATED FROM PYTHON SOURCE LINES 271-277 We can see that the :math:`\gamma` parameter has a large confidence interval relative to the true value of the parameter: even the sign of the parameter is unknown. The parameter which is calibrated with the smallest (relative) confidence interval is :math:`R`. This is why this parameter seems to be the most important in this case. .. GENERATED FROM PYTHON SOURCE LINES 279-281 Increase the default number of points in the plots. This produces smoother spiky distributions. .. GENERATED FROM PYTHON SOURCE LINES 283-285 .. code-block:: Python ot.ResourceMap.SetAsUnsignedInteger("Distribution-DefaultPointNumber", 300) .. GENERATED FROM PYTHON SOURCE LINES 286-288 We now plot the predicted output stress depending on the input strain before and after calibration. .. GENERATED FROM PYTHON SOURCE LINES 288-294 .. code-block:: Python # sphinx_gallery_thumbnail_number = 3 graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("topleft") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 295-297 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 299-300 We can also plot the predicted stress depending on the observed stress. .. GENERATED FROM PYTHON SOURCE LINES 302-305 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 306-308 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 310-317 The :meth:`~openturns.CalibrationResult.getObservationsError` method returns the estimated distribution of the observation error. This is necessarily a Gaussian distribution, because this is the distribution that we assume when we use least squares. By hypothesis, the distribution has a zero mean (this is a property of linear least squares). The standard deviation is estimated from the residuals after calibration. .. GENERATED FROM PYTHON SOURCE LINES 319-322 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1.14338e+07) .. GENERATED FROM PYTHON SOURCE LINES 323-326 In order to validate that the distribution of the residuals is Gaussian after calibration, we use the :meth:`~openturns.CalibrationResult.drawResiduals` method. .. GENERATED FROM PYTHON SOURCE LINES 328-336 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 337-346 The analysis of the distribution of the residuals after calibration shows that the distribution is centered on zero and is symmetric. This indicates that the calibration performed well. With the least squares method, the standard deviation of the observation noise is estimated from the data. On the plot, we see that the Gaussian with zero mean and estimated standard deviation is close to the distribution of the residuals after calibration. This shows that the distribution of the residuals is close to being Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 348-353 This can also be seen on a Normal-plot. This is a QQ-plot applied to the normal distribution. We could use the :meth:`~openturns.VisualTest.DrawHenryLine` method to plot it, but :meth:`~openturns.CalibrationResult.drawResidualsNormalPlot` does it directly. .. GENERATED FROM PYTHON SOURCE LINES 355-358 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 359-361 We see that the residuals fit to the Normal distribution, according to the normal plot. .. GENERATED FROM PYTHON SOURCE LINES 363-368 The parameters which best fit to the data may be sensitive to the random noise in the observed outputs. In order to see how this source of randomness changes the optimum parameter, we use :meth:`~openturns.CalibrationResult.drawParameterDistributions`. .. GENERATED FROM PYTHON SOURCE LINES 370-378 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 379-385 We see that the :math:`R` and :math:`C` parameters are relatively accurately estimated. The parameter :math:`\gamma` is not estimated accurately, as can be be seen from the very large spread of the distribution of this parameter. This may reveal that there are identifiability issues with this parameter when we use a linearization of the model. .. GENERATED FROM PYTHON SOURCE LINES 388-424 .. code-block:: Python def plotDistributionGridPDF(distribution): """ Plot the marginal and bi-dimensional iso-PDF on a grid. Parameters ---------- distribution : ot.Distribution The distribution. Returns ------- grid : ot.GridLayout(dimension, dimension) The grid of plots. """ dimension = distribution.getDimension() grid = ot.GridLayout(dimension, dimension) for i in range(dimension): for j in range(dimension): if i == j: distributionI = distribution.getMarginal([i]) graph = distributionI.drawPDF() else: distributionJI = distribution.getMarginal([j, i]) graph = distributionJI.drawPDF() graph.setLegends([""]) graph.setTitle("") if i < dimension - 1: graph.setXTitle("") if j > 0: graph.setYTitle("") grid.setGraph(i, j, graph) grid.setTitle("Iso-PDF values") return grid .. GENERATED FROM PYTHON SOURCE LINES 425-426 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 426-435 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 436-442 Since linear least squares calibration perform well, the study could stop here. In the next cell, we use other methods to see how this can change the results. We are going to see that the lack of identifiability of the :math:`\gamma` parameter can be regularized using Gausssian calibration methods. .. GENERATED FROM PYTHON SOURCE LINES 444-449 Calibration with nonlinear least squares ---------------------------------------- The :class:`~openturns.NonLinearLeastSquaresCalibration` class performs the non linear least squares calibration by minimizing the squared Euclidian norm between the predictions and the observations. .. GENERATED FROM PYTHON SOURCE LINES 451-455 .. code-block:: Python algo = ot.NonLinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior ) .. GENERATED FROM PYTHON SOURCE LINES 456-460 The optimization algorithm is automatically selected using the default solver which can build a :class:`~openturns.LeastSquaresProblem`. We can see which solved is used using :meth:`~openturns.NonLinearLeastSquaresCalibration.getOptimizationAlgorithm`. .. GENERATED FROM PYTHON SOURCE LINES 462-464 .. code-block:: Python print(algo.getOptimizationAlgorithm()) .. rst-class:: sphx-glr-script-out .. code-block:: none class=OptimizationAlgorithm implementation=class=Ceres class=OptimizationAlgorithmImplementation problem=class=OptimizationProblem implementation=class=LeastSquaresProblem residual function=class=Function name=Unnamed implementation=class=FunctionImplementation name=Unnamed description=[] evaluationImplementation=class=NoEvaluation name=Unnamed gradientImplementation=class=NoGradient name=Unnamed hessianImplementation=class=NoHessian name=Unnamed dimension=0 startingPoint=class=Point name=Unnamed dimension=0 values=[] maximumIterationNumber=100 maximumEvaluationNumber=1000 maximumAbsoluteError=1e-05 maximumRelativeError=1e-05 maximumResidualError=1e-05 maximumConstraintError=1e-05 verbose=false .. GENERATED FROM PYTHON SOURCE LINES 465-467 The :meth:`~openturns.NonLinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 469-472 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 473-475 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 477-479 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 481-484 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.49701e+08,2.3567e+09,2.68804] .. GENERATED FROM PYTHON SOURCE LINES 485-486 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 488-495 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [717.0, 766.6] C in [986.3, 5795.6] Gamma in [-20.1227, 50.4506] .. GENERATED FROM PYTHON SOURCE LINES 496-504 We can see that :math:`R` and :math:`C` are accurately estimated and that :math:`\gamma` is estimated with a relatively large confidence interval. Notice, however, that the interval is much more narrow than the one with obtained using linear least squares. This is because the optimization algorithm that we used implicitly introduced some regularization that was absent from linear least squares. .. GENERATED FROM PYTHON SOURCE LINES 506-507 We now check the observations depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 509-513 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("topleft") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 514-516 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 518-521 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 522-524 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 526-527 We now focus on the distribution of the errors. .. GENERATED FROM PYTHON SOURCE LINES 529-532 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = -2002.54, sigma = 1.01979e+07) .. GENERATED FROM PYTHON SOURCE LINES 533-537 We see that the distribution is Gaussian (this is by hypothesis) with a mean relatively close to zero. Indeed, please consider that the stress has an order of magnitude close to :math:`10^8`. .. GENERATED FROM PYTHON SOURCE LINES 539-540 As with any other distribution, we can draw its PDF. .. GENERATED FROM PYTHON SOURCE LINES 540-543 .. code-block:: Python graph = observationError.drawPDF() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 544-552 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 553-558 The analysis of the residuals shows that the distribution is centered on zero and symmetric. This indicates that the calibration performed well. Moreover, the distribution of the residuals is close to being Gaussian. This indicates that the observation error might be Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 560-568 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 569-572 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 573-574 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 576-585 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 586-593 We see that the parameters are relatively well estimated, although the parameter :math:`\gamma` still has a significant dispersion (but this is significantly reduced compared to linear least squares). Since this parameter cannot be identified easily, it may be appropriate to consider Gaussian calibration, which adds some regularity to the problem that may solve identificability issues. .. GENERATED FROM PYTHON SOURCE LINES 595-599 Gaussian calibration parameters ------------------------------- In this part, we set the parameters of the Gaussian calibration. We first set the standard deviation of the observations errors. .. GENERATED FROM PYTHON SOURCE LINES 601-603 .. code-block:: Python sigmaStress = 1.0e7 # (Pa) .. GENERATED FROM PYTHON SOURCE LINES 604-607 Define the covariance matrix of the output Y of the model. Since the dimension of the output is equal to 1, this must be a 1-by-1 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 609-612 .. code-block:: Python errorCovariance = ot.CovarianceMatrix(1) errorCovariance[0, 0] = sigmaStress ** 2 .. GENERATED FROM PYTHON SOURCE LINES 613-614 Define the covariance matrix of the parameters :math:`\theta` to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 616-620 .. code-block:: Python sigmaR = 0.1 * R sigmaC = 0.1 * C sigmaGamma = 0.1 * Gamma .. GENERATED FROM PYTHON SOURCE LINES 621-623 Since there are 3 parameters, the prior covariance matrix is a 3-by-3 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 625-631 .. code-block:: Python sigma = ot.CovarianceMatrix(3) sigma[0, 0] = sigmaR ** 2 sigma[1, 1] = sigmaC ** 2 sigma[2, 2] = sigmaGamma ** 2 print(sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none [[ 4.9e+15 0 0 ] [ 0 6.25e+16 0 ] [ 0 0 0.64 ]] .. GENERATED FROM PYTHON SOURCE LINES 632-637 Gaussian linear calibration --------------------------- The :class:`~openturns.GaussianLinearCalibration` class performs the Gaussian linear calibration by linearizing the model in the neighbourhood of the prior. This is also known as the Kalman filter. .. GENERATED FROM PYTHON SOURCE LINES 639-643 .. code-block:: Python algo = ot.GaussianLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 644-646 The :meth:`~openturns.GaussianLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 648-651 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 652-656 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 658-661 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.48807e+08,2.67622e+09,7.75012] .. GENERATED FROM PYTHON SOURCE LINES 662-667 We can compute a 95% credibility interval of the parameter :math:`\theta^\star` (when we consider Bayesian methods, confidence intervals are called credibility intervals). This interval reflects the interval that contains 95% of the posterior distribution. .. GENERATED FROM PYTHON SOURCE LINES 669-676 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [736.2, 761.5] C in [2312.0, 3040.5] Gamma in [5.9294, 9.5708] .. GENERATED FROM PYTHON SOURCE LINES 677-679 We can see that all three parameters are estimated with a relatively small confidence interval (including :math:`\gamma`). .. GENERATED FROM PYTHON SOURCE LINES 681-682 Let us analyze the validation graphics. .. GENERATED FROM PYTHON SOURCE LINES 684-688 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("topleft") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 689-691 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 693-696 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 697-699 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 701-704 The observation error is an hypothesis of the Gaussian calibration. This is the Gaussian distribution that we introduced in the model. .. GENERATED FROM PYTHON SOURCE LINES 706-709 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 710-718 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 719-723 The analysis of the Gaussian distribution of the observation errors is close to the posterior distribution of the residuals. Moreover, the posterior distribution is centered. These information indicate that the calibration performed well. .. GENERATED FROM PYTHON SOURCE LINES 725-728 The posterior distribution of the parameters allows one to see if the observations bring significant information compared to the prior Gaussian distributions. .. GENERATED FROM PYTHON SOURCE LINES 730-738 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 739-749 We see that the posterior distribution of :math:`R` is spiky. For the :math:`R` variable, the observations are much more important than the prior: this is shown by the fact that the posterior and prior distribution of the :math:`R` variable are very different. We see that the prior and posterior distribution are close to each other for the :math:`\gamma` variable: the observations did not convey much information for this variable. This shows that this parameters is difficult to identify. One potential solution to estimate this parameter is, if it can be identified, to use a larger sample size (but this is not always easy in practice). .. GENERATED FROM PYTHON SOURCE LINES 751-752 We can check that if the residuals after calibration are normal. .. GENERATED FROM PYTHON SOURCE LINES 754-757 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 758-759 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 761-770 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 771-776 Gaussian nonlinear calibration ------------------------------ The :class:`~openturns.GaussianNonLinearCalibration` class performs the Gaussian nonlinear calibration. This algorithm is also known as 3DVar. .. GENERATED FROM PYTHON SOURCE LINES 778-782 .. code-block:: Python algo = ot.GaussianNonLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 783-785 The :meth:`~openturns.GaussianNonLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 787-790 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 791-793 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 795-797 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 799-802 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.4895e+08,2.67065e+09,7.73735] .. GENERATED FROM PYTHON SOURCE LINES 803-804 We can compute a 95% credibility interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 804-811 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [732.2, 759.4] C in [2466.8, 2994.8] Gamma in [7.3087, 7.8756] .. GENERATED FROM PYTHON SOURCE LINES 812-814 The previous credibility interval can be compared to the one that we previously obtains with the linear Gaussian calibration method. .. GENERATED FROM PYTHON SOURCE LINES 816-820 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("topleft") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 821-823 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 825-828 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 829-831 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 833-836 The method to compute the distribution of the error and the posterior distribution of the parameter depends on the default value of the bootstrap size. .. GENERATED FROM PYTHON SOURCE LINES 836-838 .. code-block:: Python print("Bootstrap size : ", algo.getBootstrapSize()) .. rst-class:: sphx-glr-script-out .. code-block:: none Bootstrap size : 100 .. GENERATED FROM PYTHON SOURCE LINES 839-840 The default value of the parameter uses bootstrap to estimate the distribution. .. GENERATED FROM PYTHON SOURCE LINES 842-844 By default, the observation error is predicted by bootstraping the problem at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 846-849 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = -99660.4, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 850-851 This can be compared to the residuals distribution, which is computed at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 853-861 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 862-863 The analysis is very similar to the linear Gaussian calibration. .. GENERATED FROM PYTHON SOURCE LINES 865-873 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 874-877 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 878-879 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 881-890 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 891-895 We see that the prior and posterior distribution for the :math:`\gamma` parameter are close to each other, but not superimposed: the observations significantly brought information to the variable :math:`\gamma` during the calibration. .. GENERATED FROM PYTHON SOURCE LINES 895-898 .. code-block:: Python otv.View.ShowAll() .. GENERATED FROM PYTHON SOURCE LINES 899-900 Reset default settings .. GENERATED FROM PYTHON SOURCE LINES 900-901 .. code-block:: Python ot.ResourceMap.Reload() .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 6.922 seconds) .. _sphx_glr_download_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_calibration_chaboche.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_calibration_chaboche.py `