.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_calibration/least_squares_and_gaussian_calibration/plot_calibration_chaboche.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: Calibration of the Chaboche mechanical model ============================================ In this example we present calibration methods on the Chaboche model. A detailed explanation of this mechanical law is presented :ref:`here `. As we are going to see, this model is relatively simple to calibrate: its parameters are identifiable and the output is relatively sensitive to the variation of the parameters. Hence, all methods perform correctly in this case. In this example, we use both least squares methods and Bayesian Gaussian methods. Please read :ref:`code_calibration` for more details on code calibration and least squares and read :ref:`gaussian_calibration` for more details on Gaussian calibration. Parameters to calibrate and observations ---------------------------------------- The vector of parameters to calibrate is: .. math:: \theta = (R,C,\gamma). We consider a data set where the number of observations is equal to: .. math:: n = 10. The observations are the pairs :math:`\{(\epsilon_i,\sigma_i)\}_{i=1,...,n}`, i.e. each observation is a couple made of the strain and the corresponding stress. In the particular situation where we want to calibrate this model, the following list presents which variables are observed input variables, input calibrated variables and observed output variables. - :math:`\epsilon`: Input. Observed. - :math:`R`, :math:`C`, :math:`\gamma` : Inputs. Calibrated. - :math:`\sigma`: Output. Observed. .. GENERATED FROM PYTHON SOURCE LINES 45-52 .. code-block:: Python import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt from openturns.usecases import chaboche_model ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 53-61 Define the observations ----------------------- In practice, we generally use a data set which has been obtained from measurements. This data set can be loaded using e.g. :meth:`~openturns.Sample.ImportFromCSVFile`. Here we import the data from the :class:`~openturns.usecases.chaboche_model.ChabocheModel` class. .. GENERATED FROM PYTHON SOURCE LINES 63-70 .. code-block:: Python ot.RandomGenerator.SetSeed(0) cm = chaboche_model.ChabocheModel() print(cm.data) observedStrain = cm.data[:, 0] observedStress = cm.data[:, 1] nbobs = cm.data.getSize() .. rst-class:: sphx-glr-script-out .. code-block:: none [ Strain Stress (Pa) ] 0 : [ 0 7.56e+08 ] 1 : [ 0.0077 7.57e+08 ] 2 : [ 0.0155 7.85e+08 ] 3 : [ 0.0233 8.19e+08 ] 4 : [ 0.0311 8.01e+08 ] 5 : [ 0.0388 8.42e+08 ] 6 : [ 0.0466 8.49e+08 ] 7 : [ 0.0544 8.79e+08 ] 8 : [ 0.0622 8.85e+08 ] 9 : [ 0.07 8.96e+08 ] .. GENERATED FROM PYTHON SOURCE LINES 71-72 Print the Chaboche model .. GENERATED FROM PYTHON SOURCE LINES 74-77 .. code-block:: Python print("Inputs:", cm.model.getInputDescription()) print("Outputs:", cm.model.getOutputDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Inputs: [Strain,R,C,Gamma] Outputs: [Stress] .. GENERATED FROM PYTHON SOURCE LINES 78-84 We see that there are four inputs: `Strain`, `R`, `C` and `Gamma` and one output: `Sigma`. The `Strain` is observed on input and the stress `Sigma` is observed on output. Using these observations, we want to estimate the parameters `R`, `C` and `Gamma`. .. GENERATED FROM PYTHON SOURCE LINES 86-92 Set the calibration parameters ------------------------------ In this part, we begin the calibration study. Define the value of the reference values of the :math:`\theta` parameter. In the Bayesian framework, this is called the mean of the *prior* Gaussian distribution. In the data assimilation framework, this is called the *background*. .. GENERATED FROM PYTHON SOURCE LINES 94-99 .. code-block:: Python R = 700e6 # Exact : 750e6 C = 2500e6 # Exact : 2750e6 Gamma = 8.0 # Exact : 10 thetaPrior = [R, C, Gamma] .. GENERATED FROM PYTHON SOURCE LINES 100-124 In the physical model, the inputs and parameters are ordered as presented in the next table. Notice that there are no parameters in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ | 1 | R | +-------+----------------+ | 2 | C | +-------+----------------+ | 3 | Gamma | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | ∅ | ∅ | +-------+-----------+ **Table 1.** Indices and names of the inputs and parameters of the physical model. .. GENERATED FROM PYTHON SOURCE LINES 126-129 .. code-block:: Python print("Physical Model Inputs:", cm.model.getInputDescription()) print("Physical Model Parameters:", cm.model.getParameterDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Physical Model Inputs: [Strain,R,C,Gamma] Physical Model Parameters: [] .. GENERATED FROM PYTHON SOURCE LINES 130-153 In order to perform calibration, we have to define a parametric model, with observed inputs and parameters to calibrate. In order to do this, we create a :class:`~openturns.ParametricFunction` where the parameters are `R`, `C` and `Gamma` which have the indices 1, 2 and 3 in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | 0 | R | +-------+-----------+ | 1 | C | +-------+-----------+ | 3 | Gamma | +-------+-----------+ **Table 2.** Indices and names of the inputs and parameters of the parametric model. .. GENERATED FROM PYTHON SOURCE LINES 155-158 The following statement create the calibrated function from the model. The calibrated parameters `R`, `C`, `Gamma` are at indices 1, 2, 3 in the inputs arguments of the model. .. GENERATED FROM PYTHON SOURCE LINES 160-163 .. code-block:: Python calibratedIndices = [1, 2, 3] mycf = ot.ParametricFunction(cm.model, calibratedIndices, thetaPrior) .. GENERATED FROM PYTHON SOURCE LINES 164-165 Then we plot the model and compare it to the observations. .. GENERATED FROM PYTHON SOURCE LINES 167-183 .. code-block:: Python graph = ot.Graph("Model before calibration", "Strain", "Stress (MPa)", True) # Plot the model curve = mycf.draw(cm.strainMin, cm.strainMax, 50).getDrawable(0) curve.setLegend("Model before calibration") curve.setLineStyle(ot.ResourceMap.GetAsString("CalibrationResult-ObservationLineStyle")) graph.add(curve) # Plot the noisy observations cloud = ot.Cloud(observedStrain, observedStress) cloud.setLegend("Observations") cloud.setPointStyle( ot.ResourceMap.GetAsString("CalibrationResult-ObservationPointStyle") ) graph.add(cloud) graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :alt: Model before calibration :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 184-188 We see that the observations are relatively noisy, but that the trend is clear: this shows that it may be possible to fit the model. At this point, we have a data set that we can use for calibration and a model to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 191-196 Calibration with linear least squares ------------------------------------- The :class:`~openturns.LinearLeastSquaresCalibration` class performs the linear least squares calibration by linearizing the model in the neighbourhood of the reference point. .. GENERATED FROM PYTHON SOURCE LINES 198-202 .. code-block:: Python algo = ot.LinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior, "SVD" ) .. GENERATED FROM PYTHON SOURCE LINES 203-205 The :meth:`~openturns.LinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 207-210 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 211-215 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior density of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 215-227 .. code-block:: Python def printRCGamma(parameter, indentation=" "): """ Print the [R, C, Gamma] vector with readable units. """ print(indentation, "R = %.1f (MPa)" % (parameter[0] / 1.0e6)) print(indentation, "C = %.1f (MPa)" % (parameter[1] / 1.0e6)) print(indentation, "Gamma = %.4f" % (parameter[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 228-237 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print("theta After = ") printRCGamma(thetaMAP) print("theta Before = ") printRCGamma(thetaPrior) print("theta True = ") thetaTrue = [cm.trueR, cm.trueC, cm.trueGamma] printRCGamma(thetaTrue) .. rst-class:: sphx-glr-script-out .. code-block:: none theta After = R = 751.2 (MPa) C = 2209.6 (MPa) Gamma = 0.7726 theta Before = R = 700.0 (MPa) C = 2500.0 (MPa) Gamma = 8.0000 theta True = R = 750.0 (MPa) C = 2750.0 (MPa) Gamma = 10.0000 .. GENERATED FROM PYTHON SOURCE LINES 238-239 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 239-259 .. code-block:: Python def printRCGammaInterval(interval, indentation=" "): """ Print the [R, C, Gamma] C.I. with readable units. """ lowerBound = interval.getLowerBound() upperBound = interval.getUpperBound() print( indentation, "R in [%.1f, %.1f]" % (lowerBound[0] / 1.0e6, upperBound[0] / 1.0e6), ) print( indentation, "C in [%.1f, %.1f]" % (lowerBound[1] / 1.0e6, upperBound[1] / 1.0e6), ) print(indentation, "Gamma in [%.4f, %.4f]" % (lowerBound[2], upperBound[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 260-268 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [730.3, 772.1] C in [758.2, 3661.0] Gamma in [-1450.7629, 1452.3081] .. GENERATED FROM PYTHON SOURCE LINES 269-275 We can see that the :math:`\gamma` parameter has a large confidence interval relative to the true value of the parameter: even the sign of the parameter is unknown. The parameter which is calibrated with the smallest (relative) confidence interval is :math:`R`. This is why this parameter seems to be the most important in this case. .. GENERATED FROM PYTHON SOURCE LINES 277-279 Increase the default number of points in the plots. This produces smoother spiky distributions. .. GENERATED FROM PYTHON SOURCE LINES 281-283 .. code-block:: Python ot.ResourceMap.SetAsUnsignedInteger("Distribution-DefaultPointNumber", 300) .. GENERATED FROM PYTHON SOURCE LINES 284-286 We now plot the predicted output stress depending on the input strain before and after calibration. .. GENERATED FROM PYTHON SOURCE LINES 286-292 .. code-block:: Python # sphinx_gallery_thumbnail_number = 3 graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 293-295 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 297-298 We can also plot the predicted stress depending on the observed stress. .. GENERATED FROM PYTHON SOURCE LINES 300-303 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 304-306 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 308-315 The :meth:`~openturns.CalibrationResult.getObservationsError` method returns the estimated distribution of the observation error. This is necessarily a Gaussian distribution, because this is the distribution that we assume when we use least squares. By hypothesis, the distribution has a zero mean (this is a property of linear least squares). The standard deviation is estimated from the residuals after calibration. .. GENERATED FROM PYTHON SOURCE LINES 317-320 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1.14338e+07) .. GENERATED FROM PYTHON SOURCE LINES 321-324 In order to validate that the distribution of the residuals is Gaussian after calibration, we use the :meth:`~openturns.CalibrationResult.drawResiduals` method. .. GENERATED FROM PYTHON SOURCE LINES 326-334 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 335-344 The analysis of the distribution of the residuals after calibration shows that the distribution is centered on zero and is symmetric. This indicates that the calibration performed well. With the least squares method, the standard deviation of the observation noise is estimated from the data. On the plot, we see that the Gaussian with zero mean and estimated standard deviation is close to the distribution of the residuals after calibration. This shows that the distribution of the residuals is close to being Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 346-351 This can also be seen on a Normal-plot. This is a QQ-plot applied to the normal distribution. We could use the :meth:`~openturns.VisualTest.DrawHenryLine` method to plot it, but :meth:`~openturns.CalibrationResult.drawResidualsNormalPlot` does it directly. .. GENERATED FROM PYTHON SOURCE LINES 353-356 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 357-359 We see that the residuals fit to the Normal distribution, according to the normal plot. .. GENERATED FROM PYTHON SOURCE LINES 361-366 The parameters which best fit to the data may be sensitive to the random noise in the observed outputs. In order to see how this source of randomness changes the optimum parameter, we use :meth:`~openturns.CalibrationResult.drawParameterDistributions`. .. GENERATED FROM PYTHON SOURCE LINES 368-376 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 377-383 We see that the :math:`R` and :math:`C` parameters are relatively accurately estimated. The parameter :math:`\gamma` is not estimated accurately, as can be be seen from the very large spread of the distribution of this parameter. This may reveal that there are identifiability issues with this parameter when we use a linearization of the model. .. GENERATED FROM PYTHON SOURCE LINES 386-425 .. code-block:: Python def plotDistributionGridPDF(distribution): """ Plot the marginal and bi-dimensional iso-PDF on a grid. Parameters ---------- distribution : ot.Distribution The distribution. Returns ------- grid : ot.GridLayout(dimension, dimension) The grid of plots. """ dimension = distribution.getDimension() grid = ot.GridLayout(dimension, dimension) for i in range(dimension): for j in range(dimension): if i == j: distributionI = distribution.getMarginal([i]) graph = distributionI.drawPDF() else: distributionJI = distribution.getMarginal([j, i]) graph = distributionJI.drawPDF() contour = graph.getDrawable(0).getImplementation() contour.setColorBarPosition("") # Hide color bar graph.setDrawable(contour, 0) graph.setLegends([""]) graph.setTitle("") if i < dimension - 1: graph.setXTitle("") if j > 0: graph.setYTitle("") grid.setGraph(i, j, graph) grid.setTitle("Iso-PDF values") return grid .. GENERATED FROM PYTHON SOURCE LINES 426-427 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 427-436 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 437-443 Since linear least squares calibration perform well, the study could stop here. In the next cell, we use other methods to see how this can change the results. We are going to see that the lack of identifiability of the :math:`\gamma` parameter can be regularized using Gausssian calibration methods. .. GENERATED FROM PYTHON SOURCE LINES 445-450 Calibration with nonlinear least squares ---------------------------------------- The :class:`~openturns.NonLinearLeastSquaresCalibration` class performs the non linear least squares calibration by minimizing the squared Euclidian norm between the predictions and the observations. .. GENERATED FROM PYTHON SOURCE LINES 452-456 .. code-block:: Python algo = ot.NonLinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior ) .. GENERATED FROM PYTHON SOURCE LINES 457-461 The optimization algorithm is automatically selected using the default solver which can build a :class:`~openturns.LeastSquaresProblem`. We can see which solved is used using :meth:`~openturns.NonLinearLeastSquaresCalibration.getOptimizationAlgorithm`. .. GENERATED FROM PYTHON SOURCE LINES 463-465 .. code-block:: Python print(algo.getOptimizationAlgorithm()) .. rst-class:: sphx-glr-script-out .. code-block:: none class=OptimizationAlgorithm implementation=class=Ceres algorithm=LEVENBERG_MARQUARDT class=OptimizationAlgorithmImplementation problem=class=OptimizationProblem implementation=class=LeastSquaresProblem residual function=class=Function name=Unnamed implementation=class=FunctionImplementation name=Unnamed description=[] evaluationImplementation=class=NoEvaluation name=Unnamed gradientImplementation=class=NoGradient name=Unnamed hessianImplementation=class=NoHessian name=Unnamed dimension=0 startingPoint=class=Point name=Unnamed dimension=0 values=[] maximumIterationNumber=100 maximumCallsNumber=1000 maximumAbsoluteError=1e-05 maximumRelativeError=1e-05 maximumResidualError=1e-05 maximumConstraintError=1e-05 .. GENERATED FROM PYTHON SOURCE LINES 466-468 The :meth:`~openturns.NonLinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 470-473 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 474-476 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 478-480 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 482-485 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.51192e+08,2.21421e+09,0.897348] .. GENERATED FROM PYTHON SOURCE LINES 486-487 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 489-496 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [684.3, 777.7] C in [823.4, 5431.8] Gamma in [-26.4494, 26.7070] .. GENERATED FROM PYTHON SOURCE LINES 497-505 We can see that :math:`R` and :math:`C` are accurately estimated and that :math:`\gamma` is estimated with a relatively large confidence interval. Notice, however, that the interval is much more narrow than the one with obtained using linear least squares. This is because the optimization algorithm that we used implicitly introduced some regularization that was absent from linear least squares. .. GENERATED FROM PYTHON SOURCE LINES 507-508 We now check the observations depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 510-514 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 515-517 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 519-522 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 523-525 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 527-528 We now focus on the distribution of the errors. .. GENERATED FROM PYTHON SOURCE LINES 530-533 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 18556.8, sigma = 1.02315e+07) .. GENERATED FROM PYTHON SOURCE LINES 534-538 We see that the distribution is Gaussian (this is by hypothesis) with a mean relatively close to zero. Indeed, please consider that the stress has an order of magnitude close to :math:`10^8`. .. GENERATED FROM PYTHON SOURCE LINES 540-541 As with any other distribution, we can draw its PDF. .. GENERATED FROM PYTHON SOURCE LINES 541-544 .. code-block:: Python graph = observationError.drawPDF() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 545-553 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 554-559 The analysis of the residuals shows that the distribution is centered on zero and symmetric. This indicates that the calibration performed well. Moreover, the distribution of the residuals is close to being Gaussian. This indicates that the observation error might be Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 561-569 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 570-573 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 574-575 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 577-586 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 587-594 We see that the parameters are relatively well estimated, although the parameter :math:`\gamma` still has a significant dispersion (but this is significantly reduced compared to linear least squares). Since this parameter cannot be identified easily, it may be appropriate to consider Gaussian calibration, which adds some regularity to the problem that may solve identificability issues. .. GENERATED FROM PYTHON SOURCE LINES 596-600 Gaussian calibration parameters ------------------------------- In this part, we set the parameters of the Gaussian calibration. We first set the standard deviation of the observations errors. .. GENERATED FROM PYTHON SOURCE LINES 602-604 .. code-block:: Python sigmaStress = 1.0e7 # (Pa) .. GENERATED FROM PYTHON SOURCE LINES 605-608 Define the covariance matrix of the output Y of the model. Since the dimension of the output is equal to 1, this must be a 1-by-1 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 610-613 .. code-block:: Python errorCovariance = ot.CovarianceMatrix(1) errorCovariance[0, 0] = sigmaStress**2 .. GENERATED FROM PYTHON SOURCE LINES 614-615 Define the covariance matrix of the parameters :math:`\theta` to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 617-621 .. code-block:: Python sigmaR = 0.1 * R sigmaC = 0.1 * C sigmaGamma = 0.1 * Gamma .. GENERATED FROM PYTHON SOURCE LINES 622-624 Since there are 3 parameters, the prior covariance matrix is a 3-by-3 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 626-632 .. code-block:: Python sigma = ot.CovarianceMatrix(3) sigma[0, 0] = sigmaR**2 sigma[1, 1] = sigmaC**2 sigma[2, 2] = sigmaGamma**2 print(sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none [[ 4.9e+15 0 0 ] [ 0 6.25e+16 0 ] [ 0 0 0.64 ]] .. GENERATED FROM PYTHON SOURCE LINES 633-638 Gaussian linear calibration --------------------------- The :class:`~openturns.GaussianLinearCalibration` class performs the Gaussian linear calibration by linearizing the model in the neighbourhood of the prior. This is also known as the Kalman filter. .. GENERATED FROM PYTHON SOURCE LINES 640-644 .. code-block:: Python algo = ot.GaussianLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 645-647 The :meth:`~openturns.GaussianLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 649-652 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 653-657 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 659-662 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.48807e+08,2.67622e+09,7.75012] .. GENERATED FROM PYTHON SOURCE LINES 663-668 We can compute a 95% credibility interval of the parameter :math:`\theta^\star` (when we consider Bayesian methods, confidence intervals are called credibility intervals). This interval reflects the interval that contains 95% of the posterior distribution. .. GENERATED FROM PYTHON SOURCE LINES 670-677 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [736.2, 761.5] C in [2312.0, 3040.5] Gamma in [5.9294, 9.5708] .. GENERATED FROM PYTHON SOURCE LINES 678-680 We can see that all three parameters are estimated with a relatively small confidence interval (including :math:`\gamma`). .. GENERATED FROM PYTHON SOURCE LINES 682-683 Let us analyze the validation graphics. .. GENERATED FROM PYTHON SOURCE LINES 685-689 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 690-692 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 694-697 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 698-700 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 702-705 The observation error is an hypothesis of the Gaussian calibration. This is the Gaussian distribution that we introduced in the model. .. GENERATED FROM PYTHON SOURCE LINES 707-710 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 711-719 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 720-724 The analysis of the Gaussian distribution of the observation errors is close to the posterior distribution of the residuals. Moreover, the posterior distribution is centered. These information indicate that the calibration performed well. .. GENERATED FROM PYTHON SOURCE LINES 726-729 The posterior distribution of the parameters allows one to see if the observations bring significant information compared to the prior Gaussian distributions. .. GENERATED FROM PYTHON SOURCE LINES 731-739 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 740-750 We see that the posterior distribution of :math:`R` is spiky. For the :math:`R` variable, the observations are much more important than the prior: this is shown by the fact that the posterior and prior distribution of the :math:`R` variable are very different. We see that the prior and posterior distribution are close to each other for the :math:`\gamma` variable: the observations did not convey much information for this variable. This shows that this parameters is difficult to identify. One potential solution to estimate this parameter is, if it can be identified, to use a larger sample size (but this is not always easy in practice). .. GENERATED FROM PYTHON SOURCE LINES 752-753 We can check that if the residuals after calibration are normal. .. GENERATED FROM PYTHON SOURCE LINES 755-758 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 759-760 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 762-771 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 772-777 Gaussian nonlinear calibration ------------------------------ The :class:`~openturns.GaussianNonLinearCalibration` class performs the Gaussian nonlinear calibration. This algorithm is also known as 3DVar. .. GENERATED FROM PYTHON SOURCE LINES 779-783 .. code-block:: Python algo = ot.GaussianNonLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 784-786 The :meth:`~openturns.GaussianNonLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 788-791 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 792-794 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 796-798 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 800-803 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.48601e+08,2.68364e+09,7.75426] .. GENERATED FROM PYTHON SOURCE LINES 804-805 We can compute a 95% credibility interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 805-812 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [730.3, 760.5] C in [2485.6, 3009.6] Gamma in [7.2810, 7.8985] .. GENERATED FROM PYTHON SOURCE LINES 813-815 The previous credibility interval can be compared to the one that we previously obtains with the linear Gaussian calibration method. .. GENERATED FROM PYTHON SOURCE LINES 817-821 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 822-824 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 826-829 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 830-832 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 834-837 The method to compute the distribution of the error and the posterior distribution of the parameter depends on the default value of the bootstrap size. .. GENERATED FROM PYTHON SOURCE LINES 837-839 .. code-block:: Python print("Bootstrap size : ", algo.getBootstrapSize()) .. rst-class:: sphx-glr-script-out .. code-block:: none Bootstrap size : 100 .. GENERATED FROM PYTHON SOURCE LINES 840-841 The default value of the parameter uses bootstrap to estimate the distribution. .. GENERATED FROM PYTHON SOURCE LINES 843-845 By default, the observation error is predicted by bootstraping the problem at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 847-850 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = -98784.3, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 851-852 This can be compared to the residuals distribution, which is computed at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 854-862 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 863-864 The analysis is very similar to the linear Gaussian calibration. .. GENERATED FROM PYTHON SOURCE LINES 866-874 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 875-878 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 879-880 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 882-891 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 892-896 We see that the prior and posterior distribution for the :math:`\gamma` parameter are close to each other, but not superimposed: the observations significantly brought information to the variable :math:`\gamma` during the calibration. .. GENERATED FROM PYTHON SOURCE LINES 896-899 .. code-block:: Python otv.View.ShowAll() .. GENERATED FROM PYTHON SOURCE LINES 900-901 Reset default settings .. GENERATED FROM PYTHON SOURCE LINES 901-902 .. code-block:: Python ot.ResourceMap.Reload() .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 9.799 seconds) .. _sphx_glr_download_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_calibration_chaboche.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_calibration_chaboche.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_calibration_chaboche.zip `