.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_calibration/least_squares_and_gaussian_calibration/plot_calibration_chaboche.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: Calibration of the Chaboche mechanical model ============================================ In this example we present calibration methods on the Chaboche model. A detailed explanation of this mechanical law is presented :ref:`here `. As we are going to see, this model is relatively simple to calibrate: its parameters are identifiable and the output is relatively sensitive to the variation of the parameters. Hence, all methods perform correctly in this case. In this example, we use both least squares methods and Bayesian Gaussian methods. Please read :ref:`code_calibration` for more details on code calibration and least squares and read :ref:`gaussian_calibration` for more details on Gaussian calibration. Parameters to calibrate and observations ---------------------------------------- The vector of parameters to calibrate is: .. math:: \theta = (R,C,\gamma). We consider a data set where the number of observations is equal to: .. math:: n = 10. The observations are the pairs :math:`\{(\epsilon_i,\sigma_i)\}_{i=1,...,n}`, i.e. each observation is a couple made of the strain and the corresponding stress. In the particular situation where we want to calibrate this model, the following list presents which variables are observed input variables, input calibrated variables and observed output variables. - :math:`\epsilon`: Input. Observed. - :math:`R`, :math:`C`, :math:`\gamma` : Inputs. Calibrated. - :math:`\sigma`: Output. Observed. .. GENERATED FROM PYTHON SOURCE LINES 47-54 .. code-block:: Python import openturns as ot import openturns.viewer as otv from matplotlib import pylab as plt from openturns.usecases import chaboche_model ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 55-63 Define the observations ----------------------- In practice, we generally use a data set which has been obtained from measurements. This data set can be loaded using e.g. :meth:`~openturns.Sample.ImportFromCSVFile`. Here we import the data from the :class:`~openturns.usecases.chaboche_model.ChabocheModel` class. .. GENERATED FROM PYTHON SOURCE LINES 65-71 .. code-block:: Python cm = chaboche_model.ChabocheModel() print(cm.data) observedStrain = cm.data[:, 0] observedStress = cm.data[:, 1] nbobs = cm.data.getSize() .. rst-class:: sphx-glr-script-out .. code-block:: none [ Strain Stress (Pa) ] 0 : [ 0 7.56e+08 ] 1 : [ 0.0077 7.57e+08 ] 2 : [ 0.0155 7.85e+08 ] 3 : [ 0.0233 8.19e+08 ] 4 : [ 0.0311 8.01e+08 ] 5 : [ 0.0388 8.42e+08 ] 6 : [ 0.0466 8.49e+08 ] 7 : [ 0.0544 8.79e+08 ] 8 : [ 0.0622 8.85e+08 ] 9 : [ 0.07 8.96e+08 ] .. GENERATED FROM PYTHON SOURCE LINES 72-73 Print the Chaboche model .. GENERATED FROM PYTHON SOURCE LINES 75-78 .. code-block:: Python print("Inputs:", cm.model.getInputDescription()) print("Outputs:", cm.model.getOutputDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Inputs: [Strain,R,C,Gamma] Outputs: [Sigma] .. GENERATED FROM PYTHON SOURCE LINES 79-85 We see that there are four inputs: `Strain`, `R`, `C` and `Gamma` and one output: `Sigma`. The `Strain` is observed on input and the stress `Sigma` is observed on output. Using these observations, we want to estimate the parameters `R`, `C` and `Gamma`. .. GENERATED FROM PYTHON SOURCE LINES 87-93 Set the calibration parameters ------------------------------ In this part, we begin the calibration study. Define the value of the reference values of the :math:`\theta` parameter. In the Bayesian framework, this is called the mean of the *prior* Gaussian distribution. In the data assimilation framework, this is called the *background*. .. GENERATED FROM PYTHON SOURCE LINES 95-100 .. code-block:: Python R = 700e6 # Exact : 750e6 C = 2500e6 # Exact : 2750e6 Gamma = 8.0 # Exact : 10 thetaPrior = [R, C, Gamma] .. GENERATED FROM PYTHON SOURCE LINES 101-125 In the physical model, the inputs and parameters are ordered as presented in the next table. Notice that there are no parameters in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ | 1 | R | +-------+----------------+ | 2 | C | +-------+----------------+ | 3 | Gamma | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | ∅ | ∅ | +-------+-----------+ **Table 1.** Indices and names of the inputs and parameters of the physical model. .. GENERATED FROM PYTHON SOURCE LINES 127-130 .. code-block:: Python print("Physical Model Inputs:", cm.model.getInputDescription()) print("Physical Model Parameters:", cm.model.getParameterDescription()) .. rst-class:: sphx-glr-script-out .. code-block:: none Physical Model Inputs: [Strain,R,C,Gamma] Physical Model Parameters: [] .. GENERATED FROM PYTHON SOURCE LINES 131-154 In order to perform calibration, we have to define a parametric model, with observed inputs and parameters to calibrate. In order to do this, we create a :class:`~openturns.ParametricFunction` where the parameters are `R`, `C` and `Gamma` which have the indices 1, 2 and 3 in the physical model. +-------+----------------+ | Index | Input variable | +=======+================+ | 0 | Strain | +-------+----------------+ +-------+-----------+ | Index | Parameter | +=======+===========+ | 0 | R | +-------+-----------+ | 1 | C | +-------+-----------+ | 3 | Gamma | +-------+-----------+ **Table 2.** Indices and names of the inputs and parameters of the parametric model. .. GENERATED FROM PYTHON SOURCE LINES 156-159 The following statement create the calibrated function from the model. The calibrated parameters `R`, `C`, `Gamma` are at indices 1, 2, 3 in the inputs arguments of the model. .. GENERATED FROM PYTHON SOURCE LINES 161-164 .. code-block:: Python calibratedIndices = [1, 2, 3] mycf = ot.ParametricFunction(cm.model, calibratedIndices, thetaPrior) .. GENERATED FROM PYTHON SOURCE LINES 165-166 Then we plot the model and compare it to the observations. .. GENERATED FROM PYTHON SOURCE LINES 168-184 .. code-block:: Python graph = ot.Graph("Model before calibration", "Strain", "Stress (MPa)", True) # Plot the model curve = mycf.draw(cm.strainMin, cm.strainMax, 50).getDrawable(0) curve.setLegend("Model before calibration") curve.setLineStyle(ot.ResourceMap.GetAsString("CalibrationResult-ObservationLineStyle")) graph.add(curve) # Plot the noisy observations cloud = ot.Cloud(observedStrain, observedStress) cloud.setLegend("Observations") cloud.setPointStyle( ot.ResourceMap.GetAsString("CalibrationResult-ObservationPointStyle") ) graph.add(cloud) graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :alt: Model before calibration :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 185-189 We see that the observations are relatively noisy, but that the trend is clear: this shows that it may be possible to fit the model. At this point, we have a data set that we can use for calibration and a model to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 192-197 Calibration with linear least squares ------------------------------------- The :class:`~openturns.LinearLeastSquaresCalibration` class performs the linear least squares calibration by linearizing the model in the neighbourhood of the reference point. .. GENERATED FROM PYTHON SOURCE LINES 199-203 .. code-block:: Python algo = ot.LinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior, "SVD" ) .. GENERATED FROM PYTHON SOURCE LINES 204-206 The :meth:`~openturns.LinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 208-211 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 212-216 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior density of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 216-228 .. code-block:: Python def printRCGamma(parameter, indentation=" "): """ Print the [R, C, Gamma] vector with readable units. """ print(indentation, "R = %.1f (MPa)" % (parameter[0] / 1.0e6)) print(indentation, "C = %.1f (MPa)" % (parameter[1] / 1.0e6)) print(indentation, "Gamma = %.4f" % (parameter[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 229-238 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print("theta After = ") printRCGamma(thetaMAP) print("theta Before = ") printRCGamma(thetaPrior) print("theta True = ") thetaTrue = [cm.trueR, cm.trueC, cm.trueGamma] printRCGamma(thetaTrue) .. rst-class:: sphx-glr-script-out .. code-block:: none theta After = R = 751.2 (MPa) C = 2209.6 (MPa) Gamma = 0.7726 theta Before = R = 700.0 (MPa) C = 2500.0 (MPa) Gamma = 8.0000 theta True = R = 750.0 (MPa) C = 2750.0 (MPa) Gamma = 10.0000 .. GENERATED FROM PYTHON SOURCE LINES 239-240 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 240-260 .. code-block:: Python def printRCGammaInterval(interval, indentation=" "): """ Print the [R, C, Gamma] C.I. with readable units. """ lowerBound = interval.getLowerBound() upperBound = interval.getUpperBound() print( indentation, "R in [%.1f, %.1f]" % (lowerBound[0] / 1.0e6, upperBound[0] / 1.0e6), ) print( indentation, "C in [%.1f, %.1f]" % (lowerBound[1] / 1.0e6, upperBound[1] / 1.0e6), ) print(indentation, "Gamma in [%.4f, %.4f]" % (lowerBound[2], upperBound[2])) return None .. GENERATED FROM PYTHON SOURCE LINES 261-269 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [730.3, 772.1] C in [758.2, 3661.0] Gamma in [-1450.7629, 1452.3081] .. GENERATED FROM PYTHON SOURCE LINES 270-276 We can see that the :math:`\gamma` parameter has a large confidence interval relative to the true value of the parameter: even the sign of the parameter is unknown. The parameter which is calibrated with the smallest (relative) confidence interval is :math:`R`. This is why this parameter seems to be the most important in this case. .. GENERATED FROM PYTHON SOURCE LINES 278-280 Increase the default number of points in the plots. This produces smoother spiky distributions. .. GENERATED FROM PYTHON SOURCE LINES 282-284 .. code-block:: Python ot.ResourceMap.SetAsUnsignedInteger("Distribution-DefaultPointNumber", 300) .. GENERATED FROM PYTHON SOURCE LINES 285-287 We now plot the predicted output stress depending on the input strain before and after calibration. .. GENERATED FROM PYTHON SOURCE LINES 287-293 .. code-block:: Python # sphinx_gallery_thumbnail_number = 3 graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 294-296 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 298-299 We can also plot the predicted stress depending on the observed stress. .. GENERATED FROM PYTHON SOURCE LINES 301-304 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 305-307 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 309-316 The :meth:`~openturns.CalibrationResult.getObservationsError` method returns the estimated distribution of the observation error. This is necessarily a Gaussian distribution, because this is the distribution that we assume when we use least squares. By hypothesis, the distribution has a zero mean (this is a property of linear least squares). The standard deviation is estimated from the residuals after calibration. .. GENERATED FROM PYTHON SOURCE LINES 318-321 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1.14338e+07) .. GENERATED FROM PYTHON SOURCE LINES 322-325 In order to validate that the distribution of the residuals is Gaussian after calibration, we use the :meth:`~openturns.CalibrationResult.drawResiduals` method. .. GENERATED FROM PYTHON SOURCE LINES 327-335 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 336-345 The analysis of the distribution of the residuals after calibration shows that the distribution is centered on zero and is symmetric. This indicates that the calibration performed well. With the least squares method, the standard deviation of the observation noise is estimated from the data. On the plot, we see that the Gaussian with zero mean and estimated standard deviation is close to the distribution of the residuals after calibration. This shows that the distribution of the residuals is close to being Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 347-352 This can also be seen on a Normal-plot. This is a QQ-plot applied to the normal distribution. We could use the :meth:`~openturns.VisualTest.DrawHenryLine` method to plot it, but :meth:`~openturns.CalibrationResult.drawResidualsNormalPlot` does it directly. .. GENERATED FROM PYTHON SOURCE LINES 354-357 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_005.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 358-360 We see that the residuals fit to the Normal distribution, according to the normal plot. .. GENERATED FROM PYTHON SOURCE LINES 362-367 The parameters which best fit to the data may be sensitive to the random noise in the observed outputs. In order to see how this source of randomness changes the optimum parameter, we use :meth:`~openturns.CalibrationResult.drawParameterDistributions`. .. GENERATED FROM PYTHON SOURCE LINES 369-377 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_006.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 378-384 We see that the :math:`R` and :math:`C` parameters are relatively accurately estimated. The parameter :math:`\gamma` is not estimated accurately, as can be be seen from the very large spread of the distribution of this parameter. This may reveal that there are identifiability issues with this parameter when we use a linearization of the model. .. GENERATED FROM PYTHON SOURCE LINES 387-423 .. code-block:: Python def plotDistributionGridPDF(distribution): """ Plot the marginal and bi-dimensional iso-PDF on a grid. Parameters ---------- distribution : ot.Distribution The distribution. Returns ------- grid : ot.GridLayout(dimension, dimension) The grid of plots. """ dimension = distribution.getDimension() grid = ot.GridLayout(dimension, dimension) for i in range(dimension): for j in range(dimension): if i == j: distributionI = distribution.getMarginal([i]) graph = distributionI.drawPDF() else: distributionJI = distribution.getMarginal([j, i]) graph = distributionJI.drawPDF() graph.setLegends([""]) graph.setTitle("") if i < dimension - 1: graph.setXTitle("") if j > 0: graph.setYTitle("") grid.setGraph(i, j, graph) grid.setTitle("Iso-PDF values") return grid .. GENERATED FROM PYTHON SOURCE LINES 424-425 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 425-434 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_007.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 435-441 Since linear least squares calibration perform well, the study could stop here. In the next cell, we use other methods to see how this can change the results. We are going to see that the lack of identifiability of the :math:`\gamma` parameter can be regularized using Gausssian calibration methods. .. GENERATED FROM PYTHON SOURCE LINES 443-448 Calibration with nonlinear least squares ---------------------------------------- The :class:`~openturns.NonLinearLeastSquaresCalibration` class performs the non linear least squares calibration by minimizing the squared Euclidian norm between the predictions and the observations. .. GENERATED FROM PYTHON SOURCE LINES 450-454 .. code-block:: Python algo = ot.NonLinearLeastSquaresCalibration( mycf, observedStrain, observedStress, thetaPrior ) .. GENERATED FROM PYTHON SOURCE LINES 455-459 The optimization algorithm is automatically selected using the default solver which can build a :class:`~openturns.LeastSquaresProblem`. We can see which solved is used using :meth:`~openturns.NonLinearLeastSquaresCalibration.getOptimizationAlgorithm`. .. GENERATED FROM PYTHON SOURCE LINES 461-463 .. code-block:: Python print(algo.getOptimizationAlgorithm()) .. rst-class:: sphx-glr-script-out .. code-block:: none class=OptimizationAlgorithm implementation=class=Ceres algorithm=LEVENBERG_MARQUARDT class=OptimizationAlgorithmImplementation problem=class=OptimizationProblem implementation=class=LeastSquaresProblem residual function=class=Function name=Unnamed implementation=class=FunctionImplementation name=Unnamed description=[] evaluationImplementation=class=NoEvaluation name=Unnamed gradientImplementation=class=NoGradient name=Unnamed hessianImplementation=class=NoHessian name=Unnamed dimension=0 startingPoint=class=Point name=Unnamed dimension=0 values=[] maximumIterationNumber=100 maximumEvaluationNumber=1000 maximumAbsoluteError=1e-05 maximumRelativeError=1e-05 maximumResidualError=1e-05 maximumConstraintError=1e-05 .. GENERATED FROM PYTHON SOURCE LINES 464-466 The :meth:`~openturns.NonLinearLeastSquaresCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 468-471 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 472-474 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 476-478 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 480-483 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.49701e+08,2.3567e+09,2.68804] .. GENERATED FROM PYTHON SOURCE LINES 484-485 We can compute a 95% confidence interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 487-494 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [717.0, 766.6] C in [986.3, 5795.6] Gamma in [-20.1227, 50.4506] .. GENERATED FROM PYTHON SOURCE LINES 495-503 We can see that :math:`R` and :math:`C` are accurately estimated and that :math:`\gamma` is estimated with a relatively large confidence interval. Notice, however, that the interval is much more narrow than the one with obtained using linear least squares. This is because the optimization algorithm that we used implicitly introduced some regularization that was absent from linear least squares. .. GENERATED FROM PYTHON SOURCE LINES 505-506 We now check the observations depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 508-512 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_008.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 513-515 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 517-520 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_009.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 521-523 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 525-526 We now focus on the distribution of the errors. .. GENERATED FROM PYTHON SOURCE LINES 528-531 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = -2002.54, sigma = 1.01979e+07) .. GENERATED FROM PYTHON SOURCE LINES 532-536 We see that the distribution is Gaussian (this is by hypothesis) with a mean relatively close to zero. Indeed, please consider that the stress has an order of magnitude close to :math:`10^8`. .. GENERATED FROM PYTHON SOURCE LINES 538-539 As with any other distribution, we can draw its PDF. .. GENERATED FROM PYTHON SOURCE LINES 539-542 .. code-block:: Python graph = observationError.drawPDF() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_010.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 543-551 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_011.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 552-557 The analysis of the residuals shows that the distribution is centered on zero and symmetric. This indicates that the calibration performed well. Moreover, the distribution of the residuals is close to being Gaussian. This indicates that the observation error might be Gaussian. .. GENERATED FROM PYTHON SOURCE LINES 559-567 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_012.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 568-571 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_013.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 572-573 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 575-584 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_014.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 585-592 We see that the parameters are relatively well estimated, although the parameter :math:`\gamma` still has a significant dispersion (but this is significantly reduced compared to linear least squares). Since this parameter cannot be identified easily, it may be appropriate to consider Gaussian calibration, which adds some regularity to the problem that may solve identificability issues. .. GENERATED FROM PYTHON SOURCE LINES 594-598 Gaussian calibration parameters ------------------------------- In this part, we set the parameters of the Gaussian calibration. We first set the standard deviation of the observations errors. .. GENERATED FROM PYTHON SOURCE LINES 600-602 .. code-block:: Python sigmaStress = 1.0e7 # (Pa) .. GENERATED FROM PYTHON SOURCE LINES 603-606 Define the covariance matrix of the output Y of the model. Since the dimension of the output is equal to 1, this must be a 1-by-1 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 608-611 .. code-block:: Python errorCovariance = ot.CovarianceMatrix(1) errorCovariance[0, 0] = sigmaStress**2 .. GENERATED FROM PYTHON SOURCE LINES 612-613 Define the covariance matrix of the parameters :math:`\theta` to calibrate. .. GENERATED FROM PYTHON SOURCE LINES 615-619 .. code-block:: Python sigmaR = 0.1 * R sigmaC = 0.1 * C sigmaGamma = 0.1 * Gamma .. GENERATED FROM PYTHON SOURCE LINES 620-622 Since there are 3 parameters, the prior covariance matrix is a 3-by-3 covariance matrix. .. GENERATED FROM PYTHON SOURCE LINES 624-630 .. code-block:: Python sigma = ot.CovarianceMatrix(3) sigma[0, 0] = sigmaR**2 sigma[1, 1] = sigmaC**2 sigma[2, 2] = sigmaGamma**2 print(sigma) .. rst-class:: sphx-glr-script-out .. code-block:: none [[ 4.9e+15 0 0 ] [ 0 6.25e+16 0 ] [ 0 0 0.64 ]] .. GENERATED FROM PYTHON SOURCE LINES 631-636 Gaussian linear calibration --------------------------- The :class:`~openturns.GaussianLinearCalibration` class performs the Gaussian linear calibration by linearizing the model in the neighbourhood of the prior. This is also known as the Kalman filter. .. GENERATED FROM PYTHON SOURCE LINES 638-642 .. code-block:: Python algo = ot.GaussianLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 643-645 The :meth:`~openturns.GaussianLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 647-650 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 651-655 Analysis of the results ----------------------- The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 657-660 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.48807e+08,2.67622e+09,7.75012] .. GENERATED FROM PYTHON SOURCE LINES 661-666 We can compute a 95% credibility interval of the parameter :math:`\theta^\star` (when we consider Bayesian methods, confidence intervals are called credibility intervals). This interval reflects the interval that contains 95% of the posterior distribution. .. GENERATED FROM PYTHON SOURCE LINES 668-675 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [736.2, 761.5] C in [2312.0, 3040.5] Gamma in [5.9294, 9.5708] .. GENERATED FROM PYTHON SOURCE LINES 676-678 We can see that all three parameters are estimated with a relatively small confidence interval (including :math:`\gamma`). .. GENERATED FROM PYTHON SOURCE LINES 680-681 Let us analyze the validation graphics. .. GENERATED FROM PYTHON SOURCE LINES 683-687 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_015.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 688-690 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 692-695 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_016.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 696-698 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 700-703 The observation error is an hypothesis of the Gaussian calibration. This is the Gaussian distribution that we introduced in the model. .. GENERATED FROM PYTHON SOURCE LINES 705-708 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = 0, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 709-717 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_017.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 718-722 The analysis of the Gaussian distribution of the observation errors is close to the posterior distribution of the residuals. Moreover, the posterior distribution is centered. These information indicate that the calibration performed well. .. GENERATED FROM PYTHON SOURCE LINES 724-727 The posterior distribution of the parameters allows one to see if the observations bring significant information compared to the prior Gaussian distributions. .. GENERATED FROM PYTHON SOURCE LINES 729-737 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_018.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 738-748 We see that the posterior distribution of :math:`R` is spiky. For the :math:`R` variable, the observations are much more important than the prior: this is shown by the fact that the posterior and prior distribution of the :math:`R` variable are very different. We see that the prior and posterior distribution are close to each other for the :math:`\gamma` variable: the observations did not convey much information for this variable. This shows that this parameters is difficult to identify. One potential solution to estimate this parameter is, if it can be identified, to use a larger sample size (but this is not always easy in practice). .. GENERATED FROM PYTHON SOURCE LINES 750-751 We can check that if the residuals after calibration are normal. .. GENERATED FROM PYTHON SOURCE LINES 753-756 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_019.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 757-758 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 760-769 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_020.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 770-775 Gaussian nonlinear calibration ------------------------------ The :class:`~openturns.GaussianNonLinearCalibration` class performs the Gaussian nonlinear calibration. This algorithm is also known as 3DVar. .. GENERATED FROM PYTHON SOURCE LINES 777-781 .. code-block:: Python algo = ot.GaussianNonLinearCalibration( mycf, observedStrain, observedStress, thetaPrior, sigma, errorCovariance ) .. GENERATED FROM PYTHON SOURCE LINES 782-784 The :meth:`~openturns.GaussianNonLinearCalibration.run` method computes the solution of the problem. .. GENERATED FROM PYTHON SOURCE LINES 786-789 .. code-block:: Python algo.run() calibrationResult = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 790-792 Analysis of the results ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 794-796 The :meth:`~openturns.CalibrationResult.getParameterMAP` method returns the maximum of the posterior distribution of :math:`\theta`. .. GENERATED FROM PYTHON SOURCE LINES 798-801 .. code-block:: Python thetaMAP = calibrationResult.getParameterMAP() print(thetaMAP) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.4895e+08,2.67065e+09,7.73735] .. GENERATED FROM PYTHON SOURCE LINES 802-803 We can compute a 95% credibility interval of the parameter :math:`\theta^\star`. .. GENERATED FROM PYTHON SOURCE LINES 803-810 .. code-block:: Python thetaPosterior = calibrationResult.getParameterPosterior() interval = thetaPosterior.computeBilateralConfidenceIntervalWithMarginalProbability( 0.95 )[0] print("95% C.I.:") printRCGammaInterval(interval) .. rst-class:: sphx-glr-script-out .. code-block:: none 95% C.I.: R in [727.8, 759.5] C in [2446.9, 2991.9] Gamma in [7.2145, 7.8737] .. GENERATED FROM PYTHON SOURCE LINES 811-813 The previous credibility interval can be compared to the one that we previously obtains with the linear Gaussian calibration method. .. GENERATED FROM PYTHON SOURCE LINES 815-819 .. code-block:: Python graph = calibrationResult.drawObservationsVsInputs() graph.setLegendPosition("upper left") view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_021.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 820-822 We see that there is a good fit after calibration, since the predictions after calibration are close to the observations. .. GENERATED FROM PYTHON SOURCE LINES 824-827 .. code-block:: Python graph = calibrationResult.drawObservationsVsPredictions() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_022.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 828-830 We see that there is a much better fit after calibration, since the predictions are close to the diagonal of the graphics. .. GENERATED FROM PYTHON SOURCE LINES 832-835 The method to compute the distribution of the error and the posterior distribution of the parameter depends on the default value of the bootstrap size. .. GENERATED FROM PYTHON SOURCE LINES 835-837 .. code-block:: Python print("Bootstrap size : ", algo.getBootstrapSize()) .. rst-class:: sphx-glr-script-out .. code-block:: none Bootstrap size : 100 .. GENERATED FROM PYTHON SOURCE LINES 838-839 The default value of the parameter uses bootstrap to estimate the distribution. .. GENERATED FROM PYTHON SOURCE LINES 841-843 By default, the observation error is predicted by bootstraping the problem at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 845-848 .. code-block:: Python observationError = calibrationResult.getObservationsError() print(observationError) .. rst-class:: sphx-glr-script-out .. code-block:: none Normal(mu = -99660.4, sigma = 1e+07) .. GENERATED FROM PYTHON SOURCE LINES 849-850 This can be compared to the residuals distribution, which is computed at the posterior. .. GENERATED FROM PYTHON SOURCE LINES 852-860 .. code-block:: Python graph = calibrationResult.drawResiduals() view = otv.View( graph, figure_kw={"figsize": (7.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.6) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :alt: Residual analysis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_023.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 861-862 The analysis is very similar to the linear Gaussian calibration. .. GENERATED FROM PYTHON SOURCE LINES 864-872 .. code-block:: Python graph = calibrationResult.drawParameterDistributions() view = otv.View( graph, figure_kw={"figsize": (10.0, 4.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plt.subplots_adjust(right=0.8) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :alt: plot calibration chaboche :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_024.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 873-876 .. code-block:: Python graph = calibrationResult.drawResidualsNormalPlot() view = otv.View(graph) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :alt: Residuals after calibration vs Gaussian hypothesis :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_025.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 877-878 Plot the PDF values of the distribution of the optimum parameters. .. GENERATED FROM PYTHON SOURCE LINES 880-889 .. code-block:: Python grid = plotDistributionGridPDF(thetaPosterior) view = otv.View( grid, figure_kw={"figsize": (6.0, 6.0)}, legend_kw={"bbox_to_anchor": (1.0, 1.0), "loc": "upper left"}, ) plot_space = 0.5 plt.subplots_adjust(wspace=plot_space, hspace=plot_space) .. image-sg:: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :alt: Iso-PDF values :srcset: /auto_calibration/least_squares_and_gaussian_calibration/images/sphx_glr_plot_calibration_chaboche_026.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 890-894 We see that the prior and posterior distribution for the :math:`\gamma` parameter are close to each other, but not superimposed: the observations significantly brought information to the variable :math:`\gamma` during the calibration. .. GENERATED FROM PYTHON SOURCE LINES 894-897 .. code-block:: Python otv.View.ShowAll() .. GENERATED FROM PYTHON SOURCE LINES 898-899 Reset default settings .. GENERATED FROM PYTHON SOURCE LINES 899-900 .. code-block:: Python ot.ResourceMap.Reload() .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 6.875 seconds) .. _sphx_glr_download_auto_calibration_least_squares_and_gaussian_calibration_plot_calibration_chaboche.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_calibration_chaboche.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_calibration_chaboche.py `