MetaModelAlgorithm

class MetaModelAlgorithm(*args)

Base class for metamodel algorithms.

Parameters:
sampleX, sampleY2-d sequence of float

Input/output samples

distributionDistribution, optional

Joint probability density function of the physical input vector.

Methods

BuildDistribution(inputSample)

Recover the distribution, with metamodel performance in mind.

getClassName()

Accessor to the object's name.

getDistribution()

Accessor to the joint probability density function of the physical input vector.

getInputSample()

Accessor to the input sample.

getName()

Accessor to the object's name.

getOutputSample()

Accessor to the output sample.

getWeights()

Return the weights of the input sample.

hasName()

Test if the object is named.

run()

Compute the response surfaces.

setDistribution(distribution)

Accessor to the joint probability density function of the physical input vector.

setName(name)

Accessor to the object's name.

__init__(*args)
static BuildDistribution(inputSample)

Recover the distribution, with metamodel performance in mind.

For each marginal, find the best 1-d continuous parametric model else fallback to the use of a nonparametric one.

The selection is done as follow:

  • We start with a list of all parametric models (all factories)

  • For each model, we estimate its parameters if feasible.

  • We check then if model is valid, ie if its Kolmogorov score exceeds a threshold fixed in the MetaModelAlgorithm-PValueThreshold ResourceMap key. Default value is 5%

  • We sort all valid models and return the one with the optimal criterion.

For the last step, the criterion might be BIC, AIC or AICC. The specification of the criterion is done through the MetaModelAlgorithm-ModelSelectionCriterion ResourceMap key. Default value is fixed to BIC. Note that if there is no valid candidate, we estimate a non-parametric model (KernelSmoothing or Histogram). The MetaModelAlgorithm-NonParametricModel ResourceMap key allows selecting the preferred one. Default value is Histogram

One each marginal is estimated, we use the Spearman independence test on each component pair to decide whether an independent copula. In case of non independence, we rely on a NormalCopula.

Parameters:
sampleSample

Input sample.

Returns:
distributionDistribution

Input distribution.

getClassName()

Accessor to the object’s name.

Returns:
class_namestr

The object class name (object.__class__.__name__).

getDistribution()

Accessor to the joint probability density function of the physical input vector.

Returns:
distributionDistribution

Joint probability density function of the physical input vector.

getInputSample()

Accessor to the input sample.

Returns:
inputSampleSample

Input sample of a model evaluated apart.

getName()

Accessor to the object’s name.

Returns:
namestr

The name of the object.

getOutputSample()

Accessor to the output sample.

Returns:
outputSampleSample

Output sample of a model evaluated apart.

getWeights()

Return the weights of the input sample.

Returns:
weightssequence of float

The weights of the points in the input sample.

hasName()

Test if the object is named.

Returns:
hasNamebool

True if the name is not empty.

run()

Compute the response surfaces.

Notes

It computes the response surfaces and creates a MetaModelResult structure containing all the results.

setDistribution(distribution)

Accessor to the joint probability density function of the physical input vector.

Parameters:
distributionDistribution

Joint probability density function of the physical input vector.

setName(name)

Accessor to the object’s name.

Parameters:
namestr

The name of the object.

Examples using the class

Build and validate a linear model

Build and validate a linear model

Getting started

Getting started

Gaussian Process Regression vs KrigingAlgorithm

Gaussian Process Regression vs KrigingAlgorithm

Metamodel of a field function

Metamodel of a field function

Viscous free fall: metamodel of a field function

Viscous free fall: metamodel of a field function

Distribution of estimators in linear regression

Distribution of estimators in linear regression

Mixture of experts

Mixture of experts

Export a metamodel

Export a metamodel

Create a general linear model metamodel

Create a general linear model metamodel

Create a linear model

Create a linear model

Perform stepwise regression

Perform stepwise regression

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression : quick-start

Gaussian Process Regression : quick-start

Gaussian Process-based active learning for reliability

Gaussian Process-based active learning for reliability

Advanced Gaussian process regression

Advanced Gaussian process regression

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: choose a polynomial trend

Gaussian Process Regression: choose a polynomial trend

Gaussian process fitter: configure the optimization solver

Gaussian process fitter: configure the optimization solver

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian process regression: draw the likelihood

Gaussian process regression: draw the likelihood

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression: metamodel of the Branin-Hoo function

Gaussian Process Regression: metamodel of the Branin-Hoo function

Example of multi output Gaussian Process Regression on the fire satellite model

Example of multi output Gaussian Process Regression on the fire satellite model

Sequentially adding new points to a Gaussian Process metamodel

Sequentially adding new points to a Gaussian Process metamodel

Gaussian Process Regression: propagate uncertainties

Gaussian Process Regression: propagate uncertainties

Polynomial chaos is sensitive to the degree

Polynomial chaos is sensitive to the degree

Fit a distribution from an input sample

Fit a distribution from an input sample

Create a polynomial chaos metamodel by integration on the cantilever beam

Create a polynomial chaos metamodel by integration on the cantilever beam

Create a sparse chaos by integration

Create a sparse chaos by integration

Conditional expectation of a polynomial chaos expansion

Conditional expectation of a polynomial chaos expansion

Polynomial chaos expansion cross-validation

Polynomial chaos expansion cross-validation

Validate a polynomial chaos

Validate a polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Compute grouped indices for the Ishigami function

Compute grouped indices for the Ishigami function

Compute Sobol’ indices confidence intervals

Compute Sobol' indices confidence intervals

Create a polynomial chaos metamodel from a data set

Create a polynomial chaos metamodel from a data set

Advanced polynomial chaos construction

Advanced polynomial chaos construction

Create a full or sparse polynomial chaos expansion

Create a full or sparse polynomial chaos expansion

Polynomial chaos exploitation

Polynomial chaos exploitation

Compute leave-one-out error of a polynomial chaos expansion

Compute leave-one-out error of a polynomial chaos expansion

EfficientGlobalOptimization examples

EfficientGlobalOptimization examples

LOLA-Voronoi sequential design of experiment

LOLA-Voronoi sequential design of experiment

Sobol’ sensitivity indices from chaos

Sobol' sensitivity indices from chaos

Use the ANCOVA indices

Use the ANCOVA indices

Example of sensitivity analyses on the wing weight model

Example of sensitivity analyses on the wing weight model