Note
Go to the end to download the full example code.
Create a FCE for dependent inputs: transformation vs domination¶
In this example, we create a functional chaos expansion for the Ishigami function when the input distribution has dependent marginals.
Refer to Functional Chaos Expansion to learn more about functional chaos expansion.
We provide one input sample and one output sample of the Ishigami function. We build two meta models:
Meta model 1: we use an isoprobabilistic transformation that maps the input distribution to another one with independent marginals.
Meta model 2: we use the domination method: the basis of the projection space is not orthonormal to the input distribution.
Define the Ishigami model¶
from openturns.usecases import ishigami_function
import openturns as ot
import openturns.viewer as otv
We load the Ishigami model. The IshigamiModel data class contains the input
distribution of the random vector
in im.distribution.
im = ishigami_function.IshigamiModel()
input_names = im.distribution.getDescription()
We want to introduce some dependence betwenn the components. That is why we have to change the
input distribution stored in the Ishigami model (which have independent components).
We use a copula
that links with a
ClaytonCopula. The last component
is independent of
. The final 3d-copula is a
BlockIndependentCopula copula.
We keep the initial marginal distributions.
copula = ot.BlockIndependentCopula([ot.ClaytonCopula(1.5), ot.IndependentCopula(1)])
input_dist = ot.JointDistribution(
[im.distribution.getMarginal(i) for i in range(im.dim)], copula
)
We generate an input sample from the input distribution and we compute the output sample from the Ishigami function that is contained in im.model.
sampleSize = 1000
inputTrain = im.distribution.getSample(sampleSize)
outputTrain = im.model(inputTrain)
We display the relationships between the outputs and the inputs.
grid = ot.VisualTest.DrawPairsXY(inputTrain, outputTrain)
view = otv.View(grid, figure_kw={"figsize": (12.0, 4.0)})
We draw the histogramm of the output values.
graph = ot.HistogramFactory().build(outputTrain).drawPDF()
graph.setTitle("Ishigami outputs")
graph.setXTitle("y")
view = otv.View(graph)
Meta model 1: Transformation method¶
As the input distribution has dependent marginals, we can use an isoprobabilistic transformation that maps the input distribution to another one with independent marginals. This new distribution is not specified, therefore the default adaptive strategy and the default projection stategy are used.
The default adaptive strategy is defined as follows:
a basis is built as the tensorization of the univariate polynomials family orthonormal to the standard representative of the input marginals distribution. Then this basis is orthonormal to the distribution denoted by
which is the tensorization of the standard representative of the input marginals distribution;
the enumerate function is chosen according to the FunctionalChaosAlgorithm-QNorm parameter of the
ResourceMap: if this parameter is equal to 1, then theLinearEnumerateFunctionclass is used, otherwise, theHyperbolicAnisotropicEnumerateFunctionclass is used. The default value of the key being 0.5, we use the Hyperbolic Anisotropic EnumerateFunction.the first elements of the basis are used to build the approximation space. The number of elements is computed from the total degree (using the enumerate function of the basis) specified as default value in FunctionalChaosAlgorithm-MaximumTotalDegree of the
ResourceMap. The default value being 10, the polynomial approximation space is generated by the polynomials of the basis with maximum total degree equal to 10.
The FunctionalChaosAlgorithm class uses an isoprobabilistic
transformation
that maps
into
. We note
. In this new
-space, the basis built by the adaptive strategy is
orthonormal to the distribution
of the new random vector
.
The meta model of the transformed Ishigami model
is built.
The projection stategy is not specified neither: we use the least-squares strategy with no model selection
if the key FunctionalChaosAlgorithm-Sparse of the ResourceMap is False and
with model selection in the other case, using the selection algorithm specified by the key
FunctionalChaosAlgorithm-FittingAlgorithm. Considering the default values of the keys, we use a
least-squares strategy with no model selection.
Then, the meta model built in the -space is finally composed with the
isoprobabilistic transformation to get the meta model of the Ishigami function in the initial
-space.
Note that the final meta model is projected on a basis which is not polynomials, due to the action of the isoprobabilistic transformation which is not affine.
We create the functional chaos algorithm.
chaos_algo = ot.FunctionalChaosAlgorithm(inputTrain, outputTrain, input_dist)
chaos_algo.setUseDomination(False)
chaos_algo.run()
We get the result and the resulting meta model.
chaos_result = chaos_algo.getResult()
metamodel = chaos_result.getMetaModel()
In order to validate the meta model, we generate a test sample.
n_valid = 1000
inputTest = input_dist.getSample(n_valid)
outputTest = im.model(inputTest)
metamodel_predictions = metamodel(inputTest)
We draw the validation graph and we get the score: the meta model is of poor quality.
val = ot.MetaModelValidation(outputTest, metamodel_predictions)
r2Score = val.computeR2Score()[0]
print(f"r2Score with Transformation method = {r2Score:.6f}")
graph = val.drawValidation()
graph.setTitle(f"R2={r2Score * 100:.2f}%, use domination = false")
view = otv.View(graph)
r2Score with Transformation method = 0.647130
Meta model 2: Domination method¶
Now, we want to use the domination method, which means that the basis created by the
adaptive strategy is used to project the model. This basis is not orthonormal to
.
We use the setUseDomination() method.
We implement the same steps as before, until the validation graph.
chaos_algo.setUseDomination(True)
chaos_algo.run()
chaos_result = chaos_algo.getResult()
metamodel_dom = chaos_result.getMetaModel()
metamodel_dom_predictions = metamodel_dom(inputTest)
val = ot.MetaModelValidation(outputTest, metamodel_dom_predictions)
r2Score = val.computeR2Score()[0]
print(f"r2Score with Domination method = {r2Score:.6f}")
graph = val.drawValidation()
graph.setTitle(f"R2={r2Score * 100:.2f}%, use domination = true")
view = otv.View(graph)
r2Score with Domination method = 0.989526
We can see that the meta model obtained with the domination method is largely better than
the meta model
obtained with the Transformation method, even though
the multivariate basis of the approximation space is not orthonormal to the input distribution
.
It can be explained by the fact that the multivariate tensorized basis used by the
domination method is able to
capture the tensorized structure of the Ishigami model. This was not the case of the
Transformation method which uses a basis in the
-space which is not tensorized,
due to the action of the isoprobabilist transformation.
Display all figures
otv.View.ShowAll()
OpenTURNS