Advanced polynomial chaos construction¶
In this example we are going to expose advanced elements in the construction of a polynomial chaos algorithm:
construction of the multivariate orthonormal basis,
truncature strategy of the multivariate orthonormal basis,
evaluation strategy of the approximation coefficients.
In this example, we consider the following function :
for any .
We assume that the inputs have normal, uniform, gamma and beta distributions :
and , , and are independent.
Define the model and the input distribution¶
[1]:
from __future__ import print_function
import openturns as ot
[2]:
model = ot.SymbolicFunction(['x1', 'x2', 'x3', 'x4'], ['1+x1*x2 + 2*x3^2+x4^4'])
Create a distribution of dimension 4.
[3]:
distribution = ot.ComposedDistribution([ot.Normal(), ot.Uniform(), ot.Gamma(2.75, 1.0), ot.Beta(2.5, 1.0, -1.0, 2.0)])
[4]:
inputDimension = distribution.getDimension()
inputDimension
[4]:
4
STEP 1: Construction of the multivariate orthonormal basis¶
Create the univariate polynomial family collection which regroups the polynomial families for each direction.
[5]:
polyColl = ot.PolynomialFamilyCollection(inputDimension)
We could use the Krawtchouk and Charlier families (for discrete distributions).
[6]:
polyColl[0] = ot.KrawtchoukFactory()
polyColl[1] = ot.CharlierFactory()
We could also use the automatic selection of the polynomial which corresponds to the distribution: this is done with the StandardDistributionPolynomialFactory
class.
[7]:
for i in range(inputDimension):
marginal = distribution.getMarginal(i)
polyColl[i] = ot.StandardDistributionPolynomialFactory(marginal)
In our specific case, we use specific polynomial factories.
[8]:
polyColl[0] = ot.HermiteFactory()
polyColl[1] = ot.LegendreFactory()
polyColl[2] = ot.LaguerreFactory(2.75)
# Parameter for the Jacobi factory : 'Probabilty' encoded with 1
polyColl[3] = ot.JacobiFactory(2.5, 3.5, 1)
Create the enumeration function.
The first possibility is to use the LinearEnumerateFunction
.
[9]:
enumerateFunction = ot.LinearEnumerateFunction(inputDimension)
Another possibility is to use the HyperbolicAnisotropicEnumerateFunction
, which gives less weight to interactions.
[10]:
q = 0.4
enumerateFunction_1 = ot.HyperbolicAnisotropicEnumerateFunction(inputDimension, q)
Create the multivariate orthonormal basis which is the the cartesian product of the univariate basis.
[11]:
multivariateBasis = ot.OrthogonalProductPolynomialFactory(polyColl, enumerateFunction)
Ask how many polynomials have total degrees equal to k=5.
[12]:
k = 5
enumerateFunction.getStrataCardinal(k)
[12]:
56
Ask how many polynomials have degrees lower or equal to k=5.
[13]:
enumerateFunction.getStrataCumulatedCardinal(k)
[13]:
126
Give the k-th term of the multivariate basis. To calculate its degree, add the integers.
[14]:
k = 5
enumerateFunction(k)
[14]:
[2,0,0,0]
Build a term of the basis as a Function. Generally, we do not need to construct manually any term, all terms are constructed automatically by a strategy of construction of the basis.
[15]:
i = 5
Psi_i = multivariateBasis.build(i)
Psi_i
[15]:
-0.707107 + 0.707107 * x0^2
Get the measure mu associated to the multivariate basis.
[16]:
distributionMu = multivariateBasis.getMeasure()
distributionMu
[16]:
ComposedDistribution(Normal(mu = 0, sigma = 1), Uniform(a = -1, b = 1), Gamma(k = 3.75, lambda = 1, gamma = 0), Beta(alpha = 2.5, beta = 1, a = -1, b = 1), IndependentCopula(dimension = 4))
STEP 2: Truncature strategy of the multivariate orthonormal basis¶
FixedStrategy : all the polynomials af degree lower or equal to 2 which corresponds to the 15 first ones.
[17]:
p = 15
truncatureBasisStrategy = ot.FixedStrategy(multivariateBasis, p)
SequentialStrategy : among the maximumCardinalBasis = 100 first polynomials of the multivariate basis those verfying the convergence criterion.
[18]:
maximumCardinalBasis = 100
truncatureBasisStrategy_1 = ot.SequentialStrategy(
multivariateBasis, maximumCardinalBasis)
CleaningStrategy : among the maximumConsideredTerms = 500 first polynomials, those which have the mostSignificant = 50 most significant contributions with significance criterion significanceFactor equal to The True
boolean indicates if we are interested in the online monitoring of the current basis update (removed or added coefficients).
[19]:
maximumConsideredTerms = 500
mostSignificant = 50
significanceFactor = 1.0e-4
truncatureBasisStrategy_2 = ot.CleaningStrategy(
multivariateBasis, maximumConsideredTerms, mostSignificant, significanceFactor, True)
STEP 3: Evaluation strategy of the approximation coefficients¶
The technique illustrated is the Least Squares technique where the points come from an design of experiments. Here : the Monte Carlo sampling technique.
[20]:
sampleSize = 100
evaluationCoeffStrategy = ot.LeastSquaresStrategy(
ot.MonteCarloExperiment(sampleSize))
You can specify the approximation algorithm. This is the algorithm that generates a sequence of basis using Least Angle Regression.
[21]:
basisSequenceFactory = ot.LARS()
This algorithm estimates the empirical error on each sub-basis using Leave One Out strategy.
[22]:
fittingAlgorithm = ot.CorrectedLeaveOneOut()
# Finally the metamodel selection algorithm embbeded in LeastSquaresStrategy
approximationAlgorithm = ot.LeastSquaresMetaModelSelectionFactory(
basisSequenceFactory, fittingAlgorithm)
evaluationCoeffStrategy_2 = ot.LeastSquaresStrategy(
ot.MonteCarloExperiment(sampleSize), approximationAlgorithm)
Try integration.
[23]:
marginalDegrees = [2] * inputDimension
evaluationCoeffStrategy_3 = ot.IntegrationStrategy(
ot.GaussProductExperiment(distributionMu, marginalDegrees))
STEP 4: Creation of the Functional Chaos Algorithm¶
The FunctionalChaosAlgorithm
class combines
the model :
model
the distribution of the input random vector :
distribution
the truncature strategy of the multivariate basis
and the evaluation strategy of the coefficients
[24]:
polynomialChaosAlgorithm = ot.FunctionalChaosAlgorithm(
model, distribution, truncatureBasisStrategy, evaluationCoeffStrategy)