Polynomial chaos exploitationΒΆ

In this example we are going to create a global approximation of a model response using functional chaos and expose the associated results:

  • the composed model: h: \underline{Z}^{\strut} \longrightarrow \underline{Y} = g \circ T^{-1}(\underline{Z}), which is the model of the reduced variables \underline{Z}. We have \displaystyle h =  \sum_{k \in \mathbb N} \underline{\alpha}_k \Psi_k,

  • the coefficients of the polynomial approximation : (\underline{\alpha}_k)_{k \in K},

  • the composed meta model: \hat{h}, which is the model of the reduced variables reduced to the truncated multivariate basis (\Psi_k)_{k \in K}. We have \displaystyle  \hat{h} = \sum_{k \in K} \underline{\alpha}_k \Psi_k,

  • the meta model: \displaystyle \hat{g} : \underline{X} \longrightarrow Y = \hat{h} \circ T(\underline{X}) which is the polynomial chaos approximation as a Function. We have \displaystyle \hat{g} = \sum_{k \in K} \underline{\alpha}_k \Psi_k \circ T,

  • the truncated multivariate basis : (\Psi_k)_{k \in K},

  • the indices K,

  • the composition of each polynomial of the truncated multivariate basis \Psi_k,

  • the distribution \mu of the transformed variables \underline{Z},

from __future__ import print_function
import openturns as ot
import openturns.viewer as viewer
from matplotlib import pylab as plt
ot.Log.Show(ot.Log.NONE)

prepare some X/Y data

ot.RandomGenerator.SetSeed(0)
dimension = 2
input_names = ['x1', 'x2']
formulas = ['cos(x1 + x2)', '(x2 + 1) * exp(x1 - 2 * x2)']
model = ot.SymbolicFunction(input_names, formulas)
distribution = ot.Normal(dimension)
x  = distribution.getSample(30)
y = model(x)

create a functional chaos algorithm

algo = ot.FunctionalChaosAlgorithm(x, y)
algo.run()

Stream out the result

result = algo.getResult()

Get the polynomial chaos coefficients All the coefficients

result.getCoefficients()
v0v1
00.3406373-0.7628467
101.31123
2-0.2314675-2.844915
301.79254
4-0.40957540
50-1.019301
60-1.233743
70-1.014213
80-0.7458156


The coefficients of marginal i

i = 1
result.getCoefficients()[i]

[0,1.31123]



Get the indices of the selected polynomials : K

subsetK = result.getIndices()
subsetK

[0,2,4,6,7,9,12,13,20]



Get the composition of the polynomials of the truncated multivariate basis

for i in range(subsetK.getSize()):
    print("Polynomial number ", i, " in truncated basis <-> polynomial number ",
          subsetK[i], " = ", ot.LinearEnumerateFunction(dimension)(subsetK[i]), " in complete basis")

Out:

Polynomial number  0  in truncated basis <-> polynomial number  0  =  [0,0]  in complete basis
Polynomial number  1  in truncated basis <-> polynomial number  2  =  [0,1]  in complete basis
Polynomial number  2  in truncated basis <-> polynomial number  4  =  [1,1]  in complete basis
Polynomial number  3  in truncated basis <-> polynomial number  6  =  [3,0]  in complete basis
Polynomial number  4  in truncated basis <-> polynomial number  7  =  [2,1]  in complete basis
Polynomial number  5  in truncated basis <-> polynomial number  9  =  [0,3]  in complete basis
Polynomial number  6  in truncated basis <-> polynomial number  12  =  [2,2]  in complete basis
Polynomial number  7  in truncated basis <-> polynomial number  13  =  [1,3]  in complete basis
Polynomial number  8  in truncated basis <-> polynomial number  20  =  [0,5]  in complete basis

Get the multivariate basis as a collection of Function

reduced = result.getReducedBasis()

Get the orthogonal basis

orthgBasis = result.getOrthogonalBasis()
orthgBasis

class=OrthogonalProductPolynomialFactory univariate polynomial collection=[class=OrthogonalUniVariatePolynomialFamily implementation=class=StandardDistributionPolynomialFactory hasSpecificFamily=false orthonormalization algorithm=class=OrthonormalizationAlgorithm implementation=class=AdaptiveStieltjesAlgorithm measure=class=VonMises name=VonMises dimension=1 mu=0.0630857 kappa=1.61753 monicRecurrenceCoefficients=[class=Point name=Unnamed dimension=3 values=[1,-0.0630857,0],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-0.999574],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.56788],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.93234],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.54128],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.46372],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.48756],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.4853],class=Point name=Unnamed dimension=3 values=[1,-0.0630857,-2.4789]] monicSquaredNorms=class=Point name=Unnamed dimension=9 values=[1,0.999574,2.56678,7.52666,19.1273,47.1244,117.225,291.338,722.198] isElliptical=false,class=OrthogonalUniVariatePolynomialFamily implementation=class=StandardDistributionPolynomialFactory hasSpecificFamily=true specificFamily=class=OrthogonalUniVariatePolynomialFamily implementation=class=LegendreFactory measure=class=Uniform name=Uniform dimension=1 a=-1 b=1] measure=class=ComposedDistribution name=ComposedDistribution dimension=2 copula=class=IndependentCopula name=IndependentCopula dimension=2 marginal[0]=class=VonMises name=VonMises dimension=1 mu=0.0630857 kappa=1.61753 marginal[1]=class=Uniform name=Uniform dimension=1 a=-1 b=1



Get the distribution of variables Z

orthgBasis.getMeasure()

ComposedDistribution(VonMises(mu = 0.0630857, kappa=1.61753), Uniform(a = -1, b = 1), IndependentCopula(dimension = 2))



Get the composed model which is the model of the reduced variables Z

result.getComposedModel()

(DatabaseEvaluation
input sample :
[ X0 X1 ]
0 : [ 0.608202 -1.26617 ]
1 : [ -0.438266 1.20548 ]
2 : [ -2.18139 0.350042 ]
3 : [ -0.355007 1.43725 ]
4 : [ 0.810668 0.793156 ]
5 : [ -0.470526 0.261018 ]
6 : [ -2.29006 -1.28289 ]
7 : [ -1.31178 -0.0907838 ]
8 : [ 0.995793 -0.139453 ]
9 : [ -0.560206 0.44549 ]
10 : [ 0.322925 0.445785 ]
11 : [ -1.03808 -0.856712 ]
12 : [ 0.473617 -0.125498 ]
13 : [ 0.351418 1.78236 ]
14 : [ 0.0702074 -0.781366 ]
15 : [ -0.721533 -0.241223 ]
16 : [ -1.78796 0.40136 ]
17 : [ 1.36783 1.00434 ]
18 : [ 0.741548 -0.0436123 ]
19 : [ 0.539345 0.29995 ]
20 : [ 0.407717 -0.485112 ]
21 : [ -0.382992 -0.752817 ]
22 : [ 0.257926 1.96876 ]
23 : [ -0.671291 1.85579 ]
24 : [ 0.0521593 0.790446 ]
25 : [ 0.716353 -0.743622 ]
26 : [ 0.184356 -1.53073 ]
27 : [ 0.655027 0.538071 ]
28 : [ 1.73821 -0.958722 ]
29 : [ 0.377922 -0.181004 ]
output sample :
[ y0 y1 ]
0 : [ 0.791234 -6.153 ]
1 : [ 0.719848 0.127674 ]
2 : [ -0.257609 0.075673 ]
3 : [ 0.46935 0.0964592 ]
4 : [ -0.0330217 0.825582 ]
5 : [ 0.978133 0.467366 ]
6 : [ -0.9084 -0.372691 ]
7 : [ 0.167439 0.293644 ]
8 : [ 0.655206 3.07871 ]
9 : [ 0.993427 0.338667 ]
10 : [ 0.718808 0.818737 ]
11 : [ -0.318354 0.28152 ]
12 : [ 0.940016 1.80491 ]
13 : [ -0.533709 0.111917 ]
14 : [ 0.757606 1.11916 ]
15 : [ 0.571259 0.59742 ]
16 : [ 0.183152 0.105058 ]
17 : [ -0.718312 1.05597 ]
18 : [ 0.76617 2.19061 ]
19 : [ 0.667988 1.22357 ]
20 : [ 0.997007 2.04242 ]
21 : [ 0.421399 0.759585 ]
22 : [ -0.609865 0.0749114 ]
23 : [ 0.376759 0.0356671 ]
24 : [ 0.665521 0.388187 ]
25 : [ 0.999628 2.32215 ]
26 : [ 0.222539 -13.6308 ]
27 : [ 0.368781 1.00946 ]
28 : [ 0.711272 1.59716 ]
29 : [ 0.980674 1.71644 ])o(| y0 = [x0]->[x0]
| y1 = [x1]->[0.2190125596644127981+1.8591062333030965448*x1]
)



Get the composed meta model which is the model of the reduced variables Z within the reduced polynomials basis

result.getComposedMetaModel()

[0.340637,-0.762847] + [0,1.31123] * (1.73205 * x1) + [-0.231467,-2.84491] * (-1.11803 + 3.3541 * x1^2) + [0,1.79254] * (-3.96863 * x1 + 6.61438 * x1^3) + [-0.409575,0] * ((-0.0630992 + 1.00021 * x0) * (1.73205 * x1)) + [0,-1.0193] * (1.125 - 11.25 * x1^2 + 13.125 * x1^4) + [0,-1.23374] * ((-0.621424 - 0.0787529 * x0 + 0.624174 * x0^2) * (1.73205 * x1)) + [0,-1.01421] * ((-0.0630992 + 1.00021 * x0) * (-1.11803 + 3.3541 * x1^2)) + [0,-0.745816] * ((-0.621424 - 0.0787529 * x0 + 0.624174 * x0^2) * (-1.11803 + 3.3541 * x1^2))



Get the meta model which is the composed meta model combined with the iso probabilistic transformation

result.getMetaModel()

([0.340637,-0.762847] + [0,1.31123] * (1.73205 * x1) + [-0.231467,-2.84491] * (-1.11803 + 3.3541 * x1^2) + [0,1.79254] * (-3.96863 * x1 + 6.61438 * x1^3) + [-0.409575,0] * ((-0.0630992 + 1.00021 * x0) * (1.73205 * x1)) + [0,-1.0193] * (1.125 - 11.25 * x1^2 + 13.125 * x1^4) + [0,-1.23374] * ((-0.621424 - 0.0787529 * x0 + 0.624174 * x0^2) * (1.73205 * x1)) + [0,-1.01421] * ((-0.0630992 + 1.00021 * x0) * (-1.11803 + 3.3541 * x1^2)) + [0,-0.745816] * ((-0.621424 - 0.0787529 * x0 + 0.624174 * x0^2) * (-1.11803 + 3.3541 * x1^2)))o(| y0 = [x0]->[x0]
| y1 = [x1]->[0.537892876741792203*(x1-0.2190125596644127981)]
)



Get the projection strategy

algo.getProjectionStrategy()

class=ProjectionStrategy implementation=class=LeastSquaresStrategy experiment=class=FixedExperiment name=Unnamed sample=class=Sample name=Normal implementation=class=SampleImplementation name=Normal size=30 dimension=2 description=[X0,X1] data=[[0.608202,-1.26617],[-0.438266,1.20548],[-2.18139,0.350042],[-0.355007,1.43725],[0.810668,0.793156],[-0.470526,0.261018],[-2.29006,-1.28289],[-1.31178,-0.0907838],[0.995793,-0.139453],[-0.560206,0.44549],[0.322925,0.445785],[-1.03808,-0.856712],[0.473617,-0.125498],[0.351418,1.78236],[0.0702074,-0.781366],[-0.721533,-0.241223],[-1.78796,0.40136],[1.36783,1.00434],[0.741548,-0.0436123],[0.539345,0.29995],[0.407717,-0.485112],[-0.382992,-0.752817],[0.257926,1.96876],[-0.671291,1.85579],[0.0521593,0.790446],[0.716353,-0.743622],[0.184356,-1.53073],[0.655027,0.538071],[1.73821,-0.958722],[0.377922,-0.181004]] weights=class=Point name=Unnamed dimension=30 values=[0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333]



Total running time of the script: ( 0 minutes 0.099 seconds)

Gallery generated by Sphinx-Gallery