Note
Click here to download the full example code
Polynomial chaos exploitationΒΆ
In this example we are going to create a global approximation of a model response using functional chaos and expose the associated results:
the composed model: : , which is the model of the reduced variables . We have ,
the coefficients of the polynomial approximation : ,
the composed meta model: , which is the model of the reduced variables reduced to the truncated multivariate basis . We have ,
the meta model: which is the polynomial chaos approximation as a Function. We have ,
the truncated multivariate basis : ,
the indices ,
the composition of each polynomial of the truncated multivariate basis ,
the distribution of the transformed variables ,
from __future__ import print_function
import openturns as ot
import openturns.viewer as viewer
from matplotlib import pylab as plt
ot.Log.Show(ot.Log.NONE)
prepare some X/Y data
ot.RandomGenerator.SetSeed(0)
dimension = 2
input_names = ['x1', 'x2']
formulas = ['cos(x1 + x2)', '(x2 + 1) * exp(x1 - 2 * x2)']
model = ot.SymbolicFunction(input_names, formulas)
distribution = ot.Normal(dimension)
x = distribution.getSample(30)
y = model(x)
create a functional chaos algorithm
algo = ot.FunctionalChaosAlgorithm(x, y)
algo.run()
Stream out the result
result = algo.getResult()
Get the polynomial chaos coefficients All the coefficients
result.getCoefficients()
v0 | v1 | |
---|---|---|
0 | 0.3382773 | -0.4009759 |
1 | 0 | 1.889049 |
2 | 0 | 0.7075578 |
3 | -0.228331 | -2.224586 |
4 | 0 | 1.774134 |
5 | -0.4226791 | 0 |
6 | 0 | -0.9198962 |
7 | 0 | 0.07833277 |
8 | 0 | -0.7791349 |
9 | 0 | -0.2046361 |
10 | 0 | 0.5270544 |
11 | 0 | 0.3482843 |
The coefficients of marginal i
i = 1
result.getCoefficients()[i]
[0,1.88905]
Get the indices of the selected polynomials : K
subsetK = result.getIndices()
subsetK
[0,2,3,4,6,7,9,12,13,17,19,20]#12
Get the composition of the polynomials of the truncated multivariate basis
for i in range(subsetK.getSize()):
print("Polynomial number ", i, " in truncated basis <-> polynomial number ",
subsetK[i], " = ", ot.LinearEnumerateFunction(dimension)(subsetK[i]), " in complete basis")
Out:
Polynomial number 0 in truncated basis <-> polynomial number 0 = [0,0] in complete basis
Polynomial number 1 in truncated basis <-> polynomial number 2 = [0,1] in complete basis
Polynomial number 2 in truncated basis <-> polynomial number 3 = [2,0] in complete basis
Polynomial number 3 in truncated basis <-> polynomial number 4 = [1,1] in complete basis
Polynomial number 4 in truncated basis <-> polynomial number 6 = [3,0] in complete basis
Polynomial number 5 in truncated basis <-> polynomial number 7 = [2,1] in complete basis
Polynomial number 6 in truncated basis <-> polynomial number 9 = [0,3] in complete basis
Polynomial number 7 in truncated basis <-> polynomial number 12 = [2,2] in complete basis
Polynomial number 8 in truncated basis <-> polynomial number 13 = [1,3] in complete basis
Polynomial number 9 in truncated basis <-> polynomial number 17 = [3,2] in complete basis
Polynomial number 10 in truncated basis <-> polynomial number 19 = [1,4] in complete basis
Polynomial number 11 in truncated basis <-> polynomial number 20 = [0,5] in complete basis
Get the multivariate basis as a collection of Function
reduced = result.getReducedBasis()
Get the orthogonal basis
orthgBasis = result.getOrthogonalBasis()
orthgBasis
class=OrthogonalProductPolynomialFactory univariate polynomial collection=[class=OrthogonalUniVariatePolynomialFamily implementation=class=StandardDistributionPolynomialFactory hasSpecificFamily=true specificFamily=class=OrthogonalUniVariatePolynomialFamily implementation=class=HermiteFactory measure=class=Normal name=Normal dimension=1 mean=class=Point name=Unnamed dimension=1 values=[0] sigma=class=Point name=Unnamed dimension=1 values=[1] correlationMatrix=class=CorrelationMatrix dimension=1 implementation=class=MatrixImplementation name=Unnamed rows=1 columns=1 values=[1],class=OrthogonalUniVariatePolynomialFamily implementation=class=StandardDistributionPolynomialFactory hasSpecificFamily=true specificFamily=class=OrthogonalUniVariatePolynomialFamily implementation=class=LegendreFactory measure=class=Uniform name=Uniform dimension=1 a=-1 b=1] measure=class=ComposedDistribution name=ComposedDistribution dimension=2 copula=class=IndependentCopula name=IndependentCopula dimension=2 marginal[0]=class=Normal name=Normal dimension=1 mean=class=Point name=Unnamed dimension=1 values=[0] sigma=class=Point name=Unnamed dimension=1 values=[1] correlationMatrix=class=CorrelationMatrix dimension=1 implementation=class=MatrixImplementation name=Unnamed rows=1 columns=1 values=[1] marginal[1]=class=Uniform name=Uniform dimension=1 a=-1 b=1
Get the distribution of variables Z
orthgBasis.getMeasure()
ComposedDistribution(Normal(mu = 0, sigma = 1), Uniform(a = -1, b = 1), IndependentCopula(dimension = 2))
Get the composed model which is the model of the reduced variables Z
result.getComposedModel()
(DatabaseEvaluation
input sample :
[ X0 X1 ]
0 : [ 0.608202 -1.26617 ]
1 : [ -0.438266 1.20548 ]
2 : [ -2.18139 0.350042 ]
3 : [ -0.355007 1.43725 ]
4 : [ 0.810668 0.793156 ]
5 : [ -0.470526 0.261018 ]
6 : [ -2.29006 -1.28289 ]
7 : [ -1.31178 -0.0907838 ]
8 : [ 0.995793 -0.139453 ]
9 : [ -0.560206 0.44549 ]
10 : [ 0.322925 0.445785 ]
11 : [ -1.03808 -0.856712 ]
12 : [ 0.473617 -0.125498 ]
13 : [ 0.351418 1.78236 ]
14 : [ 0.0702074 -0.781366 ]
15 : [ -0.721533 -0.241223 ]
16 : [ -1.78796 0.40136 ]
17 : [ 1.36783 1.00434 ]
18 : [ 0.741548 -0.0436123 ]
19 : [ 0.539345 0.29995 ]
20 : [ 0.407717 -0.485112 ]
21 : [ -0.382992 -0.752817 ]
22 : [ 0.257926 1.96876 ]
23 : [ -0.671291 1.85579 ]
24 : [ 0.0521593 0.790446 ]
25 : [ 0.716353 -0.743622 ]
26 : [ 0.184356 -1.53073 ]
27 : [ 0.655027 0.538071 ]
28 : [ 1.73821 -0.958722 ]
29 : [ 0.377922 -0.181004 ]
output sample :
[ y0 y1 ]
0 : [ 0.791234 -6.153 ]
1 : [ 0.719848 0.127674 ]
2 : [ -0.257609 0.075673 ]
3 : [ 0.46935 0.0964592 ]
4 : [ -0.0330217 0.825582 ]
5 : [ 0.978133 0.467366 ]
6 : [ -0.9084 -0.372691 ]
7 : [ 0.167439 0.293644 ]
8 : [ 0.655206 3.07871 ]
9 : [ 0.993427 0.338667 ]
10 : [ 0.718808 0.818737 ]
11 : [ -0.318354 0.28152 ]
12 : [ 0.940016 1.80491 ]
13 : [ -0.533709 0.111917 ]
14 : [ 0.757606 1.11916 ]
15 : [ 0.571259 0.59742 ]
16 : [ 0.183152 0.105058 ]
17 : [ -0.718312 1.05597 ]
18 : [ 0.76617 2.19061 ]
19 : [ 0.667988 1.22357 ]
20 : [ 0.997007 2.04242 ]
21 : [ 0.421399 0.759585 ]
22 : [ -0.609865 0.0749114 ]
23 : [ 0.376759 0.0356671 ]
24 : [ 0.665521 0.388187 ]
25 : [ 0.999628 2.32215 ]
26 : [ 0.222539 -13.6308 ]
27 : [ 0.368781 1.00946 ]
28 : [ 0.711272 1.59716 ]
29 : [ 0.980674 1.71644 ])o(| y0 = [x0]->[-0.05126222674900215992+0.97297586513129663555*x0]
| y1 = [x1]->[0.2190125596644127981+1.8591062333030965448*x1]
)
Get the composed meta model which is the model of the reduced variables Z within the reduced polynomials basis
result.getComposedMetaModel()
[0.338277,-0.400976] + [0,1.88905] * (1.73205 * x1) + [0,0.707558] * (-0.707107 + 0.707107 * x0^2) + [-0.228331,-2.22459] * (-1.11803 + 3.3541 * x1^2) + [0,1.77413] * (-3.96863 * x1 + 6.61438 * x1^3) + [-0.422679,0] * ((x0) * (1.73205 * x1)) + [0,-0.919896] * (1.125 - 11.25 * x1^2 + 13.125 * x1^4) + [0,0.0783328] * ((-0.707107 + 0.707107 * x0^2) * (1.73205 * x1)) + [0,-0.779135] * ((x0) * (-1.11803 + 3.3541 * x1^2)) + [0,-0.204636] * (-8.47215 * x1 + 76.2494 * x1^3 - 167.749 * x1^5 + 103.844 * x1^7) + [0,0.527054] * ((x0) * (-3.96863 * x1 + 6.61438 * x1^3)) + [0,0.348284] * ((-0.707107 + 0.707107 * x0^2) * (-1.11803 + 3.3541 * x1^2))
Get the meta model which is the composed meta model combined with the iso probabilistic transformation
result.getMetaModel()
([0.338277,-0.400976] + [0,1.88905] * (1.73205 * x1) + [0,0.707558] * (-0.707107 + 0.707107 * x0^2) + [-0.228331,-2.22459] * (-1.11803 + 3.3541 * x1^2) + [0,1.77413] * (-3.96863 * x1 + 6.61438 * x1^3) + [-0.422679,0] * ((x0) * (1.73205 * x1)) + [0,-0.919896] * (1.125 - 11.25 * x1^2 + 13.125 * x1^4) + [0,0.0783328] * ((-0.707107 + 0.707107 * x0^2) * (1.73205 * x1)) + [0,-0.779135] * ((x0) * (-1.11803 + 3.3541 * x1^2)) + [0,-0.204636] * (-8.47215 * x1 + 76.2494 * x1^3 - 167.749 * x1^5 + 103.844 * x1^7) + [0,0.527054] * ((x0) * (-3.96863 * x1 + 6.61438 * x1^3)) + [0,0.348284] * ((-0.707107 + 0.707107 * x0^2) * (-1.11803 + 3.3541 * x1^2)))o(| y0 = [x0]->[1.0277747227214693027*(x0+0.05126222674900215992)]
| y1 = [x1]->[0.537892876741792203*(x1-0.2190125596644127981)]
)
Get the projection strategy
algo.getProjectionStrategy()
class=ProjectionStrategy implementation=class=LeastSquaresStrategy experiment=class=FixedExperiment name=Unnamed sample=class=Sample name=Normal implementation=class=SampleImplementation name=Normal size=30 dimension=2 description=[X0,X1] data=[[0.608202,-1.26617],[-0.438266,1.20548],[-2.18139,0.350042],[-0.355007,1.43725],[0.810668,0.793156],[-0.470526,0.261018],[-2.29006,-1.28289],[-1.31178,-0.0907838],[0.995793,-0.139453],[-0.560206,0.44549],[0.322925,0.445785],[-1.03808,-0.856712],[0.473617,-0.125498],[0.351418,1.78236],[0.0702074,-0.781366],[-0.721533,-0.241223],[-1.78796,0.40136],[1.36783,1.00434],[0.741548,-0.0436123],[0.539345,0.29995],[0.407717,-0.485112],[-0.382992,-0.752817],[0.257926,1.96876],[-0.671291,1.85579],[0.0521593,0.790446],[0.716353,-0.743622],[0.184356,-1.53073],[0.655027,0.538071],[1.73821,-0.958722],[0.377922,-0.181004]] weights=class=Point name=Unnamed dimension=30 values=[0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333,0.0333333]
Total running time of the script: ( 0 minutes 0.032 seconds)