Sample

class Sample(*args)

Sample of real vectors.

Available constructors:

Sample(array)

Sample(size, dim)

Sample(size, point)

Sample(other, first, last)

Parameters:
array2-d sequence of float

The data

sizeint, m > 0, optional

The sample size. Default creates an empty sample with dimension 1.

dimensionint, n \geq 0, optional

The real vectors dimension. Default creates an empty sample with dimension 1.

pointPoint or flat (1d) array, list or tuple of floats, optional

The point that will be repeated along the sample. Default creates a sample filled with zeros (null vectors).

otherSample

The sample contains points to copy.

firstint, 0 \leq first < m

The index of the first point to copy.

lastint, first < last \leq m, optional

The index after the last point to copy.

Methods

BuildFromDataFrame()

Convert a pandas DataFrame to Sample.

BuildFromPoint(point)

Static method for building a sample from a sequence of float.

ImportFromCSVFile(*args)

Static method for building a sample from a CSV file.

ImportFromTextFile(*args)

Static method for building a sample from a text file.

add(*args)

Append a sample (in-place).

argsort([isIncreasing])

Returns indices of sorted sample.

asDataFrame()

Convert to pandas DataFrame.

asPoint()

Accessor to the internal linear storage for 1D sample.

clear()

Erase all values.

computeCentralMoment(k)

Estimate componentwise central moments.

computeCovariance()

Estimate the covariance matrix.

computeEmpiricalCDF(point[, tail])

Estimate the empirical cumulative distribution function (ECDF).

computeKendallTau()

Estimate the Kendall coefficients matrix.

computeKurtosis()

Estimate the componentwise kurtosis (4th order central normalized moment).

computeLinearCorrelation()

Estimate the linear (Pearson) correlation matrix.

computeMean()

Estimate the mean vector.

computeMedian()

Estimate the componentwise medians (50%-quantiles).

computeQuantile(*args)

Estimate the quantile of the joint distribution underlying the sample.

computeQuantilePerComponent(*args)

Estimate the componentwise quantiles.

computeRange()

Compute the range per component.

computeRawMoment(k)

Compute the raw (non-central) moment per component.

computeSkewness()

Estimate the componentwise skewness (3rd order central normalized moment).

computeSpearmanCorrelation()

Estimate the Spearman correlation matrix.

computeStandardDeviation()

Estimate the componentwise standard deviations.

computeVariance()

Estimate the componentwise variances.

erase(*args)

Erase point(s) at or between index(es) (in-place).

exportToCSVFile(*args)

Dump the sample to a CSV file.

find(point)

Get the position of a point in the sample.

getClassName()

Accessor to the object's name.

getDescription()

Accessor to the componentwise description.

getDimension()

Accessor to the sample's dimension.

getId()

Accessor to the object's id.

getImplementation()

Accessor to the underlying implementation.

getMarginal(*args)

Accessor to sample marginal(s) (column(s)).

getMax()

Accessor to the componentwise maximum values.

getMin()

Accessor to the componentwise minimum values.

getName()

Accessor to the object's name.

getSize()

Accessor to the sample size.

rank(*args)

Compute the sample (componentwise) ranks.

select(indices)

Select points in a sample.

setDescription(description)

Accessor to the componentwise description.

setName(name)

Accessor to the object's name.

sort(*args)

Sort the sample.

sortAccordingToAComponent(index)

Sort the sample according to the given component.

sortAccordingToAComponentInPlace(index)

Sort the sample in place according to the given component.

sortInPlace()

Sort the sample in place.

sortUnique()

Sort the sample and remove duplicate points.

sortUniqueInPlace()

Sort the sample in place and remove duplicate points.

split(index)

Trunk the sample.

stack(sample)

Stack (horizontally) the given sample to the current one (in-place).

Examples

Create a Sample

>>> import openturns as ot
>>> import numpy as np
>>> sample = ot.Sample(3, 2)
>>> print(sample)
0 : [ 0 0 ]
1 : [ 0 0 ]
2 : [ 0 0 ]

Create a Sample from a (2d) array, list or tuple

>>> import numpy as np
>>> sample = ot.Sample(np.array([(1.0, 2.0), (3.0, 4.0), (5.0, 6.0)]))

and back

>>> z = np.array(sample)

Load a sample from a CSV file

>>> sample = ot.Sample.ImportFromCSVFile('sample.csv', ',')

Eventually samples may also be generated from probability distributions or experiments

>>> random_sample = ot.Normal(2).getSample(10)
>>> experiment = ot.LHSExperiment(ot.Normal(2), 10).generate()

Translation: addition or subtraction of a (compatible) sample or a point, or a scalar which is promoted into a point of compatible dimension with equal components

>>> print(sample + sample)
0 : [  2  4 ]
1 : [  6  8 ]
2 : [ 10 12 ]
>>> print(sample - sample)
0 : [ 0 0 ]
1 : [ 0 0 ]
2 : [ 0 0 ]
>>> print(sample - sample[0])
0 : [ 0 0 ]
1 : [ 2 2 ]
2 : [ 4 4 ]
>>> print(sample - sample[0, 0])
0 : [ 0 1 ]
1 : [ 2 3 ]
2 : [ 4 5 ]
__init__(*args)
BuildFromDataFrame()

Convert a pandas DataFrame to Sample.

Parameters:
dfpandas DataFrame

The data to convert

Returns:
sampleSample

The converted sample

static BuildFromPoint(point)

Static method for building a sample from a sequence of float.

Parameters:
data1d array-like

Data.

Returns:
sampleSample

Sample generated from sequence

Examples

>>> import openturns as ot
>>> n = 20
>>> x = ot.Sample.BuildFromPoint(range(n))
>>> data = [2.0, 2.0, 1.0, 1.0, 2.0, 3.0, 1.0, 2.0, 2.0, 1.0]
>>> sample = ot.Sample.BuildFromPoint(data)
static ImportFromCSVFile(*args)

Static method for building a sample from a CSV file.

Parameters:
file_namestr

Path to CSV file.

separatorstr, optional

Separating string. Default uses Sample-CSVFileSeparator from the ResourceMap.

Returns:
sampleSample

Sample loaded from the CSV file.

Notes

The file may or may not contain a header line (columns spanned with strings delimited with quotes). If it does contain such a header line, it will be used for setting the sample description using setDescription().

The implementation follows the RFC 4180: https://tools.ietf.org/html/rfc4180, for more permissive formats see ImportFromTextFile().

Examples

>>> import openturns as ot
>>> separator = ','
>>> sample = ot.Sample.ImportFromCSVFile('sample.csv', separator)
static ImportFromTextFile(*args)

Static method for building a sample from a text file.

Parameters:
file_namestr

Path to text file.

separatorstr, optional

Separating string. Default uses a blank space.

skipped_linesint, optional

Number of lines skipped. Default is 0.

decimalSeparatorstr, optional

Decimal separator. Default is dot.

Returns:
sampleSample

Sample loaded from the text file.

Notes

The file may or may not contain a header line (columns spanned with strings delimited with quotes). If it does contain such a header line, it will be used for setting the sample description using setDescription().

This method allows for more permissive file formatting than ImportFromCSVFile():

  • The field separator can be a whitespace

  • Comment lines or empty ones are allowed

  • Lines can be skipped from the start of the file

The comment marker is defined by the Sample-CommentsMarker entry from ResourceMap.

Examples

>>> import openturns as ot
>>> separator = ' '
>>> sample = ot.Sample.ImportFromTextFile('sample.txt', separator)
add(*args)

Append a sample (in-place).

Parameters:
point or samplesequence or 2-d sequence of float

The point(s) to append.

Examples

Append an existing sample with a single point.

>>> import openturns as ot
>>> sample = ot.Sample(3, 2)
>>> sample.add([1.0, 2.0])
>>> print(sample)
0 : [ 0 0 ]
1 : [ 0 0 ]
2 : [ 0 0 ]
3 : [ 1 2 ]

Append an existing sample with another sample.

>>> sample.add(ot.Sample(2, [2.0, 1.0]))
>>> print(sample)
0 : [ 0 0 ]
1 : [ 0 0 ]
2 : [ 0 0 ]
3 : [ 1 2 ]
4 : [ 2 1 ]
5 : [ 2 1 ]
argsort(isIncreasing=True)

Returns indices of sorted sample.

The algorithm sorts the points in the sample in the lexicographic order.

Returns:
indicesIndices

The indices which sorts the sample.

isIncreasingbool, optional

If True, sort in increasing order. If False, sort in decreasing order. Default is True.

Examples

>>> import openturns as ot
>>> sample = ot.Sample(
...     [[-1.0, 1.0, 0.0], [-1.0, 1.0, 1.0], [-1.0, 0.0, 1.0], [-1.0, 0.0, -1.0]]
... )
>>> print(sample)
0 : [ -1  1  0 ]
1 : [ -1  1  1 ]
2 : [ -1  0  1 ]
3 : [ -1  0 -1 ]
>>> indices = sample.argsort()
>>> print(indices)
[3,2,0,1]
>>> print(sample[indices])
    [ v0 v1 v2 ]
0 : [ -1  0 -1 ]
1 : [ -1  0  1 ]
2 : [ -1  1  0 ]
3 : [ -1  1  1 ]
>>> indices = sample.argsort(False)
>>> print(indices)
[1,0,2,3]
asDataFrame()

Convert to pandas DataFrame.

Returns:
dfpandas DataFrame

The converted data

asPoint()

Accessor to the internal linear storage for 1D sample.

Returns:
valuesPoint

Flat internal representation of the sample.

Notes

Available only for 1D sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal().getSample(5)
>>> print(sample)
    [ X0        ]
0 : [  0.608202 ]
1 : [ -1.26617  ]
2 : [ -0.438266 ]
3 : [  1.20548  ]
4 : [ -2.18139  ]
>>> print(sample.asPoint())
[0.608202,-1.26617,-0.438266,1.20548,-2.18139]
clear()

Erase all values.

computeCentralMoment(k)

Estimate componentwise central moments.

Parameters:
kint

The central moment’s order.

Returns:
mPoint

Componentwise central moment of order k estimated from the sample.

Notes

The central moment of order k is estimated as follows:

\vect{\widehat{m}}^{(k)}_0 = \Tr{\left(\frac{1}{m}
                                       \sum_{j=1}^m
                                       \left(x_i^{(j)} - \widehat{\mu}_i\right)^k,
                                       \quad i = 1, \ldots, n\right)}

where \vect{\widehat{\mu}} is the estimator of the mean.

These estimators are the natural (possibly biased) estimators. For unbiased estimators use the other dedicated methods such as computeVariance() for the variance.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeCentralMoment(2))
[0.915126,0.873119]
computeCovariance()

Estimate the covariance matrix.

Returns:
covarianceCovarianceMatrix

Covariance matrix estimated from the sample.

Notes

The covariance matrix is estimated as follows:

\mat{\widehat{\Sigma}} = \left[\frac{1}{m - 1}
                               \sum_{k=1}^m
                               \left(x_i^{(k)} - \widehat{\mu}_i\right)
                               \left(x_j^{(k)} - \widehat{\mu}_j\right),
                               \quad i, j = 1, \ldots, n\right]

where \vect{\widehat{\mu}} denotes the estimate of the mean.

This is an unbiased estimator.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeCovariance())
[[ 0.946682  0.0182104 ]
 [ 0.0182104 0.903226  ]]
computeEmpiricalCDF(point, tail=False)

Estimate the empirical cumulative distribution function (ECDF).

Parameters:
xsequence of float

CDF input.

survivalbool, optional

A flag telling whether this should estimate the empirical cumulative distribution function or the empirical survival function. Default is False and estimates the CDF.

Returns:
pfloat, 0 \leq p \leq 1

Empirical CDF or SF value at point x.

Notes

The empirical cumulative distribution function (CDF) is estimated as follows:

\hat{F}(\vect{x}) = \frac{1}{m} \sum_{j=1}^m
                    \mathbf{1}_{\cap_{i=1}^n x_i^{(j)} \leq x_i}(\vect{x})

The empirical survival function (SF) is estimated in a similar way:

\hat{S}(\vect{x}) = \frac{1}{m} \sum_{j=1}^m
                    \mathbf{1}_{\cap_{i=1}^n x_i^{(j)} > x_i}(\vect{x})

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeEmpiricalCDF(sample[0]))
0.1
computeKendallTau()

Estimate the Kendall coefficients matrix.

Returns:
tauCorrelationMatrix

Kendall coefficients matrix estimated from the sample.

Notes

This uses an external implementation provided under the Boost Software License by David Simcha based on the paper by [knight1966]. It actually switches between two implementations depending on the sample size:

  • The most basic implementation performing in O(m^2) is used when the sample size is less than SampleImplementation-SmallKendallTau from the ResourceMap.

  • The other more complex implementation performing in O(m\log(m)) is used for larger samples.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeKendallTau())
[[ 1          0.00689655 ]
 [ 0.00689655 1          ]]
computeKurtosis()

Estimate the componentwise kurtosis (4th order central normalized moment).

Returns:
kurtosisPoint

Componentwise kurtosis estimated from the sample.

Notes

The componentwise kurtosis are estimated as follows:

\vect{\widehat{\kappa}} = \Tr{\left(\frac{m (m-1) (m+1)}{(m-2) (m-3)}
                                    \frac{\sum_{j=1}^m
                                          \left(x_i^{(j)} - \widehat{\mu}_i\right)^4}
                                         {\left(\sum_{j=1}^m
                                                \left(x_i^{(j)} - \widehat{\mu}_i\right)^2
                                          \right)^2}
                                    - 3 \frac{3 (m-5)}{(m-2) (m-3)},
                                    \quad i = 1, \ldots, n\right)}

where \vect{\widehat{\mu}} is the estimate of the mean.

This estimator is unbiased.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeKurtosis())
[3.27647,2.40275]
computeLinearCorrelation()

Estimate the linear (Pearson) correlation matrix.

Returns:
rhoCorrelationMatrix

Pearson correlation matrix estimated from the sample.

Notes

The Pearson correlation matrix is estimated as follows:

\mat{\widehat{\rho}} = \left[\frac{\widehat{\Sigma}_{i,j}}
                                  {\widehat{\Sigma}_{i,i} \widehat{\Sigma}_{j,j}},
                             \quad i,j = 1, \ldots, n\right]

where \mat{\widehat{\Sigma}} denotes the estimate of the covariance.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeLinearCorrelation())
[[ 1         0.0196933 ]
 [ 0.0196933 1         ]]
computeMean()

Estimate the mean vector.

Returns:
meanPoint

Mean vector estimated from the sample.

Notes

The mean is estimated as follows:

\vect{\widehat{\mu}} = \Tr{\left(\frac{1}{m}
                                 \sum_{j=1}^m x_i^{(j)},
                                 \quad i = 1, \ldots, n\right)}

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeMean())
[-0.0512622,0.136653]
computeMedian()

Estimate the componentwise medians (50%-quantiles).

Returns:
medianPoint

Median vector estimated from the sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeMedian())
[0.221141,0.108703]
computeQuantile(*args)

Estimate the quantile of the joint distribution underlying the sample.

Parameters:
pfloat, 0 \leq p \leq 1, or sequence of float

Input probability level.

Returns:
quantilePoint or Sample

Quantile of the joint distribution at probability level p, estimated from the sample.

Raises:
NotImplementedYetErrorIf the dimension is greater than 1.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(1).getSample(30)
>>> print(sample.computeQuantile(.2))
[-0.947394]
computeQuantilePerComponent(*args)

Estimate the componentwise quantiles.

Parameters:
pfloat, 0 \leq p \leq 1, or sequence of float

Input probability level.

Returns:
quantilePoint or Sample

Componentwise quantile at probability level p, estimated from the sample.

Notes

The present implementation interpolates the quantile between the two adjacent empirical quantiles (\widehat{x}_i^- and \widehat{x}_i^+):

\widehat{q}_i = \alpha \widehat{x}_i^- + (1 - \alpha) \widehat{x}_i^+

where \alpha = p m - 0.5.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeQuantilePerComponent(0.2))
[-0.696412,-0.767092]
computeRange()

Compute the range per component.

Returns:
rangePoint

Componentwise ranges estimated from the sample.

Notes

The statistical range is defined as the deviation between the maximal and the minimal value of the sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeRange())
[4.02827,3.49949]
computeRawMoment(k)

Compute the raw (non-central) moment per component.

Parameters:
kint, k \geq 0

Componentwise moment’s order.

Returns:
momentsPoint

Componentwise moments estimated from the sample.

Notes

The (raw) moment of order k is estimated as follows:

\vect{\widehat{m}}^{(k)} = \Tr{\left(\frac{1}{m}
                                     \sum_{j=1}^m {x_i^{(j)}}^k,
                                     \quad i = 1, \ldots, n\right)}

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeRawMoment(2))
[0.917754,0.891793]
computeSkewness()

Estimate the componentwise skewness (3rd order central normalized moment).

Returns:
skewnessPoint

Componentwise skewness estimated from the sample.

Notes

The componentwise skewnesses are estimated as follows:

\vect{\widehat{\delta}} = \Tr{\left(m \frac{\sqrt{m-1}}{m-2}
                                    \frac{\sum_{j=1}^m
                                          \left(x_i^{(j)} - \widehat{\mu}_i\right)^3}
                                         {\left(\sum_{j=1}^m
                                                \left(x_i^{(j)} - \widehat{\mu}_i\right)^2
                                          \right)^{3/2}},
                                    \quad i = 1, \ldots, n\right)}

where \vect{\widehat{\mu}} is the estimate of the mean.

This is an unbiased estimator.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeSkewness())
[-0.69393,0.231931]
computeSpearmanCorrelation()

Estimate the Spearman correlation matrix.

Returns:
rhoCorrelationMatrix

Spearman correlation matrix estimated from the sample.

Notes

The Spearman correlation matrix is estimated as the Pearson correlation matrix of the ranks sample (i.e. using self.rank().computeLinearCorrelation()).

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeSpearmanCorrelation())
[[  1          -0.00556174 ]
 [ -0.00556174  1          ]]
computeStandardDeviation()

Estimate the componentwise standard deviations.

Returns:
standard_deviationsPoint

Componentwise standard deviation estimated from the sample.

See also

computeVariance

Notes

The componentwise standard deviations are estimated as the square root of the componentwise variances.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeStandardDeviation())
[0.972976,0.950382]
computeVariance()

Estimate the componentwise variances.

Returns:
variancesPoint

Componentwise variances estimated from the sample.

Notes

The componentwise variances are estimated as follows:

\vect{\widehat{\sigma^2}} = \Tr{\left(\frac{1}{m-1}
                                      \sum_{j=1}^m
                                      \left(x_i^{(j)} - \widehat{\mu}_i\right)^2,
                                      \quad i = 1, \ldots, n\right)}

where \vect{\widehat{\mu}} is the estimate of the mean.

This estimator is unbiased.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.computeVariance())
[0.946682,0.903226]
erase(*args)

Erase point(s) at or between index(es) (in-place).

Parameters:
fint, 0 \leq f < m

The index of the first point to erase.

lint, f < l \leq m, optional

The index after the last point to erase. Default uses l = f + 1 and only removes sample[f].

Examples

>>> import openturns as ot
>>> sample = ot.Sample([[i] for i in range(5)])
>>> print(sample)
0 : [ 0 ]
1 : [ 1 ]
2 : [ 2 ]
3 : [ 3 ]
4 : [ 4 ]
>>> sample.erase(1, 3)
>>> print(sample)
0 : [ 0 ]
1 : [ 3 ]
2 : [ 4 ]
exportToCSVFile(*args)

Dump the sample to a CSV file.

Parameters:
file_namestr

Path to CSV file.

separatorstr, optional

Separating string. Default uses Sample-CSVFileSeparator from the ResourceMap.

decimalSeparatorstr, optional

Decimal separator. Default is dot.

precisionint, optional

Numerical precision Default takes Sample-CSVPrecision entry from ResourceMap.

formatstr, optional

Floating-point formatting, one of:

  • scientific: exponent notation

  • fixed: constant number of digits

  • defaultfloat: variable number of digits

Default takes Sample-CSVFormat entry from ResourceMap.

Notes

This will create a header line with componentwise descriptions (obtained from getDescription()) between quotes as column names. In scientific formatting the number of significant digits is precision + 1.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> sample.exportToCSVFile('sample.csv', '; ')
find(point)

Get the position of a point in the sample.

Parameters:
pointsequence of float

The wanted point.

Returns:
indexint,

Returns m if the point does not belong to the sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(30)
>>> print(sample.find(sample[10]))
10
>>> print(sample.find([0.0, 0.0]))
30
getClassName()

Accessor to the object’s name.

Returns:
class_namestr

The object class name (object.__class__.__name__).

getDescription()

Accessor to the componentwise description.

Returns:
descriptionDescription

Description of the sample’s components.

See also

setDescription
getDimension()

Accessor to the sample’s dimension.

Returns:
nint

The number of components of the points in the sample.

getId()

Accessor to the object’s id.

Returns:
idint

Internal unique identifier.

getImplementation()

Accessor to the underlying implementation.

Returns:
implImplementation

A copy of the underlying implementation object.

getMarginal(*args)

Accessor to sample marginal(s) (column(s)).

Parameters:
indicesint, sequence of int, 0 \leq i < n or sequence of str

The identifiers of the wanted marginal(s). When the description contains duplicate labels, the first marginal is picked up.

Returns:
sampleSample

A subsample of the present sample with the requested marginal(s).

Notes

The Sample also implements slicing in its __getitem__ method.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(10).getSample(3)
>>> print(sample.getMarginal([1, 4]))
    [ X1        X4        ]
0 : [ -1.26617  -2.18139  ]
1 : [  0.261018 -1.31178  ]
2 : [  0.445785  0.473617 ]
getMax()

Accessor to the componentwise maximum values.

Returns:
maximum_valuesPoint

Componentwise maximum values.

getMin()

Accessor to the componentwise minimum values.

Returns:
minimum_valuesPoint

Componentwise minimum values.

getName()

Accessor to the object’s name.

Returns:
namestr

The name of the object.

getSize()

Accessor to the sample size.

Returns:
mint

The number points in the sample.

rank(*args)

Compute the sample (componentwise) ranks.

Parameters:
marginal_indexint, 0 \leq i < n, optional

The component whose ranks are wanted. Default computes the ranks of all the components.

Returns:
ranksSample

The requested ranks.

Notes

The ranks of a 1d sample is a list of indices that sorts the points in the ascending order. Ties (equal points) are averaged.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> print(sample.rank())
    [ X0 X1 ]
0 : [ 2  0  ]
1 : [ 1  2  ]
2 : [ 0  1  ]
select(indices)

Select points in a sample.

It selects the points at given locations and returns them as a new sample.

Parameters:
indicessequence of int, 0 \leq i < m

The selected indices.

Returns:
selected_sampleSample

The selected points as a sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> selected_sample = sample.select([1, 0, 1])
>>> print(selected_sample)
    [ X0        X1        ]
0 : [ -0.438266  1.20548  ]
1 : [  0.608202 -1.26617  ]
2 : [ -0.438266  1.20548  ]
setDescription(description)

Accessor to the componentwise description.

Parameters:
descriptionsequence of str

Description of the sample’s components.

See also

getDescription
setName(name)

Accessor to the object’s name.

Parameters:
namestr

The name of the object.

sort(*args)

Sort the sample.

The ordering is based on the comparison operator of the Point. Hence, the sort method orders the points in the sample according to lexicographic order.

Parameters:
marginal_indexint, 0 \leq i < n, optional

The index of the component to sort. Default sorts the whole sample, i.e. the returned sorted sample has the same dimension as the input sample. If marginal_index is provided, then the returned sorted sample has dimension 1: the corresponding marginal sample is sorted and returned.

Returns:
sorted_sampleSample

The requested sorted sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> print(sample.sort())
    [ X0        X1        ]
0 : [ -2.18139   0.350042 ]
1 : [ -0.438266  1.20548  ]
2 : [  0.608202 -1.26617  ]
>>> print(sample.sort(1))
0 : [ -1.26617  ]
1 : [  0.350042 ]
2 : [  1.20548  ]

In the following sample, the first component of all points is equal to -1, which creates a tie where the other components must be used to make a difference in the comparison. The algorithm sorts the points taking into account all the components in the points. This shows that the algorithm uses lexicographic ordering, since using only the first component would leave the sample unchanged.

>>> import openturns as ot
>>> sample = ot.Sample(
...     [[-1.0, 1.0, 1.0], [-1.0, 1.0, 0.0], [-1.0, 0.0, 1.0], [-1.0, 0.0, -1.0]]
... )
>>> print(sample)
0 : [ -1  1  1 ]
1 : [ -1  1  0 ]
2 : [ -1  0  1 ]
3 : [ -1  0 -1 ]
>>> print(sample.sort())
0 : [ -1  0 -1 ]
1 : [ -1  0  1 ]
2 : [ -1  1  0 ]
3 : [ -1  1  1 ]
sortAccordingToAComponent(index)

Sort the sample according to the given component.

Parameters:
marginal_indexint, 0 \leq i < n

The component to use for sorting the sample.

Returns:
sorted_sampleSample

The sample sorted according to the given component.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> print(sample.sortAccordingToAComponent(0))
    [ X0        X1        ]
0 : [ -2.18139   0.350042 ]
1 : [ -0.438266  1.20548  ]
2 : [  0.608202 -1.26617  ]
sortAccordingToAComponentInPlace(index)

Sort the sample in place according to the given component.

Parameters:
marginal_indexint, 0 \leq i < n

The component to use for sorting the sample.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> sample.sortAccordingToAComponentInPlace(0)
>>> print(sample)
    [ X0        X1        ]
0 : [ -2.18139   0.350042 ]
1 : [ -0.438266  1.20548  ]
2 : [  0.608202 -1.26617  ]
sortInPlace()

Sort the sample in place.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> sample.sortInPlace()
>>> print(sample)
    [ X0        X1        ]
0 : [ -2.18139   0.350042 ]
1 : [ -0.438266  1.20548  ]
2 : [  0.608202 -1.26617  ]
sortUnique()

Sort the sample and remove duplicate points.

Returns:
unique_sampleSample

The requested sorted sample with duplicate points removed.

Examples

>>> import openturns as ot
>>> sample = ot.Sample([[3, 0, 3], [1, 1, 0], [0, 2, 2], [1, 1, 0]])
>>> print(sample)
0 : [ 3 0 3 ]
1 : [ 1 1 0 ]
2 : [ 0 2 2 ]
3 : [ 1 1 0 ]
>>> print(sample.sortUnique())
0 : [ 0 2 2 ]
1 : [ 1 1 0 ]
2 : [ 3 0 3 ]
sortUniqueInPlace()

Sort the sample in place and remove duplicate points.

Examples

>>> import openturns as ot
>>> sample = ot.Sample([[3, 0, 3], [1, 1, 0], [0, 2, 2], [1, 1, 0]])
>>> print(sample)
0 : [ 3 0 3 ]
1 : [ 1 1 0 ]
2 : [ 0 2 2 ]
3 : [ 1 1 0 ]
>>> sample.sortUniqueInPlace()
>>> print(sample)
0 : [ 0 2 2 ]
1 : [ 1 1 0 ]
2 : [ 3 0 3 ]
split(index)

Trunk the sample.

It splits the sample before the index passed as argument and returns the remainder as new sample.

Parameters:
indexint, 0 \leq i < m

The truncation index.

Returns:
remainder_sampleSample

The remainder sample (everything that comes after the truncation index).

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> remainder_sample = sample.split(1)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
>>> print(remainder_sample)
    [ X0        X1        ]
0 : [ -0.438266  1.20548  ]
1 : [ -2.18139   0.350042 ]
stack(sample)

Stack (horizontally) the given sample to the current one (in-place).

Parameters:
sampleSample

Sample to stack with compatible size.

Examples

>>> import openturns as ot
>>> ot.RandomGenerator.SetSeed(0)
>>> sample = ot.Normal(2).getSample(3)
>>> print(sample)
    [ X0        X1        ]
0 : [  0.608202 -1.26617  ]
1 : [ -0.438266  1.20548  ]
2 : [ -2.18139   0.350042 ]
>>> another_sample = ot.Normal(2).getSample(3)
>>> print(another_sample)
    [ X0        X1        ]
0 : [ -0.355007  1.43725  ]
1 : [  0.810668  0.793156 ]
2 : [ -0.470526  0.261018 ]
>>> sample.stack(another_sample)
>>> print(sample)
    [ X0        X1        X0        X1        ]
0 : [  0.608202 -1.26617  -0.355007  1.43725  ]
1 : [ -0.438266  1.20548   0.810668  0.793156 ]
2 : [ -2.18139   0.350042 -0.470526  0.261018 ]

Examples using the class

Customize your Metropolis-Hastings algorithm

Customize your Metropolis-Hastings algorithm

Bayesian calibration of a computer code

Bayesian calibration of a computer code

Bayesian calibration of the flooding model

Bayesian calibration of the flooding model

Gibbs sampling of the posterior distribution

Gibbs sampling of the posterior distribution

Linear Regression with interval-censored observations

Linear Regression with interval-censored observations

Bayesian calibration of hierarchical fission gas release models

Bayesian calibration of hierarchical fission gas release models

Sampling from an unscaled probability density

Sampling from an unscaled probability density

Posterior sampling using a PythonDistribution

Posterior sampling using a PythonDistribution

Calibration of the Chaboche mechanical model

Calibration of the Chaboche mechanical model

Calibration of the deflection of a tube

Calibration of the deflection of a tube

Calibration of the flooding model

Calibration of the flooding model

Calibration of the logistic model

Calibration of the logistic model

Calibrate a parametric model: a quick-start guide to calibration

Calibrate a parametric model: a quick-start guide to calibration

Calibration without observed inputs

Calibration without observed inputs

Generate observations of the Chaboche mechanical model

Generate observations of the Chaboche mechanical model

Generate flooding model observations

Generate flooding model observations

Fitting a distribution with customized maximum likelihood

Fitting a distribution with customized maximum likelihood

Get the asymptotic distribution of the estimators

Get the asymptotic distribution of the estimators

Estimate a conditional quantile

Estimate a conditional quantile

Estimate a GEV on the Fremantle sea-levels data

Estimate a GEV on the Fremantle sea-levels data

Estimate a GEV on the Port Pirie sea-levels data

Estimate a GEV on the Port Pirie sea-levels data

Estimate a GEV on race times data

Estimate a GEV on race times data

Estimate a GEV on the Venice sea-levels data

Estimate a GEV on the Venice sea-levels data

Estimate a GPD on the Dow Jones Index data

Estimate a GPD on the Dow Jones Index data

Estimate a GPD on the daily rainfall data

Estimate a GPD on the daily rainfall data

Estimate a GPD on the Wooster temperature data

Estimate a GPD on the Wooster temperature data

Estimate a multivariate distribution

Estimate a multivariate distribution

Fit a non parametric distribution

Fit a non parametric distribution

Fit a parametric distribution

Fit a parametric distribution

Fit an extreme value distribution

Fit an extreme value distribution

Fit a distribution by maximum likelihood

Fit a distribution by maximum likelihood

Model a singular multivariate distribution

Model a singular multivariate distribution

Bandwidth sensitivity in kernel smoothing

Bandwidth sensitivity in kernel smoothing

Fit a parametric copula

Fit a parametric copula

Estimate tail dependence coefficients on the wave-surge data

Estimate tail dependence coefficients on the wave-surge data

Estimate tail dependence coefficients on the wind data

Estimate tail dependence coefficients on the wind data

Fit a non parametric copula

Fit a non parametric copula

Estimate a spectral density function

Estimate a spectral density function

Estimate a stationary covariance function

Estimate a stationary covariance function

Visualize sensitivity

Visualize sensitivity

Visualize clouds

Visualize clouds

Visualize pairs between two samples

Visualize pairs between two samples

Visualize pairs

Visualize pairs

Estimate moments from sample

Estimate moments from sample

Import / export a sample via a CSV file

Import / export a sample via a CSV file

Build and validate a linear model

Build and validate a linear model

Estimate quantile confidence intervals from data

Estimate quantile confidence intervals from data

Estimate a confidence interval of a quantile

Estimate a confidence interval of a quantile

A quick start guide to the Point and Sample classes

A quick start guide to the Point and Sample classes

Randomize the lines of a Sample

Randomize the lines of a Sample

Estimate correlation coefficients

Estimate correlation coefficients

Sample manipulation

Sample manipulation

Link Pandas and OpenTURNS

Link Pandas and OpenTURNS

Sort a sample

Sort a sample

Compare unconditional and conditional histograms

Compare unconditional and conditional histograms

Draw a survival function

Draw a survival function

Compute squared SRC indices confidence intervals

Compute squared SRC indices confidence intervals

Draw the empirical CDF

Draw the empirical CDF

Draw an histogram

Draw an histogram

Test a discrete distribution

Test a discrete distribution

Select fitted distributions

Select fitted distributions

Kolmogorov-Smirnov : get the statistics distribution

Kolmogorov-Smirnov : get the statistics distribution

Kolmogorov-Smirnov : understand the p-value

Kolmogorov-Smirnov : understand the p-value

Kolmogorov-Smirnov : understand the statistics

Kolmogorov-Smirnov : understand the statistics

Use the Kolmogorov/Lilliefors test

Use the Kolmogorov/Lilliefors test

Draw the QQ-Plot

Draw the QQ-Plot

Test identical distributions

Test identical distributions

Test the copula

Test the copula

Test independence

Test independence

Test Normality

Test Normality

Function manipulation

Function manipulation

Logistic growth model

Logistic growth model

Create a process sample from a sample

Create a process sample from a sample

Value function

Value function

Vertex value function

Vertex value function

Define a function with a field output: the viscous free fall example

Define a function with a field output: the viscous free fall example

Define a connection function with a field output

Define a connection function with a field output

Defining Python and symbolic functions: a quick start introduction to functions

Defining Python and symbolic functions: a quick start introduction to functions

Getting started

Getting started

Gaussian Process Regression vs KrigingAlgorithm

Gaussian Process Regression vs KrigingAlgorithm

A quick start guide to graphs

A quick start guide to graphs

A quick start guide to contours

A quick start guide to contours

How to fill an area

How to fill an area

Plot the log-likelihood contours of a distribution

Plot the log-likelihood contours of a distribution

Metamodel of a field function

Metamodel of a field function

Viscous free fall: metamodel of a field function

Viscous free fall: metamodel of a field function

Create a linear least squares model

Create a linear least squares model

Mixture of experts

Mixture of experts

Export a metamodel

Export a metamodel

Create a general linear model metamodel

Create a general linear model metamodel

Create a linear model

Create a linear model

Over-fitting and model selection

Over-fitting and model selection

Perform stepwise regression

Perform stepwise regression

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression : quick-start

Gaussian Process Regression : quick-start

Gaussian Process-based active learning for reliability

Gaussian Process-based active learning for reliability

Advanced Gaussian process regression

Advanced Gaussian process regression

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: choose a polynomial trend

Gaussian Process Regression: choose a polynomial trend

Gaussian process fitter: configure the optimization solver

Gaussian process fitter: configure the optimization solver

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian process regression: draw the likelihood

Gaussian process regression: draw the likelihood

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression: metamodel of the Branin-Hoo function

Gaussian Process Regression: metamodel of the Branin-Hoo function

Example of multi output Gaussian Process Regression on the fire satellite model

Example of multi output Gaussian Process Regression on the fire satellite model

Sequentially adding new points to a Gaussian Process metamodel

Sequentially adding new points to a Gaussian Process metamodel

Gaussian Process Regression: propagate uncertainties

Gaussian Process Regression: propagate uncertainties

Polynomial chaos is sensitive to the degree

Polynomial chaos is sensitive to the degree

Fit a distribution from an input sample

Fit a distribution from an input sample

Create a polynomial chaos metamodel by integration on the cantilever beam

Create a polynomial chaos metamodel by integration on the cantilever beam

Create a sparse chaos by integration

Create a sparse chaos by integration

Conditional expectation of a polynomial chaos expansion

Conditional expectation of a polynomial chaos expansion

Polynomial chaos expansion cross-validation

Polynomial chaos expansion cross-validation

Apply a transform or inverse transform on your polynomial chaos

Apply a transform or inverse transform on your polynomial chaos

Validate a polynomial chaos

Validate a polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Compute grouped indices for the Ishigami function

Compute grouped indices for the Ishigami function

Compute Sobol’ indices confidence intervals

Compute Sobol' indices confidence intervals

Plot enumeration rules

Plot enumeration rules

Create a polynomial chaos metamodel from a data set

Create a polynomial chaos metamodel from a data set

Advanced polynomial chaos construction

Advanced polynomial chaos construction

Create a full or sparse polynomial chaos expansion

Create a full or sparse polynomial chaos expansion

Polynomial chaos exploitation

Polynomial chaos exploitation

Integrate a function with Gauss-Kronrod algorithm

Integrate a function with Gauss-Kronrod algorithm

Estimate a multivariate integral with IteratedQuadrature

Estimate a multivariate integral with IteratedQuadrature

Iterated Functions System

Iterated Functions System

Compute leave-one-out error of a polynomial chaos expansion

Compute leave-one-out error of a polynomial chaos expansion

Compute confidence intervals of a regression model from data

Compute confidence intervals of a regression model from data

Compute confidence intervals of a univariate noisy function

Compute confidence intervals of a univariate noisy function

Estimate extrema iteratively

Estimate extrema iteratively

Estimate moments iteratively

Estimate moments iteratively

Estimate threshold exceedance iteratively

Estimate threshold exceedance iteratively

EfficientGlobalOptimization examples

EfficientGlobalOptimization examples

Mix/max search and sensitivity from design

Mix/max search and sensitivity from design

Multi-objective optimization using Pagmo

Multi-objective optimization using Pagmo

Optimization of the Rastrigin test function

Optimization of the Rastrigin test function

Quick start guide to optimization

Quick start guide to optimization

Create a Joint by Conditioning distribution

Create a Joint by Conditioning distribution

Create and draw scalar distributions

Create and draw scalar distributions

Create and draw multivariate distributions

Create and draw multivariate distributions

Create a random mixture

Create a random mixture

Create a deconditioned random vector

Create a deconditioned random vector

Generate random variates by inverting the CDF

Generate random variates by inverting the CDF

Draw minimum volume level sets

Draw minimum volume level sets

Create a mixture of distributions

Create a mixture of distributions

Compare frequentist and Bayesian estimation

Compare frequentist and Bayesian estimation

Quick start guide to distributions

Quick start guide to distributions

Truncate a distribution

Truncate a distribution

Create a maximum entropy order statistics distribution

Create a maximum entropy order statistics distribution

Use the Ratio of Uniforms algorithm to sample a distribution

Use the Ratio of Uniforms algorithm to sample a distribution

Create and manipulate an ARMA process

Create and manipulate an ARMA process

Create a mesh

Create a mesh

Export a field to VTK

Export a field to VTK

Draw a field

Draw a field

Manipulate a time series

Manipulate a time series

Trend computation

Trend computation

Create a custom covariance model

Create a custom covariance model

Create a spectral model

Create a spectral model

Analyse the central tendency of a cantilever beam

Analyse the central tendency of a cantilever beam

Create a composite design of experiments

Create a composite design of experiments

Create a deterministic design of experiments

Create a deterministic design of experiments

Create a random design of experiments

Create a random design of experiments

Create a design of experiments with discrete and continuous variables

Create a design of experiments with discrete and continuous variables

Various design of experiments

Various design of experiments

Deterministic design of experiments

Deterministic design of experiments

Create a Gauss product design

Create a Gauss product design

LOLA-Voronoi sequential design of experiment

LOLA-Voronoi sequential design of experiment

Generate low discrepancy sequences

Generate low discrepancy sequences

Create mixed deterministic and probabilistic designs of experiments

Create mixed deterministic and probabilistic designs of experiments

Create a Monte Carlo design of experiments

Create a Monte Carlo design of experiments

Optimize an LHS design of experiments

Optimize an LHS design of experiments

The PlotDesign method

The PlotDesign method

Probabilistic design of experiments

Probabilistic design of experiments

Plot the Smolyak quadrature

Plot the Smolyak quadrature

Use the Smolyak quadrature

Use the Smolyak quadrature

Estimate a probability with Monte-Carlo on axial stressed beam: a quick start guide to reliability

Estimate a probability with Monte-Carlo on axial stressed beam: a quick start guide to reliability

Create a domain event

Create a domain event

Create a threshold event

Create a threshold event

Cross Entropy Importance Sampling

Cross Entropy Importance Sampling

Use the FORM - SORM algorithms

Use the FORM - SORM algorithms

Using the FORM - SORM algorithms on a nonlinear function

Using the FORM - SORM algorithms on a nonlinear function

Estimate a flooding probability

Estimate a flooding probability

Estimate a probability using Line Sampling

Estimate a probability using Line Sampling

Use the FORM algorithm in case of several design points

Use the FORM algorithm in case of several design points

Non parametric Adaptive Importance Sampling (NAIS)

Non parametric Adaptive Importance Sampling (NAIS)

Time variant system reliability problem

Time variant system reliability problem

Exploitation of simulation algorithm results

Exploitation of simulation algorithm results

Estimate a buckling probability

Estimate a buckling probability

Test the design point with the Strong Maximum Test

Test the design point with the Strong Maximum Test

Subset Sampling

Subset Sampling

Estimate Sobol indices on a field to point function

Estimate Sobol indices on a field to point function

Sobol’ sensitivity indices from chaos

Sobol' sensitivity indices from chaos

The HSIC sensitivity indices: the Ishigami model

The HSIC sensitivity indices: the Ishigami model

Use the ANCOVA indices

Use the ANCOVA indices

Parallel coordinates graph as sensitivity tool

Parallel coordinates graph as sensitivity tool

Sobol’ sensitivity indices using rank-based algorithm

Sobol' sensitivity indices using rank-based algorithm

Estimate Sobol’ indices for the Ishigami function by a sampling method: a quick start guide to sensitivity analysis

Estimate Sobol' indices for the Ishigami function by a sampling method: a quick start guide to sensitivity analysis

Estimate Sobol’ indices for a function with multivariate output

Estimate Sobol' indices for a function with multivariate output

Example of sensitivity analyses on the wing weight model

Example of sensitivity analyses on the wing weight model