# Least squares polynomial response surface¶

*local*approximation around a given set of input parameters, one may seek a

*global*approximation of over its whole domain of definition. A common choice to this end is global polynomial approximation. For the sake of simplicity, a

*scalar*model response will be considered from now on. Nonetheless, the following derivations hold for a vector-valued response.

a linear function, i.e. a polynomial of degree one;

a quadratic function, i.e. a polynomial of degree two.

where is a set of unknown coefficients.

where denotes the number of terms, which is equal to (resp. to ) when using a linear (resp. a quadratic) approximation, and the family gathers the constant monomial , the monomials of degree one and possibly the cross-terms as well as the monomials of degree two . Using the vector notation and , this rewrites:

*global*approximation of the model response over its whole definition domain is sought. To this end, the coefficients may be computed using a least squares regression approach. In this context, an experimental design , i.e. a set of realizations of input parameters is required, as well as the corresponding model evaluations .

The solution is given by:

where:

It is clear that the above equation is only valid for a full rank information matrix. A necessary condition is that the size of the experimental design is not less than the number of PC coefficients to estimate. In practice, it is not recommended to directly invert since the solution may be particularly sensitive to an ill-conditioning of the matrix. The least-square problem is rather solved using more robust numerical methods such as

singular value decomposition(SVD) orQR-decomposition.

API:

Examples:

References:

Bjorck, 1996, “Numerical methods for least squares problems”, SIAM Press, Philadelphia, PA.