# Kriging¶

Kriging (also known as Gaussian process regression) is a Bayesian technique that aim at approximating functions (most often in order to surrogate it because it is expensive to evaluate). In the following it is assumed we aim at creating a surrogate model of a scalar-valued model . Note the implementation of Kriging deals with vector-valued functions (), without simply looping over each output. It is also assumed the model is obtained over a design of experiments in order to produce a set of observations gathered in the following dataset: . Ultimately Kriging aims at producing a predictor (also known as a response surface or metamodel) denoted as .

We put the following Gaussian process prior on the model :

where:

is a generalized linear model based upon a functional basis and a vector of coefficients ,

is a zero-mean stationary Gaussian process whose covariance function reads:

where is the variance and is the correlation function that solely depends on the Manhattan distance between input points and a vector of parameters .

Under the Gaussian process prior assumption, the observations and a prediction at some unobserved input are jointly normally distributed:

where:

is the regression matrix,

is the observations’ correlation matrix, and:

is the vector of cross-correlations between the prediction and the observations.

As such, the Kriging predictor is defined as the following conditional distribution:

where and are the maximum likelihood estimates of the correlation parameters and variance (see references).

It can be shown (see references) that the predictor is also Gaussian:

with mean:

where is the generalized least squares solution of the underlying linear regression problem:

and variance:

where:

Kriging may also be referred to as *Gaussian process regression*.

API:

See

`KrigingAlgorithm`

Examples:

References:

Lophaven, H. Nielsen and J. Sondergaard, 2002, “DACE, A Matlab kriging toolbox”, Technichal University of Denmark. http://www2.imm.dtu.dk/projects/dace/

Santner, B. Williams and W. Notz, 2003. “The design and analysis of computer experiments”, Springer, New York.

Rasmussen and C. Williams, 2006, T. Dietterich (Ed.), “Gaussian processes for machine learning”, MIT Press.