Linear and Quadratic Taylor ExpansionsΒΆ

The approximation of the model response \underline{y} = h(\underline{x}) around a specific set \underline{x}_0 = (x_{0,1},\dots,x_{0,n_{X}}) of input parameters may be of interest. One may then substitute h for its Taylor expansion at point \underline{x}_0. Hence h is replaced with a first or second-order polynomial \widehat{h} whose evaluation is inexpensive, allowing the analyst to apply the uncertainty propagation methods.
We consider the first and second order Taylor expansions around \ux=\underline{x}_0.

\underline{y} \, \, \approx \, \, \widehat{h}(\underline{x}) \, \, = \, \, h(\underline{x}_0) \, + \,  \sum_{i=1}^{n_{X}} \; \frac{\partial h}{\partial x_i}(\underline{x}_0).\left(x_i - x_{0,i} \right)

Introducing a vector notation, the previous equation rewrites:

\underline{y} \, \, \approx \, \,  \underline{y}_0 \, + \, \underline{\underline{L}} \: \left(\underline{x}-\underline{x}_0\right)

where:

  • \underline{y_0} = (y_{0,1} , \dots, y_{0,n_Y})^{\textsf{T}}=  h(\underline{x}_0) is the vector model response evaluated at \underline{x}_0;
  • \underline{x} is the current set of input parameters;
  • \underline{\underline{L}} = \left( \frac{\partial y_{0,j}}{\partial x_i} \, \, , \, \, i=1,\ldots, n_X \, \, , \, \, j=1, \ldots, n_Y \right) is the transposed Jacobian matrix evaluated at \underline{x}_0.

\begin{aligned}
    \underline{y} \, \, \approx \, \, \widehat{h}(\underline{x}) \, \, = \, \,
    h(\underline{x}_0) \, +  \, \sum_{i=1}^{n_{X}} \;  \frac{\partial h}{\partial x_i}(\underline{x}_0).\left(x_i - x_{0,i} \right) \, + \, \frac{1}{2} \; \sum_{i,j=1}^{n_X} \;  \frac{\partial^2 h}{\partial x_i \partial x_j}(\underline{x}_0).\left(x_i - x_{0,i} \right).\left(x_j - x_{0,j} \right)
  \end{aligned}

Introducing a vector notation, the previous equation rewrites:

\underline{y} \, \, \approx  \, \,  \underline{y}_0 \, + \,  \underline{\underline{L}} \: \left(\underline{x}-\underline{x}_0\right) \, + \,  \frac{1}{2} \; \left\langle \left\langle\underline{\underline{\underline{Q}}}\:,\underline{x}-\underline{x}_0 \right\rangle,\:\underline{x}-\underline{x}_0 \right\rangle

where \underline{\underline{Q}} = \left\{ \frac{\partial^2 y_{0,k}}{\partial x_i \partial x_j} \, \, , \, \, i,j=1,\ldots, n_X \, \, , \, \, k=1, \ldots, n_Y \right\} is the transposed Hessian matrix.

Examples: