The Hessian matrix is the square matrix of the second-order partial derivatives of a multivariate function.
Mathematically, the Hessian of a scalar function is an \(n \times n\) matrix,
where
n is the domain dimension of
f.
For a scalar function
f, we have
\[
H(f) = \begin{bmatrix}
\frac{\partial^2 f}{\partial x_1^2} & \frac{\partial^2 f}{\partial x_1\,\partial x_2} & \cdots & \frac{\partial^2 f}{\partial x_1\,\partial x_n} \\ \\
\frac{\partial^2 f}{\partial x_2\,\partial x_1} & \frac{\partial^2 f}{\partial x_2^2} & \cdots & \frac{\partial^2 f}{\partial x_2\,\partial x_n} \\ \\
\vdots & \vdots & \ddots & \vdots \\ \\
\frac{\partial^2 f}{\partial x_n\,\partial x_1} & \frac{\partial^2 f}{\partial x_n\,\partial x_2} & \cdots & \frac{\partial^2 f}{\partial x_n^2}
\end{bmatrix}
\]
This implementation computes the Hessian matrix numerically using the finite difference method.
We assume that the function
f is continuous so the Hessian matrix is square and symmetric.