Numerical Differentiation is a method of numerically estimating the derivative of a function at some point. Numerical Differentiation is defined as:

Let $f$  be a given function that is only known at a number of isolated points. The problem of numerical differentiation is to compute an approximation to the derivative $f'$ of $f$ by suitable combinations of the known values of $f$.

A difference quotient is normally used to approximate the derivatives. There are numerous ways to construct a difference quotient. Taylor Series expansion can be used to measure the error terms of each of these approximations. Additionally, an extrapolation method can be used to improve the accuracy to extrapolate the quotient where the denominator is infinitesimally small. In this chapter, we shall be also looking at some of the important derivatives that are used in other branches of mathematics like in the optimization theory like the Gradient Vector, the Hessian and Jacobian Matrices.