34  finite differences

Definition of a derivative:

\[ \underbrace{\dot{f} = f'(t) = \frac{df(t)}{dt}}_{\text{same thing}} = \lim_{\Delta t \rightarrow 0} \frac{f(t+\Delta t) - f(t)}{\Delta t}. \]

Numerically, we can approximate the derivative \(f'(t)\) of a time series \(f(t)\) as

\[ \frac{df(t)}{dt} = \frac{f(t+\Delta t) - f(t)}{\Delta t} + \mathcal{O}(\Delta t). \tag{34.1}\]

The expression \(\mathcal{O}(\Delta t)\) means that the error associated with the approximation is proportional to \(\Delta t\). This is called “Big O notation”.

The expression above is called the two-point forward difference formula. Likewise, we can define the two-point backward difference formula:

\[ \frac{df(t)}{dt} = \frac{f(t) - f(t-\Delta t)}{\Delta t} + \mathcal{O}(\Delta t). \tag{34.2}\]

If we sum together Equation 34.1 and Equation 34.2 we get:

\[ \begin{aligned} 2\frac{df(t)}{dt} &= \frac{f(t+\Delta t) - \cancel{f(t)}}{\Delta t} + \frac{\cancel{f(t)} - f(t-\Delta t)}{\Delta t} \\ &= \frac{f(t+\Delta t) - f(t-\Delta t)}{\Delta t}. \end{aligned} \tag{34.3}\]

Dividing both sides by 2 gives the two-point central difference formula:

\[ \frac{df(t)}{dt} = \frac{f(t+\Delta t) - f(t-\Delta t)}{2\Delta t} + \mathcal{O}(\Delta t^2). \tag{34.4}\]

Two things are worth mentioning about the approximation above:

  1. it is balanced, that is, there is no preference of the future over the past.
  2. its error is proportional to \(\Delta t^2\), it is a lot more precise than the unbalanced approximations :)

To understand why the error is proportional to \(\Delta t^2\), one can subtract the Taylor expansion of \(f(t-\Delta t)\) from the Taylor expansion of \(f(t+\Delta t)\). See this, pages 3 and 4.

The function np.gradient calculates the derivative using the central difference for points in the interior of the array, and uses the forward (backward) difference for the derivative at the beginning (end) of the array.

The “gradient” usually refers to a first derivative with respect to space, and it is denoted as \(\nabla f(x)=\frac{df(x)}{dx}\). However, it doesn’t really matter if we call the independent variable \(x\) or \(t\), the derivative operator is exactly the same.

Check out this nice example.