Numerical Analysis/Differentiation/Examples

< Numerical Analysis < Differentiation

When deriving a finite difference approximation of the jth derivative of a function f : \mathbb{R} \rightarrow \mathbb{R}, we wish to find a_1, a_2, ..., a_n \in \mathbb{R} and b_1, b_2, ..., b_n \in \mathbb{R} such that

f^{(j)}(x_0) = h^{-j}\sum_{i=1}^{n}a_i f(x_0 + b_i h) + O(h^k) \text{ as } h \to 0

or, equivalently,

h^{-j}\sum_{i=1}^{n}a_i f(x_0 + b_i h) = f^{(j)}(x_0) + O(h^k) \text{ as } h \to 0

where O(h^k) is the error, the difference between the correct answer and the approximation, expressed using Big-O notation. Because h may be presumed to be small, a larger value for k is better than a smaller value.

A general method for finding the coefficients is to generate the Taylor expansion of h^{-j}\sum_{i=1}^{n}a_i f(x_0 + b_i h) and choose a_1, a_2, ..., a_n and b_1, b_2, ..., b_n such that f^{(j)}(x_0) and the remainder term are the only non-zero terms. If there are no such coefficients, a smaller value for k must be chosen.

For a function of m variables g:\mathbb{R}^m \rightarrow \mathbb{R}, the procedure is similar, except x_0, b_1, b_2, ..., b_n are replaced by points in \mathbb{R}^m and the multivariate extension of Taylor's theorem is used.


Single-Variable

In all single-variable examples, x_0 \in \mathbb{R} and f : \mathbb{R} \rightarrow \mathbb{R} are unknown, and h \in \mathbb{R} is small. Additionally, let f be 5 times continuously differentiable on \mathbb{R}.

First Derivative

Find a, b, c \in \mathbb{R} such that \frac{a f(x_0 + h) + b f(x_0 + c h)}{h} best approximates f'(x_0).

Let f : \mathbb{R} \rightarrow \mathbb{R} be 42 times continuously differentiable on \mathbb{R}. Find the largest n \in \mathbb{N} such that

\frac{df}{dx}(x_0) = \frac{-f(x_0 + 2 h) + 8 f(x_0 + h, y_0) - 8 f(x_0 - h) + f(x_0 - 2 h)}{12 h} + O(h^n) \text{ as } h \to 0

In other words, find the order of the error of the method.

Second Derivative

Find a, b, c \in \mathbb{R} such that \frac{a f(x_0 - h) + b f(x_0) + c f(x_0 + h)}{h^2} best approximates f''(x_0).

Multivariate

In all two-variable examples, x_0, y_0 \in \mathbb{R} and f : \mathbb{R}^2 \rightarrow \mathbb{R} are unknown, and h \in \mathbb{R} is small.

Non-Mixed Derivatives

Because of the nature of partial derivatives, some of them may be calculated using single-variable methods. This is done by holding constant all but one variable to form a new function of one variable. For example if g_y(y) = f(x_0, y), then \frac{df}{dy}(x_0, y_0) = \frac{dg}{dy}(y_0).

Find an approximation of \frac{d}{dy}f(x_0, y_0)

Mixed Derivatives

Mixed derivatives may require the multivariate extension of Taylor's theorem.

Let f : \mathbb{R}^2 \rightarrow \mathbb{R} be 42 times continuously differentiable on \mathbb{R}^2 and let g : \mathbb{R}^3 \rightarrow \mathbb{R} be defined as

g(x_0, y_0, h) = \frac{f(x_0 + h, y_0 + h) + f(x_0 - h, y_0 - h) - f(x_0 - h, y_0 + h) - f(x_0 + h, y_0 - h)}{4 h^2}\,.

Find the largest n \in \mathbb{N} such that

\frac{d^2 f}{dx dy}(x_0, y_0) = g(x_0, y_0, h) + O(h^n) \text{ as } h \to 0\,.

In other words, find the order of the error of the approximation.

Example Code

Implementing these methods is reasonably simple in programming languages that support higher-order functions. For example, the method from the first example may be implemented in C++ using function pointers, as follows:

//  Returns an approximation of the derivative of f at x.
double derivative (double (*f)(double), double x, double h =0.01) {
  return (f(x + h) - f(x - h)) / (2 * h);
}
This article is issued from Wikiversity - version of the Saturday, April 02, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.