Finally On A Calculus Of Differences

| 0 Comments

My fellow students and I have spent much of our spare time this past year[1][2][3] investigating the similarities between the calculus of functions and that of sequences, which we have defined for a sequence \(s_n\) with the differential operator
\[ \Delta \, s_n = s_n - s_{n-1} \]
and the integral operator
\[ \Delta^{-1} \, s_n = \sum_{i=1}^n s_i \]
where \(\sum\) is the summation sign, adopting the convention that terms with non-positive indices equate to zero.

We have thus far discovered how to differentiate and integrate monomial sequences, found product and quotient rules for differentiation, a rule of integration by parts and figured solutions to some familiar-looking differential equations, all of which bear a striking resemblance to their counterparts for functions. To conclude our investigation, we decided to try to find an analogue of Taylor's theorem for sequences.

Taylor's Theorem

Recall that Taylor's theorem states that we can approximate a function within some region of an argument \(x\) with a polynomial of the form
\[ f(x+\delta) \approx f(x) + \delta D \, f(x) + \tfrac12 \delta^2 D^2 \, f(x) + \dots + \tfrac{1}{n!} \delta^n D^n \, f(x) \]
where \(D\) is the differential operator, satisfying
\[ \begin{align*} D^0 \, f(x) &= f(x)\\ D^i \, f(x) &= \frac{\mathrm{d}^i}{\mathrm{d}x^i} f(x) \end{align*} \]
and the exclamation mark stands for the factorial, being the product of every integer from one up to and including the one that precedes it, or one if preceded by zero.
In particular, if we consider the limit as \(n\) tends to infinity then we recover the Taylor series representation of the function in the neighbourhood of \(x\)
\[ f(x+\delta) = \sum_{i=0}^\infty \tfrac{1}{i!} \delta^i D^i \, f(x) \]
In seeking an equivalent representation for sequences, the first thing that my fellow students and I noted is that for monomial functions the series
\[ (x+\delta)^k = \sum_{i=0}^k \tfrac{1}{i!} \delta^i D^i \, x^k \]
is exact for all \(x\) and \(\delta\). This follows from the fact that
\[ \begin{align*} D^i \, x^k &= k \times (k-1) \times \dots \times (k-i+1) \times x^{k-i} = \tfrac{k!}{(k-i)!} x^{k-i}\\ D^{k+1} \, x^k &= 0 \end{align*} \]
for \(i\) between zero and \(k\), and so
\[ \sum_{i=0}^k \tfrac{1}{i!} \delta^i D^i \, x^k = \sum_{i=0}^k \tfrac{1}{i!} \delta^i \times \tfrac{k!}{(k-i)!} x^{k-i} = \sum_{i=0}^k \tfrac{k!}{i! \times (k-i)!} \times \delta^i \times x^{k-i} = \sum_{i=0}^k {^k}C_i \times \delta^i \times x^{k-i} \]
which is simply the binomial expansion of \((x+\delta)^k\), in which \({^k}C_i\) is the combination which counts how many ways we can pick \(i\) from \(k\) items when the order of picking is of no account.

Newton Polynomials

This very much reminded us of Newton's observation that given a function \(f\) and a distinct set of arguments \(x_i\) for \(i\) from zero to \(k\) we may construct a \(k^\mathrm{th}\) order polynomial that matches the function at each of them with
\[ f^\ast(x) = f\left[x_0\right] + \sum_{i=1}^k f\left[x_0 \dots x_i\right] \times \prod_{j=0}^{i-1} \left(x - x_j\right) \]
where \(\prod\) is the product sign and \(f\left[x_0 \dots x_i\right]\) is a divided difference, defined with the recurrence
\[ \begin{align*} f\left[x_i\right] &= f\left(x_i\right)\\ f\left[x_i \dots x_j\right] &= \frac{f\left[x_{i+1} \dots x_j\right] - f\left[x_i \dots x_{j-1}\right]}{x_j - x_i} \end{align*} \]
In particular, since any \(k^\mathrm{th}\) order polynomial is uniquely determined by \(k+1\) distinct points upon its graph, if
\[ f(x) = x^k \]
then we have
\[ f^\ast(x) = f(x) \]
for all \(x\). For example, if \(k\) equals two then
\[ \begin{align*} f\left[x_0\right] &= x_0^2\\ f\left[x_1\right] &= x_1^2\\ f\left[x_2\right] &= x_2^2\\ \\ f\left[x_0, x_1\right] &= \frac{f\left[x_1\right] - f\left[x_0\right]}{x_1 - x_0} = \frac{x_1^2 - x_0^2}{x_1 - x_0} = \frac{\left(x_1 - x_0\right) \times \left(x_1 + x_0\right)}{x_1 - x_0} = x_1 + x_0\\ f\left[x_1, x_2\right] &= x_2 + x_1\\ \\ f\left[x_0, x_1, x_2\right] &= \frac{f\left[x_1, x_2\right] - f\left[x_0, x_1\right]}{x_2 - x_0} = \frac{\left(x_2 + x_1\right) - \left(x_1 + x_0\right)}{x_2 - x_0} = \frac{x_2 - x_0}{x_2 - x_0} = 1 \end{align*} \]
and consequently
\[ \begin{align*} f^\ast(x) &= f\left[x_0\right] + \sum_{i=1}^k f\left[x_0 \dots x_i\right] \times \prod_{j=0}^{i-1} \left(x - x_j\right)\\ &= f\left[x_0\right] + f\left[x_0,x_1\right] \times \left(x - x_0\right) + f\left[x_0,x_1,x_2\right] \times \left(x - x_0\right) \times \left(x - x_1\right)\\ &= x_0^2 + \left(x_1+x_0\right) \times \left(x - x_0\right) + 1 \times \left(x - x_0\right) \times \left(x - x_1\right)\\ &= x_0^2 + \left(x_1 \times x - x_1 \times x_0 + x_0 \times x - x_0^2\right) + \left(x^2 - x \times x_1 - x_0 \times x + x_0 \times x_1\right)\\ &= \left(x_0^2 - x_0^2\right) + \left(x_1 \times x - x \times x_1\right) + \left(x_0 \times x_1 - x_1 \times x_0\right) + \left(x_0 \times x - x_0 \times x\right) + x^2 = x^2 \end{align*} \]
as claimed.

To apply this to the monomial sequence
\[ s_n = n^k \]
we simply defined
\[ \begin{align*} x_i &= n-k+i\\ f(x) &= s_x \end{align*} \]
so that
\[ f\left[x_0 \dots x_i\right] = \tfrac{1}{i!} \Delta^i \, s_{n-k+i} \]
for \(n\) strictly greater than \(k\). To prove this relationship between the fractional difference and our differential operator we deduced that if
\[ \begin{align*} &0 \leqslant i \leqslant j \leqslant k\\ &f\left[x_i \dots x_j\right] = \tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j} \end{align*} \]
holds true, then so must
\[ \begin{align*} &0 \leqslant i \leqslant j \leqslant k-1\\ &f\left[x_{i+1} \dots x_{j+1}\right] = \tfrac{1}{((j+1)-(i+1))!} \; \Delta^{(j+1)-(i+1)} \, s_{n-k+j+1} = \tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j+1} \end{align*} \]
and therefore
\[ \begin{align*} f\left[x_i \dots x_{j+1}\right] &= \frac{f\left[x_{i+1} \dots x_{j+1}\right] - f\left[x_i \dots x_j\right]}{x_{j+1} - x_i} = \frac{\tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j+1} - \tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j}}{x_{j+1} - x_i}\\ &= \frac{\tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j+1} - \tfrac{1}{(j-i)!} \; \Delta^{j-i} \, s_{n-k+j}}{(n-k+j+1) - (n-k+i)} = \frac{\Delta^{j-i} \, s_{n-k+j+1} - \Delta^{j-i} \, s_{n-k+j}}{(j-i)! \times (j-i+1)}\\ &= \frac{\Delta^{j-i+1} \, s_{n-k+j+1}}{(j-i+1)!} = \tfrac{1}{(j+1-i)!} \; \Delta^{j+1-i} \, s_{n-k+j+1} \end{align*} \]
Finally, since it trivially holds for
\[ f\left[x_i \dots x_i\right] = f\left[x_i\right] = f\left(x_i\right) = s_{n-k+i} = \Delta^0 \, s_{n-k+i} \]
then it must do so for all
\[ 0 \leqslant i \leqslant j \leqslant k \]
and we consequently have
\[ \begin{align*} s_{n+d} &= s_{n-k} + \sum_{i=1}^k \tfrac{1}{i!} \Delta^i \, s_{n-k+i} \times \prod_{j=0}^{i-1} \left(\left(n+d\right) - \left(n-k+j\right)\right)\\ &= s_{n-k} + \sum_{i=1}^k \tfrac{1}{i!} \Delta^i \, s_{n-k+i} \times \prod_{j=0}^{i-1} (d+k-j) = \sum_{i=0}^k \tfrac{1}{i!} \Delta^i \, s_{n-k+i} \times \frac{(d+k)!}{(d+k-i)!}\\ &= \sum_{i=0}^k \frac{(d+k)!}{i! \times (d+k-i)!} \times \Delta^i \, s_{n-k+i} = \sum_{i=0}^k {^{d+k}}C_i \times \Delta^i \, s_{n-k+i} \end{align*} \]
as demonstrated by deck 1.

Deck 1: The Monomial Sequence Newton Polynomial

That, contrary to our assumptions, this formula holds for \(n\) equal to \(k\) is but a consequence of our convention that zero indexed elements of a sequence are equal to zero is consistent with the fact that monomials equal zero when given arguments of zero.

Non-Polynomial Sequences

Whilst both Taylor series and Newton polynomials require but a finite number of terms to exactly reproduce polynomial functions, the same cannot be said in general. Unlike the Taylor series, however, we cannot extend the Newton polynomial to an infinite number of terms for our sequences since we equate every term before the first with zero. Instead, the best that we can do is to employ as many terms of the sequence that we have to hand with
\[ s_{n+d} \approx \sum_{i=0}^{n-1} {^{d+n-1}}C_i \times \Delta^i \, s_{i+1} \]
Deck 2 illustrates this for our exponential sequence
\[ s_n = 2^n \]
Deck 2: The Exponential Sequence Newton Polynomial

That the Newton polynomial grows ever more accurate as the index increases is a consequence of the fact that such polynomial approximations of the exponential function improve as their order increases. Unfortunately, they also rapidly degrade as their arguments grow in size, as illustrated by deck 3 in which \(n\) is fixed and \(d\) is increased.

Deck 3: Increasing d

Runge's Phenomenon

Rather more irritatingly, there are many sequences for which increasing the order of the Newton polynomial does little to improve its accuracy, a failing known as Runge's phenomenon. For example, the sequence
\[ s_n = \frac{1}{1 + n^2} \]
might not seem especially troublesome upon the face of it, but it is based upon a function for which Runge studied the failure of polynomial approximation and quite confounds the ability of a Newton polynomial to predict the value of the next element, as demonstrated by deck 4.

Deck 4: Runge's Phenomenon

Here we have taken the trouble to buffer the values of the derivative so that we might figure the errors that much more swiftly for large indices, by which we see that they ultimately appear to grow without bound.
Whilst it must be admitted that the Taylor series of functions may sometimes fail to converge, Runge's phenomenon is very much more severe, to the extent that we really cannot trust our Newton polynomials of sequences as analogues for them in general.

Low Order Approximation

There is one way, however, in which we use Taylor series that easily translates to our Newton polynomials; to construct low order polynomials that approximate a function within some neighbourhood. For example, we can approximate the function
\[ f(x) = \frac{1}{1 + x^2} \]
in the neighbourhood of \(x\) with the second order truncated Taylor series
\[ f(x+\delta) \approx f(x) + \delta D \, f(x) + \tfrac12 \delta^2 D^2 \, f(x) \]
where the derivatives can be figured by the chain and product rules to be
\[ \begin{align*} D \, f(x) &= D \, \left(1+x^2\right)^{-1} = -\left(1+x^2\right)^{-2} \times 2x\\ D^2 \, f(x) &= - D \, \left(1+x^2\right)^{-2} \times 2x = 2 \times \left(1+x^2\right)^{-3} \times 2x \times 2x - \left(1+x^2\right)^{-2} \times 2\\ &= \left(1+x^2\right)^{-3} \times 8x^2 - \left(1+x^2\right)^{-3} \times 2\left(1+x^2\right)\\ &= \left(1+x^2\right)^{-3} \times \left(6x^2 - 2\right) \end{align*} \]
yielding
\[ f(x+\delta) \approx \frac{1}{1 + x^2} - \frac{2x\delta}{\left(1+x^2\right)^2} + \frac{\left(3x^2 - 1\right) \delta^2}{\left(1+x^2\right)^3} \]
and, in particular
\[ \begin{align*} f(x+1) &\approx \frac{1}{1 + x^2} - \frac{2x}{\left(1+x^2\right)^2} + \frac{3x^2-1}{\left(1+x^2\right)^3} = \frac{\left(1+x^2\right)^2}{\left(1+x^2\right)^3} - \frac{2x\left(1+x^2\right)}{\left(1+x^2\right)^3} + \frac{3x^2-1}{\left(1+x^2\right)^3}\\ &= \frac{\left(1+2x^2+x^4\right) - \left(2x+2x^3\right) + \left(3x^2-1\right)}{1+3x^2+3x^4+x^6} = \frac{x^4 - 2x^3 + 5x^2 - 2x}{x^6+3x^4+3x^2+1} \end{align*} \]
The equivalent second order truncated Newton polynomial is given by
\[ \begin{align*} s_{n+1} &\approx \sum_{i=0}^2 {^3}C_i \times \Delta^i \, s_{n-2+i} ={^3}C_0 \times \Delta^0 \, s_{n-2} + {^3}C_1 \times \Delta^1 \, s_{n-1} + {^3}C_2 \times \Delta^2 \, s_n\\ &= s_{n-2} + 3 \times \Delta \, s_{n-1} + 3 \times \Delta^2 \, s_n = s_{n-2} + 3 \times \left(s_{n-1}-s_{n-2}\right) + 3 \times \left(\left(s_n-s_{n-1}\right)-\left(s_{n-1}-s_{n-2}\right)\right)\\ &= 3 \times s_n - 3 \times s_{n-1} + s_{n-2} \end{align*} \]
which, with some careful figuring, we found to resolve to
\[ \begin{align*} s_{n+1} &\approx \frac{3}{1+n^2} - \frac{3}{1+(n-1)^2} + \frac{1}{1+(n-2)^2} = \frac{3}{n^2+1} - \frac{3}{n^2 - 2n + 2} + \frac{1}{n^2 - 4n + 5}\\ &= \frac{3 \times \left(n^2 - 2n + 2\right) \times \left(n^2 - 4n + 5\right) - 3\times \left(n^2+1\right) \times \left(n^2 - 4n + 5\right) + \left(n^2+1\right) \times \left(n^2 - 2n + 2\right)}{\left(n^2+1\right) \times \left(n^2 - 2n + 2\right) \times \left(n^2 - 4n + 5\right)}\\ &= \frac{\left(3n^4 - 18n^3 + 45n^2 -54n + 30\right) - \left(3n^4-12n^3+18n^2-12n+15\right) + \left(n^4-2n^3+3n^2-2n+2\right)}{n^6 - 6n^5 + 16n^4 - 24n^3 + 25n^2 - 18n + 10}\\ &= \frac{n^4 - 8n^3 + 30n^2 - 44n + 17}{n^6 - 6n^5 + 16n^4 - 24n^3 + 25n^2 - 18n + 10} \end{align*} \]
after which we put together deck 5 to compare these two approximations to the true value.

Deck 5: The Two Approximations

Clearly the Taylor series approximation is more accurate than that of the Newton polynomial, but my fellow students and I were nevertheless most pleased that the latter does indeed tend towards the next term of the sequence as the index increases.

And with that observation we concluded our musings upon the relationship between the calculus of functions and our calculus of sequences; musings that I must say that we have found quite edifying, as I sincerely hope that you have too!
\(\Box\)

References

[1] On A Calculus Of Differences, www.thusspakeak.com, 2017.

[2] Further On A Calculus Of Differences, www.thusspakeak.com, 2017.

[3] Further Still On A Calculus Of Differences, www.thusspakeak.com, 2017.

Leave a comment

 
This site requires HTML5, CSS 2.1 and JavaScript 5 and has been tested with

Chrome Chrome 26+
Firefox Firefox 20+
Internet Explorer Internet Explorer 9+