The interest of my fellow students and I has been somewhat piqued of late by a curious similarity of the relationship between sequences and series to that between the derivatives and integrals of functions. Specifically, for a function \(f\) taking a nonnegative argument \(x\), we have
This similarity rather set us to wondering whether we could employ the language of calculus to reason about sequences and series.
Now my fellow students and I could not think of an especially compelling reason why we should not take derivatives of sequences or integrals of series and so, for the sake of simplicity, we shall henceforth use the term sequence to refer to both sequences and series.
For a constant sequence we trivially have
Now the combinations for \(r\) equal to zero and one are
reassuring us that our reckoning had been quite sound.
For functions that are simply negative powers of their arguments, the derivative follows much the same rule
Deck 2 compares the results of this formula with the differences between neighbouring terms of negative power sequences.
That its first term is infinite rather than one informs us that we must always be sure to define
Note that it is trivially the case that, for a pair of sequences \(s_n\) and \(t_n\) and constants \(a\) and \(b\), we have
To figure the sequence quotient rule we first noted that
Specifically, we require that
Now this raises the possibility that we might refer to terms with negative indices which, just as we had for those with indices of zero, we decided to equate to zero, as we have done in deck 5 which compares the values of this expression to those of recursively applied derivatives for a negative power sequence.
The many similarities that we have found between the derivatives of sequences and those of functions has left my fellow students and me quite keen to investigate whether or not the same might be true of their integrals, but our studies presently demand our fullest attention and so we must put the matter aside for the time being.
\[
\begin{align*}
F(x) &= \int_0^x f(x) \, \mathrm{d}x\\
f(x) &= \frac{\mathrm{d}}{\mathrm{d}x} F(x)
\end{align*}
\]
and for a sequence \(s\) having terms
\[
s_1, s_2, s_3, \dots
\]
we can define a series \(S\) with terms
\[
S_n = s_1 + s_2 + s_3 + \dots + s_n = \sum_{i=1}^n s_i
\]
where \(\sum\) is the summation sign, from which we can recover the terms of the sequence with
\[
s_n = S_n  S_{n1}
\]
using the convention that \(S_0\) equals zero.This similarity rather set us to wondering whether we could employ the language of calculus to reason about sequences and series.
A Calculus Of Differences
Noting that the derivative of a function may be defined as the limit of the backward relative difference
\[
\frac{f(x)  f(x\delta)}{\delta}
\]
as \(\delta\) tends to zero, we decided to try defining the derivative of a series as the limit of
\[
\frac{S_i  S_{id}}{d}
\]
as \(d\) tends to zero which, assuming that it is a positive integer, can be no smaller than one. Taking inspiration from the operator notation for differentiation
\[
D = \frac{\mathrm{d}}{\mathrm{d}x}
\]
with which we may express the aforementioned derivative and integral as
\[
\begin{align*}
D \, F(x) &= \frac{\mathrm{d}}{\mathrm{d}x} F(x) = f(x)\\
D^{1} \, f(x) &= \int_0^x f(x) \, \mathrm{d}x = F(x)
\end{align*}
\]
since the latter is most trivially the inverse of the former, we defined the operators
\[
\begin{align*}
\Delta \, S_n &= S_nS_{n1} = s_n\\
\Delta^{1} \, s_n &= \sum_{i=1}^n s_i = S_n
\end{align*}
\]
again with the convention that the zeroth element of a series is equal to zero, making quite explicit the similarity that has so caught our attention.Now my fellow students and I could not think of an especially compelling reason why we should not take derivatives of sequences or integrals of series and so, for the sake of simplicity, we shall henceforth use the term sequence to refer to both sequences and series.
Derivatives Of Some Elementary Sequences
To get a feel for how the derivatives of sequences might compare to those of functions, we decided to apply our differential operator to some basic sequences; specifically those whose elements are constant, equal to their indices, to their squares or to their cubes.For a constant sequence we trivially have
\[
\begin{align*}
s_n &= c\\
\Delta \, s_1 &= s_1  s_{0\phantom{1}} = c  0 = c\\
\Delta \, s_n &= s_n  s_{n1} = c  c = 0
\end{align*}
\]
which aligns precisely with the derivative of a constant function, at least for all but the very first element. For the identity sequence, whose elements equal their indices, we have
\[
\begin{align*}
s_n &= n\\
\Delta \, s_n &= n  (n1) = 1
\end{align*}
\]
which again follows the derivative of the identity function. Now for the sequence of squares we find that
\[
\begin{align*}
s_n &= n^2\\
\Delta \, s_n &= n^2  (n1)^2 = n^2  \left(n^2  2n + 1\right) = 2n  1
\end{align*}
\]
and for that of cubes
\[
\begin{align*}
s_n &= n^3\\
\Delta \, s_n &= n^3  (n1)^3 = n^3  \left(n^3  3n^2 + 3n  1\right) = 3n^2  3n + 1
\end{align*}
\]
Whilst these are not identical to the derivatives of their corresponding functions, they are somewhat similar in the sense that their highest order terms, having the greatest power of \(n\), take the form of those derivatives
\[
\begin{align*}
D \, x^2 &= 2x\\
D \, x^3 &= 3x^2
\end{align*}
\]
This similarity persists for all unit monomial sequences, being those whose elements are nonnegative integer powers of their indices. To demonstrate that this is so we must first note that
\[
(a+b)^n = \sum_{r=0}^n \frac{n!}{r! \times (nr)!} a^{nr} b^r = \sum_{r=0}^n {^n}C_r \, a^{nr} b^r
\]
where the exclamation mark stands for the factorial of the term to its immediate left, equal to one if it is zero and to the product of every integer from one up to and including it otherwise, and \({^n}C_r\) is the combination which enumerates the number of ways in which we can choose \(r\) from \(n\) items when the order of selection is of no consequence.Now the combinations for \(r\) equal to zero and one are
\[
\begin{align*}
{^n}C_0 &= \frac{n!}{0! \times (n0)!} = \frac{n!}{n!} = 1\\
{^n}C_1 &= \frac{n!}{1! \times (n1)!} = \frac{n!}{(n1)!} = \frac{n \times (n1)!}{(n1)!} = n
\end{align*}
\]
and so if \(n\) is two or greater
\[
\begin{align*}
(a+b)^n &= a^n b^0 + n a^{n1} b^1 + \sum_{r=2}^n {^n}C_r \, a^{nr} b^r\\
&= a^n + n a^{n1} b + \sum_{r=2}^n {^n}C_r \, a^{nr} b^r
\end{align*}
\]
For unit monomial sequences we therefore have
\[
\begin{align*}
s_n &= n^k\\
\Delta \, s_n &= n^k  (n1)^k\\
&= n^k  \sum_{r=0}^k (1)^r \times {^k}C_r \, n^{kr}\\
&= n^k  \left(n^k  k n^{k1} + \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}\right)\\
&= k n^{k1}  \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}
\end{align*}
\]
as compared to the derivatives of unit monomial functions
\[
D \, x^k = k x^{k1}
\]
To demonstrate that this formula holds true, my fellow students and I put together a deck for Professor B's wondrous calculating machine^{[1]} to compare its results with the differences between consecutive terms of monomial sequences
Deck 1: The Monomial Series Derivative



reassuring us that our reckoning had been quite sound.
For functions that are simply negative powers of their arguments, the derivative follows much the same rule
\[
D \, x^{k} = k x^{k1}
\]
but the derivatives of equivalent sequences do not, upon the face of it, appear to behave similarly
\[
\begin{align*}
s_n &= n^{k}\\
\Delta \, s_n
&= n^{k}  (n1)^{k}
= \frac{1}{n^k}  \frac{1}{(n1)^k}\\
&= \frac{(n1)^k  n^k}{n^k \times (n1)^k}
=  \frac{n^k  (n1)^k}{n^k \times (n1)^k}\\
&=  \frac{k n^{k1}  \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}}{n^k \times \sum_{r=0}^k (1)^r \times {^k}C_r \, n^{kr}}
\end{align*}
\]
However, with a little algebraic manipulation we find that
\[
\begin{align*}
\Delta \, s_n
&=  \frac{k n^{k1}  \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}}{n^{2k} \times \sum_{r=0}^k (1)^r \times {^k}C_r \, n^{r}}\\
&=  \frac{k n^{k1}  \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}}{\sum_{r=0}^k (1)^r \times {^k}C_r \, n^{r}}\\
&=  \frac{k n^{k1}  \sum_{r=2}^k (1)^r \times {^k}C_r \, n^{kr}}{1 \;\;\;\;\; +\sum_{r=1}^k (1)^r \times {^k}C_r \, n^{r}}
\end{align*}
\]
revealing that the similarity does indeed persist for negative powers.Deck 2 compares the results of this formula with the differences between neighbouring terms of negative power sequences.
Deck 2: The Negative Power Series Derivative



That its first term is infinite rather than one informs us that we must always be sure to define
\[
\Delta \, s_1 = s_1
\]
since we have adopted the convention that \(s_0\) is equal to zero for any sequence.Note that it is trivially the case that, for a pair of sequences \(s_n\) and \(t_n\) and constants \(a\) and \(b\), we have
\[
\Delta \, \left(a \times s_n + b \times t_n\right) = a \times \Delta \, s_n + b \times \Delta \, t_n
\]
since
\[
\begin{align*}
\Delta \, \left(a \times s_n + b \times t_n\right)
&= \left(a \times s_n + b \times t_n\right)  \left(a \times s_{n1} + b \times t_{n1}\right)\\
&= \left(a \times s_n  a \times s_{n1}\right) + \left(b \times t_n  b \times t_{n1}\right)\\
&= a \times \left(s_n  s_{n1}\right) + b \times \left(t_n  t_{n1}\right)
\end{align*}
\]
Products And Quotients
The next task that my fellow students and I set for ourselves was to figure the sequence equivalents of the product and quotient rules for the derivatives of functions
\[
\begin{align*}
D \, (f(x) \times g(x)) &= D \, f(x) \times g(x) + f(x) \times D \, g(x)\\
D \, \frac{f(x)}{g(x)} &= \frac{D \, f(x) \times g(x)  f(x) \times D \, g(x)}{g(x)^2}
\end{align*}
\]
For the former we swiftly deduced that
\[
\begin{align*}
\Delta \, \left(s_n \times t_n\right)
&= s_n \times t_n  s_{n1} \times t_{n1}\\
&= s_n \times t_n  \left(s_{n1} \times t_n  s_{n1} \times t_n\right) s_{n1} \times t_{n1}\\
&= \left(s_n \times t_n  s_{n1} \times t_n\right) + \left(s_{n1} \times t_n  s_{n1} \times t_{n1}\right)\\
&= \left(s_n  s_{n1}\right) \times t_n + s_{n1} \times \left(t_n  t_{n1}\right)\\
&= \Delta \, s_n \times t_n + \left(s_n  \Delta \, s_n\right) \times \Delta \, t_n\\
&= \Delta \, s_n \times t_n + s_n \times \Delta \, t_n  \Delta \, s_n \times \Delta \, t_n
\end{align*}
\]
as demonstrated by deck 3.
Deck 3: The Product Rule



To figure the sequence quotient rule we first noted that
\[
\frac{s_n}{t_n} = s_n \times t_n^{1}
\]
so that we could use the product rule to yield
\[
\begin{align*}
\Delta \, \frac{s_n}{t_n}
&= \Delta \, s_n \times t_n^{1} + s_n \times \Delta \, t_n^{1}  \Delta \, s_n \times \Delta \, t_n^{1}\\
&= \Delta \, s_n \times t_n^{1} + s_n \times \left(t_n^{1}  t_{n1}^{1}\right)  \Delta \, s_n \times \left(t_n^{1}  t_{n1}^{1}\right)\\
&= \frac{\Delta \, s_n \times t_{n1} + s_n \times \left(t_{n1}  t_n\right)  \Delta \, s_n \times \left(t_{n1}  t_n\right)}{t_n \times t_{n1}}\\
&= \frac{\Delta \, s_n \times \left(t_n  \Delta \, t_{n}\right)  s_n \times \Delta \, t_{n} + \Delta \, s_n \times \Delta \, t_{n}}{t_n \times \left(t_n  \Delta \, t_{n}\right)}\\
&= \frac{\Delta \, s_n \times t_n  s_n \times \Delta \, t_{n}}{t_n^2  t_n \times \Delta \, t_{n}}
\end{align*}
\]
as verified by deck 4.
Deck 4: The Quotient Rule



Higher Order Derivatives
Just as with the derivatives of functions we can define higher order derivatives through repeated differentiation. For example
\[
\begin{align*}
\Delta \, s_n
&= s_n  s_{n1}\\
\Delta^2 \, s_n
&= \Delta \, \left(\Delta \, s_n\right)
= \left(s_n  s_{n1}\right)  \left(s_{n1}  s_{n2}\right)
= s_n  2s_{n1} + s_{n2}\\
\Delta^3 \, s_n
&= \Delta \, \left(\Delta^2 \, s_n\right)
= \left(s_n  2s_{n1} + s_{n2}\right)  \left(s_{n1}  2s_{n2} + s_{n3}\right)
= s_n  3s_{n1} + 3s_{n2}  s_{n3}
\end{align*}
\]
Now by simple inspection it is clear that each of these examples conforms to the expression
\[
\Delta^d \, s_n = \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr}
\]
To determine whether or not it holds for derivatives of any order we need simply confirm that it satisfies the rule that the \((d+1)^{\mathrm{th}}\) order derivative is the derivative of the \(d^{\mathrm{th}}\) order derivative since we already know that it holds for \(d\) equal to one, two and three and so by induction may consequently deduce that it holds for \(d\) equal to four and thence for \(d\) equal to five, six, seven and so on and so forth.Specifically, we require that
\[
\sum_{r=0}^{d+1} (1)^r \times {^{d+1}}C_r \, s_{nr}
= \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr}
 \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{(n1)r}
\]
which my fellow students and I figured to be true by noting that
\[
\begin{align*}
{^d}C_0 = 1 &= {^{d+1}}C_0\\
{^d}C_{d} = 1 &= {^{d+1}}C_{d+1}\\
{^d}C_r + {^d}C_{r1} &= \frac{d!}{r! \times (dr)!} + \frac{d!}{(r1)! \times (d(r1))!}
= \frac{d!}{r! \times (dr)!} + \frac{d!}{(r1)! \times (dr+1)!}\\
&= \frac{d! \times (dr+1) + d! \times r}{r! \times (dr+1)!}
= \frac{d! \times (d+1)}{r! \times (dr+1)!}
= \frac{(d+1)!}{r! \times ((d+1)r)!}
= {^{d+1}}C_r
\end{align*}
\]
and consequently
\[
\begin{align*}
&\sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr}  \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{(n1)r}\\
&\quad\quad= \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr}  \sum_{r=1}^{d+1} (1)^{r1} \times {^d}C_{r1} \, s_{(n1){r1}}\\
&\quad\quad= \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr} + \sum_{r=1}^{d+1} (1)^r \times {^d}C_{r1} \, s_{nr}\\
&\quad\quad= (1)^0 \times {^d}C_0 \, s_{n0} + \sum_{r=1}^d (1)^r \times \left({^d}C_r + {^d}C_{r1}\right) \times s_{nr} + (1)^{d+1} \times {^d}C_{d} \, s_{n(d+1)}\\
&\quad\quad= (1)^0 \times {^{d+1}}C_0 \, s_{n0} + \sum_{r=1}^d (1)^r \times {^{d+1}}C_r \, s_{nr} + (1)^{d+1} \times {^{d+1}}C_{d+1} \, s_{n(d+1)}\\
&\quad\quad= \sum_{r=0}^{d+1} (1)^r \times {^{d+1}}C_r \, s_{nr}
\end{align*}
\]
as needed.Now this raises the possibility that we might refer to terms with negative indices which, just as we had for those with indices of zero, we decided to equate to zero, as we have done in deck 5 which compares the values of this expression to those of recursively applied derivatives for a negative power sequence.
Deck 5: Higher Order Derivatives



The many similarities that we have found between the derivatives of sequences and those of functions has left my fellow students and me quite keen to investigate whether or not the same might be true of their integrals, but our studies presently demand our fullest attention and so we must put the matter aside for the time being.
\(\Box\)
Leave a comment