For some time now^{[1][2]} my fellow students and I have been whiling away our spare time considering the similarities of the relationships between sequences and series and those between the derivatives and integrals of functions. Having defined differential and integral operators for a sequence \(s_n\) with
We have since spent some time considering how we might solve equations relating sequences to their derivatives, known as differential equations when involving functions, and it is upon our findings that I shall now report.
Note that the similarity between exponential functions and sequences persists for higher order derivatives with
Here we have used our formula for the higher order derivatives of a sequence \(s_n\)
Whilst the terms certainly oscillate between positive and negative values they also grow exponentially, quite unlike sinusoidal functions. If we divide each term by \(2^{\frac12 n}\), however, we recover something much more familiar, as demonstrated by deck 3.
This strongly suggests that this particular solution to the differential equation is
Furthermore, for \(n\) greater than one, we have
This set us to wondering quite how the solution to
By mere inspection it's quite evident that this is simply another sinusoidal sequence; specifically
Thankfully, however, we can at least transform sequence differential equations into comparatively simple recurrence relations. For example, we can rearrange
[2] Further On A Calculus Of Differences, www.thusspakeak.com, 2017.
\[
\Delta \, s_n = s_n  s_{n1}
\]
and
\[
\Delta^{1} \, s_n = \sum_{i=1}^n s_i
\]
where \(\sum\) is the summation sign, we found analogues for the product rule, the quotient rule and the rule of integration by parts, as well as formulae for the derivatives and integrals of monomial sequences, being those whose terms are nonnegative integer powers of their indices, and higher order, or repeated, derivatives and integrals in general.We have since spent some time considering how we might solve equations relating sequences to their derivatives, known as differential equations when involving functions, and it is upon our findings that I shall now report.
Exponential Sequences
The sequence differential equation with which we began our exploration was
\[
\Delta \, s_n = s_{n1}
\]
which was very much inspired by a common definition of the exponential function
\[
D \, e^x = e^x
\]
where \(D\) is the differential operator for functions, standing for the derivative with respect to the argument of the function that follows it, which in this case is
\[
D \, e^x = \frac{\mathrm{d}}{\mathrm{d}x} e^x
\]
By our definition of the sequence differential operator, we have
\[
\begin{align*}
s_n  s_{n1} &= s_{n1}\\
s_n &= 2 \times s_{n1}
\end{align*}
\]
which is trivially satisfied by
\[
s_n = a \times 2^n
\]
for any constant \(a\), although for the canonical exponential sequence we shall choose it to be equal to one
\[
s^\mathrm{exp}_n = 2^n
\]
Furthermore, just as we can with the exponential function's differential equation, we can generalise this to
\[
\Delta \, s_n = c \times s_{n1}
\]
yielding arbitrary exponential sequences
\[
\begin{align*}
s_n  s_{n1} &= c \times s_{n1}\\
s_n &= (c+1) \times s_{n1}\\
s_n &= a \times (c+1)^n
\end{align*}
\]
Now this is but a slight reworking of the definition of a geometric sequence; a fact that we can highlight with
\[
\begin{align*}
r & = c+1\\
a^\prime & = \frac{a}{c+1}\\
s_n &= a \times (c+1)^n = a^\prime \times r^{n1}
\end{align*}
\]
Nevertheless, we shall continue to use the term exponential sequence in the context of differential equations.Note that the similarity between exponential functions and sequences persists for higher order derivatives with
\[
\begin{align*}
D^{\,d} \, \left(a \times c^x\right) &= (\ln c)^d \times \left(a \times c^x\right)\\
\Delta^d \, \left(a \times c^n\right) &= (c1)^d \times \left(a \times c^{nd}\right)
\end{align*}
\]
at least for \(n\) greater than \(d\), as demonstrated by deck 1.
Deck 1: Exponential Sequence Derivatives



Here we have used our formula for the higher order derivatives of a sequence \(s_n\)
\[
\Delta^d \, s_n = \sum_{r=0}^d (1)^r \times {^d}C_r \, s_{nr}
\]
where \(^{d}C_r\) is the combination, which equals the number of ways that we can choose \(r\) from \(d\) items if the order of their choosing is unimportant, given by
\[
{^d}C_r = \frac{d!}{r! \times (dr)!}
\]
where the exclamation mark stands for the factorial, being the product of every integer from one to that preceding it.
Sinusoidal Sequences
Given this property of the higher order derivatives of exponential sequences, we can choose to define the canonical exponential sequence with the second order differential equation
\[
\Delta^2 \, s^\mathrm{exp}_n = s^\mathrm{exp}_{n2}
\]
which we can confirm does indeed have the same solution as the first order differential equation with
\[
\begin{align*}
s^\mathrm{exp}_n  2 \times s^\mathrm{exp}_{n1} + s^\mathrm{exp}_{n2} &= s^\mathrm{exp}_{n2}\\
s^\mathrm{exp}_n &= 2 \times s^\mathrm{exp}_{n1}
\end{align*}
\]
provided that we explicitly define
\[
s^\mathrm{exp}_1 = 2
\]
We were consequently curious to see whether or not the second order differential equation
\[
\Delta^2 \, s_n = s_{n2}
\]
might admit a sinusoidal solution akin to
\[
D^{\,2} \, \left(a \times \sin(x) + b \times \cos(x)\right) =  \left(a \times \sin(x) + b \times \cos(x)\right)
\]
Expanding the second order derivative of \(s_n\) yields
\[
\begin{align*}
s_n  2 \times s_{n1} + s_{n2} &= s_{n2}\\
s_n &= 2 \times s_{n1}  2 \times s_{n2}\\
&= 2 \times \Delta \, s_{n1}
\end{align*}
\]
which, upon the face of it, does not seem to suggest a relationship with sinusoidal functions. To get a sense of how this sequence behaves my fellow students and I build a deck to figure its terms assuming that \(s_1\) equals one and, as per usual, that terms with nonpositive indices equal zero.
Deck 2: The Terms Of The Sequence



Whilst the terms certainly oscillate between positive and negative values they also grow exponentially, quite unlike sinusoidal functions. If we divide each term by \(2^{\frac12 n}\), however, we recover something much more familiar, as demonstrated by deck 3.
Deck 3: A Sinusoidal Sequence



This strongly suggests that this particular solution to the differential equation is
\[
s_n = 2^{\frac12 n} \times \sin \left(n \times \tfrac{\pi}{4}\right)
\]
To prove it we require that
\[
\begin{align*}
\Delta^2 \, s_n + s_{n2} &= 2^{\frac12 n} \times \sin \left(n \times \tfrac{\pi}{4}\right)
 2 \times 2^{\frac12 (n1)} \times \sin \left((n1) \times \tfrac{\pi}{4}\right)
+ 2 \times 2^{\frac12 (n2)} \times \sin \left((n2) \times \tfrac{\pi}{4}\right)\\
&= 2^{\frac12 n} \times \left(\sin \left(\tfrac{n\pi}{4}\right)
 \sqrt{2} \times \sin \left(\tfrac{n\pi}{4}  \tfrac{\pi}{4}\right)
+ \sin \left(\tfrac{n\pi}{4}  \tfrac{\pi}{2}\right)\right)\\
&= 2^{\frac12 n} \times \left(\sin \left(\tfrac{n\pi}{4}\right)
 \sqrt{2} \times \sin \left(\tfrac{n\pi}{4}  \tfrac{\pi}{4}\right)
 \cos \left(\tfrac{n\pi}{4}\right)\right)
\end{align*}
\]
be equal to zero, which it most certainly will if, for any \(\theta\)
\[
\sin(\theta)  \cos(\theta) = \sqrt{2} \times \sin \left(\theta  \tfrac{\pi}{4}\right)
\]
To show that this is indeed the case, my fellow students and I turned to the angle addition formula for the sine function
\[
\sin(\theta + \alpha) = \sin(\theta) \times \cos(\alpha) + \cos(\theta) \times \sin(\alpha)
\]
which in this case equates to
\[
\begin{align*}
\sin\left(\theta  \tfrac{\pi}{4}\right) &= \sin(\theta) \times \cos\left(\tfrac{\pi}{4}\right) + \cos(\theta) \times \sin\left(\tfrac{\pi}{4}\right)\\
&= \sin(\theta) \times \tfrac{1}{\sqrt{2}} + \cos(\theta) \times \tfrac{1}{\sqrt{2}}\\
&= \tfrac{1}{\sqrt{2}} \times \left(\sin(\theta)  \cos(\theta)\right)
\end{align*}
\]
as required, and we consequently named this sequence
\[
s^\mathrm{sin}_n = 2^{\frac12 n} \times \sin \left(\tfrac{n\pi}{4}\right)
\]
Similarly, we defined
\[
s^\mathrm{cos}_n = 2^{\frac12 n} \times \cos \left(\tfrac{n\pi}{4}\right)
\]
so that we might recover analogues of the trigonometric identities
\[
\begin{align*}
\sin^2(\theta) + \cos^2(\theta) &= 1\\
\cos\left(\theta\tfrac{\pi}{2}\right) &= \sin(\theta)\\
\sin\left(\theta\tfrac{\pi}{2}\right) &= \cos(\theta)
\end{align*}
\]
with
\[
\begin{align*}
\left(s^\mathrm{sin}_n\right)^2 + \left(s^\mathrm{cos}_n\right)^2
&= 2^n \times \sin^2 \left(\tfrac{n\pi}{4}\right) + 2^n \times \cos^2 \left(\tfrac{n\pi}{4}\right)\\
&= 2^n \times \left(\sin^2 \left(\tfrac{n\pi}{4} + \cos^2 \left(\tfrac{n\pi}{4}\right)\right)\right)\\
&= 2^n = s^\mathrm{exp}_n
\end{align*}
\]
\[
\begin{align*}
s^\mathrm{cos}_{n2}
&= 2^{\frac12 (n2)} \times \cos \left(\tfrac{(n2)\pi}{4}\right)\\
&= \tfrac12 \times 2^{\frac12 n} \times \cos \left(\tfrac{n\pi}{4}  \tfrac{\pi}{2}\right)\\
&= \tfrac12 \times 2^{\frac12 n} \times \sin \left(\tfrac{n\pi}{4}\right)\\
&= \tfrac12 s^\mathrm{sin}_n
\end{align*}
\]
\[
\begin{align*}
s^\mathrm{sin}_{n2}
&= 2^{\frac12 (n2)} \times \sin \left(\tfrac{(n2)\pi}{4}\right)\\
&= \tfrac12 \times 2^{\frac12 n} \times \sin \left(\tfrac{n\pi}{4}  \tfrac{\pi}{2}\right)\\
&= \tfrac12 \times 2^{\frac12 n} \times \cos \left(\tfrac{n\pi}{4}\right)\\
&= \tfrac12 s^\mathrm{cos}_n\\
\end{align*}
\]
Note that since any multiple of \(s^\mathrm{sin}_n\) is also a solution of the differential equation, this means that \(s^\mathrm{cos}_n\) must be too, albeit only for \(n\) greater than two, and we consequently have a general solution with
\[
\Delta^2 \, \left(a \times s^\mathrm{sin}_n + b \times s^\mathrm{cos}_n\right)
= \left(a \times s^\mathrm{sin}_{n2} + b \times s^\mathrm{cos}_{n2}\right)
\]
for any constants \(a\) and \(b\), as shown by deck 4.
Deck 4: A General Solution



Furthermore, for \(n\) greater than one, we have
\[
\begin{align*}
\Delta \, s^\mathrm{sin}_{n}
&= 2^{\frac12 n} \times \sin \left(\tfrac{n\pi}{4}\right)
 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4} + \tfrac{\pi}{4}\right)
 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \left(\sin \left(\tfrac{(n1)\pi}{4}\right) \times \cos\left(\tfrac{\pi}{4}\right) + \cos \left(\tfrac{(n1)\pi}{4}\right) \times \sin\left(\tfrac{\pi}{4}\right)\right)
 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \left(\sin \left(\tfrac{(n1)\pi}{4}\right) \times \tfrac{1}{\sqrt{2}} + \cos \left(\tfrac{(n1)\pi}{4}\right) \times \tfrac{1}{\sqrt{2}}\right)
 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4}\right)\\
&= 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)
= s^\mathrm{cos}_{n1}
\end{align*}
\]
and
\[
\begin{align*}
\Delta \, s^\mathrm{cos}_{n}
&= 2^{\frac12 n} \times \cos \left(\tfrac{n\pi}{4}\right)
 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4} + \tfrac{\pi}{4}\right)
 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4} + \tfrac{\pi}{4} + \tfrac{\pi}{2}\right)
 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \left(\sin \left(\tfrac{(n1)\pi}{4} + \tfrac{\pi}{2}\right) \times \tfrac{1}{\sqrt{2}} + \cos \left(\tfrac{(n1)\pi}{4} + \tfrac{\pi}{2}\right) \times \tfrac{1}{\sqrt{2}}\right)
 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)\\
&= \sqrt{2} \times 2^{\frac12 (n1)} \times \left(\cos \left(\tfrac{(n1)\pi}{4}\right) \times \tfrac{1}{\sqrt{2}} \sin \left(\tfrac{(n1)\pi}{4}\right) \times \tfrac{1}{\sqrt{2}}\right)
 2^{\frac12 (n1)} \times \cos \left(\tfrac{(n1)\pi}{4}\right)\\
&= 2^{\frac12 (n1)} \times \sin \left(\tfrac{(n1)\pi}{4}\right)
= s^\mathrm{sin}_{n1}
\end{align*}
\]
reflecting the rules for differentiating the sine and cosine functions
\[
\begin{align*}
D \, \sin(x) &= \cos(x)\\
D \, \cos(x) &= \sin(x)
\end{align*}
\]
as confirmed by deck 5.
Deck 5: Sinusoidal Sequence Derivatives



The Fibonacci Sequence
Just as the solutions to differential equations of the form
\[
\Delta \, s_n = c \times s_{n1}
\]
are simply geometric sequences, so the solution to the differential equation
\[
\begin{align*}
\Delta \, s_n &= s_{n2}\\
s_1 &= 1
\end{align*}
\]
is but the Fibonacci sequence, as revealed by replacing the derivative term with its definition and rearranging
\[
\begin{align*}
s_n  s_{n1} &= s_{n2}\\
s_n &= s_{n1} + s_{n2}
\end{align*}
\]
and noting again that we have adopted the convention that terms with nonpositive equate to zero.This set us to wondering quite how the solution to
\[
\begin{align*}
\Delta \, s_n &= s_{n2}\\
s_1 &= 1
\end{align*}
\]
which, by the same reasoning, must be given by
\[
s_n = s_{n1}  s_{n2}
\]
might behave and so we put together deck 6 to investigate.
Deck 6: The AntiFibonacci Sequence



By mere inspection it's quite evident that this is simply another sinusoidal sequence; specifically
\[
s_n = \tfrac{2}{\sqrt{3}} \sin\left(\tfrac{n\pi}{3}\right)
\]
In retrospect this was perhaps not so very surprising given that the terms of the Fibonacci sequence can be expressed as a function of exponentials with
\[
s_n = \frac{\phi^n  \left(\phi\right)^{n}}{\sqrt{5}}
\]
where \(\phi\) is the golden ratio, given by
\[
\phi = \frac{1 + \sqrt{5}}{2}
\]
Tricks Of The Trade
Unfortunately, my fellow students and I have found that many of the tricks that we employ to solve ordinary differential equations cannot be tidily modified to apply to sequences. For example, to solve differential equations of the form
\[
\frac{\mathrm{d}^2 y}{\mathrm{d}x^2} = f(y)
\]
we exploit the fact that, by the chain rule
\[
\frac{\mathrm{d}}{\mathrm{d}x} \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right)^2
= 2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) \times \frac{\mathrm{d}}{\mathrm{d}x} \frac{\mathrm{d}y}{\mathrm{d}x}
= 2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) \frac{\mathrm{d}^2 y}{\mathrm{d}x^2}
\]
so that
\[
\begin{align*}
2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) \frac{\mathrm{d}^2 y}{\mathrm{d}x^2}
&= 2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) f(y)\\
\int 2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) \frac{\mathrm{d}^2 y}{\mathrm{d}x^2} \, \mathrm{d}x
&= \int 2 \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right) f(y) \, \mathrm{d}x\\
\int \frac{\mathrm{d}}{\mathrm{d}x} \left(\frac{\mathrm{d}y}{\mathrm{d}x}\right)^2 \, \mathrm{d}x
&= \int 2 f(y) \, \frac{\mathrm{d}y}{\mathrm{d}x} \mathrm{d}x\\
\left(\frac{\mathrm{d}y}{\mathrm{d}x}\right)^2
&= \int 2 f(y) \, \mathrm{d}y + A
\end{align*}
\]
for any constant \(A\). Now if we define
\[
g(y) = \int 2 f(y) \, \mathrm{d}y + A
\]
then
\[
\begin{align*}
\frac{\mathrm{d}y}{\mathrm{d}x} &= \sqrt{g(y)}\\
\frac{1}{\sqrt{g(y)}} \, \mathrm{d}y &= \mathrm{d}x\\
\int \frac{1}{\sqrt{g(y)}} \, \mathrm{d}y &= \int \mathrm{d}x = x + B
\end{align*}
\]
for any constant \(B\), and the solution is consequently
\[
\begin{align*}
h(y) &= \int \frac{1}{\sqrt{g(y)}} \, \mathrm{d}y  B = x\\
y &= h^{1}(x)
\end{align*}
\]
The problem that has dogged my fellow students and me with regard to translating such techniques to sequences is that we have found ourselves quite unable to figure equivalents for the transformations upon which they rely, such as the chain rule.Thankfully, however, we can at least transform sequence differential equations into comparatively simple recurrence relations. For example, we can rearrange
\[
\Delta^2 \, s_n = f\left(s_n\right)
\]
into
\[
\begin{align*}
\Delta \, s_n & \Delta \, s_{n1} = f\left(s_n\right)\\
\Delta \, s_n &= \Delta \, s_{n1} + f\left(s_n\right)
\end{align*}
\]
which, upon integrating, yields
\[
\begin{align*}
s_n &= s_{n1} + \Delta^{1} \, f\left(s_n\right) + C\\
&= s_{n1} + \sum_{i=1}^n f\left(s_i\right) + C\\
&= s_{n1} + f\left(s_n\right) + \sum_{i=1}^{n1} f\left(s_i\right) + C\\
&= s_{n1} + f\left(s_n\right) + \Delta^{1} \, f\left(s_{n1}\right) + C
\end{align*}
\]
for some freely chosen constant \(C\), and we consequently have a solution with
\[
\begin{align*}
g\left(s_n\right) &= s_n  f\left(s_n\right) = s_{n1} + \Delta^{1} \, f\left(s_{n1}\right) + C\\
s_n &= g^{1}\left(s_{n1} + \Delta^{1} \, f\left(s_{n1}\right) + C\right)
\end{align*}
\]
This was all that my fellow students and I were able to discover before we needed to return to our studies, but we shall be sure to recommence our investigations as time allows.
\(\Box\)
References
[1] On A Calculus Of Differences, www.thusspakeak.com, 2017.[2] Further On A Calculus Of Differences, www.thusspakeak.com, 2017.
Leave a comment