A Costly Proposition


Now that we know the statistical properties of memoryless processes[1][2][3][4], being those in which the waiting time for the occurrence of an event is independent of how long we have already been waiting for it, I think it would make for an interesting postscript to briefly cover how we might use them to model real world problems.

This will consequently be a rather maths focussed post and so I shall not be using sidebars for the derivations. It won't include any new ak library functionality though, so if that prospect is a daunting one then you just can skip this post.

A Compound Poisson Process

If we suppose that each event in a memoryless process with rate \(\lambda\) has an associated cost independently distributed as \(P\), then we can conclude that the total cost in one unit of time, \(X\), is distributed as
\[ \begin{align*} k &\sim Poisson(\lambda)\\ X_j &\sim P\\ X &= \sum_{j=1}^k X_j \end{align*} \]
where \(\sim\) means drawn from and \(\sum\) is the summation sign. This is known as a compound Poisson process and is often used to model insurance claims, for example.

If we assume for the moment that \(P\) is a discrete distribution having the PMF
\[ p(x_j) = p_j \]
then events costing a particular amount \(X_j\) must also be triggered by a memoryless process and, since on average they make up a proportion \(p_j\) of all of the events, it must have a rate of \(\lambda p_j\). We can use this to simplify the formula for the distribution of \(X\) by turning it inside out
\[ \begin{align*} k_j &\sim Poisson(\lambda p_j)\\ X &= \sum_j x_j k_j \end{align*} \]
On the face of it, this might not seem to be much of a simplification, but it allows us to bring characteristic functions in to play.

The Characteristic Of Cost

Recall that the CF of a random variable \(X\) is given by the complex expectation
\[ \hat{p}_X(t) = \mathrm{E}\left[e^{itX}\right] \]
Now, the CF of the sum of a set of random variables is simply the product of their individual CFs. We can also easily show that the CF of a constant multiple of a random variable \(X\) is given by
\[ \hat{p}_{cX}(t) = \mathrm{E}\left[e^{it(cX)}\right] = \mathrm{E}\left[e^{i(ct)X}\right] = \hat{p}_{X}(ct) \]
Consequently, the CF of the total cost is given by
\[ \hat{p}(t) = \prod_j \hat{p}_{\lambda p_j}\left(x_j t\right) \]
where \(\prod\) is the product sign and \(\hat{p}_\lambda\) is the CF of a Poisson distribution with rate \(\lambda\). Expanding out these Poisson CFs yields
\[ \begin{align*} \hat{p}(t) &= \prod_j \exp\left(\lambda p_j\left(e^{i x_j t}-1\right)\right)\\ &= \exp\left(\sum_j \lambda p_j\left(e^{i x_j t}-1\right)\right)\\ &= \exp\left(\lambda \left(\sum_j p_j e^{i t x_j} - \sum_j p_j\right)\right) \end{align*} \]
Now the sum of the probabilities \(p_j\) must equal one and, since the expectation of a function of a discrete random variable is
\[ \mathrm{E}[f(X)] = \sum_j p_j f\left(x_j\right) \]
then the first sum is simply the CF of the event cost distribution, \(\hat{p}_X\). The CF of the total cost is therefore
\[ \hat{p}(t) = e^{\lambda \left(\hat{p}_X(t) - 1\right)} \]
Note that this holds for continuous cost distributions too, which we can show by approximating their PDFs with PMFs
\[ p_\delta\left(x_j\right) = \int_{x_j-\delta}^{x_j+\delta} p(x) \, \mathrm{d}x \]
where \(x_{j+1} = x_j + 2\delta\), and taking the limit of the expectation as \(\delta\) tends to zero
\[ \begin{align*} \lim_{\delta \to 0} \sum_j p_\delta\left(x_j\right) f\left(x_j\right) &= \lim_{\delta \to 0} \sum_j \int_{x_j-\delta}^{x_j+\delta} p(x) \, \mathrm{d}x \, f\left(x_j\right)\\ &= \lim_{\delta \to 0} \sum_j \tfrac{1}{2}\times 2\delta \left(p\left(x_j-\delta\right) + p\left(x_j+\delta\right)\right) f\left(x_j\right)\\ &= \lim_{\delta \to 0} \sum_j \tfrac{1}{2}\times 2\delta \left(p\left(x_j-\delta\right) f\left(x_j\right) + p\left(x_j+\delta\right) f\left(x_j\right)\right)\\ &= \lim_{\delta \to 0} \sum_j \tfrac{1}{2}\times 2\delta \left(p\left(x_j-\delta\right) f\left(x_j-\delta\right) + p\left(x_j+\delta\right) f\left(x_j+\delta\right)\right)\\ &= \int_{\min(X)}^{\max(X)} p(x) f(x) \, \mathrm{d}x = \mathrm{E}\left[f(X)\right] \end{align*} \]

A Model Of Cost

When modelling the statistical behaviour of such processes we are typically interested in calculating expectations and probabilities and so will need their PDFs, CDFs and inverse CDFs rather than their CFs. Fortunately we have already implemented functions to approximate the former from the latter using the fast Fourier transform[5]; ak.fourierPDF, ak.fourierCDF and ak.fourierInvCDF.

In program 1 we compare a histogram of randomly generated total costs, using our first formula
\[ \begin{align*} k &\sim Poisson(\lambda)\\ X_j &\sim P\\ X &= \sum_{j=1}^k X_j \end{align*} \]
against the approximate CDF derived from the CF of the total cost
\[ \hat{p}(t) = e^{\lambda \left(\hat{p}_X(t) - 1\right)} \]
using a \(2^{10}\) point FFT by default.

Program 1: Probabilities Of Total Cost

All of the ak distribution functions are available to program 1, so I'd encourage you to try changing the cost distribution to see if the approximation still holds as well as it does for the log normal distribution.

It will, of course, or I wouldn't have suggested it!

Whilst this is a nice visual example, it's a pretty atypical use of this model. Instead, we should usually be interested in questions such as the probability that the total cost will exceed available funds, known as the risk of ruin, or the amount that we expect the total cost to exceed with a given probability, known as the value at risk. These are given by one minus the CDF at the value of those funds and the inverse CDF at one minus that probability respectively, as demonstrated by program 2.

Program 2: Risk Of Ruin And Value At Risk

Note that to accurately approximate the total cost distribution we must be careful about how we choose the parameters of the Fourier distribution functions; specifically, the midpoint x and width w should be chosen so that the PDF at the ends of the range is almost zero and the order n of the FFT should be chosen so that there are sufficiently many points to keep the approximation error within acceptable limits.

Nevertheless, I still think that this is a pretty neat trick!


[1] How Long Must I Wait?, www.thusspakeak.com, 2015.

[2] kth Time's The Charm, www.thusspakeak.com, 2015.

[3] ...And Then Three Came Along At Once!, www.thusspakeak.com, 2015.

[4] The Longest Wait, www.thusspakeak.com, 2015.

[5] Characteristically Tricksy, www.thusspakeak.com, 2015.

Leave a comment