Recently in Probability Category

Archimedean View

Last time we took a look at how we could define copulas to represent the dependency between random variables by summing the results of a generator function φ applied to the results of their cumulative distribution functions, or CDFs, and then applying the inverse of that function φ-1 to that sum.
These are known as Archimedean copulas and are valid whenever φ is strictly decreasing over the interval [0,1], equal to zero when its argument equals one and have nth derivatives that are non-negative over that interval when n is even and non-positive when it is odd, for n up to the number of random variables.
Whilst such copulas are relatively easy to implement we saw that their densities are a rather trickier job, in contrast to Gaussian copulas where the reverse is true. In this post we shall see how to draw random vectors from Archimedean copulas which is also much more difficult than doing so from Gaussian copulas.

Full text...  

Archimedean Skew

About a year and a half ago we saw how we could use Gaussian copulas to define dependencies between the elements of a vector valued multivariate random variable whose elements, when considered in isolation, were governed by arbitrary cumulative distribution functions, known as marginals. Whilst Gaussian copulas are quite flexible, they can't represent every possible dependency between those elements and in this post we shall take a look at some others defined by the Archimedean family of copulas.

Full text...  

Copulating Normally

Last year we took a look at multivariate uniformly distributed random variables, which generalise uniform random variables to multiple dimensions with random vectors whose elements are independently uniformly distributed. We have now seen how we can similarly generalise normally distributed random variables with the added property that the normally distributed elements of their vectors may be dependent upon each other; specifically that they may be correlated.
As it turns out, we can generalise this dependence to arbitrary sets of random variables with a fairly simple observation.

Full text...  

The Cumulative Distribution Unction

We have previously seen how we can generalise normally distributed random variables to multiple dimensions by defining vectors with elements that are linear functions of independent standard normally distributed random variables, having means of zero and standard deviations of one, with

  Z' = L × Z + μ

where L is a constant matrix, Z is a vector whose elements are the independent standard normally distributed random variables and μ is a constant vector.
So far we have derived and implemented the probability density function and the characteristic function of the multivariate normal distribution that governs such random vectors but have yet to do the same for its cumulative distribution function since it's a rather more difficult task and thus requires a dedicated treatment, which we shall have in this post.

Full text...  

Multiple Multiply Normal Functions

Last time we took a look at how we could define multivariate normally distributed random variables with linear functions of multiple independent standard univariate normal random variables.
Specifically, given a Z whose elements are independent standard univariate normal random variables, a constant vector μ and a constant matrix L

  Z' = L × Z + μ

has linearly dependent normally distributed elements, a mean vector of μ and a covariance matrix of

  Σ' = L × LT

where LT is the transpose of L in which the rows and columns are switched.
We got as far as deducing the characteristic function and the probability density function of the multivariate normal distribution, leaving its cumulative distribution function and its complement aside until we'd implemented both them and the random variable itself, which we shall do in this post.

Full text...  

Every Which Way Is Normal

A few months ago we saw how we could generalise the concept of a random variable to multiple dimensions by generating random vectors rather than numbers. Specifically we took a look at the multivariate uniform distribution which governs random vectors whose elements are independently uniformly distributed.
Whilst it demonstrated that we can find multivariate versions of distribution functions such as the probability density function, the cumulative distribution function and the characteristic function, the uniform distribution is fairly trivial and so, for a more interesting example, this time we shall look at generalising the normal distribution to multiple dimensions.

Full text...  

Into The Nth Dimension

A few years ago we took a look at some of the properties of uniformly distributed random variables, whose values have equal probabilities of falling within intervals of equal width within their ranges. A simple generalisation of this are multivariate uniform distributions which govern random vectors that fall within equal volumes with equal probability.

Full text...  

A Costly Proposition

Now that we know the statistical properties of memoryless processes, being those in which the waiting time for the occurence of an event is independent of how long we have already been waiting for it, I think it would make for an interesting postscript to briefly cover how we might use them to model real world problems.

Full text...  

The Longest Wait

We have seen how the waiting time for an event in a memoryless process, being one in which the probability that we must wait some given time doesn't change no matter how long we've already been waiting, must be exponentially distributed, that the waiting time for the kth such event must be gamma distributed and that the number of such events occurring in one unit of time must be Poisson distributed.
This time I'd like to ask how long we must wait for the first and the last of several such processes that are happening at the same time to trigger an event.

Full text...  

...And Then Three Came Along At Once!

In the last few posts we have been looking at the properties of memoryless processes, being those processes that trigger events at intervals that are independent of how long we have been waiting for one to happen. First we answered the question of what is the probability that we must wait some given time that an event will occur with the exponential distribution, and then the question of what is the probability that we must wait some given time for the kth event to occur with the gamma distribution, which allows the somewhat counter-intuitive case of non-integer k.

This time I'd like to ask another question; what is the probability that we'll observe k events, occurring at a rate λ, in a period of one unit of time?

Full text...