Recently in Probability Category

Copulating Normally

Last year we took a look at multivariate uniformly distributed random variables, which generalise uniform random variables to multiple dimensions with random vectors whose elements are independently uniformly distributed. We have now seen how we can similarly generalise normally distributed random variables with the added property that the normally distributed elements of their vectors may be dependent upon each other; specifically that they may be correlated.
As it turns out, we can generalise this dependence to arbitrary sets of random variables with a fairly simple observation.

Full text...

submit to reddit  

The Cumulative Distribution Unction

We have previously seen how we can generalise normally distributed random variables to multiple dimensions by defining vectors with elements that are linear functions of independent standard normally distributed random variables, having means of zero and standard deviations of one, with

  Z' = L × Z + μ

where L is a constant matrix, Z is a vector whose elements are the independent standard normally distributed random variables and μ is a constant vector.
So far we have derived and implemented the probability density function and the characteristic function of the multivariate normal distribution that governs such random vectors but have yet to do the same for its cumulative distribution function since it's a rather more difficult task and thus requires a dedicated treatment, which we shall have in this post.

Full text...

submit to reddit  

Multiple Multiply Normal Functions

Last time we took a look at how we could define multivariate normally distributed random variables with linear functions of multiple independent standard univariate normal random variables.
Specifically, given a Z whose elements are independent standard univariate normal random variables, a constant vector μ and a constant matrix L

  Z' = L × Z + μ

has linearly dependent normally distributed elements, a mean vector of μ and a covariance matrix of

  Σ' = L × LT

where LT is the transpose of L in which the rows and columns are switched.
We got as far as deducing the characteristic function and the probability density function of the multivariate normal distribution, leaving its cumulative distribution function and its complement aside until we'd implemented both them and the random variable itself, which we shall do in this post.

Full text...

submit to reddit  

Every Which Way Is Normal

A few months ago we saw how we could generalise the concept of a random variable to multiple dimensions by generating random vectors rather than numbers. Specifically we took a look at the multivariate uniform distribution which governs random vectors whose elements are independently uniformly distributed.
Whilst it demonstrated that we can find multivariate versions of distribution functions such as the probability density function, the cumulative distribution function and the characteristic function, the uniform distribution is fairly trivial and so, for a more interesting example, this time we shall look at generalising the normal distribution to multiple dimensions.

Full text...

submit to reddit  

Into The Nth Dimension

A few years ago we took a look at some of the properties of uniformly distributed random variables, whose values have equal probabilities of falling within intervals of equal width within their ranges. A simple generalisation of this are multivariate uniform distributions which govern random vectors that fall within equal volumes with equal probability.

Full text...

submit to reddit  

A Costly Proposition

Now that we know the statistical properties of memoryless processes, being those in which the waiting time for the occurence of an event is independent of how long we have already been waiting for it, I think it would make for an interesting postscript to briefly cover how we might use them to model real world problems.

Full text...

submit to reddit  

The Longest Wait

We have seen how the waiting time for an event in a memoryless process, being one in which the probability that we must wait some given time doesn't change no matter how long we've already been waiting, must be exponentially distributed, that the waiting time for the kth such event must be gamma distributed and that the number of such events occurring in one unit of time must be Poisson distributed.
This time I'd like to ask how long we must wait for the first and the last of several such processes that are happening at the same time to trigger an event.

Full text...

submit to reddit  

...And Then Three Came Along At Once!

In the last few posts we have been looking at the properties of memoryless processes, being those processes that trigger events at intervals that are independent of how long we have been waiting for one to happen. First we answered the question of what is the probability that we must wait some given time that an event will occur with the exponential distribution, and then the question of what is the probability that we must wait some given time for the kth event to occur with the gamma distribution, which allows the somewhat counter-intuitive case of non-integer k.

This time I'd like to ask another question; what is the probability that we'll observe k events, occurring at a rate λ, in a period of one unit of time?

Full text...

submit to reddit  

Kth Time's The Charm

Last time we proved that if the waiting time for an event does not depend upon how long we have already been waiting then it must be exponentially distributed. I promised that there was a lot more to say about such memoryless waiting time processes and in this post we shall begin to investigate them further. Specifically, we shall ask how long must we wait for more than one event to occur?

Full text...

submit to reddit  

How Long Must I Wait?

Whilst we're back on the subject of probability distributions I should like to cover the one that first drew my interest to probability and statistics. Back in my undergraduate days I was chiefly interested in pure mathematics, in deriving mathematical truths from first principles, and my impression of statistics was, rather unfairly in retrospect, that it involved the rote learning of formulae and how to use them rather than proving why they should be used in the first place.
Then I chanced upon a description of a waiting time problem and learned the error of my ways.

Full text...

submit to reddit