Quick and Easy Gaussian Random Numbers

Generating random numbers which fit a normal distribution is essential for stochastic optimization, especially for continuous evolutionary algorithms.

For high-quality results the weapon of choice is the Box–Muller transform. It’s a little expensive; it involves exponents and trigonometry and such. Recently I was working on an EA written in JavaScript, and I wanted to avoid using those functions. There’s actually a damn simple alternate method.

Box-Muller gives a true Gaussian distribution, which is what you would get from accumulating an unlimited number of uniformly-distributed random values. Turns out you don’t need to accumulate very many to get a half-decent distribution. This is the function I’m using:

A very important thing to note about this approximation is that the returned value will never be more than 3.46 standard deviations from the mean. Extreme values which should be very rare are in fact totally absent. This limitation is actually handy, for my purposes.
The distribution of this thing’s output looks like this:

four sample gauss

The quality of the approximation and the maximum possible deviation can be controlled via the number of uniform random samples. Accumulated uniform random values fit what’s called the Irwin-Hall distribution. The probability density resembles Gaussian with mean and standard deviation dependent on the number of accumulated values. To illustrate the math, the following method uses a given number of samples:

This entry was posted in Uncategorized and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published.