Generating random numbers which fit a normal distribution is essential for stochastic optimization, especially for continuous evolutionary algorithms.
For high-quality results the weapon of choice is the Box–Muller transform. It’s a little expensive; it involves exponents and trigonometry and such. Recently I was working on an EA written in JavaScript, and I wanted to avoid using those functions. There’s actually a damn simple alternate method.
Box-Muller gives a true Gaussian distribution, which is what you would get from accumulating an unlimited number of uniformly-distributed random values. Turns out you don’t need to accumulate very many to get a half-decent distribution. This is the function I’m using:
1 2 3 4 5 |
var rand = Math.random; var randGauss = function(mean, stddev) { var x = (rand() + rand() + rand() + rand()) * 1.73 - 3.46; return mean + x * stddev; }; |
A very important thing to note about this approximation is that the returned value will never be more than 3.46 standard deviations from the mean. Extreme values which should be very rare are in fact totally absent. This limitation is actually handy, for my purposes.
The distribution of this thing’s output looks like this:
The quality of the approximation and the maximum possible deviation can be controlled via the number of uniform random samples. Accumulated uniform random values fit what’s called the Irwin-Hall distribution. The probability density resembles Gaussian with mean and standard deviation dependent on the number of accumulated values. To illustrate the math, the following method uses a given number of samples:
1 2 3 4 5 6 7 8 |
var randGaussCustom = function(mean, stddev, n) { var i, a, b, x = 0; for (i = 0; i < n; i++) { x += rand(); } a = 2 / Math.sqrt(n / 3); b = n / Math.sqrt(n / 3); x = x * a - b; return mean + x * stddev; }; |