Mail Archives: djgpp/1999/11/24/11:36:03
On 23 Nov 1999, Hans-Bernhard Broeker wrote:
> > The
> > theory behind it is the well-known theorem which states that the sum
> > of a large number of independent random numbers approaches the normal
> > distribution.
>
> Nice theory, but its application has a significant drawback: 6 is not
> 'large'. Not by any statistical definition I've seen. In the actual
> mathematical theorem, it's actually for 'infinitely' many terms only
> that the true gaussian is guaranteed to come out of the process.
As I said, this is good enough for the IMSL library, so it should be
good enough for most of us. I actually used this function in many
simulations. The results are usually indistinguishable from more
complicated methods, as far as the simulation results go.
Of course, if you are doing a PhD thesis in random number generation,
don't dare to use it ;-)
> With a properly implemented inverse error function, or the 2D trick of
> transforming via polar coordinates posted here by someone else, the
> accuracy will be quite a lot better than that, usually. The only real
> justification for the sum-of-6 version would be speed, then. But that
> would only hold if the random number generator itself is significantly
> faster than, say a sin() or log() evaluation. For good RNGs, that
> won't usually hold.
Actually, modern uniform RNGs are lightning-fast, they involve only a
handful of arithmetic instructions and bit shifts. See the RNGs
posted by Marsaglia a few months ago (Dejanews will find them).
- Raw text -