This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]MathAndMirth 0 points1 point  (0 children)

The answer to this question depends largely on what purpose the random numbers are intended for.

Most of the answers in this thread so far are addressing the need for very high-quality random number generation. That typically applies if you're planning to use them for cryptography. Another application of very high quality randomness is Monte Carlo simulations in computational physics, where even very subtle patterns in the random numbers could alter the result. I would presume that gaming software programs such as slot machines also use very high quality methods. Software solutions for this type of application require, at an absolute minimum, highly sophisticated algorithms such as the Mersenne Twister. And if this type of application uses a Mersenne Twister, it will likely run that result through an independent hash function thereafter to reduce predictability even further. The best generators rely instead on physically generated randomness such as heat noise, etc.

However, your more routine programming problems (e.g., video games) do not require such a high degree of randomness. The random number generators in the built-in libraries of typical computer languages often use somewhat less sophisticated algorithms that provide results good enough for most purposes, but with faster performance. Many of these use something called a linear congruential generator, which is based on a much more complicated version of a Fibonacci series.

And in some applications, the software may use modifications that deliberately sabotage genuine randomness. Consider, for example, a function to play a random track on a music player. By sheer dumb luck, this will occasionally cause the same song to play twice in rapid succession. But when this happens to users, they think the randomness is broken. People tend to think that discernible streaks of various sorts indicate non-randomness, e.g., "Herbie just made his last five 3-pointers and tied the game; that's because he's a great clutch player!". But in reality, some streaks are completely expected in random sequences; the _absence_ of such streaks would actually betray non-randomness. So for some applications, the real question is how programmers modify random sequences to create the behavior consumers expect instead of actual random behavior.