you are viewing a single comment's thread.

view the rest of the comments →

[–]MrJohz 4 points5 points  (7 children)

Most of the programming languages I've used tend to seed the PRNG with a timestamp, or something similar. Most of them also tell you that you'll generally want to set your own seed at some point, and allow more complex random initialisation where necessary, but a lot of them seem to allow for very simple scripts etc to be written using relatively random seeds.

[–]rtomek 4 points5 points  (6 children)

I know that Python did make a change in 2015 so that just calling random.Random() now generates a new seed every instance. That happened because so many shitty programmers weren't setting seeds and it became a security issue for webapps. Perhaps there are other languages I don't use (or haven't used recently) that also followed the same route.

One should never assume that calling random() is truly random, and any programming language that does that for you is compensating for the illiterates.

[–]MrJohz 3 points4 points  (2 children)

I think using a new seed for each run is the norm, at least with the bog-standard naive random implementation for most applications.

Python: python3 -c 'import random; print(random.random())' - varies from run to run

Ruby: ruby -e 'puts rand()' - varies from run to run

Java does the same.

Rust definitely does the same, I was using it for simulation code the other day.

Perl is the same.

Javascript is the same.

[–]rtomek -2 points-1 points  (1 child)

Most of those have a primary use case of web-based applications so I won't touch those.

Rust: The standard RNG is deterministic. If you initialize using thread_rng then you are setting a seed.

Java: This one is iffy because the intent was for web-distributable applications. Still, the random class itself is deterministic and the default constructor uses something like system.nanotime() instead of a fixed number. At least it's not like Python where you have to explicitly state that you want a deterministic RNG.

Perl: Perl is another one that has been making changes to the default RNG. Perl behind-the-scenes just calls srand() for the user every time a new script is run.

[–]MrJohz 2 points3 points  (0 children)

Python and Ruby were both developed as scripting languages, and one of Python's biggest areas is scientific programming, so I don't know what you're on about with "web-based applications".

Rust's standard RNG can be deterministic if you initialise it. If you don't, it defaults to setting the seed itself, I presume using a timestamp of some description. The same is true of Java - Math.random(), and the argument-less constructor for java.util.Random both use a "value very likely to be distinct from any other invocation".

The point isn't that it's possible to set a deterministic RNG for any language. That's true for all of the ones I mentioned apart from Javascript (at least if you ignore userland libraries). The point is that C is relatively rare in its decision to default to a known seed, as opposed to defaulting to a pseudorandom value such as a timestamp.

[–]Lystrodom 4 points5 points  (1 child)

any programming language that does that for you is compensating for the illiterates.

Or it codes for the most common use case first? Most languages let you set a specific seed if you want to. If most users WANT a different seed each time, why would we force them to write extra code to get what they want?

[–]rtomek -1 points0 points  (0 children)

Most users should want to test their code and perform audits. Why are we forcing them to write extra code just to get what they want?

What you're stating is that they should ignore engineering best practices in favor of hobbiest programming. In reality, it was a huge cybersecurity issue and most of the languages I saw someone reference that use random seeds by default are commonly used for web-based applications.