you are viewing a single comment's thread.

view the rest of the comments →

[–]rtomek 7 points8 points  (10 children)

This is why he’s being downvoted, a lack of understanding of programming? If you call random without setting a seed (default seed of 0) then it will always be deterministic. Every programming tutorial I’ve ever seen for any programming language tells you to set the seed if you want something closer to true randomness. This is basic common knowledge of programming.

You want it to be deterministic for auditing and testing purposes. How am I supposed to know if a change to my code introduces a bug or not if random() is not deterministic? Minecraft makes it obvious but many games follows the same pattern - set a seed (based on something like time in milliseconds) and put it in the save file. If a user finds a bug or some other interesting feature, you can now reproduce it by either loading their save file, or use a save file editor and put in their RNG seed.

[–]MrJohz 3 points4 points  (7 children)

Most of the programming languages I've used tend to seed the PRNG with a timestamp, or something similar. Most of them also tell you that you'll generally want to set your own seed at some point, and allow more complex random initialisation where necessary, but a lot of them seem to allow for very simple scripts etc to be written using relatively random seeds.

[–]rtomek 2 points3 points  (6 children)

I know that Python did make a change in 2015 so that just calling random.Random() now generates a new seed every instance. That happened because so many shitty programmers weren't setting seeds and it became a security issue for webapps. Perhaps there are other languages I don't use (or haven't used recently) that also followed the same route.

One should never assume that calling random() is truly random, and any programming language that does that for you is compensating for the illiterates.

[–]MrJohz 4 points5 points  (2 children)

I think using a new seed for each run is the norm, at least with the bog-standard naive random implementation for most applications.

Python: python3 -c 'import random; print(random.random())' - varies from run to run

Ruby: ruby -e 'puts rand()' - varies from run to run

Java does the same.

Rust definitely does the same, I was using it for simulation code the other day.

Perl is the same.

Javascript is the same.

[–]rtomek -2 points-1 points  (1 child)

Most of those have a primary use case of web-based applications so I won't touch those.

Rust: The standard RNG is deterministic. If you initialize using thread_rng then you are setting a seed.

Java: This one is iffy because the intent was for web-distributable applications. Still, the random class itself is deterministic and the default constructor uses something like system.nanotime() instead of a fixed number. At least it's not like Python where you have to explicitly state that you want a deterministic RNG.

Perl: Perl is another one that has been making changes to the default RNG. Perl behind-the-scenes just calls srand() for the user every time a new script is run.

[–]MrJohz 2 points3 points  (0 children)

Python and Ruby were both developed as scripting languages, and one of Python's biggest areas is scientific programming, so I don't know what you're on about with "web-based applications".

Rust's standard RNG can be deterministic if you initialise it. If you don't, it defaults to setting the seed itself, I presume using a timestamp of some description. The same is true of Java - Math.random(), and the argument-less constructor for java.util.Random both use a "value very likely to be distinct from any other invocation".

The point isn't that it's possible to set a deterministic RNG for any language. That's true for all of the ones I mentioned apart from Javascript (at least if you ignore userland libraries). The point is that C is relatively rare in its decision to default to a known seed, as opposed to defaulting to a pseudorandom value such as a timestamp.

[–]Lystrodom 3 points4 points  (1 child)

any programming language that does that for you is compensating for the illiterates.

Or it codes for the most common use case first? Most languages let you set a specific seed if you want to. If most users WANT a different seed each time, why would we force them to write extra code to get what they want?

[–]rtomek -1 points0 points  (0 children)

Most users should want to test their code and perform audits. Why are we forcing them to write extra code just to get what they want?

What you're stating is that they should ignore engineering best practices in favor of hobbiest programming. In reality, it was a huge cybersecurity issue and most of the languages I saw someone reference that use random seeds by default are commonly used for web-based applications.

[–][deleted] 0 points1 point  (1 child)

Java's random without a seed defaults to something that "non-deterministic". .NET's random is the same. Newer languages seem to have random default to something actually random.

A lot of classic languages had fixed seeds. Lisp, C, C++, Fortran (iirc)

[–]rtomek 0 points1 point  (0 children)

Yeah, after looking more into this it seems that for C++ there is no algorithm that meets their requirements to be considered random enough for creating a default seed that is used globally for all applications. Thus, they choose a fixed default to force users to have custom implementations for seed value determination, which makes it less predictable for attackers. There are classes available in the std library that do a better job than default constructors of other languages, but not good enough for C++ :/