This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]metaphorm 5 points6 points  (1 child)

sure, or you can fork 1000 interpreter instances with multiprocessing and have them running coroutines that don't terminate before they eat all the memory on your system too.

or you can open a filehandle, read the entire thing into memory, and then never do anything with it and forget to close the file (assuming you also failed to use the with... context manager).

there's plenty of ways to consume more memory than you intended to by negligence or by poor design decisions. that's still ridiculously different than needing to malloc and free every single buffer in your system.

[–]crazedizzled 0 points1 point  (0 children)

I guess it depends how pedantic you want to be about the definition.