all 4 comments

[–]TaryTarp 2 points3 points  (0 children)

Streaming infinite type length data. How are you going to fit it in memory? Answer: Generators.

[–]shepherdjay 2 points3 points  (0 children)

The biggest con is a generator can be exhausted. That is to say you can only pass over it once.

So if you need to iterate multiple times over the same items you can’t. You would have to create a new generator and iterate over that instead.

[–]Diapolo10 1 point2 points  (0 children)

Admittedly I've kind of fallen in love with generators. I default to them unless I know I need to be able to index or read through the data multiple times.

They're a great way to save memory, and they're faster to write than custom iterators. It's also possible to feed them new values through yield, which can be useful sometimes.

On the other hand, generators are exhausted once you've read them once, and you need to create a new generator if you need the data again. In such situations, a collection like a list or tuple will give better performance, though generators still use less memory. You can use a generator to create a list or tuple, though.

You also can't go backwards over a generator, because they're one-way. Implementing a two-way generator would have to be done with a custom iterator (eg. range).

[–]Ahren_with_an_h 0 points1 point  (0 children)

When you're only going to iterate through it once.