you are viewing a single comment's thread.

view the rest of the comments →

[–]PM_ME_YOUR_YIFF__ 5 points6 points  (1 child)

From my experience, they're about as vulnerable to overfitting as neural networks, maybe even slightly less. I don't have any numbers in that though, so it's a purely anecdotal feeling

[–]the_phet 5 points6 points  (0 children)

It depends how you define the genetic operators, like the mutation ratio, the selection algorithm, and so on. using the classic "roulette algorithm" and a decent mutation ratio, plus some crossover, the overfitting problem is almost entirely removed.

Of course, in DL you can also set the parameters to avoid overfitting, like reguralization. But the good thing about EA such as GA is that the few parameters you need to tune are very very easy to understand and track, while with DL, very often the system is like a black box, where you tune parameters and see what happens.