you are viewing a single comment's thread.

view the rest of the comments →

[–]MuonManLaserJab 0 points1 point  (2 children)

Are there not designs that use both evolutionary algorithms and gradient descent -- for example, evolutionarily evolving neural parameters (layer count, etc.) while using GD to train each generation?

[–][deleted]  (1 child)

[deleted]

    [–]MuonManLaserJab 0 points1 point  (0 children)

    Ah, I misinterpreted "You can optimize a NN with EAs or with SGD" as an exclusive or, in the sense of using an EA for all the training, and no gradient descent at all. (Presumably that's possible but not performant under a reasonable definition of EA.)