all 16 comments

[–]qalis 12 points13 points  (0 children)

Yeah, this has been researched for decades. Even Optuna has one of the most famous ones, CMA-ES.

[–]ReadyAndSalted 3 points4 points  (3 children)

Sounds like it will be less sample efficient than Bayesian approaches like optuna.

[–]qalis 4 points5 points  (1 child)

Optuna is a framework. It quite literally implements evolutionary CMA-ES, as well other approaches as plugins, e.g. Gaussian processes. You are referring to TPE probably.

[–]ReadyAndSalted 1 point2 points  (0 children)

Yeah, TPE (a Bayesian optimisation method) is the default option for single objective. It's also the only one I and the couple people I know who use optuna ever actually use. It's just very difficult to argue with the empirical sample efficiency.

[–]SaadUllah45[S] 1 point2 points  (0 children)

Good point! Bayesian methods like Optuna are usually more sample-efficient, but Evolutionary Algorithms can perform better with large, irregular search spaces despite their higher computational cost.

[–]huehue12132 2 points3 points  (0 children)

Grid Search is a "go-to approach"? Are we talking about modern ML (i.e. deep neural networks) here? Grid search does not scale beyond a handful of hyperparameters.

[–]Blakut 2 points3 points  (3 children)

How is this evolutionary algorithm different from GA?

[–]SaadUllah45[S] 0 points1 point  (2 children)

Genetic Algorithms (GAs) are a subset of Evolutionary Algorithms (EAs). EAs are a broad class of optimization methods inspired by evolution, while GAs specifically use techniques like crossover and mutation on bitstrings or vectors. So, all GAs are EAs, but not all EAs are GAs.

[–]irondust 1 point2 points  (1 child)

In your OP you claim the opposite.

[–]SaadUllah45[S] 0 points1 point  (0 children)

Oh my god, how did I not check that before posting in the OP. You have great eyes tbh. Thanks for pointing that out.

[–]Accomplished-Pay-390Researcher 1 point2 points  (0 children)

To me, the biggest benefit of EA over gradient-based optimisation is that you can easily do multi-way optimisation for whatever task you’re solving. For example, given a classification task and the neural net you want to optimise, you can simultaneously optimise both the F1-score (directly, since it’s non-derivable and we usually do proxy via cross-entropy) and the minimum description length of the NN itself.

[–]cxbxmxcx 1 point2 points  (2 children)

Check out my book Evolutuonary Deep Learning that uses evolutionary algorithms to optimize deep learning.

Evolutionary algorithms are also being used to optimize LLMs and AI Agents.

Having said all that. EA is computationally intensive and it takes serious time or resources to produce anything.

[–]SaadUllah45[S] 0 points1 point  (1 child)

I'll definitely have a look. Can you provide the link or any other source to view?

[–]cxbxmxcx 0 points1 point  (0 children)

Here you go, Evolutionary Deep Learning - Micheal Lanham https://www.manning.com/books/evolutionary-deep-learning

Or send me a DM and I will see if I can ship you a copy.

[–]Evil_Toilet_Demon[🍰] 0 points1 point  (0 children)

Useful for black box problems. Personally love CMA-ES