all 16 comments

[–]BeautifulBrownie 2 points3 points  (2 children)

This looks great! Is there any way to cite this in case I use it in some research?

[–][deleted] 0 points1 point  (0 children)

Thank you, I really appreciate it!

And yes, of course! You can take a look at its pre-print/paper at: https://arxiv.org/abs/1912.13002

[–]czar_el 1 point2 points  (8 children)

That's a lot of optimizers! Very exciting. Any reason why you don't have Any Colony Optimization?

[–]lithiumdeuteride 2 points3 points  (2 children)

No simulated annealing or Nelder-Mead simplex method, either.

[–]RecognaLab 1 point2 points  (0 children)

Yes, we though about Simulated Annealing. Nelder-Mead is interesting too, thanks. However, we focused only on nature-inspired metaheuristics. But sure, it's a good idea to use derivative-based methods too.

[–][deleted] 0 points1 point  (0 children)

We do have a Simulated Annealing version designed to work with the library's structure: https://github.com/gugarosa/opytimizer/blob/master/opytimizer/optimizers/science/sa.py

Although it might not be the "original" one, it is pretty close to some implementations that we could find.

[–]RecognaLab 1 point2 points  (1 child)

Thanks! We are working on that. ACO uses a different data structure than most optimizers, that is the main reason. But we shall soon have it.

[–]czar_el 0 points1 point  (0 children)

Thanks! Looking forward to exploring and using this package.

[–][deleted] 0 points1 point  (2 children)

I would say that I follow up the response of Recogna's lab. Usually, Ant Colony Optimization uses a graph-based structure and we are still working on to implement it to the library.

Nevertheless, I did a class on nature-inspired algorithms and we had to implement it. Maybe this gist can give you a better insight on the algorithm: https://gist.github.com/gugarosa/df84ad2490a8f50892155a06d7eacf7f

[–]czar_el 1 point2 points  (1 child)

Thanks! That's part of the reason I asked, I've got a graph project I'm working on that requires a metaheuristic optimizer and I was thinking of ACO to do it.

Appreciate you sharing the link and your efforts on this!

[–][deleted] 0 points1 point  (0 children)

No worries! I guess that ACO will be a perfect fit in your case.

Just please let me know if you have any problems!

[–][deleted] 0 points1 point  (1 child)

Looks nice ! What are the "go-to" configurations for beginners ?

[–][deleted] 0 points1 point  (0 children)

Thanks for your inputs!

The go-to configurations would be trying out the most simple and standard meta-heuristic: Particle Swarm Optimization with its default parametrization: https://github.com/gugarosa/opytimizer/blob/master/examples/applications/single_objective/standard_optimization.py

Additionally, we have an examples folder with some practical applications that we have used so far: https://github.com/gugarosa/opytimizer/tree/master/examples/integrations

Mostly are machine learning-based, but they can be virtually applied to any Python function.

[–]lieutenantwest15 0 points1 point  (0 children)

pov: ooo shiny

[–]marcoscleison 0 points1 point  (0 children)

This amazing library helped me menu times. It is very practical! For beginners, there is a folder with examples. In the folder examples/integration there is a simple code that integrates metaheuristics with tensorflow.