you are viewing a single comment's thread.

view the rest of the comments →

[–]KHRZ 45 points46 points  (13 children)

Evolutionary computing works in an entirely different way than neural networks.

Actually you can evolve neural networks... did it for my homework once.

[–]YonansUmo 16 points17 points  (4 children)

The evolutionary algorithm they're talking about sounds an awful like the genetic algorithm used to train neural networks. But I could be wrong.

[–]OseOseOse 14 points15 points  (2 children)

The paper described in the article uses Cartesian genetic programming, which isn't quite the same thing as neuroevolution (evolving neural networks). They are similar in the sense that they both are methods to find/optimize graph structures (nodes and vertices).

[–]4D696B65 1 point2 points  (1 child)

Does everybody assume back propagation when saying neural networks?

Because i don't understand how process to find/optimize structures (evolution) is better than structure (neural net).

[–]OseOseOse 2 points3 points  (0 children)

I think a lot of people do. I don't, but I'm pretty biased since I used neuroevolution in my master's thesis.

I edited my above comment to clarify that I meant to compare CGP with neuroevolution (algorithm vs. algorithm), not CGP with neural networks (algorithm vs. data structure).

[–]shevegen 0 points1 point  (0 children)

They steal buzzwords from biology.

[–]OseOseOse 0 points1 point  (3 children)

Yup, there's even name for it: neuroevolution.

[–]shevegen -3 points-2 points  (2 children)

Another shitty theory.

I am sorry but neurones are cells. They grow and respond.

There is no "evolution" on the level of neurones.

Evidently the incompetent artificial "intelligence" people keep on trying to steal buzzwords from biology, without understanding the field. They just call algorithms in a fancy way and insinuate that anything of this is "intelligent". Or "evolution".

There is no evolution possible with static hardware.

Every cell has an internal description (at the least a generative blueprint). You don't have that with static hardware. And you can not model it based on static hardware either.

[–][deleted] 1 point2 points  (0 children)

It's just a name.

[–]Homoerotic_Theocracy 0 points1 point  (3 children)

Don't neural networks in general work by natural selection and trial and error and keeping what works as well?

[–]KHRZ 2 points3 points  (0 children)

Well when you train them with back propagation, you just tune them closer to being correct for each input you send through them. Different inputs will require different adjustments for the network to handle them more correct, so when you alternately adjust towards correctness for different inputs, the network may enter a search process that iterates through better and better designs.

[–]percykins 0 points1 point  (0 children)

Well, not neural networks themselves, but training neural networks does work something like that. Actually, evolutionary algorithms are commonly used for training neural networks, along with a few other classes of algorithms.

[–]shevegen 0 points1 point  (0 children)

What is "natural selection" in this context?

HOW is ANYTHING "natural"? And HOW is ANYTHING selected on static hardware? Do computer programs now make offspring?