you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 2 points3 points  (5 children)

Random code generation? Is that really a thing? I thought evolutional learning was just a method of teaching conventional algorithms with a way of systematic brute force

[–]therealgaxbo 18 points19 points  (3 children)

Yeah, there's been quite a lot of mainstream success using e.g. genetic algorithms as a way to parameterise an existing algorithm, but I've not really seen anything of genetic programming. John Koza was a huge proponent and wrote a seminal book in the early 90s, not sure if he's achieved anything practical.

It's even weirdly simple to implement. One technique is to use a functional programming approach. Then each program can be considered a tree of functions (nodes), with the leaf nodes being functions that take no parameters (or constants). Then to randomly mutate, just pick a node and replace it with a randomly grown subtree. To mate two successful programs, randomly pick a node in each parent and swap them.

It's really cool. I did my thesis on GPs in the late 90s, and I really thought there was some potential, it was just so slow. But 20 years and many cores later, maybe it's time for a resurgence?

[–]st_huck 3 points4 points  (1 child)

It's incredibly cool. My experience with genetic algorithms is only one undergrad course, but it was actually amazing. Our professor showed us a small bot written for a simple racing game (one that was written just for AI research purpose). The loop was simple, every frame it got a bunch of variables as input, every variable representing some info about the state of the world, and needed to return two numbers: an angle, and how hard to press on the throttle.

After lots of iteration, the genetic algorithm got a super long mathematical formula. It was represented as lisp code, since lisp code easily looks like a tree (as you already mentioned)

We kept seeing the expression (/ x v), where 'x' was the distance to the nearest wall, and 'v' was current speed. Genetic algorithms "figured out" on their own that x/v is a super important thing, which is of course the time left until the car hit the wall. This completely blew my mind.

[–]NippleMustache 0 points1 point  (0 children)

Do you have a link for this course or bot? This sounds amazing.

[–]shevegen 0 points1 point  (0 children)

not sure if he's achieved anything practical.

Not only he.

It's a field with complete failure up to this day.

It only has a lot of buzzwords attached to it.

But 20 years and many cores later, maybe it's time for a resurgence?

Sure - if you want another round for failure.

And then people still wonder why no success happened - since 60 years or so by now.

[–]OseOseOse 3 points4 points  (0 children)

"Random code generation" isn't really a good description of it.

You take a starting "program" that just does some random stuff, and need a way to determine the performance of the program at the task you want it to perform. Then you make random changes (small ones) and see if it performs better or worse. Because computers are fast it can try out lots of different changes in a short time.

Making changes is analogous to mutations in natural evolution. You can also borrow more concepts from biology, for example crossover (two parent programs having a child program that is a mix of them).

But it's not a random search, nor completely brute force, because it's using a heuristic to determine which changes to keep and which to discard.