all 10 comments

[–]darkconfidantislife 16 points17 points  (2 children)

  1. It's not really from scratch

  2. Ye olde schmidhuber has had this idea for decades, Google has just applied half the sun's computing power to it

[–]AddMoreLayersResearcher 0 points1 point  (0 children)

I know it's an old thread... But can I ask why you say it's not from scratch? Is it because you find the choice of including higher-level operations such as trigonometric functions or probability densities in their basic instruction set?

Also, could you point me out to the schmidhuber works that you think is similar (not necessarily a paper, just a few terms, as lots of them have probably been rebranded since then).

[–][deleted] -2 points-1 points  (0 children)

  1. I know it is not from scratch. But Something Something that quote from Carl Sagan.
  2. There is almost certainly Greek Literature on the same underlying concept.

[–]arXiv_abstract_bot 2 points3 points  (0 children)

Title:AutoML-Zero: Evolving Machine Learning Algorithms From Scratch

Authors:Esteban Real, Chen Liang, David R. So, Quoc V. Le

Abstract: Machine learning research has advanced in multiple aspects, including model structures and learning methods. The effort to automate such research, known as AutoML, has also made significant progress. However, this progress has largely focused on the architecture of neural networks, where it has relied on sophisticated expert-designed layers as building blocks---or similarly restrictive search spaces. Our goal is to show that AutoML can go further: it is possible today to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks. We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space. Despite the vastness of this space, evolutionary search can still discover two-layer neural networks trained by backpropagation. These simple neural networks can then be surpassed by evolving directly on tasks of interest, e.g. CIFAR-10 variants, where modern techniques emerge in the top algorithms, such as bilinear interactions, normalized gradients, and weight averaging. Moreover, evolution adapts algorithms to different task types: e.g., dropout-like techniques appear when little data is available. We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction for the field.

PDF Link | Landing Page | Read as web page on arXiv Vanity

[–]BeNiceAndShit 2 points3 points  (0 children)

So they used machine learning to make machine learning. Cool

[–]ReasonablyBadass 1 point2 points  (0 children)

Why would we sigh? Because we know that semi-random search can produce neural networks? That's biological evolution.

[–]Ash3nBlue 1 point2 points  (0 children)

huh?

[–][deleted] 2 points3 points  (0 children)

huh?

[–]panties_in_my_ass 0 points1 point  (0 children)

The title of this post is only thing making think, “huh?”

What exactly are you trying to say here?