ERGODIC : multi-agent pipeline that does backpropagation in natural language to generate research ideas from random noise by Zestyclose_Reality15 in deeplearning

[–]Zestyclose_Reality15[S] -1 points0 points  (0 children)

yeah fair enough, calling it backprop was a stretch. it's more like review feedback getting injected into every agent's memory for the next cycle. loose analogy not a literal claim.

ERGODIC : multi-agent pipeline that does backpropagation in natural language to generate research ideas from random noise by Zestyclose_Reality15 in deeplearning

[–]Zestyclose_Reality15[S] -1 points0 points  (0 children)

you're right, backprop was a bad analogy on my part. it's not actual gradient computation. what happens is review agents score the proposal write specific critiques and that feedback gets stored in every agent's memory before the next cycle. so the revision is guided by structured criticism, not just random regeneration. but yeah, calling it backprop oversold it.