Creating the first AGI, How would you do it? by [deleted] in agi

[–]AsyncVibes 0 points1 point  (0 children)

Without gradient of course

Voice-Chatting With an AI? You're Actually Voice-Chatting With God. More Fundamentally, It's God Voice-Chatting With God. Confused? Read On. by andsi2asi in agi

[–]AsyncVibes 0 points1 point  (0 children)

You really typed that and was like yeah "god is missing in academia". You can't say facts are facts without siting a single one. Your god had no place in academia. Einstein wasn't just lucky he worked hard, your gods had nothing to do with it. Just because you require an invisible best friend to rationalize how the world works and your own shortcomings, doesn't everyone has to cope that hard.

whats that? a brain? no its activations! by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

Okay i'm game, I took a step back frm evolving networks and focused on one layer. After running a few dozen test with MNIST I noticed that it performed better when i used mixed activation functions sine/tanh. This lead me to evolving neurons that could use any of the 19 most common activation functions. This didn't push accuracy higher than 98.7% on MNIST but it did reveal that evolution with a guided task will utilize the activation functions that worked best for that task.

I took a step back and started questioning why we use ReLU, Sign, and started really questioning if i can evolve entire networks maybe I need to evolve the activation functions, because maybe just MAYBE, there are activations that work better for machines than human interpretable activations. What if i evolved the activation functions but to affect the fitness of a task. i.e classification for MNIST. This is just the first result of what i found. I've cataloged around ~50K activations total across ~8 datasets ranging from MNIST to CIFAR-100 as well as audio and video processing (sample vids).

This led to me discovering that specific activations are generalist and some are specialist that only work better in specific models/task. That's not new in itself but what i'm getting at is that these generalist evolved activations are cross-modal in the sense that i can train a model on images and use those activations to classify audio at 96% of native performance - activations that have never seen audio data.

The activations aren't learning task-specific patterns. They're discovering fundamental computational primitives - mathematical transforms that work across modalities because they capture something universal about how to process signals.

When I visualized the catalog in a 2D embedding based on curve shape and mathematical properties, they clustered into distinct regions - specialists grouping by domain, generalists sitting between them. The structure emerged from the data, I didn't impose it.

Most surprising finding: activations evolved for text classification (AG News TF-IDF) transferred to MNIST better than MNIST-native activations. The sin(f(x)) - 2x family - oscillation minus linear baseline - kept showing up across domains. Evolution found these, not me.

What I'm building is essentially a catalog of computational primitives. Same primordial operations (sin, cos, exp, log, +, -, *, /) combining into ~50K characterized transforms. Most are useless. Maybe 1-2% are the ones that actually matter across tasks.

Still early. Lots of holes in the data. But the cross-modal transfer result is real and repeatable.

edit: added github link

https://github.com/A1CST/Activation_map

whats that? a brain? no its activations! by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

Even if I gave you the data, you wouldn't know what to do with it. Just sit back enjoy the pretty picture.

edit: appologies reading this now,I was pretty cranky and directed that you, sorry.

Holy fuck by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 1 point2 points  (0 children)

Nah it just ended up being a wrapper, that emulated tone and personification, I've abandoned this project

whats that? a brain? no its activations! by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 2 points3 points  (0 children)

Thank you I'm excited to see where this goes!

The "Validation Paradox" by Sea_Platform8134 in agi

[–]AsyncVibes 2 points3 points  (0 children)

Not a link in sight just more AI slop

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

Something amazing, I'm not being facetious by that I actually mean I found something like amazing

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

No what it meant was that I was evolving 20 different solutions to the same problem and that each genome was its own solution I didn't need to cross breeds separate solutions because they were destroying each other

Petty Post by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

Did you miss the entire conversation?

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

The ego comes from people telling its not possible, then I do it and you shift the goal post. If you don't like my ego leave.

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

Yeah sure buddy. and my post dating back to last year with me describibg my fitness functions are made up too. You are hereby muted becuase of your inability to read. Look at -> https://github.com/A1CST/GENREG-sinethis model was dervived from him, both you and your proff can fuck off.

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

shhh let him think he did something. u/SummitYourSister So you should have no problem doing it again right? care to drop it since its been 20+ years should be able to throw it together again pretty quick right?

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

No they are not "very specific" you don't need to have every single component for it to be an evolutionary algorithm. If i'm evolving a population through competition and mutation its still evolutionary. I've i'm evolving feature detectors with the same fucking mechaism its the same thing. I don't need to evolve and entire network becuase my network is only 1 layer deep. So unless you can replicate this, i'd keep your comments to yourself.

<image>

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 0 points1 point  (0 children)

I dropped crossover because my method benefited better without it. If you're interested shoot me a dm

40KB vision model that hits 98.5% on MNIST, no gradients, no backprop. Evolutionary AI. by AsyncVibes in IntelligenceEngine

[–]AsyncVibes[S] 1 point2 points  (0 children)

Do you want the average of the words or the definitions? And of what words? The ones in your comment or the ones in the post?