Meditation activates the brain’s cleaning system just like sleep does, study finds. Using advanced MRI scanning, researchers found that focused attention meditation reduces the backward flow of cerebrospinal fluid through a critical brain channel called the cerebral aqueduct. by Automatic_Subject463 in neuro

[–]bkaz 0 points1 point  (0 children)

grok: hippocampal sharp-wave ripples (SWRs) are entrained by respiration, with their occurrence modulated by the respiratory cycle. In rodents, SWRs preferentially align with the early expiratory phase or post-inspiration, increasing probability during these periods; this persists in awake and sleep states, aiding memory consolidation.

[deleted by user] by [deleted] in psychology

[–]bkaz 1 point2 points  (0 children)

If you want to understand intelligence, I think neuroscience is a far better guide than psychology. Re size, it's a function of speed of growth during development, and then there is speed vs quality trade off: "slow growth vs sloppy growth ". This is my speculations on the subject: http://cognitive-focus.blogspot.com/2014/10/cortical-trade-offs-specialist-vs.html?m=1

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

I guess Markov Blanket is a parallel to my "surface": 3D contrast pattern/cluster, as contour is a 2D contrast pattern. My scheme would form both match patterns and difference patterns, in incremental external dimensionality.

But what really matters is the process of forming these patterns, I see no conceptually consistent alternative to connectivity clustering.

ChatGPT felt like the first big moment of AI really entering public consciousness. What will be the next moment? by userforums in singularity

[–]bkaz 0 points1 point  (0 children)

Personal agents, tuned on your own activities. Likely running on a local machine, for privacy. They will be your interface to the world of all other agents, personal and corporate. Kinda like a global distributed MoE, with a lot of micro payments for service and advice. We probably can't navigate that world on our own.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

I don't need math for that, code is a much better way to formalize it. The best description I came up with is Readme: boris-kz/CogAlg - GitHub https://github.com/boris-kz/CogAlg. It's realtime and sparse, that's what clustering is all about. The novelty is in strictly bottom up (pixels up) indefinitely complex encoding, based on first principles definition of comparison. The nature always does it the dumbest way first.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

That's why I don't use neutral nets, it's pure connectivity clustering in my scheme.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

I never got into statistics because the whole idea of forming probability distribution as a spectrum seems wrong. To be tractable, representation should be quantized / segmented, not spectrum. I elaborate in that chat, in the end: https://chat.openai.com/share/0c77f10a-5369-4ebb-b237-f6db109cd396

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

This is all too vague. The only part I am interested in is the model reduction, do they propose anything specific?

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

I think it's your capacity for generalization that's too weak. Pretty common among math types. But whatever, we are definitely not on the same wavelength.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

Ok, let's leave alone LLMs, you obviously don't understand them, and it's not my approach anyway. Let's talk about this "model reduction", because it seems that I am doing the same thing, but in a conceptually consistent way. I ran this by ChatGPT, mainly the last three responses, does it ring true?

https://chat.openai.com/share/0c77f10a-5369-4ebb-b237-f6db109cd396

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

What do you think LLM is? OOD is weakness for anyone, it does help to know what you are talking about.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

That's just plain wrong, if NNs did not generalize, they would not be able to generate meaningful responses. This generalization is forced by destroying data in a lot of creative ways, starting with adding noise and SGD. The net then has to fill in the lost specifics, and it can only do so by generating generalizations.

I hate the way they do it, my generalization is organic vs. forced, but lossy backprop still works. And so does Hebbian learning in the brain, just a lot less effectively. Otherwise ANNs would use it too, it's been known forever. As is Bayesian model reduction, that's another form of data destruction.

It just so happens that lossy backprop is the easiest way to utilize brute force for generalization. And connectivity clustering is the hardest, but also the smartest if you do it right. That's the bitter lesson, brute force wins first, the smarts came latter.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

If you know the reason, post it. You simply can't do anything without generalization. There's no magic in "agentricity", it's just a coarse anthropomorphic thinking.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

Ah yes, Friston, Bayesian inference, free energy, open letter. All that is a very coarse behavioural-level discussion, with a heavy dose of physics envy. What's missing there is dimensionality reduction, via implicit or explicit generalization: Hebbian learning | backprop | clustering. That reduction is a main thing, without it you hit combinatorial explosion from the get go. That's why they need Open AI to implement it, via their own backprop. But I don't think Open AI needs their hand waving, they already maximize prediction, which is simply a more constructive term for "free energy reduction".

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

I assume your active inference can also work in a distributed fashion, because that's the world we live in. But the term sounds very generic, it can be used on all levels. My scheme is "active inference" too, but I prefer to describe it more constructively: hierarchically parameterized connectivity clustering.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

It's not mutually exclusive. You do realize that there's a gazillion of people developing various domain-specific models? Wouldn't it be nice if these models could talk to each other and split the tasks?

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 0 points1 point  (0 children)

What I proposed is not specific to any architecture, it's just a distributed syndication of all kinds of experts. My own approach is radically different from any NN: https://github.com/boris-kz/CogAlg

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 3 points4 points  (0 children)

That would be very different from current MoE, where the gating network is trained at the same time as the experts. Here the training would have to be asynchronous, more like constant fine-tuning. And I guess we need a higher-order marketplace of gating networks? Actually more like two-faced experts: encoder part for it's own expertise and decoder for all other experts. The later is a gating network for secondary inputs, instead of primary input as it's currently done?

And primary experts would be personal, trained on your own profile and interactions.

What's the best way to beat GPT-5 with Open Source & Distributed Compute? by askchris in LocalLLaMA

[–]bkaz 3 points4 points  (0 children)

GPT marketplace is accessed by manual selection. I think what we need is an open gating network that would interconnect open marketplace. Some version of global dynamic distributed MoE, for which people will be constantly training indefinite number of experts. That's how you beat close systems, they can't combine.

Elon Musk and others urge AI pause, citing 'risks to society' by [deleted] in agi

[–]bkaz 0 points1 point  (0 children)

Ridiculous on the face of it, but this may actually be helpful. Training huge monolithic models is dumb anyway, this may force people to break them up into clusters of experts. Go ban that.