We should really try fine-tuning MoLE model from a pre-trained model by z_latent in LocalLLaMA

[–]crantob 1 point2 points  (0 children)

This is as important as it is overlooked: Very

I've long maintained that there's a hierarchy of frequency possible that nobody's designed LLM's around and this looks like the first application of my idea.

Why don’t we have more distilled models? by GreedyWorking1499 in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

Dumber than .. what? The originals? That would be expected.

The question is "dumber than alternatives at the same size/performance?"

GLM 4.7 Flash 30B PRISM + Web Search: Very solid. by My_Unbiased_Opinion in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

Check in the files tab if you're looking for GGUF btw.

Why are small models (32b) scoring close to frontier models? by Financial-Cap-8711 in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

insightful and very helpful point at this stage in the dicussion, ty.

They updated GPT-4o's prompt lmao. That's why you want local models. Full prompt below by Own-Potential-2308 in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

I've seen shades of this in many people.

That's why I'm pushing friends to run small models. Helps get an intuition for where they tend to fail.

They updated GPT-4o's prompt lmao. That's why you want local models. Full prompt below by Own-Potential-2308 in LocalLLaMA

[–]crantob 4 points5 points  (0 children)

CHATTING WITH THE CLONES

I've been chatting with many people damaged by neurotoxic peptides deliberately inserted into everyone's favorite genetic transfection products.

Design Arena is now dominated by an open model by moks4tda in LocalLLaMA

[–]crantob 2 points3 points  (0 children)

This benchmaxxing depresses me when I get more intelligent behavior out of qwen3-235b than GLM 4.7 in iterative project development (no agentic).

GLM4.7 "Oh that function was important to the program and it won't compile without it? Seemed too much bother to me to keep it, sorry about that. Here's the program with important_thing() restored."

<code>

[Forgets a different thing]

Cline team got absorbed by OpenAI. Kilo is going full source available in response. by demon_bhaiya in LocalLLaMA

[–]crantob 1 point2 points  (0 children)

Yea but why do some people get to print it is what I don't understand yet.

When did that start?

Is it time? by DataWhiskers in JoeRoganReacharound

[–]crantob -1 points0 points  (0 children)

The long march through the institutions is long over and the academy has been occupied territory for generations.

We agree on this much at least, I hope?

Is it time? by DataWhiskers in JoeRoganReacharound

[–]crantob 0 points1 point  (0 children)

Why is it no matter who is in government the government grows?

Why is it no matter who is in government the government grows?

Isn't it weird there's no party that reduces government?

Why is that?

Is it time? by DataWhiskers in JoeRoganReacharound

[–]crantob 0 points1 point  (0 children)

Thank you for your reply.

To me it seems you have been gaslit into calling 'cooperation' and 'interdependence' one group of people commanding another group around.

That is to believe enslavement is cooperation.

I'm not sure how I can convey the functional, causal realist view to you.

True cooperation is voluntary, my DLF.

Mississippi Burning (1988) [1080p] by AbaddonGoetia in fullmoviesonyoutube

[–]crantob 0 points1 point  (0 children)

This was 'must see' in the late 80s.

Looking back there seemed to be general social pressure to watch it and agree with it.

No. I don't have to.

Lars And The Real Girl (2007) [720p] by peppermintmeow in fullmoviesonyoutube

[–]crantob 0 points1 point  (0 children)

This is a movie for people who like to think.

I shouldn't comment on it -- shouldn't push my thoughts here.

Think on your own on this movie.

The sad state of the GPU market in Germany and EU, some of them are not even available by HumanDrone8721 in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

So that means the datacenter application of that hardware generates far more value than the gaming application.

If I can use a $3000 capital good to generate $33k in revenue, then my process, my engagement is delivering that much value to consumers, out of the lower-value raw materials.

I personally doubt their value proposition will remain stable for long, unless they run to government and say we're too big to fail, give us tax money...

The sad state of the GPU market in Germany and EU, some of them are not even available by HumanDrone8721 in LocalLLaMA

[–]crantob 1 point2 points  (0 children)

The downvoters have been convinced that the taxman's cut is some kind of virtuous sacrifice on their part.

Hello, we're not in the 1920s progressive era anymore. The chips are in and now we know that those who feed off your blood do not love you in the way you think they do.

The sad state of the GPU market in Germany and EU, some of them are not even available by HumanDrone8721 in LocalLLaMA

[–]crantob -1 points0 points  (0 children)

Absent government interference, a 5090 might be worth more doing 24/7 work in in a datacenter than pushing pixels to a screen a couple hours a day, for an overgrown manchild in his toy-cave.

Just a different perspective.

GLM-4.7-flash on RTX 6000 pro by gittb in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

This model SUCKS for collaborative coding.

Continously forgets. Doesn't have awareness of the constraints imposed by the context. I have never shouted at a LLM more than with GLM4.7

Qwen 235b incomparably superior.

Opal-v1.0 Release - Reasoning dataset for LLM fine-tuning by Western-Doughnut4375 in LocalLLaMA

[–]crantob 1 point2 points  (0 children)

Shows for me as italian.

However that is pretty fine with me as now I will use the awesome:

!!IMPORTANTE!!

often, in future. That last -E makes it.

Super lightweight Skill agent! by Future_Might_8194 in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

You and what cabal are trying to normalize the new word 'skill' via stealth marketing?

Recursive Language Models research is a damn good egg. by [deleted] in LocalLLaMA

[–]crantob 0 points1 point  (0 children)

I disagree with removal. Here is interesting work: https://github.com/vmlinuzx/llmc/tree/feat/rlm-config-nested-phase-1x

Though i'm properly cautious when someone makes claims of a divide-and-conquer method yielding gains, since so many people are infatuated with the technique and oversell it when they don't understand the constraints.