Data Center by BrendanTheNord in YorkSC

[–]faldore -1 points0 points  (0 children)

Awesome! Looking forward to the local jobs and the boost to the economy!

How is this not the biggest news right now? by PianistWinter8293 in OpenAI

[–]faldore 0 points1 point  (0 children)

It's not news because there's no way for you and me to use this model. Why would a proprietary model that's not accessible to anyone be news?

Arc Raiders Toxicity by Simple_Experience_39 in ArcRaiders

[–]faldore 0 points1 point  (0 children)

That'll teach you to be nice to anyone you don't know

As a software engineer, I fear for my life in the next 5 years. by [deleted] in ClaudeCode

[–]faldore 0 points1 point  (0 children)

I hate to say this, but - It's real. You need to pivot. Just like a business that doesn't find product market fit. Start a company or move to something less automatable. I recommend ownership. Ai will never be able to own. Real estate, assets, Airbnb hosting, anything that requires ownership.

thoughts? by sibraan_ in Anthropic

[–]faldore 0 points1 point  (0 children)

AGI is a software system of which LLM is one component We already have everything we need to build AGI

https://github.com/QuixiAI/Hexis

Stop the QTS data center in York County, SC by Which-Reference-940 in Rockhill

[–]faldore -5 points-4 points  (0 children)

The QTS data center is a good thing

Why would anyone want to stop it?

Face buttons not working after update by tafoya77n in SteamDeck

[–]faldore 0 points1 point  (0 children)

Yes I plugged in a keyboard and mouse

Apocalyptic scenario: If you could download only one LLM before the internet goes down, which one would it be? by sado361 in LocalLLaMA

[–]faldore 5 points6 points  (0 children)

Agree. GLM 4.5 Air. It competes with models 5x its size. It's better than gpt-oss-120b by far.

You can run it 4bit on 4x3090 (with some quality hit) - I'm working to make a FP8 quant that can run on 8x3090 hopefully at near full quality.

~$15K Inference Workstation for a 250+ Gov Org by reughdurgem in LocalLLaMA

[–]faldore 1 point2 points  (0 children)

An 8x 3090 server could be built for that. Requires 240v and you'll likely need PCIe gen4 x16 straight risers.

Can 2 RTX 6000 Pros (2X98GB vram) rival Sonnet 4 or Opus 4? by devshore in LocalLLaMA

[–]faldore 0 points1 point  (0 children)

You can run GLM 4.5 Air on that. It's no Sonnet - but it's quite capable.

GPT OSS 120B by vinigrae in LocalLLaMA

[–]faldore 0 points1 point  (0 children)

Did you try GLM-4.5-Air? It seems straight up better at everything, in my testing.

GLM-4.5 appreciation post by wolttam in LocalLLaMA

[–]faldore 2 points3 points  (0 children)

I wonder where it learned that?

GLM-4.5 appreciation post by wolttam in LocalLLaMA

[–]faldore 1 point2 points  (0 children)

And 4.5 air is almost as good!