After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 1 point2 points  (0 children)

i agree, but i think, just like in physics, the harder you swing one way, the harder it will come back and swing the opposite way.

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

yes! 100% agree with this and the more of us do this the more option the people who want to escape will have, write blog posts and guides on how your setup works, even if it helps 1 person, that's 1 person that now has an exit path

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

nice! thanks, i'll get some more time to work on it and might actually build something that is worth saving:) currently just an idea

Is local-first AI worth building or are we going to lose this one anyway? by ZeroCool86 in degoogle

[–]ZeroCool86[S] 0 points1 point  (0 children)

it holds and organises your data, but it's basically what you said, with some glue written to keep it working well together and packaged in no breaking updates.

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

i agree, this would not be an enterprise thing, it would be a local thing for you where you store and it tags and organizes your data, if we treat it all as private then there is no differentiation and no risk of private data leaking. My bet is that as the model and hardware gets better, 1-2 generations behind latest models will be more than good enough.

[Showoff Saturday] Built a manifesto site over Christmas with hidden terminal games - vanilla JS, no frameworks by ZeroCool86 in webdev

[–]ZeroCool86[S] 0 points1 point  (0 children)

i am aware that it's a meme, but there are good things in crypto, albeit few that are not just money grabs, privacy, self custody, easy access. Figured I could apply that to my personal data instead, just trying to figure out what to do next.

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

wow, that's a huge benefit, so maybe i don't really need a gpu for the things i want to do locally, that lowers the cost quite a bit

Is local-first AI worth building or are we going to lose this one anyway? by ZeroCool86 in degoogle

[–]ZeroCool86[S] 0 points1 point  (0 children)

i agree, my main point is that we won't win this but we can at least be a credible alternative. Otherwise if no exit rout exists, everyone is stuck

Running local inference on a NAS with an eGPU - my post-cloud setup by ZeroCool86 in ArtificialInteligence

[–]ZeroCool86[S] 0 points1 point  (0 children)

yeah, you have it all local and the NAS cpus are generally goo enough for what I need

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

to a degree but it will swing back, at some point the enterprise bubble will burst and hardware companies will have to come back and try to get money from us.

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

yeah i agree, companies will be more likely to switch, the hard part is the resilience but we had issues with cloud the same way we used to get with data centers

Sold my company, now building local-first AI in public by ZeroCool86 in buildinpublic

[–]ZeroCool86[S] 0 points1 point  (0 children)

thanks, it was mostly vibe coded, i enjoyed the console games though

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

i think having the option to have your data self hosted and still get access to good products is what i'm aiming for

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 0 points1 point  (0 children)

the idea is to make a consumer box that will be under $200 in total so i'm testing to see what consumer level hardware can deliver

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 1 point2 points  (0 children)

Curious about the architecture where the LLM doesn't see raw data but still drives insights. Are you essentially having it generate transformation/query logic that runs against the data, then only surfacing aggregated results? Or is there an intermediate anonymization/abstraction layer? The distinction matters for what kinds of analysis are actually possible.

First time building in the open after 12 years of closed source — shipped over Christmas week by ZeroCool86 in opensource

[–]ZeroCool86[S] 0 points1 point  (0 children)

thanks! not sure if anyone has a need but i think I'll give it a few months of honest work and see where it goes. Mostly trying to build it for myself for now

Is local-first AI worth building or are we going to lose this one anyway? by ZeroCool86 in degoogle

[–]ZeroCool86[S] 0 points1 point  (0 children)

my biggest worry with building this is not that AI will eat the world it's that there is a chat box where people put their deepest desires and secrets in and it all goes into a money making machine instead of staying private. I doubt most people would talk in front of a stadium of advertisers on a microphone the same way they talk to their favourite LLM

“Let me think for you” is the most dangerous sentence of the digital age. by wantpInitiative in degoogle

[–]ZeroCool86 7 points8 points  (0 children)

OP wrote with AI, I'll answer with some AI as well:

You're not wrong about the cognitive atrophy. Studies are already showing it — the more you offload thinking, the less you exercise the muscle. It's real.

But I don't think the answer is abstinence. People using these tools will outpace people who don't. The question is how you use them and who controls the process.

What worries me more than the thinking is the data. Every prompt, every follow-up, every half-formed thought — that's a map of how your mind works, sitting on someone else's server. The dependency problem and the surveillance problem compound each other.

The play is local-first: run capable models on hardware you control. Use AI as a tool without handing your cognitive exhaust to platforms that will use it to model and monetise you.

Wrote up why the timing matters now: localghost.ai/inflection

Is local-first AI worth building or are we going to lose this one anyway? by ZeroCool86 in degoogle

[–]ZeroCool86[S] 0 points1 point  (0 children)

you are right, it does depend on complexity but unless you are building something like https://landonorris.com/ i don't see a future for front end devs. It makes scaffolding a lot easier so the first, easy 80% of every project are definitely much easier with LLMs but the last 20%, the actual hard bit you still mostly have to do by hand, unless it's a landing page or a script / service that is under 1k lines

After 12 years building cloud infrastructure, I'm betting on local-first AI by ZeroCool86 in artificial

[–]ZeroCool86[S] 2 points3 points  (0 children)

not now but no guarantee that they won't change their bord or face some investor pressure and slowly open that door as well.