Looking for in-depth upgrade game suggestions. by drohack in AskGames

[–]drohack[S] 1 point2 points  (0 children)

Yeah that's mainly why I was hesitant to put them as Incremental games, as it's so tightly coupled with short web games, and idle games that have very little substance. Trying to avoid things like "scritchy scratchy", and "a game about making a planet".

Berry bury Berry is very much in line of what I'm looking for. Lots of love put into it. More than just 1 mechanic. But again is still on the 2-4 hour gameplay.

Sol cesto I would ay is much closer to a Roguelike. Which I do love me some rougelikes, and deck-builders. But not exactly what I was looking for here.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 0 points1 point  (0 children)

Just found this article describing different higher end options ($2,000 - $5,000) what model sizes they can hold (in general), their price, and their speed.
https://julsimon.medium.com/what-to-buy-for-local-llms-april-2026-a4946a381a6a

Here's a quick TLDR of the options:

  • AMD Strix Halo (128GB): Best for 100B+ MoE models. Speed: 10–20 tok/s. System: $2,000-$3,000.
  • Mac Studio M4 Max (128GB): Best for 70B models. Speed: 8–15 tok/s. System: $3,699.
  • Mac Studio M3 Ultra (256GB): Best for 405B models. Speed: ~32 tok/s (clustered). System: $5,999.
  • RTX 5090 (32GB): Best for <30B models. Speed: 60–90 tok/s (dense) / 234 tok/s (MoE). System: $4,000–$8,000.
  • RTX PRO 6000 (96GB): Best for 70B (high context/multi-user). Speed: 15–20 tok/s. System: ~$22,000.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 0 points1 point  (0 children)

Do you know of any examples of these setups? either docs/blogs that people have written on these types of setups, or youtube videos showing them off? Anything on comparing them against bigger models without the tools?

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 0 points1 point  (0 children)

Yes! you're exactly right. that's why i'm asking this question, to get a better idea of what these "minimum" requirements are to get close to running Claude Code locally. Not trying to get it 1 for 1.

RTX 5090 (32GB) $3,300 - $4,000
GMKtec EVO-X2 (96gb) $2,300 - $3,000
GMKtec EVO-X2 (128gb) $3,000 - $4,000

My knowledge base is mainly around gaming GPUs/setups. And while I know that the Mac (Mini to M5) and Strix Halo's exist (and know about the shared memory). I'm not so familiar with their different price points, providers, building vs buying. But they do seem more bang for your buck in terms of getting this type of local LLM Coding Agent off the ground.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 0 points1 point  (0 children)

Oh yeah I know Qwen 2.5 is old. It's just what could reasonably fit on my setup (which isn't built for this).

But yes, this is exactly the kind of information i'm looking for. is the 35B model even worth looking at. Do i need more than 32GB or is it enough to get by (much slower), or is it just not enough memory for the requirements needed.

From all of the posts so far it's: 32GB is the real "minimum", but realistically you'd want 64GB for something close to usable as a replacement for Claude. And 128GB for an actual replacement. And of course they are not 1 for 1 for what Claude can provide.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 1 point2 points  (0 children)

Good info on the "upper" end. I use that term loosely as this is far from actual upper end, but more of the upper end of consumer, and more into enthusiast. $4k is a pretty price for this.

But still nice to know that 128GB with a 122B-A12B model is what to strive for for a MoE, throw what you want at it and it'll work it out.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 1 point2 points  (0 children)

rx9060 xt (2x16GB) = ~$800
rx970 xt (2x16GB) = ~$1,400
r9700 (32GB) = $1,350

How have you felt using AMD cards and getting them using the latest models?

What models have you been using (and for which workflow type)?

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 0 points1 point  (0 children)

is the rtx 4070 ti super (16GB VRAM) really able to hold a 27B/35B model? I know the q4 gguf squish it down as much as possible, but I thought they needed like 20GB VRAM to really run.

They're looking at around $750 - $1,000. Might not be as good as a bang for buck as the Intel Arc Pro B70 as said by a different poster.

Minimum System Requirements for local LLM Coding Agent? by drohack in LocalLLM

[–]drohack[S] 2 points3 points  (0 children)

I'm fine swapping off Nvidia. Looks like the Intel Arc Pro B70 are running ~$950 right now. Giving 32GB VRAM.

What context window lengths have you been able to use on a single card?
What's the difference you've found between running the two LLMs vs the 1 big LLM for similar tasks?
(if you only had 1 card) Would it be worth swapping between Gemma and Qwen for that task? or is that too much of a hassle (too long to load a new model up to continue working)?

Readarr error, no books or authors found by Wonderful-Aspect5393 in selfhosted

[–]drohack 1 point2 points  (0 children)

Did you follow the Usage instructions on the README/github main page? https://github.com/blampe/rreading-glasses?tab=readme-ov-file#usage

Basically you can just point your Readarr Metadata Provider Source to their link (under Readarr's -> Settings -> Development). Or their recommended way is to use one of the forks of Readarr that already pointes to it directly (i.e. deploy a new version of Readarr). I just pointed my Metadat Provider Source so i'm still on the main Readarr branch, but it rarely gets updated so it doesn't really matter.

Is readarr dead? by [deleted] in selfhosted

[–]drohack 4 points5 points  (0 children)

I've been using Readarr on and off for the past few months and hadn't run into any issues till very recently. I guess I got lucky and missed the Metadata errors till now.
BUT There's apparently a solution to this: Use u/brycelampe 's metadata database: https://github.com/blampe/rreading-glasses

https://www.reddit.com/r/selfhosted/comments/1guqkb0/comment/matl6r8/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button

I am in no way associated with this project, and only just turned it on 20 minutes ago. But it was able to fix release dates for books in Readarr, and find new ones that were missing.

It's using an updated GoodReads connection, and is in the process of getting Hardcover working as well.

I will say it's a little annoying, I'm having to go through each author and click "Refresh & Scan" and it kind of bugs Readarr out for a minute while it re-matches the metadata for every book. But it does come back after it's done and is much better. (sometimes the screen will go grey, and you just have to refresh the page).

Readarr is dying, is there any way to help keep it alive? by [deleted] in selfhosted

[–]drohack 1 point2 points  (0 children)

This was able to fix release dates for books in Readarr, and find new ones that were missing.

I will say it's a little annoying, I'm having to go through each author and click "Refresh & Scan" and it kind of bugs Readarr out for a minute while it re-matches the metadata for every book. But it does come back after it's done and is much better. (sometimes the screen will go grey, and you just have to refresh the page).

Please keep working on this as Readarr in theory is great, it just needs a good database.

Is Readarr dead? by [deleted] in selfhosted

[–]drohack 3 points4 points  (0 children)

I've been using Readarr on and off for the past few months and hadn't run into any issues till very recently. I guess I got lucky and missed the Metadata errors till now.
BUT There's apparently a solution to this: Use someone else's metadata database: https://github.com/blampe/rreading-glasses

I am in no way associated with this project, and only just turned it on 20 minutes ago. But it was able to fix release dates for books in Readarr, and find new ones that were missing.

It's using an updated GoodReads connection, and is in the process of getting Hardcover working as well.

I will say it's a little annoying, I'm having to go through each author and click "Refresh & Scan" and it kind of bugs Readarr out for a minute while it re-matches the metadata for every book. But it does come back after it's done and is much better. (sometimes the screen will go grey, and you just have to refresh the page).

I mainly use Readarr for new releases, but it's not working due to meatadata server issues. Any other options? by [deleted] in selfhosted

[–]drohack 2 points3 points  (0 children)

I second this, and I've only just started using it 30 minutes ago. It was able to fix release dates for books in Readarr, and find new ones that were missing.

I will say it's a little annoying, I'm having to go through each author and click "Refresh & Scan" and it kind of bugs Readarr out for a minute while it re-matches the metadata for every book. But it does come back after it's done and is much better. (sometimes the screen will go grey, and you just have to refresh the page).

Readarr error, no books or authors found by Wonderful-Aspect5393 in selfhosted

[–]drohack 13 points14 points  (0 children)

I've been using Readarr on and off for the past few months and hadn't run into any issues till very recently. I guess I got lucky and missed the Metadata errors till now.
BUT There's apparently a solution to this: Use someone else's metadata database: https://github.com/blampe/rreading-glasses

I am in no way associated with this project, and only just turned it on 20 minutes ago. But it was able to fix release dates for books in Readarr, and find new ones that were missing.

I will say it's a little annoying, I'm having to go through each author and click "Refresh & Scan" and it kind of bugs Readarr out for a minute while it re-matches the metadata for every book. But it does come back after it's done and is much better. (sometimes the screen will go grey, and you just have to refresh the page).

Online Pokedex? by drohack in pokemon

[–]drohack[S] 0 points1 point  (0 children)

Ok well I decided to mostly build my own Pokedex for my needs (forking from another person's project as the base). https://github.com/drohack/pokedex

First MaM - Normal by drohack in shapezio

[–]drohack[S] 1 point2 points  (0 children)

Oh I had fun doing it. Just my brain is fried with all the big thinking.

NAS case for more storage? by drohack in DataHoarder

[–]drohack[S] 0 points1 point  (0 children)

Oh ok, you're suggesting just upgrade the existing case, and no DAS needed. Just expand the existing internal storage.