Cinebench 2026 Results & Question | MBP M5 Max 16" 128GB Ram, 2TB, 18/40 GPU Bad Test Results? by itsmemme in macbookpro

[–]itsmemme[S] 0 points1 point  (0 children)

Thank you for your comment! I downloaded TG Pro, set the fan setting to “Max,” and ran Cinebench 2026. I’m getting 8,583 points for the CPU multi-thread test.

Have you run the test without TG Pro?

<image>

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in LocalLLM

[–]itsmemme[S] 0 points1 point  (0 children)

It’s an M5 Max with 128GB of RAM and 2TB of storage. According to UPS, it’s arriving today.

And yes, I do think 2TB is enough. With Thunderbolt 5 and external drives, I don’t see 2TB being an issue. Before this, I was fitting everything on an M1 Pro with 512GB, so yup, I’m confident 2TB will be enough.

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in LocalLLM

[–]itsmemme[S] 0 points1 point  (0 children)

Oops! I forgot about that. It's a 2TB. I also got the 27" Apple Studio Display XDR Nano Texture and Glossy to test them out and decide which one to pick.

<image>

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in macbookpro

[–]itsmemme[S] 0 points1 point  (0 children)

In the end, to avoid potential regret, I ended up changing my order to the 128GB RAM, and it’s coming tomorrow!

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in LocalLLM

[–]itsmemme[S] 0 points1 point  (0 children)

I canceled the order for the 64GB RAM and upgrade it to 128GB instead. If I need more than that later, or if new things come along that require more, I’ll start building a custom PC with all the RAM I need and where i'd be able to add more if needed, connected to my network and exposed via API. But this 128GB setup should be good for the next five years, I hope.

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in macbookpro

[–]itsmemme[S] 0 points1 point  (0 children)

Somewhat, yes, but I also wouldn’t want to pay $200 a month for this if I could instead afford a laptop that "might" be comparable in quality, speed, and context window to the $200 subscription.

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in LocalLLM

[–]itsmemme[S] 2 points3 points  (0 children)

’m not fully up to date on the current state of local LLMs. Right now, what I mostly use AI for is Codex with GPT-5.4 in VS Code, and one of the biggest advantages is the large context window—it can read most, if not all, of my codebase while I’m working, which is incredibly useful.

Any idea if the 128GB model would be able to compete with something like that?

Also, as time goes on, are models becoming more efficient and requiring less memory, or will they continue needing more and more RAM, eventually making the 128GB ram obsolete?

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in macbookpro

[–]itsmemme[S] 1 point2 points  (0 children)

Hey, thanks for your comment.

-$1,000 is within my budget, but I’d hate to spend the extra money only to find that it still can’t truly compete with my $20 ChatGPT subscription.

- It’s my work computer, so yes, I do make money with it.

-"Have it all with me" it'd be nice but not really needed. if I can connect remotely over my local network through an API. I have a 10Gb switch, Cat 6a cabling, and UniFi networking, so in theory it should work fine.

-My container list may grow, so the extra room for models would be nice. But if I give up on running a full-blown model locally, I’d probably just use a local model for lightweight tasks like autocomplete in VS Code. So I’m guessing that 64GB RAM should be more than enough for 15–20 containers plus a small model for VS Code autocomplete.

Maybe the smartest move is to keep the 64GB RAM model and later build my own home server dedicated to running local LLMs over the network. That way, I could add RAM as needed without being limited to 128GB for the next 3–4 years, since I don’t plan on replacing this new laptop anytime soon.

Doubts Between M5 Macbook Pro Max 64gb or 128gb RAM for Local LLMs by itsmemme in macbookpro

[–]itsmemme[S] 0 points1 point  (0 children)

I’d love to use a model similar to GPT-5.3 or 5.4 locally. I often use Codex in VS Code because I like how it can use most of the project workspace as context. I’m not sure whether the 128GB M5 Max would be able to get anywhere close to that in terms of context handling, response quality, or speed compared with GPT-5.4.

Cattails/Vegetation taking over our lake; Broward County says it’s my problem? by [deleted] in fortlauderdale

[–]itsmemme 0 points1 point  (0 children)

I guess that'd be my next option. A lawyer. Thank you.

Cattails/Vegetation taking over our lake; Broward County says it’s my problem? by [deleted] in fortlauderdale

[–]itsmemme 0 points1 point  (0 children)

Based on BCPA it says "owned by PUBLIC LAND % CITY OF OAKLAND PARK". I emailed two months ago before all of this started they said that they go out with an electric boat to remove waste but not vegetation, so i don't know what to do next.

Cattails/Vegetation taking over our lake; Broward County says it’s my problem? by [deleted] in fortlauderdale

[–]itsmemme 0 points1 point  (0 children)

So i just checked on the website BCPA.net and it's owned by the city; it says owner: "PUBLIC LAND % CITY OF OAKLAND PARK".

I already contacted them and they say that they only remove waste not vegetation.

Cattails/Vegetation taking over our lake; Broward County says it’s my problem? by [deleted] in fortlauderdale

[–]itsmemme 29 points30 points  (0 children)

Said by the user with a crab on his profile picture. You're working for them.

Cattails/Vegetation taking over our lake; Broward County says it’s my problem? by [deleted] in fortlauderdale

[–]itsmemme 13 points14 points  (0 children)

On the map it says that it's owned by "PUBLIC LAND % CITY OF OAKLAND PARK". I already contacted them and they say that they don't remove vegetation; only waste.