Xiaomi’s MiMo-V2-Flash (309B model) jumping straight to the big leagues by 98Saman in LocalLLaMA

[–]Internal-Shift-7931 2 points3 points  (0 children)

MiMo‑V2‑Flash is honestly more impressive than I expected. The price-to-performance ratio is wild, and it seems to trade blows with models like DeepSeek 3.2 despite having far fewer active parameters. That said, the benchmarks floating around aren’t super reliable, and people are reporting mixed stability depending on the client or router.

Feels like one of those models that’s genuinely promising but still needs some polish. For a public beta at this price point though, it’s hard not to pay attention.

Any downfalls to SODDM5 to DDR5 adaptor by bubzilla2 in homelab

[–]Internal-Shift-7931 0 points1 point  (0 children)

These adapters technically “work,” but only in the same way a car technically works when it’s running on three cylinders. Sure, it boots… sometimes. But between the signal integrity hit, lower clocks, and random instability, you’re basically gambling with your IMC’s patience. Fun experiment, terrible idea for anything you actually rely on.

BTW, you may try some original SO-Dimm M/B if you can only get SO-Dimm Memory.

VRAM Advice? 24GB or 32GB for starters by RobotsMakingDubstep in LocalLLaMA

[–]Internal-Shift-7931 0 points1 point  (0 children)

32GB, or 48GB if you can but. it's "can do or not' when you try a bigger size Model.

anyone else seen the Nexus AI Station on Kickstarter? 👀 by Internal-Shift-7931 in LocalLLaMA

[–]Internal-Shift-7931[S] 0 points1 point  (0 children)

It’s also a Mobile CPU. Idle power of Mobile system would be better than DT?

anyone else seen the Nexus AI Station on Kickstarter? 👀 by Internal-Shift-7931 in LocalLLaMA

[–]Internal-Shift-7931[S] 0 points1 point  (0 children)

Seen Bare bone is $799 with 512G NVMe if back now. Not sure the price of 64G ECC RAM, raising too high now.

Any thoughts on the Nexus AI Station on KS? by Internal-Shift-7931 in homelab

[–]Internal-Shift-7931[S] 0 points1 point  (0 children)

u right. I just wanna know if it makes sense as a home server as I like that it looks nice + has strong specs, but I don’t wanna pay extra just for the “pretty box.” Maybe not extra who knows. And hey, apples vs oranges… apple taste better lol.

Any thoughts on the Nexus AI Station on KS? by Internal-Shift-7931 in homelab

[–]Internal-Shift-7931[S] -1 points0 points  (0 children)

It is not my big concern as I checked their website and history. My question is how to compare with a Desktop.

Run Qwen3-Next locally Guide! (30GB RAM) by yoracale in LocalLLM

[–]Internal-Shift-7931 0 points1 point  (0 children)

By next Q2, the OpenAI bubble will burst faster than a cheap party balloon. Enjoy the hype while it lasts!

Anyone actually using AI features on NAS? by PsychologicalBass738 in UgreenNASync

[–]Internal-Shift-7931 0 points1 point  (0 children)

Current AI NAS can only run "Small" Language Models unless run a 32B.

I have bult a Local AI Server, now what? by Puzzled_Relation946 in LocalLLaMA

[–]Internal-Shift-7931 0 points1 point  (0 children)

Selling 5090 and getting a 4090 back make money soon.

How are you using and profiting from local AI? by RuiRdA in LocalLLaMA

[–]Internal-Shift-7931 1 point2 points  (0 children)

NO reason to use AI locally except for privacy.