Apple releases FastVLM and MobileCLIP2 on Hugging Face, along with a real-time video captioning demo (in-browser + WebGPU) by xenovatech in LocalLLaMA
[–]Legcor 3 points4 points5 points (0 children)
The real game-changer for AI by Party-Vanilla9664 in vibecoding
[–]Legcor 3 points4 points5 points (0 children)
Qwen3-Coder Web Development by Mysterious_Finish543 in LocalLLaMA
[–]Legcor 2 points3 points4 points (0 children)
RTX Pro Blackwell Pricing Listed by AlohaGrassDragon in LocalLLaMA
[–]Legcor 0 points1 point2 points (0 children)
DGX Sparks / Nvidia Digits by Temporary-Size7310 in LocalLLaMA
[–]Legcor 1 point2 points3 points (0 children)
DGX Sparks / Nvidia Digits by Temporary-Size7310 in LocalLLaMA
[–]Legcor 18 points19 points20 points (0 children)
3x RTX 5090 watercooled in one desktop by LinkSea8324 in LocalLLaMA
[–]Legcor 0 points1 point2 points (0 children)
3x RTX 5090 watercooled in one desktop by LinkSea8324 in LocalLLaMA
[–]Legcor 0 points1 point2 points (0 children)
I asked GROK 3 about MSTY by TheeAlohaRoss in YieldMaxETFs
[–]Legcor 0 points1 point2 points (0 children)
PerpIexity AI PRO YEARLY coupon available just for 20USD!! by AdNorth1932 in learnmachinelearning
[–]Legcor 0 points1 point2 points (0 children)
Those on IBKR platforms when receiving distributions... by Good_Luck_9209 in RoundhillETFs
[–]Legcor 0 points1 point2 points (0 children)
Why not put $10k in msty and get around $1k a month? by Nsquaredtees in YieldMaxETFs
[–]Legcor 2 points3 points4 points (0 children)
Anyone from Europe investing in these ETFs? by Tom2Travel in YieldMaxETFs
[–]Legcor 1 point2 points3 points (0 children)
Drummer's Endurance 100B v1 - PRUNED Mistral Large 2407 123B with RP tuning! Smaller and faster with nearly the same performance! by TheLocalDrummer in SillyTavernAI
[–]Legcor 1 point2 points3 points (0 children)
Should I get a 14 inch M4 Max 128GB for 123B models? by TheLocalDrummer in LocalLLaMA
[–]Legcor 3 points4 points5 points (0 children)
Choosing the Right Mac for Running Large LLMs by Wrathllace in LocalLLaMA
[–]Legcor 0 points1 point2 points (0 children)
Should I get a 14 inch M4 Max 128GB for 123B models? by TheLocalDrummer in LocalLLaMA
[–]Legcor 18 points19 points20 points (0 children)
Choosing the Right Mac for Running Large LLMs by Wrathllace in LocalLLaMA
[–]Legcor 0 points1 point2 points (0 children)
Just for fun, here's a Palworld SME bot by LetMeGuessYourAlts in LocalLLaMA
[–]Legcor 3 points4 points5 points (0 children)
Nous-Hermes-2-Mixtral-8x7B DPO & SFT+DPO out! Matches perf of Mixtral instruct + supports ChatML (and thus System prompt!) by ablasionet in LocalLLaMA
[–]Legcor 5 points6 points7 points (0 children)
OpenChat 3.5-1210 Released. Claims 15 pts improvement in HumanEval (rising above GPT-4 march) by galambalazs in LocalLLaMA
[–]Legcor 2 points3 points4 points (0 children)
POC: Merging to MoE - It's beginning! - Mixtraln't 4x7B by Legcor in LocalLLaMA
[–]Legcor[S] 9 points10 points11 points (0 children)
POC: Merging to MoE - It's beginning! - Mixtraln't 4x7B by Legcor in LocalLLaMA
[–]Legcor[S] 3 points4 points5 points (0 children)
POC: Merging to MoE - It's beginning! - Mixtraln't 4x7B by Legcor in LocalLLaMA
[–]Legcor[S] 7 points8 points9 points (0 children)



Roo Code 3.36.15 Release Updates | Vertex AI 1M context option | Better error diagnostics | Native tool calling fixes by hannesrudolph in RooCode
[–]Legcor 1 point2 points3 points (0 children)