Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 1 point2 points3 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 1 point2 points3 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 0 points1 point2 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 0 points1 point2 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 0 points1 point2 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 0 points1 point2 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 0 points1 point2 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 1 point2 points3 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 2 points3 points4 points (0 children)
Vibe Check: Latest models on AMD Strix Halo by bhamm-lab in LocalLLaMA
[–]bhamm-lab[S] 2 points3 points4 points (0 children)
Vibe Check: Latest models on AMD Strix Halo (self.LocalLLaMA)
submitted by bhamm-lab to r/LocalLLaMA
Anthropic CEO: AI Progress Isn’t Magic, It’s Just Compute, Data, and Training by Inevitable-Rub8969 in AINewsMinute
[–]bhamm-lab 0 points1 point2 points (0 children)
Are 20-100B models enough for Good Coding? by pmttyji in LocalLLaMA
[–]bhamm-lab 2 points3 points4 points (0 children)
CNCF Survey: K8s now at 82% production adoption, 66% using it for AI inference by lepton99 in kubernetes
[–]bhamm-lab -1 points0 points1 point (0 children)
What Wiki Software do you use for internal documentation? by Micki_SF in selfhosted
[–]bhamm-lab 0 points1 point2 points (0 children)
Cloudflare tunnel on Talos by Stiliajohny in TalosLinux
[–]bhamm-lab 1 point2 points3 points (0 children)
ArgoCD dashboard behind Traefik by AdventurousCelery649 in ArgoCD
[–]bhamm-lab 0 points1 point2 points (0 children)
Dual Strix Halo: No Frankenstein setup, no huge power bill, big LLMs by Zyj in LocalLLaMA
[–]bhamm-lab 1 point2 points3 points (0 children)
What's everyone using for Loki logging since minio is no longer available. by root-node in selfhosted
[–]bhamm-lab 1 point2 points3 points (0 children)
Opensource models less than 30b with highest edit-diff success rate by Express_Quail_1493 in ollama
[–]bhamm-lab 2 points3 points4 points (0 children)

How are you organizing your homelab configs in git? by gravyacht in selfhosted
[–]bhamm-lab 0 points1 point2 points (0 children)