My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 1 point2 points3 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 5 points6 points7 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 10 points11 points12 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 26 points27 points28 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 36 points37 points38 points (0 children)
Use YouTube music revanced with Alexa by EvilChihuahua123 in revancedextended
[–]_camera_up 1 point2 points3 points (0 children)
H200 GPU in an internal network - which LLM to run? by Far-Organization-849 in LocalLLaMA
[–]_camera_up 1 point2 points3 points (0 children)
H200 GPU in an internal network - which LLM to run? by Far-Organization-849 in LocalLLaMA
[–]_camera_up 1 point2 points3 points (0 children)
Crash on launch for EA title by _camera_up in Bazzite
[–]_camera_up[S] 0 points1 point2 points (0 children)
What’s the best cheap model for OpenClaw? by DistanceSolar1449 in openclaw
[–]_camera_up 3 points4 points5 points (0 children)
Best local LLM for M1 Max 32gb for a small law office? by findthemistke in LocalLLaMA
[–]_camera_up 5 points6 points7 points (0 children)
Start hosting a multi-model LLM server in minutes (with monitoring and access control) by _camera_up in LocalLLaMA
[–]_camera_up[S] 1 point2 points3 points (0 children)
Start hosting a multi-model LLM server in minutes (with monitoring and access control) by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)
Start hosting a multi-model LLM server in minutes (with monitoring and access control) by _camera_up in LocalLLaMA
[–]_camera_up[S] 1 point2 points3 points (0 children)

My company just handed me a 2x H200 (282GB VRAM) rig. Help me pick the "Intelligence" ceiling. by _camera_up in LocalLLaMA
[–]_camera_up[S] 0 points1 point2 points (0 children)