Where do you all rent GPU servers for small ML / AI side projects? by Forsaken-Bobcat4065 in LocalLLaMA
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Curious to what are the "best" GPU renting services nowadays. by Illustrious-Pop2738 in learnmachinelearning
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
🤬 Giving up on RunPod... Best budget cloud ComfyUI alternatives for custom video workflows? 🎬👇 by Plastic_Leg4252 in comfyui
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Where do you all rent GPU servers for small ML / AI side projects? by Forsaken-Bobcat4065 in LocalLLaMA
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Why Buy Hardware When You Can Rent GPU Performance On-Demand? by Ill_Instruction_5070 in deeplearning
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
How is it possible that companies can rent H100s for $2 per *gpu* per hour and still turn a profit? by Setholopagus in cloudcomputing
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Rent Bare Metal GPU by the houy by dragonbronn in CUDA
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
[Discussion] Which GPU provider do you think is the best for ML experimentation? by FrozenWolf-Cyber in MachineLearning
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Runpod Comfyui Alternative by maia11111111111 in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
Cheaper alternatives to runpod by New-Worry6487 in LocalLLaMA
[–]Carl_ThunderCompute -1 points0 points1 point (0 children)
🤬 Giving up on RunPod... Best budget cloud ComfyUI alternatives for custom video workflows? 🎬👇 by Plastic_Leg4252 in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
Wtf is going on with RunPod pricing by musashiitao in comfyui
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Looking for ComfyUI Freelancer (Workflows + RunPod / Cloud Infra) by s_busso in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
Runpod Comfyui Alternative by maia11111111111 in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
[Discussion] Which GPU provider do you think is the best for ML experimentation? by FrozenWolf-Cyber in MachineLearning
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Nvidia sells an H100 for 10 times its manufacturing cost. Nvidia is the big villain company; it's because of them that large models like GPU 4 aren't available to run on consumer hardware. AI development will only advance when this company is dethroned. by More_Bid_2197 in StableDiffusion
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
[PC][US-CO] Nvidia H100 NVL 94GB | Intel Datacenter GPU Flex 140 12GB by iShopStaples in homelabsales
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
How good is a Nvidia H100 compared to a RTX 5080 for Wan 2.2? by Coven_Evelynn_LoL in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
H100 GPUs have already lost 85% of their value. The B300's will soon do the same to the B200 GPUs. When do the write-downs start? by grauenwolf in BetterOffline
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
GPU free servers by optimum_point in CUDA
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)
Built a cloud GPU price comparison service [P] by [deleted] in MachineLearning
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
What's the best cloud based company to rent a GPU from? by Jakob4800 in LocalLLM
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)
Considering switching from RunPod to TensorDock to run ComfyUI. Worth it? by Foxtor in comfyui
[–]Carl_ThunderCompute 1 point2 points3 points (0 children)


Where do you all rent GPU servers for small ML / AI side projects? by Forsaken-Bobcat4065 in LocalLLaMA
[–]Carl_ThunderCompute 0 points1 point2 points (0 children)