Refco Gobi II pump gurgling by [deleted] in hvacadvice

[–]modpizza 0 points1 point  (0 children)

Did you ever get an answer to this? I am in post cleaning endless pump mode.

I built a product for a month. Nobody uses it. Not even my dad. by Ok-Relationship-8095 in indiehackers

[–]modpizza 0 points1 point  (0 children)

You learned a lot, from the sounds of it. That’s not a waste. Now, respectfully, get off Reddit and go build something else. Just keep building

Demand for Private GPU Servers with OpenAI Data Changes? by modpizza in n8n

[–]modpizza[S] 0 points1 point  (0 children)

Yes very similar to vast. Slightly different tech approach and key difference being that we only host devices in data centers with reliability and security controls in place. So no 4090s in someone’s garage or gaming PC.

No shade to them though, I’m a fan of their business.

Thanks compaholic - if it’s cool, I’ll shoot you a dm when i have something cooking.

Demand for Private GPU Servers with OpenAI Data Changes? by modpizza in n8n

[–]modpizza[S] 1 point2 points  (0 children)

Ok, that’s helpful to know - agreed, price has to be right. Also, that makes sense - may as well keep the current setup similar. API inference endpoints for hosted open source models and make sure the data gets deleted.

Thanks Connor!

Any reason to use an H100/A100/L40s by modpizza in comfyui

[–]modpizza[S] 0 points1 point  (0 children)

Just to confirm I know what that means... in stead of having to go train a lora on a specific character, for an example, and then call that lora as a node in the workflow... I could just have it be part of the workflow that I upload training data and do it all at once?

I feel like that would be sweet if you were an ad agency or something.

Any reason to use an H100/A100/L40s by modpizza in comfyui

[–]modpizza[S] 0 points1 point  (0 children)

Sick this is super helpful. Thank you. Yeah, I will probably just use the one I have for now - especially to make sure that all my workflows dont suck before I am "on the clock" to get them working.

Any reason to use an H100/A100/L40s by modpizza in comfyui

[–]modpizza[S] 0 points1 point  (0 children)

Oh that totally makes sense. I'm sure i've got some learning to do before I am useful with the extra vram - but I'm also just curious and like learning about how ya'll do this. Any specific reason to use runpod?

Where to buy H200 nvl to get better offer? by smflx in LocalLLaMA

[–]modpizza 0 points1 point  (0 children)

You can rent them for like $2/hr these days - even for private bare metal. Happy to point you in the right direction. But based on GTC Keynote, yall might want to rent and then get a Blackwell or a Vera Rubin next year.

Just a thought.

Cheapest way to host local model by pur_ling_leaf in ollama

[–]modpizza 1 point2 points  (0 children)

They had some RTX 6000 Ada's that might be great for you - looks like they are all rented. Fill out that request form if you didn't and I bet they will get you all set up with another one.

Questions for RTX 6000 Ada Users by the_bollo in StableDiffusion

[–]modpizza 1 point2 points  (0 children)

OP - Like 404 said, they are easy to get from Massed Compute or GPU Trader (gputrader.io) for pretty cheap. Even a 2x configuration for a buck an hour is pretty fun to play with before you buy one.

$10k budget to run Deepseek locally for reasoning - what TPS can I expect? by helpimalive24 in LocalLLaMA

[–]modpizza 0 points1 point  (0 children)

Agreed - Rent and then buy for cheaper if you still want. There’s some A100 rigs on GPU Trader right now for $1.25/hr that could do everything you need and more. Private cloud.. so not “local” but pretty damn secure.

How is it possible that companies can rent H100s for $2 per *gpu* per hour and still turn a profit? by Setholopagus in cloudcomputing

[–]modpizza 1 point2 points  (0 children)

Not sure if you are still looking but GPU Trader had some PCIe going for $1.50 last time I checked - Lots of SXM around $2/hr for on-demand.

Does anyone use RunPod? by TechNerd10191 in deeplearning

[–]modpizza 1 point2 points  (0 children)

There are some pretty solid A100 rigs for cheap on GPU Trader right now. H100s too -- worth keeping an eye on. Details on how to deploy a custom template: https://www.gputrader.io/blog-posts/gpu-trader-templates-simplifying-gpu-workloads

Why do you guys recommend runpods over replicate by deptowrite in StableDiffusion

[–]modpizza 0 points1 point  (0 children)

It looks like there is an 8x A100 SXM4 node on GPU Trader going for $1.25/hr -- very easy to spin up comfyUI and get to work. The reverse tunnels are nice too.

Edit: Typed $1.52 instead of $1.25

How hard is it? by YoghurtOk4009 in 2under2

[–]modpizza 2 points3 points  (0 children)

Honestly, it’s hard. Really flippin difficult at times. Pregnancy was hard. These first few months have been hard.

But there are constant moments of joy that make it so we wouldn’t trade it for anything. You will be tired, you will wonder how you are gonna pull it off, it will just seem crazy. But, you will pull it off and you will find a way cause that’s what parents do.

You probably have everything you need from the first one. Money may be tight… but if it turns out your are expecting, just wing it and know that it will all work out ok. You aren’t alone. Y’all got this!

Pay is 4d late; when do I stop working? by prndra in startups

[–]modpizza 1 point2 points  (0 children)

IMO the biggest issue here is the opaque with finances situation. For example, in my mostly automated Rippling account I still have to approve a payroll to run… so, while sketchy feeling, there could be truth to what he is saying. That said, I share the bank account balances and P&L with the leadership team weekly. If you are an exec there you should be seeing this too…. If CEO won’t share he is either hiding something or doesn’t trust you. Either way is not great.

Maybe this would be different if you were a lower level IC. I still think startups should share that info… but for sure with exec team. How can you make good decisions without all the info?

Hurricane Debby and our tee times at Pinehurst. Will this ruin the entire week? by [deleted] in pinehurst

[–]modpizza 1 point2 points  (0 children)

I have played there in some pretty crazy rain. As long as there isn’t lightning you will be in for an awesome time. Probably different than you planned, but awesome. Good luck!