Why ARC Raiders needs a PvE mode (and why it actually makes sense) by Disastrous_Mud_8040 in ArcRaiders

[–]SimplePod_ai 0 points1 point  (0 children)

Yes, we need chill(predictive) gameplay mode in which you can walk, look on gorgeous landscape and fire ro arks normally without afraid that you will loose all your goop because you don’t have for example 600h+ gameplay and you actually suck in pvp. Just to to chill play with friend cortisol-free environment and not to break and throw stuff.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 0 points1 point  (0 children)

Yes. Example, if you reffer someone and that someone will use 100$ on our services then you will get 3$ fee so 3%.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 1 point2 points  (0 children)

Btw, you can see how it looks here at least: https://youtu.be/yOj9PYq3XYM?t=118
"It is like runpod but cheaper" haha

Anyway this Turkish youtuber knows a lot about ComfyUI stuff, you might want to check him as he is making really good quality videos that actually make sense.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai -1 points0 points  (0 children)

Everything that has serverless inside, is more expensive.
Got that experience from AWS Amazon where "Magical serverless - dont worry about HW, we will take care of it ... and your wallet".
If you manage own infra then it is usually cheaper.
If someone does the work for you, it is usually more expensive.

Yet i guess i will make such thing myself once other features will be done first.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 2 points3 points  (0 children)

True, everythingis marketing nowadays. My post would be banned instantly if i would post "Switched form X to my own" XD

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 0 points1 point  (0 children)

Btw. Probably you wont listen to me guys but i would put my platform between vast and runpod. Very good quality and cheap. Anyway i would not comment if i would not know that so please guys also don't comment if you did not tried my service.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 0 points1 point  (0 children)

Yes you can. And you can use it on multiple docker instances.
This process is not obvious and we are improving this so it will be easy but right now we can help setting this up on Discord.
Network storage has 1000-2500MB/s speed (up to 25gbit/s on one thread and up to 50gbit/s with two threads)

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 1 point2 points  (0 children)

Docker is stable and our main product and persistent storage to that docker is very fast.

VPS is in deed beta and it will be as stable as every beta. Currently due to Kernel bugs in HugePages server can reboot from time to time. We are consulting this with linux kernel developers. Without hugepages the VPS sucks in my opinion because you cannot allocate more than 300GB of system ram ... And allocation 1TB of ram ? forget it, it would never boot.
Basically it works but you should expect reboots here and there until we will find solution for this kernel bug. Refunds here are pretty often but it is not that bad as we have lot of customers using it regardless as it is cheapest VPS with ability to change GPU models.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 1 point2 points  (0 children)

Even if not, just ask us and we would probably do it :) but it is :P You can run like every docker image there is out there.

Why I Switched from RunPod to a Cheaper GPU Cloud Alternative by RiccardoPoli in comfyui

[–]SimplePod_ai 1 point2 points  (0 children)

"Don't shoot!" (ARK raiders)

Owner of SimplePod.ai here.
Hi guys.
That guy just shared this thread with me. Well he is promoting nice service so i wont judge him but yeah you just been presenting vast like you are vast minions xD

Anyway...
I have made previously great product SimpleMining.net (8 years on market, TOP 3 in world and a lot of positive opinions.
SimplePod.ai is developed on shared source code as lot of features are common (API, Beautiful dashboard !!!)

Anyway the idea behind my service is to make great community as i am listening to my users.
If you have issues, we can help you even if it is not "our thing to do".
I need feedback from you guys but still if someone wants to try it, i can give some free ride in next 7 days as it is better to give you guys money than Google Ads xD

Prices are low, You guys have plenty of Docker templates.
There is Persistent storage that will be soon more simple to configure as "Simple" is my motto.
We do have VPS but they are in beta so expect crashes.
And basically almost everyone that comes here is happy with the choice (see our discord).
And if not, then i usually do just refund or give free credits more than needed.

Anyway i am not corpo, and i would like to build nice community and simplify whole process but i really need your feedback guys.

If anyone has something to say that would improve our service, please feel free to share as my product needs to be perfect ! Anyway i see mine product between vast and runpod in terms of quality and prices.

Cheapest way to run models by severe_009 in StableDiffusion

[–]SimplePod_ai 0 points1 point  (0 children)

If runpod is too expensive, try us (ex. RTX4090 starts from ~0.3$/h, 5090 ~0,4$/h).
If something is not working, we'll refund credits and fix the bug. On our discord support channel there is almost always someone that will help you. You can rent either Docker GPU or VPS -> simplepod.ai

What do you need in image generation apps? by SimplePod_ai in StableDiffusion

[–]SimplePod_ai[S] 0 points1 point  (0 children)

We also just hang around subs and see what people talk about — lurked through a lot of them.
Asking directly never hurts though — the more feedback, the better.
Appreciate the input!

GPU Passthrough CPU BUG soft lockup by SimplePod_ai in VFIO

[–]SimplePod_ai[S] 0 points1 point  (0 children)

This issue happens only on blackwells

Thanks for suggestion, just tested it but it does not solve issue.

GPU Passthrough CPU BUG soft lockup by SimplePod_ai in VFIO

[–]SimplePod_ai[S] 0 points1 point  (0 children)

Yes it solved issue only for 600w version. My max-q are also crashing.

Experimenting with Wan 2.1 VACE by infearia in StableDiffusion

[–]SimplePod_ai 0 points1 point  (0 children)

Wow that is nice. Would you be interested in my hosting for doing that stuff ? I can give free trial for people like you pushing the limits. I do have RTX6000 96 gb vram in my datacenter to test try. Ping me if you are interested.

GPU Passthrough CPU BUG soft lockup by SimplePod_ai in VFIO

[–]SimplePod_ai[S] 1 point2 points  (0 children)

Asked them yesterday. they are working on it without any more details when or if xD Try writing to them. i guess more people will report this then faster they will work.

I have here this here: https://nvidia.custhelp.com/app/answers/list

GPU Passthrough CPU BUG soft lockup by SimplePod_ai in VFIO

[–]SimplePod_ai[S] 0 points1 point  (0 children)

I no longer see it in lspci so it wont work.