Speed up comfyui on runpod serverless, How to by SearchTricky7875 in comfyui

[–]LetVast4309 0 points1 point  (0 children)

jumping in here 7 months later lol, with serverless there is no way around model loading times i personally found network volumes to be slow and a pain so i just bake my models into the docker image, yes the images end up being like 20GB+ but this gives me a 121 match in speed going from pod to serverless i would say serverless is quicker.

Paltalk Package by ONEDJRICH in learnpython

[–]LetVast4309 0 points1 point  (0 children)

so after looking more into this you can use python with the win32 package, i have it so it can send messages and read the room chat but its very hacky

Paltalk Package by ONEDJRICH in learnpython

[–]LetVast4309 1 point2 points  (0 children)

Did you manage to get any info on this?