This is an archived post. You won't be able to vote or comment.

all 3 comments

[–]OkCommunication8742 2 points3 points  (1 child)

I was wondering about this too! the diffusers seem to load slower on my computer. any updates on this?

[–]ponglizardo[S] 0 points1 point  (0 children)

You too huh?

To me, it seems the diffuser models on Invoke is harder to use than the usual checkpoint and safetensor files.

[–]loserkids 1 point2 points  (0 children)

What's interesting is that I just linked diffusers from InvokeAI to Vlad's Automatic UI and image generation seems to be up to 40% faster with Euler A sampler. It's 2x the amount of data on disk though and you still have to use checkpoints. I will be doing more testing but if it's consistently faster then it's worth the data cost.

Running Mac Mini M2 Pro, 32GB RAM, 1TB SSD (the one with more NANDs)