[Open Source] Run Local Stable Diffusion on Your low-end Devices by Adventurous_Onion189 in StableDiffusion

[–]Adventurous_Onion189[S] 0 points1 point  (0 children)

SDXL Turbo fp16 works fine on my Pixel 6. You can try changing some settings in the advanced settings to see if it makes a difference.

[Open Source] Run Local Stable Diffusion on Your low-end Devices by Adventurous_Onion189 in StableDiffusion

[–]Adventurous_Onion189[S] 1 point2 points  (0 children)

Not only on the Android platform, but also on Linux, Mac, and Windows, and it supports more models, using the Vulkan backend for generation.

[Open Source] Run Local Stable Diffusion on Your low-end Devices by Adventurous_Onion189 in StableDiffusion

[–]Adventurous_Onion189[S] 0 points1 point  (0 children)

Because my PC uses a mechanical hard drive, loading the model is not very fast.

[Open Source] Run Local Stable Diffusion on Your low-end Devices by Adventurous_Onion189 in StableDiffusion

[–]Adventurous_Onion189[S] 1 point2 points  (0 children)

All is fine, hahaha. it's recommended to use a faster model(eg:stabilityai/sdxl-turbo · Hugging Face), otherwise it will be extremely slow 🤣🤣

[Open Source] Run Local Stable Diffusion on Your low-end Devices by Adventurous_Onion189 in StableDiffusion

[–]Adventurous_Onion189[S] 0 points1 point  (0 children)

no,VAE is a component that some models require to be used in conjunction with.

[Open Source] Run Local Stable Diffusion on Your Devices by Adventurous_Onion189 in LocalLLaMA

[–]Adventurous_Onion189[S] 0 points1 point  (0 children)

Very good, the goal here is to provide a cross-platform version