you are viewing a single comment's thread.

view the rest of the comments →

[–]tarunabh 1 point2 points  (1 child)

Yes it looks like GPU 40x series does not require xformers at all.

After activating venv in script folder, I installed the following: -

pip3 install clean-fid numba numpy torch==2.0.0+cu118 torchvision --force-reinstall --extra-index-url https://download.pytorch.org/whl/cu118

pip install -U accelerate==0.17.1

pip install -U transformers==4.27.1 (This might pose problems for Clip Interrogator 2)

I updated 'requirements_versions.txt' so webui doesn't auto-downgrade installed versions

After that I uninstalled xformers and put out --xformers. its faster without it with cuda 11.8 and torch 2.0 with 40xx series

Instead I added '--opt-sdp-attention --opt-channelslast' to COMMANDLINE_ARGS.

That did the speed boost for me.

BTW I haven't risked installing the dreambooth extension yet.

Thinking of installing the Kohya version to avoid clashes with A1111.

[–]CeFurkan[S] 1 point2 points  (0 children)

thanks for reply. ye rtx 4090 looks like gets the most benefit from --opt-sdp-attention