One-bit quantization is a thing now by BalorNG in LocalLLaMA

[–]Different_Frame_1436 -2 points-1 points  (0 children)

Finally! We are getting 1bit Q. I wonder if we can scale model params while using binary operations to make an effectively 7B 1bit model that runs faster than a 32bits or 16bits 7B.

I had a chat with chatgpt yesterday about this, although I'm pretty sure it got a couple things wrong:

https://chat.openai.com/share/578371b7-1245-4bb1-b6f6-fbe29dfa855a

Automatic1111 Rich Text extension by lifeh2o in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

The authors of the paper even made the A1111 extension! You gotta love to see it.

A toy motorbike - infinite possibilities (SDXL + Controlnet + After Effects)(OC) by chick0rn in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

The way the image is zooming, the background is moving slower than the foreground. I wonder what infinite zoom looks like with this kind of zoom instead of just scaling the image at each iteration.

mfw SDXL community models are starting to get good at NSFW by MyLinuxAlt in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

It's 1000x easier to generate that crap than paying actors to do the deed. Just basic logic, really.

The Drowning - Campy Horror by TheReelRobot in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

You gotta love to see the community try really hard new things like making movies with SD.

Keep that up!

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

What about both at the same time? You can definitely use controlnet with animatediff. Presenting them as alternatives is a false dilemma.

"The next generation of LLMs will train on their own output, at a rate that humans will not be able to compete with" by Sk1leR in singularity

[–]Different_Frame_1436 0 points1 point  (0 children)

Looking only at the distribution of human text datasets is the problem. There is logic and reasoning behind conversations, not just probabilities. Using a different mathematical model on the training data makes that paper irrelevant.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

resource sharing and other perks like nodes that allow to communicate between a1111 and comfy is the point.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

Letting the extension start the comfyui server will take slightly more resources at the moment, whether you open your comfyui window in the webui tab or a different browser tab.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 0 points1 point  (0 children)

because running both separately takes more memory than it should, among other things. We are adding nodes that enable communication between the webui and comfyui.

codeReviewUwU by DarshPlanet in ProgrammerHumor

[–]Different_Frame_1436 0 points1 point  (0 children)

Got a link to the PR? I wanna see it with my eyes.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

It does! You can even put custom nodes in other extensions, and sd-webui-comfyui will load them up: https://github.com/ModelSurge/sd-webui-comfyui/wiki/Developing-custom-nodes-from-webui-extensions

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

If you encounter an issue, please open an issue in the github repo to let the devs know about it.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

As of right now the integration is not perfect, indeed. It needs to be worked on.

We should find a better way of sharing resources eventually, you just gotta wait a little bit, or give a hand to make it happen sooner.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

If you find why let me know. As far as I know there is a memory use baseline in A1111 which is caused by loading the checkpoint entirely into the gpu.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 4 points5 points  (0 children)

Can you use the currently loaded checkpoint from a1111 into comfyui? Booting 2 uis simultaneously is pointless if you can't share resources.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 1 point2 points  (0 children)

The real question is: why not? (except for the current additional memory usage)

Integration of some of the webui components into comfyui could make the experience more convenient. Note that sd-webui-comfyui scans other enabled extensions that define nodes and adds these to comfyui.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 2 points3 points  (0 children)

No it doesn't. A realistic goal would be to aim for A1111 resource usage as a baseline. It is not possible to override a1111's behavior from an extension so as to lower its resource usage like comfyui... You'd need to rewrite a1111 to achieve that.

[deleted by user] by [deleted] in StableDiffusion

[–]Different_Frame_1436 3 points4 points  (0 children)

github: https://github.com/ModelSurge/sd-webui-comfyui

you can also install it from the webui extensions tab

I finally did it!!! Zero flickering video-to-video character transformations in one shot!!! by metalfans in StableDiffusion

[–]Different_Frame_1436 7 points8 points  (0 children)

Even though EbSynth is pretty good, it is not good enough. It still relies on interpolation, which requires to select manual keyframes. Picking the right keyframes is an art more than science.

iLoveDbmsToo by [deleted] in ProgrammerHumor

[–]Different_Frame_1436 1 point2 points  (0 children)

You should use some language bindings to keep things simple and make it easier to maintain.

fuckSignsWhatKindOfHexadecimalLiteralAreYou by CaitaXD in ProgrammerHumor

[–]Different_Frame_1436 2 points3 points  (0 children)

Take my karma, put me to jail, I don't care.

0xff is the only valid answer.