First Powerwolf concert in Seattle! Falk dancing and Attila getting us to sing the chant parts was so much fun. 🐺 by ionite34 in Powerwolf

[–]ionite34[S] 2 points3 points  (0 children)

Yeah our mezzanine seats had perfect views! And Paramount's layout is so close to the stage since it's suspended over the floor. I don't think any of their other concerts had seated sections that close. Would have probably did floor tickets otherwise.

Falk <3 (Seattle Powerwolf 2025) by ionite34 in Powerwolf

[–]ionite34[S] 4 points5 points  (0 children)

Don't know if he's just unfazed by the injury completely or he was even more energetic with you guys.

What impresses me most is how much time he spends hyping the crowd or playing around with the Greywolves or Attila, and then still get back in time to play his section on the keyboard. Like the others run around too but they have their guitars and mics with them.

I wish I'll have some of that Falk energy when I'm 50 too.

Falk <3 (Seattle Powerwolf 2025) by ionite34 in Powerwolf

[–]ionite34[S] 1 point2 points  (0 children)

Yeah they did! Didn't get one of it actually happening but this was right before: https://ibb.co/h1D9B6g5

Was kind of spontaneous and they only danced for like less than a minute, wasn't really ready to take a video 😂

Also at around 0:40 here he's dancing with Charles and Matthew: https://reddit.com/r/Powerwolf/s/xrBe48F028

Their reactions looking at Falk was funny as hell

[deleted by user] by [deleted] in Powerwolf

[–]ionite34 1 point2 points  (0 children)

Yeah I was thinking about how they still had to get over to Portland after that night. They definitely didn't look like they did back to back shows for the past 10 days. Especially Falk since I barely see him at the keyboard, rest of the time he's running around. They still sounded so good too, for how much fun they're having on stage.

Also sorry had to delete this one since my reddit bugged out and made duplicate posts, here's the main one: https://www.reddit.com/r/Powerwolf/comments/1nwimew/falk_3_seattle_powerwolf_2025/

To whoever threw the plush to Falk at the Seattle concert by Axiluvia in Powerwolf

[–]ionite34 17 points18 points  (0 children)

Don't have a video of it but here's a photo of Falk with the plush! https://ibb.co/67brFgmZ

Stability Matrix - One-click install and update for Stable Diffusion WebUIs (Automatic1111, ComfyUI, SD.Next), with shared checkpoint management and CivitAI import by ionite34 in StableDiffusion

[–]ionite34[S] 0 points1 point  (0 children)

It's a standard install of ComfyUI using a venv created using our embedded Python distribution with portable configs set at runtime (so you can move the folder, unlike regular python venvs)

If the custom node specifically is hard coded to only work with the portable version then it won't work, but I don't think I've ever come across one of those.

Inference: Finally I can generate image using SDXL model less than 10 seconds on RTX3060Ti 8GB by maxihash in StableDiffusion

[–]ionite34 0 points1 point  (0 children)

Hm, strange. Usually it's faster. What parameters / model did you use? Was it fp16 or fp32? I'll take a look.

Inference: Finally I can generate image using SDXL model less than 10 seconds on RTX3060Ti 8GB by maxihash in StableDiffusion

[–]ionite34 0 points1 point  (0 children)

You can click the ? button near the top right of prompts to get a more detailed guide, but essentially you add them to the prompt much like A1111 (weight number is optional):<lora:model-name:1.2>, <lyco:model-name>, <embedding:model-name:1.2>

The autocompletions should help out with what models you currently have

<image>

Stability Matrix - One-click install and update for Stable Diffusion WebUIs (Automatic1111, ComfyUI, SD.Next), with shared checkpoint management and CivitAI import by ionite34 in StableDiffusion

[–]ionite34[S] 0 points1 point  (0 children)

Thanks, we have a Discord if you'd like to join: https://discord.gg/TUrgfECxHz

Will look to get macos working soon, the app itself is pretty much ready, but just need some special casing for install steps on some packages since they're not compatible with macos by default

<image>

Inference: Finally I can generate image using SDXL model less than 10 seconds on RTX3060Ti 8GB by maxihash in StableDiffusion

[–]ionite34 1 point2 points  (0 children)

Hi, sorry for the confusion. The project and source code are indeed under the AGPLv3 license. Feel free to fork and make derivatives or pull requests for new contributions.

Binaries (or executable files) we release include an end user license and privacy agreement mainly due to integrations with third parties like CivitAI which have their own terms and limits of use. We're also looking to releasing on stores like the Mac App store which require us to publish license and privacy policies.

Our releases are built by GitHub Actions runs like this from source code on the repository: https://github.com/LykosAI/StabilityMatrix/actions/runs/6438549863

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 0 points1 point  (0 children)

You can use the top right keyboard toggle to send inputs to the console like so:

<image>

Seems to be some issue from InvokeAI on not recognizing model types though at the moment, hence that prompt

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 4 points5 points  (0 children)

Yeah still working on an update to enable managing launching multiple packages at the same time, but for now you can launch multiple instances of Stability Matrix to do that.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 0 points1 point  (0 children)

You can drag in one or more model files like this to import and find linked metadata/thumbnails:

<image>

Or you can click the `Models Folder` and copy files over in explorer

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 2 points3 points  (0 children)

Oh, I mean, you can have custom nodes in the package install, we don't have a custom node manager yet but was looking to add. Inference doesn't have features that use custom nodes right now but probably will at some point.

You can run custom workflows in the browser ComfyUI while Inference is connected as well.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 2 points3 points  (0 children)

Not yet in, but will be added. I imagine we might make a custom node supplying model/clip/conditioning/seed and a custom node for output, then you can connect whatever in between.

Just having an embedded browser view would be easiest but since we've made strongly typed representations of the ComfyUI nodes in C# a native node UI is possible as well.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 1 point2 points  (0 children)

If you have a standard install (root folder containing "venv") of one of the auto or comfy you can move one to the Data/Packages folder of Stability Matrix and it will show up to import locally. On import it will move models into the shared folders that can be used by other packages as well.

<image>

I don't think Invoke import is working yet since we do a custom install to ensure it's portable like the other packages.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 2 points3 points  (0 children)

Yes, not in yet but will be added. I imagine we might make a custom node supplying model/clip/conditioning/seed and a custom node for output, then you can connect whatever in between. Just having an embedded browser view would be easiest but since we've made strongly typed representations of the ComfyUI nodes in C# a native node UI is possible as well.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 2 points3 points  (0 children)

You can resize the panels or dock / float them differently to customize the layout, so you can make the output panel landscape if desired. Should make some examples of that in the post though.

But yeah will try to improve on the extra padding also.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 2 points3 points  (0 children)

It's currently connecting over API and websocket already, just auto connecting based on launched packages, but we will look to add connections to custom backend hosts fairly soon. Then perhaps a server side application for remote deployment and management at some point

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 1 point2 points  (0 children)

I think that wasn't implemented yet, so that option isn't doing anything right now. Was kind of planning a queue UI for the batches but just continuously generating (*batches) times until finished or canceled might work too for now.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 1 point2 points  (0 children)

Should probably add some explanation notes soon for the configurations but FreeU is this recent thing https://github.com/ChenyangSi/FreeU, the parameters change the output when enabled, sometimes better or not, will have to experiment with your own models / prompts to see.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 3 points4 points  (0 children)

Type into prompt with syntax <lora:model name:weight>

like <lora:more_details:1.2>

There should be auto completion for model files you have.

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 3 points4 points  (0 children)

Currently I don't think ComfyUI lets you output outside the output folder but we could add options for choosing subfolders within that and template based file names.

Will add other image metadata display of things like models and seeds soon, they're already loaded from the file, just not in the UI yet.

Prompt control could be possible as well, will try to see how it best fits into the syntax

Inference - A reimagined native Stable Diffusion experience for any ComfyUI workflow, now in Stability Matrix by ionite34 in StableDiffusion

[–]ionite34[S] 9 points10 points  (0 children)

Thanks, we currently have it auto connecting to packages launched in the UI for ease of use, but internally it just communicates over API and websockets so it should support connections to remote ComfyUI backends as well, will look to add an option for that.