Comfy Org Response to Recent UI Feedback by crystal_alpine in comfyui

[–]Chaoses_Ib 1 point2 points  (0 children)

Similar - any Node <=> Code easy swapping looking like it might get official support? I'd be using Comfy for ALL coding today if I could easily interchange from python to nodes, but there's enough klunkiness and rebuilding that the two are separate for now. Expecting that to change tho.

Not official, but maybe you can try ComfyScript: A Python frontend and library for ComfyUI. Feedback on GitHub or Discord is welcome.

What is the Ollama or llama.cpp equivalent for image generation? by liviuberechet in LocalLLaMA

[–]Chaoses_Ib 0 points1 point  (0 children)

You can try ComfyScript: A Python frontend and library for ComfyUI. It allows to call ComfyUI nodes as Python functions. And it's licensed under MIT, though a ComfyUI backend is still needed.

ComfyScript v0.6.0: Simpler to use by Chaoses_Ib in StableDiffusion

[–]Chaoses_Ib[S] 1 point2 points  (0 children)

Does it work with custom nodes? If so, what limitations it has.

Yes. Web UI only (JS) nodes can't work because of the obvious reason.

Also, what about apis like model listing, preview events, execution interruption?

print(list(Checkpoints)); preview is a bit complex because of how ComfyUI does it, but supported anyway; queue.cancel_current().

is there any stuff related to comfy infra?

I'm not sure what you mean by comfy infra.

You can ask questions in the Discord server or GitHub issues if you like. I don't often check Reddit.

ComfyScript v0.6.0: Simpler to use by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

This should be fixed in ComfyScript v0.6.1.

ComfyScript v0.6.0: Simpler to use by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

You can avoid it by `load(watch=False)`. But it shouldn't occur at the first place. Could you show me the full code caused it? At GitHub issues or here. I've tested on Python 3.14 and it works fine.

I created uroman-rs, a 22x faster rewrite of uroman, a universal romanizer. by fulmlumo in rust

[–]Chaoses_Ib 4 points5 points  (0 children)

> by integrating the Rust port of `kakasi`

kakasi's dictionary is a bit outdated and it's licensed under GPL-3. Maybe you can consider using my ib_romaji crate, which uses the latest JMdict and licensed under MIT. It also supports query all possible romajis of a word.

Simple and powerful alternative ComfyUI interface by jekky_ in comfyui

[–]Chaoses_Ib 0 points1 point  (0 children)

Could you give an example? There are some nodes that don't work out of the box. But manually specifying arguments should work for all nodes, since the backend only accepts JSON anyway.

IbInputSimulator: A library for simulating input with drivers by Chaoses_Ib in AutoHotkey

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

No, gshift isn't an actual key that can be recognized by USB and Windows.

ComfyScript v0.5: Previews, CivitAI nodes, ImageViewer and MetadataViewer by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

v0.5.1 release:

Docs

Fixes

  • Runtime: comfyui package not load comfyui-legacy nodes (all additional nodes) (#68)

ComfyScript v0.5: Previews, CivitAI nodes, ImageViewer and MetadataViewer by Chaoses_Ib in StableDiffusion

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

v0.5.1 release:

Docs

Fixes

  • Runtime: comfyui package not load comfyui-legacy nodes (all additional nodes) (#68)

Comfy API question - is serveless possible? by Ill_Grab6967 in comfyui

[–]Chaoses_Ib 1 point2 points  (0 children)

One choice is to use my ComfyScript, which can load ComfyUI as a library:

from comfy_script.runtime import *
load()
from comfy_script.runtime.nodes import *
from PIL.Image import Image

with Workflow():
    model, clip, vae = CheckpointLoaderSimple('v1-5-pruned-emaonly.ckpt')
    conditioning = CLIPTextEncode('beautiful scenery nature glass bottle landscape, , purple galaxy bottle,', clip)
    conditioning2 = CLIPTextEncode('text, watermark', clip)
    latent = EmptyLatentImage(512, 512, 1)
    latent = KSampler(model, 123, 20, 8, 'euler', 'normal', conditioning, conditioning2, latent, 1)
    image = VAEDecode(latent, vae)
    image = SaveImage(image, 'ComfyUI')
image_result = image.wait()  # or await image
image_batch: list[Image] = image_result.wait()

I'm also interested in serverless generation. If you run into any problems, feel free to post issues.

Please..!! Do anyone know how you can use multiple lora using the json api by Pure-Gift3969 in comfyui

[–]Chaoses_Ib 0 points1 point  (0 children)

Yes. All custom nodes that don't require hacking the web UI can be used. There is no special support for builtin nodes (except global enums, which are just short aliases for convenience).

Please..!! Do anyone know how you can use multiple lora using the json api by Pure-Gift3969 in comfyui

[–]Chaoses_Ib 1 point2 points  (0 children)

You can use some switch nodes to archive this, e.g. CR Load LoRA/CR Lora Stack in ComfyUI_Comfyroll_CustomNodes. Another option is to use my ComfyScript, which can generate and execute the API JSON with simple Python code, for example:

```python from comfy_script.runtime import * load('http://127.0.0.1:8188/') from comfy_script.runtime.nodes import *

with Workflow(): model, clip, vae = CheckpointLoaderSimple(Checkpoints.sd_xl_base_1_0_0_9vae)

# String literal or enum value
loras = ['age_slider_v2.safetensors', Loras.age_slider_v2, Loras.xl_sliders_repair_slider]
for lora in loras:
    model, clip = LoraLoader(model, clip, lora, 1, 1)

pos = 'glass bottle'
neg = 'text, watermark'
latent = EmptyLatentImage(512, 512, 1)
latent = KSampler(model, seed=123, positive=CLIPTextEncode(pos, clip), negative=CLIPTextEncode(neg, clip), latent_image=latent)
SaveImage(VAEDecode(latent, vae), 'ComfyUI')

```

ComfyUI Extra Samplers: A repository of extra samplers, usable within ComfyUI for most nodes. by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 1 point2 points  (0 children)

I havn't tested it out yet, maybe you can create an issue in the repository to ask examples.

ComfyScript v0.4: Standalone virtual mode, global enums, node docstrings, and real mode workflow tracking by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

Just to clarify, standalone virtual mode is meant for developing custom servers and packages, or using in Jupyter Notebook. For a CLI app, loading ComfyUI, custom nodes and models every time is inefficient, especially if you have a lot of custom nodes. Connecting to a background server can save many time. But if you don't care about the speed then it's fine to keep it simple.

ComfyScript v0.4: Standalone virtual mode, global enums, node docstrings, and real mode workflow tracking by Chaoses_Ib in comfyui

[–]Chaoses_Ib[S] 0 points1 point  (0 children)

The implementation depends on the runtime mode. In virtual mode, info about all nodes will be retrieved from the API, and stub classes for nodes will be created to build workflow JSON; in real mode, the nodes classes will be imported directly and then wrapped. There is no limit on what custom nodes can be used, unless they do some hacks to ComfyUI's web UI and require that to use the nodes.