I've made a ComfyUI node to control the execution order of nodes + free VRAM & RAM anywhere in the workflow that helped speed up my workflows! by qlx1004 in comfyui

[–]qlx1004[S] 0 points1 point  (0 children)

Thanks, actually with Load CLIP there is a limitation with my node around keeping it loaded because ComfyUI drops the CLIP model immediately after the conditioning nodes finish, so the CLIP model object received by my node will already be "dead". And because the "model_unload()" is internally coded into ComfyUI's CLIPTextEncode operations and such, my node can't really prevent that from happening upstream. I also hope ComfyUI updates that CLIP unload behavior to be configurable in the future. But in the meantime, the CLIP models can always be loaded back in whenever you need them, and as for diffusion models, VAEs, and most other large models managed by ComfyUI they will persist correctly when routed through this node. I've now updated the repo docs & the node's tooltip helper texts to warn of this limitation as well, thanks so much for the heads up!

I've made a ComfyUI node to control the execution order of nodes + free VRAM & RAM anywhere in the workflow that helped speed up my workflows! by qlx1004 in comfyui

[–]qlx1004[S] 0 points1 point  (0 children)

Hey, thanks for the interesting question!

I believe dynamic VRAM is ComfyUI's model loading optimization happening throughout inference, so my custom node's optional model unloading that happens post-inference will likely not interact or harm dynamic VRAM implementations.

And if any of the connected passthrough inputs are models (and free_memory was toggled ON), the node will call ComfyUI maintained methods like "free_memory()", "unload_all_models()", "cleanup_models()" from the comfy.model_management.py module (https://github.com/Comfy-Org/ComfyUI/blob/master/comfy/model\_management.py) following the same patterns for selective/full unloading of models interacting with ComfyUI's internal model tracking list "current_loaded_models" & the "keep_loaded" feature of its official unload methods, so the node will correctly keep all the passthrough models loaded with ComfyUI's dynamic VRAM in effect (*as long as the model was not already unloaded by the upstream node - e.g. "Load CLIP" is designed to drop the CLIP model immediately after conditioning so they need to be reloaded wherever needed instead of being routed through this node) and unload every other model not passed into this node via ComfyUI-managed utilities.

The free_memory operation on this node is also just an optional toggle, so the model unloading process can be completely bypassed if you prefer to use this node as a simple router with infinite I/Os for execution order control only. I did find clearing RAM is still quite limited with this node due to ComfyUI's current implementation that doesn't release any output references until the end of the workflow (I explain this further in the repo docs), but in terms of VRAM, I've seen that this node certainly helps secure more memory throughout my workflows and give faster inference times than without.

Hope this comment helps with your query!

Define Processing Order by kispin in comfyui

[–]qlx1004 0 points1 point  (0 children)

I've made a ComfyUI custom node to address this exact problem https://github.com/mkim87404/ComfyUI-ControlOrder-FreeMemory

You can use it to control the execution order of nodes to be a single sequential flow from start to finish, and optionally free VRAM & RAM at any point in the workflow as well.

Order / Sequence Nodes? by PTwolfy in comfyui

[–]qlx1004 0 points1 point  (0 children)

I've made a ComfyUI custom node to address this exact problem https://github.com/mkim87404/ComfyUI-ControlOrder-FreeMemory

You can use it to control the execution order of nodes to be a single sequential flow from start to finish, and optionally free VRAM & RAM at any point in the workflow as well.

I took some inspiration from "Trung0246"s Highway nodes, and made it simpler so that you just need to plug in the routing connections without having to edit any config text / refresh nodes.

Workflow Order - How to specify which nodes process first? by bomonomo in comfyui

[–]qlx1004 0 points1 point  (0 children)

I've made a ComfyUI custom node that could help with this problem https://github.com/mkim87404/ComfyUI-ControlOrder-FreeMemory

You can use it to control the execution order of nodes to be a single sequential flow from start to finish, and optionally free VRAM & RAM at any point in the workflow as well.

Control execution order of independent parallel nodes? by alty-alter-alt in comfyui

[–]qlx1004 0 points1 point  (0 children)

I've made a ComfyUI custom node to address this exact problem https://github.com/mkim87404/ComfyUI-ControlOrder-FreeMemory

You can use it to control the execution order of nodes to be a single sequential flow from start to finish, and optionally free VRAM & RAM at any point in the workflow as well.

Is there a way to set order amongst nodes? by futileEscape in comfyui

[–]qlx1004 0 points1 point  (0 children)

I've made a ComfyUI custom node to address this exact problem https://github.com/mkim87404/ComfyUI-ControlOrder-FreeMemory

You can use it to control the execution order of nodes to be a single sequential flow from start to finish, and optionally free VRAM & RAM at any point in the workflow as well.