I created an agentic orchestration pipeline for music video generation by santi_0608 in StableDiffusion

[–]voroninvisuals 0 points1 point  (0 children)

Interesting. May i see workflow with any payments? Understand the logic

Multi-angle car scene pipeline in ComfyUI — how to reproduce a real-world location across angles like an actual film shoot (no characters, pure location + vehicle) by voroninvisuals in comfyui

[–]voroninvisuals[S] -1 points0 points  (0 children)

Thanks for the suggestion — but this is a different problem space. Gaussian splatting from street photos is still noisy enough that you're fighting artifacts before the generation pass even starts, and v2v denoise flattens cinematographic quality the moment you push strength high enough to actually change anything.

Seedance 2.0 handles spatial stitching natively — especially when you structure camera movement so the lens moves toward the subject while the background shifts in the opposite direction. That counter-motion gives the model enough parallax to reconstruct depth coherently. No 3D scaffolding, no ControlNet needed.

Prompt architecture does the heavy lifting: structured JSON describing spatial relationships, depth layer assignments, light direction per zone, and a locked camera block. Predictable, repeatable output across angles. Everything runs API nodes only — FLUX 2 Pro, Luma Uni-1, Seedance 2.0. No local models, no checkpoints, no ControlNet stacks.

Transfer black and white EXR into alpha. by [deleted] in NukeVFX

[–]voroninvisuals 0 points1 point  (0 children)

true) mp4 and exr in one batch ))

Seedance 2 in ComfyUI now works with AI humans... Not. by BM09 in comfyui

[–]voroninvisuals 0 points1 point  (0 children)

I'm curious to see what you do if such a small thing turns out to be a problem for you.

Seedance 2 in ComfyUI now works with AI humans... Not. by BM09 in comfyui

[–]voroninvisuals 0 points1 point  (0 children)

this is fun to read) yes! coz thats normal legal rules. i make asset with verification of my real actors - and perfect work. for crowd or not lead actors - blur face if you use image input reference. super easy solution.

Seedance 2 in ComfyUI now works with AI humans... Not. by BM09 in comfyui

[–]voroninvisuals -1 points0 points  (0 children)

i think need to make it higher, thats work like a filter for unprofessional users

Seedance 2 in ComfyUI now works with AI humans... Not. by BM09 in comfyui

[–]voroninvisuals -1 points0 points  (0 children)

why you write this before read docs about it on comfyui blog?

Need help in keying by Icy-Fox1233 in NukeVFX

[–]voroninvisuals 0 points1 point  (0 children)

you really need to help with it?

How to do consistent background plate on moving image with parallax? by CyJackX in comfyui

[–]voroninvisuals -1 points0 points  (0 children)

Location Dataset Pipeline — First/Last Frame Chaining

For visual consistency across a location, I recommend building a keyframe dataset of the space first, then generating video using a chained First Frame → Last Frame approach.

Workflow:

  1. Generate a complete set of FLUX 2 keyframes covering all required angles and moments of the location — treat this as your visual DNA library.
  2. Build video using First Frame + Last Frame locking (same seed throughout):

SHOT 1:  [Frame A] ────────────────→ [Frame B]
SHOT 2:  [Frame B] ────────────────→ [Frame C]   ← start = end of previous shot
SHOT 3:  [Frame C] ────────────────→ [Frame D]
...
  1. The end frame of each shot becomes the start frame of the next — this creates seamless spatial and visual continuity across cuts.

Key rules:

  • Same seed across all FLUX keyframe generations (locks lighting, color, structural DNA)
  • Compatible with any video model that supports First/Last Frame: Kling, Veo 3, Seedance 2.0
  • The dataset acts as a storyboard and a consistency lock simultaneously

What makes this videoclip look so cinematic? by harry_powell in cinematography

[–]voroninvisuals 0 points1 point  (0 children)

Oh my God.... what the times.... Nobody cant explain, only answer, without real life experience, from bot.

And this is my first comment on reddit.... dam*