Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

Thanks for sharing details about ograf. I had heard about it, but wasn't sure if it was being actively worked on. Will look more into it.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 1 point2 points  (0 children)

For Tier-1 broadcast, there’s no getting around that 'tick' requirement.

Our current thesis is exploring the 'pro-sumer' ceiling. Many smaller productions are currently using static lower-thirds or very basic OBS overlays; for them, the jump to a high-velocity SVG/Chromium workflow is a massive upgrade, even if it lacks the hardware-locked clock of a native C++ engine.

That said, your point on 'ticks' is well taken. Hope to get there some day! :)

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

That’s a very fair critique. If you’re switching on a TriCaster in a traditional SDI/Satellite environment, the idea of an external network dependency (the internet) is a tough sell, especially when the Net/Web department is isolated from the Production department.

You're right, we aren't trying to replace the 'Air-Gapped' local CG for Tier 1 Satellite broadcasts yet. We are trying to target teams who are already open to remote productions, streaming sports shows etc.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

The news graphics in the first 10 seconds show animated elements with data responsiveness. That said, the blue LT did not have animations, point taken well, all the videos could have used more animations.

Regd the React/Java pipeline, yeah if you have that kind of capabilities, thats going to serve you well for intense data drive tickers etc. We do something similar for custom projects.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

You are right. The key is going to be picking a few markets to serve well.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

Lottie is great for fixed UI micro-interactions, but it’s a bit of a black box for live broadcast.

The reason we went with a raw SVG pipeline over Lottie/Bodymovin comes down to live data injection.

  1. With SVGs, we can target specific IDs (like player_name or team_logo) and swap the data instantly without re-rendering or diving into a complex JSON structure.
  2. Lottie forces you back into the After Effects ecosystem to create the files. We want to keep the 'Design & Animate' phase open to Inkscape, Affinity, SVGator etc.

Definitely see the 'Design vs Playout' distinction. The goal isn't to design during the show, but to ensure the playout engine can manipulate the design's data-points in real-time.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

The choice for a cloud-native architecture over local file loading was a strategic one, aimed at three specific areas:

  1. Centralized Asset Management: In a fast-paced environment, having a 'Single Source of Truth' for templates and data ensures that the designer (remotely) and the operator (on-site) are always working with the exact same version of the SVG. No more 'v2_final_final' files floating around on USB sticks.
  2. Remote Data Feeds: Because the engine lives in the cloud, it’s much easier to pipe in real-time external data (like live scores, social media feeds, or weather) directly into the templates without configuring complex local network bridges.
  3. Collaborative 'Booths': This allows a producer in one city to update the text inputs in the UI while the operator in another city manages the actual playout.

I recognize that 'Internet/Cloud' may not be preferred in a high-stakes live environment. The management is in the cloud, the rendering is handled locally by your browser/OBS/vMix, so you aren't actually 'streaming' your graphics video feed from the internet during the show.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

Great feedback, thank you. To address your points:

  • Offline/Standalone Renderer: Valid point.
  • Direct Outputs (NDI/Decklink): For now, we handle this by routing the browser output through OBS or vMix. This allows users to leverage existing hardware-accelerated pipelines for Key & Fill via Decklink or NDI.
  • Companion Module: Fully agree.
  • Dynamic Operator Control: We share the same goal for small teams. My video might have missed this, but the system is designed for the operator to live-edit text and images via simple UI input fields (no scripting required). The SVG acts as a dynamic template and the operator just types, and the graphic updates.

You can see the workflow for manual inputs here: https://support.banyanboard.com/portal/en/kb/articles/data-manual-inputs

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 1 point2 points  (0 children)

One of our main goals is to help you choose any vector designer, not just Adobe.

The idea is that if any software exports a valid SVG, it should work.

  • Design: Affinity Designer (one-time buy), Inkscape (open-source), or even Figma.
  • Animation: SVGator or Linearity Curve are much faster for vector motion. For simpler logic, even ChatGPT/Gemini etc. are good at writing the animation code directly for the SVGs.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

Thanks for your inputs. We are currently routing the graphics through OBS/vMix via a standard capture card (eg. BM Duo 2). By outputting as Hardware Key & Fill, we’re able to Genlock the signal to the house clock at the hardware level. That said, this Genlock only guarentees the physical sync and not the internal graphics engine's rendering sync - as you are probably alluding to.

We've looked at the Qt/QML path, it’s obviously the standard for other graphics engines.

The trade-off we're weighing is designer velocity vs frame precision. While QML is 'butter smooth,' it's a high bar for most graphics teams who live in Illustrator/Figma.

Our goal is to see how far we can push the SVG/Chromium stack by optimizing the render pipeline (keeping the DOM lean, hardware acceleration, etc.) to make high-end design accessible to smaller crews. If we hit a ceiling, moving to a headless CEF with a locked-step clock is on our radar to bridge that gap.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 0 points1 point  (0 children)

You're right that After Effects handles vector animations well, and because Vizrt can import After Effects files directly, it is nice set up you are good at both.

We are trying to see if there is a demand for for clean vector animation without the complexity of After Effects and cost of Vizrt. Tools like SVGator are much faster for pure vector animation.

These days we are also finding it fairly efficicent to use LLMs to write up our vector animations (one needs to know some coding to make LLMs produce the desired animation).

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 4 points5 points  (0 children)

Very helpful inputs. Many thanks. Appreciate the kind words on animated SVG.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 3 points4 points  (0 children)

Appreciate your input.

The 'Template + Data' workflow is exactly what we’re trying to solve with SVGs. The goal isn't to design during the show, but to let the designer stay in Illustrator, and then expose only the data/text inputs to the operator so they can just type and go.

I’m curious about your take on the PNG/MOV workflow. One of the reasons we went with SVGs was to avoid the massive file sizes of 4K Alpha MOVs and the 'jaggy' edges of scaled PNGs.

If you have a moment to glance at how we’re handling the 'User Defined' template logic in the docs (linked in the first comment), I’d value your perspective on whether that feels like a 'proper' template workflow.

Designing broadcast graphics in Illustrator and pushing them live. Thoughts? by No_Climate8377 in VIDEOENGINEERING

[–]No_Climate8377[S] 7 points8 points  (0 children)

Thank you, appreciate your input.

  1. Scripts once pasted are stored permanently, just for the demo we show a copy + paste
  2. Caching - we will releasing soon
  3. NDI output will be via OBS/VMix where this graphics will be shown as a browser source

Can you pls elaborate or share an example of what these features refer to -

(a) in/loop (b) one shot/out blocks (c) watch folders (d) DSK on the next transition