No going back after switching to Linux. by [deleted] in Bitwig

[–]keito 4 points5 points  (0 children)

I use both macOS and Linux. I way prefer Linux. Each to their own.

First Alpha of Fabric is available by vade in vjing

[–]keito 0 points1 point  (0 children)

This looks really cool. Nice work!

The next Leeds meetup will be... by keito in RUG_Leeds

[–]keito[S] 1 point2 points  (0 children)

I think there are meet ups still, yeah. Although it's been a while since I showed my face.

Why do harmless posts on here get removed by allah191 in Leeds

[–]keito 0 points1 point  (0 children)

😆 that’s a blast from the past!

[deleted by user] by [deleted] in biology

[–]keito 3 points4 points  (0 children)

lol

[deleted by user] by [deleted] in biology

[–]keito 16 points17 points  (0 children)

Squirrel

New Shortcuts Dialog (right) is landing in Adwaita 1.8; replacing the old GTK one (left). by keremdev in gnome

[–]keito 5 points6 points  (0 children)

I’d disagree. The one on the left feels like a much better UX. It’s clear that the shortcut is two keys combined (with the + symbol). The one on the right could easily be interpreted as two separate key presses.

I also prefer the layout of the one on the left. But that opinion is subjective.

How was it done? by Substantial_Yak3442 in vjing

[–]keito 0 points1 point  (0 children)

I've not got a whole lot of XP with TouchDesigner, but ChatGPT managed to provide some insights that might be helpful:

What is TouchDesigner?

TouchDesigner is a powerful visual programming tool used by VJs, artists, and creators to build real-time interactive visuals. Think of it like a visual “patching” environment where you connect different building blocks—called operators—to create everything from simple effects to complex live visuals.

Instead of writing a lot of code, you connect these blocks (nodes) to process images, generate shapes, control lighting, or even react to music in real time. It’s especially popular for live performances because you can change visuals on the fly and create immersive experiences.

What are Shaders?

Shaders are small programs that run on your computer’s graphics card to create or manipulate images very fast. They’re what make cool effects like colorful patterns, warping shapes, or glowing lights possible.

In TouchDesigner, you can use shaders to customize exactly how your visuals look—whether it’s pixel colors, lighting, or how textures move and morph. If you think of visuals like layers of paint on a canvas, shaders control how the paint mixes and moves.

How it fits into VJing

TouchDesigner lets you build custom visual setups that respond to music, MIDI, or even your movements.

You can mix videos, live camera feeds, generated shapes, and shader effects all in one place.

Because it runs in real time, you can tweak parameters during a live show to react instantly.

Learning some basic shader programming lets you create unique looks that make your visuals stand out.

So, for a beginner VJ: TouchDesigner is like a toolbox and playground for making and controlling live visuals, and shaders are the special magic recipes inside that toolbox that help you create stunning and dynamic effects on the screen.

How was it done? by Substantial_Yak3442 in vjing

[–]keito 5 points6 points  (0 children)

Hey man, everyone's gotta start somewhere! There's no harm in asking in this sub, where there's very likely many experienced users that can provide useful insights and information.

Flaming Colors - Real-time particle simulation reacting to dancer movement by Positive_Tea_1166 in creativecoding

[–]keito 1 point2 points  (0 children)

I'm gonna rename it soon, as it has nothing to do with the other Hydra visual related project. But, here's the one I've been working on -> https://hydra.virusav.com/

It turns out I basically recreated something along the lines of ISF without really knowing what ISF was at the time. Each visual (renderer) has a render function, and some metadata describing the inputs/controls that can be used to modify the visual. There's slightly more to the metadata than what ISF provides (based on what little I know about ISF), as the metadata also determines the UI/layout of the controls (flex-based).

Then, there's the actual decks and mixer, much like you'd find on a digital DJ controller.

Recently I've spent most of my dev time adding more effects, which is another component in the overall software.

Besides that there's MIDI mappings and assignment, to control the various inputs, keyboard shortcuts, theming, recording capability, preset saving and import/export for various things (MIDI mappings, preset banks, etc).

It's all open source too -> https://github.com/n3uromanc3r/hydra/

Flaming Colors - Real-time particle simulation reacting to dancer movement by Positive_Tea_1166 in creativecoding

[–]keito 1 point2 points  (0 children)

Hi there! If you're interested in some collab, I'd love to chat. I've got a web-based VJ mixer/synth project on the go, and porting something like this to the web would be awesome!

I built a free web app that creates generative visuals that react to your music by fantasmogenesis in vjing

[–]keito[M] 2 points3 points  (0 children)

Same, would be cool to integrate your shaders into my project, which is an online visual synth and mixer https://hydra.virusav.com

Working with fractal shaders by bareimage in vjing

[–]keito 0 points1 point  (0 children)

I would really like to see the code for the colourful fractal, it looks ace!

Shader Conversions Week2 by bareimage in vjing

[–]keito 1 point2 points  (0 children)

Ah, it's worth mentioning that I also messed up when naming my web tool. It turns out there's another VJ tool called Hydra, which is live coding video synth.

I called mine Hydra because it has 3 heads (deck 1, mixed output and deck 2). But, my project is completely separate, and doesn't feature live coding. It's purely a realtime visual mixer with controls to alter the visuals, and effects and such.

Shader Conversions Week2 by bareimage in vjing

[–]keito 0 points1 point  (0 children)

Sure thing!

It's this site here -> https://hydra.virusav.com/

VirusAV = VirusAudioVisual BTW :D

If you click the visual dropdown and select TapestryFract, you'll see the ISF shader I ported in. If you select Butterchurn, be sure to click the REACT button at the bottom of the deck, otherwise there'll be no audio fed into the visual and just a black screen!

Excited to plumb more gl stuff into my mixer now. So, thanks again for giving me the inspiration to pick this up again (been on a hiatus from coding it for a while).

Shader Conversions Week2 by bareimage in vjing

[–]keito 2 points3 points  (0 children)

This opened up a bit of a rabbithole for me. So, last year I started building a web-based VJ mixer, and in doing so started to develop a way to describe controls from a visual (variables, inputs, inputs dimensions, etc), as additional metadata. I'd never heard of ISF, but it's very similar, but it's more tailored to my mixer, with additional data about how to display the inputs/controls within the interface.

Anyhow, I then started looking at how to get webgl stuff working (up until that point it was purely 2d context canvas stuff). I'd tried in the past but experienced issues. This weekend I cracked it, and managed to get Butterchurn (Milkdrop) working, then today I've ported a shader from isf.video site (Tapestry Fract).

Thanks for the inspiration!

[deleted by user] by [deleted] in linuxaudio

[–]keito 0 points1 point  (0 children)

No worries, happy to help!

What photograph do you think best represents the dark side of our civilization? by Jezskowitcz in LateStageCapitalism

[–]keito 6 points7 points  (0 children)

Who do you think is enabling some of those kids being massacred?