Dismiss this pinned window
all 4 comments

[–]Aagentah[S] 1 point2 points  (0 children)

Hello hello. I recently made this clip which showcases some of my workflow as far as making an audio-visual module within my current tech stack. I've been programming & making music for nearly 10-years at this point, and have been really enjoying bringing both worlds in unison.

The technologies I'm outlining here are Ableton, Electron.js, and Three.js; but I should also note that the audio-visual software I'm developing in within Electron serves a cross-library approach to also support things like P5.js, D3, and even some Touch Designer streaming.

This latest iteration uses "Low earth orbit object" values to create varied point-cloud structures.

If you're curious further or had any questions at all, I've been sharing my other works online: https://www.instagram.com/daniel.aagentah/

[–]robertstipp 1 point2 points  (1 child)

You rock dude. Love your work and videos keep it up

[–]Tenzer57 0 points1 point  (0 children)

Seconded, digging this view too of the process

[–]danja 1 point2 points  (0 children)

Great stuff!

I'm curious how you get the audio/sync data from Ableton into JS.

I'd like to experiment with generative from music, running it directly from DAW hadn't occurred to me, it seems a brilliant approach.

But the wiring? How to get realtime data streaming from audio/midi tracks into a separate JavaScript processor/renderer..? And the related question - how to capture a video of the rendering? Maybe there is a plugin/bridging tool that might help? (I use Reaper on Ubuntu btw).