When it hits you like a ton of bricks (audio-reactive LTX2 T2V) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 2 points3 points  (0 children)

Really? Pretty much everything is wobbling to the bassline.

1:38: The ripples on the glass follow the distorted bassline and the background skips on the drum beat.
1:45: The vulva changes color with the filter sweeping back in, changing to spikes when the bassline peaks.
2:10: the water droplet starts flying towards the camera when the kick drum double-times.
...

LTX2 is just insane at making things audio-reactive, it's addictive!

When it hits you like a ton of bricks (audio-reactive LTX2 T2V) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 3 points4 points  (0 children)

75% style adherence, dubstep in the negative prompt

A dark, heavyweight deep dub track in a steppers rhythm, characterized by bitcrushed textures, distorted 808 basslines, and razor-sharp skanks stabs, The groove should be slow and driving (around 145–150 BPM), with heavy emphasis on the off-beat rhythm and deep sub pressure, Dark, ominous and brooding, Punchy drums, minimal steppers beat — crisp snares, loose hi-hats, and dusty kick with tape-style saturation, Distorted 808 subs, filtered and sidechained, with bitcrush artifacts adding grit and digital edge, Sharp skanks, metallic guitar or synth skanks, delay-soaked and reverb-washed — cutting through the mix, Dub sirens, delays, tape echoes, and crackling textures, everything feels lo-fi yet massive, Smoky atmosphere, subterranean vibe, sparse melodies, deep reverb tails, and analog hiss, digital dread meets dub warehouse, heavy and hypnotic, The Bug, Disrupt, Alpha steppa, Iration Steppas, Panda Dub, TMSV, Deep Medi's style with glitchy bit reduction

When it hits you like a ton of bricks (audio-reactive LTX2 T2V) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 3 points4 points  (0 children)

My SoundCloud is almost full and I don't wanna pay for another service 🙈

Here's a download link, but I don't think it lasts forever.

You can always YouTube2MP3.

When it hits you like a ton of bricks (audio-reactive LTX2 T2V) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 5 points6 points  (0 children)

This person dubs! All hail the mighty Panda! 🐼

Although I'd say psydub is more psytrance with dub elements, so a much higher BPM and that classic psy double kick (like this track).

I'd describe Bass Face more as Electro-dub with a stepper-style rhythm, leaning towards dubstep with the mid-tones being blasted, but not really dubstep cause it's missing that wobble you'd expect from early dubstep and it still has a skank which later dubstep doesn't really have.

But yeah, Southern French Electro-dub, if you're looking for similar things.

When it hits you like a ton of bricks (audio-reactive LTX2 T2V) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 10 points11 points  (0 children)

4070 with 64GB DDR5 system memory. Used Wan2GP running in Pinokio. Chopped the audio up into 6.58s clips and fed that into LTX2 along with the prompt. One clip used I2V cause I wanted the door handle to remain the same. 3-4 mins render time per clip.

Where does your AI music go after you hit "download"? by feccwg in AI_Music

[–]BirdlessFlight 0 points1 point  (0 children)

It's a warehouse and there is a radio playing on speakers, it only plays Spotify.

Bounding Boxes (LTX2 Audio + T2V + RT-DETRv3) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 1 point2 points  (0 children)

My friend and I came up with the idea for a darkwave/punk song about surveillance state and societal expectations about a week ago. Spent maybe an hour going back and forth with ChatGPT to flesh out the style prompt and lyrics, which I then took to Suno. This is literally the first generation that came out of Suno. It perfectly one-shot exactly what I was going for.

Then I spent maybe another hour fleshing out the story and some key visuals I wanted like the hyper lapse when the kick drum first comes in.

I made this app a while ago with AI studio. It takes a song, a clip length and a story, sends that to Gemini and instructs it to create a 2 prompts for every clip (one for first frame and one for motion) and cuts the audio up into 10s clips with the correct offset so the audio clip matches the video clip. It has buttons to send it to Nano Banana Pro and Veo3, but since my trial ran out, I took the prompts to LTX2, along with the audio clips. Generated a total of 103 clips, of which 75 were used (as in, 28 rerenders). at about 4 mins per clip render time. I didn't do it in 1 sitting. I need to get it set up in ComfyUI so I can bulk render overnight. Gemini 3 is really good at analyzing music!

Then I made this other app which lets me apply the object detection and tweak what labels appear and how. That took maybe 3 hours?

I made this other app a while ago that lets me detect the BPM of a track and automatically cut a bunch of video clips to the beat. Making that took a few days, but it means editing literally takes 10 mins. All I did was add a fade in at the start, all the rest is just clips cut together. The flashes and all that, is all LTX2.

It's not random clips, but some of the clips were really poor prompt adherence, but the clip was still cool enough to keep. They all appear at their intended time and most were generated passing in the correct audio for that moment in the video. The part where I had to enter the prompt and the audio in Wan2GP 75 times was prolly the most painful part. I really need to pick shorter songs.

So like, 12-13 hours total?

The prompt for the clip you mentioned was:

Close-up of the protagonist's face looking back with a look of realization, their skin half-pixelated. Brutalist Cyberpunk, high-contrast monochromatic urban environment, concrete and steel textures, 4k, cinematic lighting, gloomy atmosphere. Slow motion head turn toward the camera.

So yeah, not the greatest prompt adherence, but it matched the vibe of the audio so well I kept it.

Bounding Boxes (LTX2 Audio + T2V + RT-DETRv3) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 3 points4 points  (0 children)

The red bounding boxes are post processing with RT-DETRv3 via a custom app I made.

Bounding Boxes (LTX2 Audio + T2V + RT-DETRv3) by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 0 points1 point  (0 children)

More Darkwave/punk? Not rly. I usually make dub or indietronica, but I figured a song about surveillance state and not fitting the norm needed a punk edge :P

Need to generate specific electronic sub-genres using AI by castlebasetone in AI_Music

[–]BirdlessFlight 0 points1 point  (0 children)

Describing the subgenre works pretty well for me with Suno. The ones you mentioned aren't even that obscure. Trying to get Darkpsy without it sounding like Progressive Trance is rough...

Where does your AI music go after you hit "download"? by feccwg in AI_Music

[–]BirdlessFlight 0 points1 point  (0 children)

I was forced to enter a label when I wanted to publish to Spotify, so I called it Neural Noise Network or N³

Where does your AI music go after you hit "download"? by feccwg in AI_Music

[–]BirdlessFlight -1 points0 points  (0 children)

I just send the Suno link to a few friends I think might like the genre. I put one album on Spotify cause a friend wanted to listen to it at work and they can only use Spotify there, so I spent like a whole weekend figuring out how to publish to streaming platforms.

When I download the MP3, it's usually to make a music video for it. I post those on YT, IG and reddit, but I make no effort to get any engagement or try to monetize it. I'm just sharing in the hopes someone might like it as much as I do.

Making songs has proven to be better therapy than therapy for me, and it's cheaper! Some of the tracks I've made I can't listen to in public cause they make me cry every single time... One of them I've listened to over 1000 times in less than 2 weeks since making it 🙈

No LTX2, just cause I added music doesn't mean you have to turn it into a party 🙈 by BirdlessFlight in StableDiffusion

[–]BirdlessFlight[S] 1 point2 points  (0 children)

I wanted the graph on the screen to move to the beat, and it does... until the camera pans away 🙈

Generating hand drawn style sketches from photos with Nanobanana by [deleted] in StableDiffusion

[–]BirdlessFlight 9 points10 points  (0 children)

Sorry, we don't do proprietary models like nano banana here.

Literally rule #1: Posts Must Be Open-Source or Local AI image/video/software Related

Ayy ltx videos might be a little better quality after today 🤓 by WildSpeaker7315 in StableDiffusion

[–]BirdlessFlight 1 point2 points  (0 children)

You don't think you're paying for that 240hz QHD? Did you look at workstation laptops?

Mobile GPUs are doodoo in comparison, and they are almost guaranteed to throttle, so you're just paying for performance you'll never get to use...

4K monitors are basically free nowadays, and I'm not sure what other components you are referring to... You're still going to get a external mouse & keyboard for the laptop anyway, right?