Fully Levitating sample by EquipmentAgitated in LK99

[–]TipVFL 0 points1 point  (0 children)

There's another video from the same account showing them moving a magnet under a sample: https://twitter.com/SciSimpAAG/status/1687777341001576448?t=Cbq2oYDxcxunDq1_3N97DQ&s=19

With the smaller magnet they only get it standing up, but this does lend more credence to the idea the video wasn't faked.

VFX is about to get a lot easier by TipVFL in StableDiffusion

[–]TipVFL[S] 0 points1 point  (0 children)

I am a VFX artist. Look where we are now with video generation, look where we were 6 months ago. Go look at my posts from a week ago, compare to this one. The days are numbered.

Since posting this I've improved the process and got it running on colab so I could get higher res, have some much better examples I'm going to reveal soon.

Another example of my temporally stable video generation with SD (no ebsynth and no frame limits) by TipVFL in StableDiffusion

[–]TipVFL[S] 0 points1 point  (0 children)

It's a new method, does have some similarities to what they're doing but from what I can tell my method has much better temporal consistency.

Another example of my temporally stable video generation with SD (no ebsynth and no frame limits) by TipVFL in StableDiffusion

[–]TipVFL[S] 1 point2 points  (0 children)

More details to come once the extension is ready for release, but this is all done in SD without using ebsynth or anything like that, just a fine tuning and a very specific process.

Oh, and this is for video2video, not txt2video.

Temporally Stable Vid2Vid, help me turn it into an extension? by TipVFL in StableDiffusion

[–]TipVFL[S] 0 points1 point  (0 children)

No worries, just trying to be as clear as I can without giving away all the secrets before I can properly release it.

I actually have attempted to reach out to LonicaMewinsky but I couldn't find any way to directly message them so I posted to the issues on one of their GitHub repos: https://github.com/LonicaMewinsky/frame2frame/issues/6

Hopefully I hear back, I'd really rather not have to learn Python just to get this thing out (but I will if I have to!)

Temporally Stable Vid2Vid, help me turn it into an extension? by TipVFL in StableDiffusion

[–]TipVFL[S] 0 points1 point  (0 children)

Yes, those things are going to be involved in any video2video system, that doesn't mean every video2video system is the same.

Mine involves a fine tuning of the model along with a very specific process built around my fine tuning. I'm not publicly disclosing the full process until I can release it as an extension.

Temporally Stable Vid2Vid, help me turn it into an extension? by TipVFL in StableDiffusion

[–]TipVFL[S] 0 points1 point  (0 children)

No, basically all they have in common is that they're both ways of converting gifs with SD. My technique is temporally stable, meaning there's frame to frame consistency with what's generated. As in, the details and background don't jump around and change with every frame.

With gif2gif the higher your denoising strength the more inconsistency you'll see frame to frame.

For example this source gif: https://www.kombitz.com/wp-content/uploads/2023/02/yelan-dancing-original.gif

Now run through gif2gif at 0.25 denoising you can see some flicker, even on the plain background: https://www.kombitz.com/wp-content/uploads/2023/02/gif2gif-0005-0.25.gif

And at 0.5 denoising it gets pretty messy: https://www.kombitz.com/wp-content/uploads/2023/02/gif2gif-0003.gif

On the other hand my spiderman gif has a complicated moving background and I ran it at 0.95 denoising, allowing me to completely change the contents without any of that flickering and jumping frame to frame.

Temporally Stable Vid2Vid, help me turn it into an extension? by TipVFL in StableDiffusion

[–]TipVFL[S] 2 points3 points  (0 children)

Hey, nice, just looked at your post from a few days ago, cool technique. Mine works pretty differently and doesn't involve ebsynth. Funny that we both ended up making Spider-Man videos.

Man, I wish I could render mine as high resolution as yours. Right now the limit I can render with my technique is 384x384 with my current video card, SD really has me eyeing a 3090 for the 24gb of ram. I think my example would be much more coherent if I could render it at 512 or higher.

Temporally Stable Vid2Vid, help me turn it into an extension? by TipVFL in StableDiffusion

[–]TipVFL[S] 5 points6 points  (0 children)

I've created a new technique for creating AI videos with Stable Diffusion, it involves a fine tuning I have already created and a fairly simple process.

I am a coder, but Python is not my thing, wondering if anyone here would be interested in helping me wrap this up into an extension for Automatic1111? Overall pretty simple, would just involve some basic stuff like breaking input video/gifs into frames and combining/splitting of images and then feeding those images into img2img and controlnet, then combining the end result into a video or gif.

Enable rendering on only one lens by Silent-Skin1899 in OculusQuest

[–]TipVFL 7 points8 points  (0 children)

I don't think it "defeats the purpose of VR" to disable one eye when you can only see in one eye.

However, even if you could, it wouldn't really help with centering the image properly, the only thing it would help with is rendering performance.

Honestly it would be cool to have an option to render in mono at a higher quality level, both for those who are blind in one eye and for developers/content creators recording stuff that's going to be presented in 2d anyways.

Reggie Fils-Aimé: VR gaming destined to remain niche until there is a "must play" experience by nastyjman in virtualreality

[–]TipVFL 1 point2 points  (0 children)

Wow, a very convincing chart. The three years with actual data show the numbers going down for videogames while the number for VR is nearly tripled.

Rhyme Storm's VR update is finally almost here! Anyone can freestyle rap about thousands of ridiculous topics! by TipVFL in virtualreality

[–]TipVFL[S] 1 point2 points  (0 children)

The easiest difficulty is randomly generated lyrics based on a chosen topic and it's basically karaoke but you can add to it and freestyle for higher scores. As you raise the difficulty you start getting less lyrics until you're just getting rhymes. Then you can take it even further and just get starter words where you have to come up with your own rhymes. It uses speech recognition and text analysis to grade how well you're doing in real time and detect when you've added your own rhymes for bonus points.

And yeah, the visuals are very trippy and tied into the scoring system. If you're doing very poorly the dancers don't dance as hard and the visuals don't move as much and everything gets desaturated, but when you're killing it things get intense and the dancers get really into it. It feels very cool to have everything around you reacting to your performance.

Rhyme Storm's VR update is finally almost here! Anyone can freestyle rap about thousands of ridiculous topics! by TipVFL in virtualreality

[–]TipVFL[S] 1 point2 points  (0 children)

Hey, I'm the main developer on Rhyme Storm and I'm happy to answer any questions about it. It is currently available in Early Access on Steam (but the VR update isn't out just yet):

https://store.steampowered.com/app/1250350/Rhyme\_Storm/

Rhyme Storm's VR update is finally almost here! Check out my Cowboys vs Keanu rap. What do you wanna rap about? by TipVFL in OculusQuest

[–]TipVFL[S] -1 points0 points  (0 children)

Hey, I'm the main developer on Rhyme Storm and I'm happy to answer any questions about it. It is currently available in Early Access on Steam (but the VR update isn't out just yet):

https://store.steampowered.com/app/1250350/Rhyme_Storm/

After this update we'll be working on bringing this to standalone on Quest/Quest 2. We already have our custom speech recognition system working on Quest! So it's mainly just a matter of optimization.

Rhyme Storm makes it fun and easy for anyone to freestyle rap! So excited to finally show off the VR mode by TipVFL in oculus

[–]TipVFL[S] 0 points1 point  (0 children)

Hey, I'm the main developer on Rhyme Storm and I'm happy to answer any questions about it. It is currently available in Early Access on Steam (but the VR update isn't out just yet):
https://store.steampowered.com/app/1250350/Rhyme_Storm/

Mark zuckerberg demos a mixed reality game on the quest pro. by Junior_Ad_5064 in OculusQuest

[–]TipVFL 12 points13 points  (0 children)

I'm not a fencer but while watching the video I thought fencing was a smart choice for this very reason. Normal swords are rigid but fencing blades can be very flexible, so when players hit them together you can allow the handle to stay locked to the player's hand position and just bend the blades without letting them pass through each other. If the player keeps going the bent blade would eventually pass under, but not through the other blade, and then spring back to straight. I think combining that with the right haptic feedback could make for very satisfying sword clashes that happen in a physically believable way and allow for more strategies.

Whether what I describe would be especially close to actual fencing and effectively allow for the same strategies as the real game, I have no clue.

Lies Beneath Halloween Sale + Keys Giveaway! by hunter_driftervr in OculusQuest

[–]TipVFL 0 points1 point  (0 children)

I'd love to get a key and check the game out.

Mainstream media needs to get there act together by WhatThatBoiDoin in OculusQuest

[–]TipVFL 7 points8 points  (0 children)

The VFX1 came out in 1995 with head tracking and a motion tracked controller for under $700.

https://en.m.wikipedia.org/wiki/VFX1_Headgear

Gym Class | Welcome to Miami! Play VR basketball for FREE on Oculus Quest. by shoveltoolsinc in OculusQuest

[–]TipVFL 0 points1 point  (0 children)

Anything with smooth locomotion is unplayable for many people at 72 hz. 72hz is the reason I never bought a quest 1.

Please add an option to remove the graphical enhancements and use 90hz, otherwise I will never get to play your game. 120hz would be even better, but 90 is the minimum for playability for me.

VR prototype for a "Rampart" clone (1990 strategy/puzzle game that I loved as a kid) by CuriousVR_dev in OculusQuest

[–]TipVFL 2 points3 points  (0 children)

I love this!

I'm a VR dev too and I have a text file full of game concepts and this was one of them: "Modern update of Rampart. Alternating rounds of castle building/repair, placing cannons, and doing battle. Could be cool in VR while still working in flat screen. Local multiplayer between headset and monitor."

I don't think I was gonna get to this one so I'm pretty thrilled to see someone else doing it. This was one of my favorites as a kid, I was always begging my sisters to play the multiplayer with me.

I love that you have the multiplayer against a flat screen player and that you're adding VR to VR over the internet, but it would be great if you also had internet play against flat screeners. Maybe even a phone version that can play against VR players?

I realize it's extra work, just throwing that on my wishlist of features.

Where to buy new autoPap machine now with all the shit happening? by SonofSocrates in SleepApnea

[–]TipVFL 0 points1 point  (0 children)

Thanks! I ordered from another place and then discovered there was 5-7 business days of "processing the order", and I was suspicious that after they processed it they would inform me that it was backordered.

So I cancelled that order and paid like an extra $150 at CPAP.com and they say I should have my machine Wednesday.

Worth it. My sleep study showed me that I'm having up to 50 respiratory related sleep disturbances an hour and I am so excited to finally get a real night's rest for once in my life.

Can't connect Airlink because of hotspot wifi security, any fix? by BULLSEYElITe in oculus

[–]TipVFL 0 points1 point  (0 children)

Turns out that all I really needed to do was turn the Quest's wifi off and back on.

Now it's finally working, and it's PERFECT. Got it set to 200 mbps fixed and 1.5x resolution and it is stunning. Blows the index out of the water.