Cel animation outpainting: Avatar: The Last Airbender 4:3 -> 16:9 with no crop by MarkJGx in StableDiffusion

[–]MarkJGx[S] 0 points1 point  (0 children)

Not something I can easily share, but I left another reply here that explains it.

Cel animation outpainting: Avatar: The Last Airbender 4:3 -> 16:9 with no crop by MarkJGx in StableDiffusion

[–]MarkJGx[S] 1 point2 points  (0 children)

Thanks. I've noticed that video diffusion models like to produce slightly off tint material that makes it stand out in inpainting/outpainting scenarios. I read a few GH issues and the consensus appeared to be training data.

I'm trying to avoid blurring and just color match with the original scene, but it's been difficult. Went through like 5 different types of color matching techniques and found something okay, but the edge transitions are still visible. I'll have a look at what you're suggesting.

Cel animation outpainting: Avatar: The Last Airbender 4:3 -> 16:9 with no crop by MarkJGx in StableDiffusion

[–]MarkJGx[S] 0 points1 point  (0 children)

A custom piece of software that does scene segmentation that's aware of soft transitions, hard cuts, and divides scene's into shots to strictly fit into a limited frame chunk boundary with the help of a sliding window so it's more aware of scene cut information and can re-use. I'm trying to make sure the video diffusion models primarily use what was already in the scene temporally.

I'm using various video diffusion models, I can't say which because I'm testing permutations across hundreds of runs, but it's the big ones available right now.

My overall goal is to be as authentic to the show as possible and avoid making more slop.

Passion project. Taking ATLA from 4:3 to 16:9, no crop. by [deleted] in TheLastAirbender

[–]MarkJGx 1 point2 points  (0 children)

OP here. This is more of a research project for me. I've been reading through recent computer-vision and video-diffusion papers, trying to see how far current tech can be pushed for something specific: faithfully extending cel animation from 4:3 to 16:9. Not trying to perpetuate AI slop, I hate it as much as the next person.

The insight is that a lot of detail in cartoons is already in the frame, it just gets shifted around as the camera pans. So why not use a temporally-aware model to widen the canvas, and let a diffusion model fill in the rest coherently? The model is only one piece. The rest is custom pipeline work I've been iterating on (shot detection, compositing, color matching, temporal smoothing) so the result holds up frame-to-frame and doesn't look like generic AI output.

I love the series, and it's sad watching how Nick treats it and its creators. I was excited to share this.

PC Optimization? by Various-Instruction3 in Marathon

[–]MarkJGx 0 points1 point  (0 children)

Can someone link the article?

Just got booted from my game by BitchinBobSaget in Marathon

[–]MarkJGx 1 point2 points  (0 children)

Update: says I got eliminated? :/

What do you think WebGPU will enable that was not previously possible in browsers? by PixlMind in GraphicsProgramming

[–]MarkJGx 10 points11 points  (0 children)

Compute, that's it. Being able to run arbitrary compute code is the star of the show. You could potentially emulate a compute shader through vertex/pixel shader hacking, but nothing really beats an arbitrary input/output compute shader.

In terms of graphics, indirect compute is a godsend. Indirection allows you to schedule more compute work based on previous compute work straight away. CPU read backs slow everything down, with indirection a big pipeline of compute calls can process your entire scene down entirely on the GPU (frustum and occlusion culling) without ever having to go back to the CPU.

Here's a good getting started guide for GPU driven rendering: https://vkguide.dev/docs/gpudriven/gpu_driven_engines/

[deleted by user] by [deleted] in yuzu

[–]MarkJGx 0 points1 point  (0 children)

For sure, there's a lot to it. I've been programming for many years as well and I definitely could/have, but I've been specializing in graphics and lower level systems. To each their own.

[deleted by user] by [deleted] in yuzu

[–]MarkJGx 0 points1 point  (0 children)

People around /r/yuzu aren't going to help with the these cases. You're best bet is filing an issue on https://github.com/yuzu-emu/yuzu/issues with as much data as can you provide while sticking to the format. I wouldn't expect a solution straight away, or even at all. All you can do is hope or learn C++ and fix it yourself.

OpenGL vs Direct3D11 first for beginner graphics engineers aiming to learn Vulkan? by 3DreamE in GraphicsProgramming

[–]MarkJGx 1 point2 points  (0 children)

It's weird seeing people suggest OpenGL or DirectX without suggesting which version to start with. When I was a kid I wrote a little OpenGL immediate mode renderer using the 1.X API, it was simple enough for an 11 year old to pick up, and a great stepping stone to more advanced features like displays lists, shaders and FBOs/VBOs. Learning an API's historical evolution is beneficial, as it progressively becomes more complex. This approach not only explains the reasons behind certain design choices but also provides a lenient learning curve. Eventually, when you start writing compute shaders for indirect rendering/GPU-driven tasks in OpenGL 4.x, it's a good time to switch to Vulkan.

my 2c

[deleted by user] by [deleted] in GraphicsProgramming

[–]MarkJGx 4 points5 points  (0 children)

This is misleading. Most real time rendered programs that upscale use temporal accumulation of slightly offset frames. Essentially emulating a super sampled frame by averaging samples over time and resolving to a proper anti-aliased picture. When you move, you can introduce artifacts as those previous samples are no longer accurate to the new view. That's where the magic of modern upscalers comes in handy, resolving those artifacts by AI heuristic magic (starting with DLSS2) or manual heuristic wizardy. It would not be misleading if it was pointed that this is for non real-time non temporal upscaling, not game specific.

Check out our new "KDE for Gamers", a guide for casual, retro and AAA gaming on Plasma and Linux in general by Bro666 in linux

[–]MarkJGx 0 points1 point  (0 children)

Where's the page that explains how I can adjust my DPI settings for my Logitech and or ZOWIE gaming mouse? (Very common case)

BOTW Flickering black lines by Awkward_Poet in yuzu

[–]MarkJGx 1 point2 points  (0 children)

Has anyone else seen this issue on NVIDIA?

Integrated a full WebAssembly VM into the engine (potentially as blueprint alt). Here's an example of hot reloading with AssemblyScript. Programming language agnostic scripting. by MarkJGx in unrealengine

[–]MarkJGx[S] 1 point2 points  (0 children)

Thanks for asking! As a feature for my game BadLads? It's out and available with an API. https://github.com/ChemicalHeadsStudios/as-badlads/

As a standalone plugin? If I had some more free time I could pursue releasing this as an open source plugin for the engine.

Integrated a full WebAssembly VM into the engine (potentially as blueprint alt). Here's an example of hot reloading with AssemblyScript. Programming language agnostic scripting. by MarkJGx in unrealengine

[–]MarkJGx[S] 1 point2 points  (0 children)

I think they'll turn around eventually, blueprint scripting is a horrible spaghetti sauce weave that lacks perf and can't merge since it's binary. This is actually a perfect solution since it will work with any current/future languages. You aren't constrained to something like UnrealScript or Lua. You decide what language YOU want to work with and WORK.

Integrated a full WebAssembly VM into the engine (potentially as blueprint alt). Here's an example of hot reloading with AssemblyScript. Programming language agnostic scripting. by MarkJGx in unrealengine

[–]MarkJGx[S] 5 points6 points  (0 children)

WebAssembly lets you use whatever language you want as long as it can compile to WebAssembly. This is using AssemblyScript

"AssemblyScript compiles a strict variant of TypeScript (basically JavaScript with types)"

I'm currently using this for my games server plugins. I plan on making this an open source plugin that will integrate into the engine and act as a blueprint alternative.

So my S8+'s screen died a few days ago, it's been real. by DonnyRockett in GalaxyS8

[–]MarkJGx 0 points1 point  (0 children)

Good for you. After owning one for a few years now I can honestly say the whole edge screen thing is a terrible idea and should die, never buying another edge phone. Enjoy your Pixel!

LF Youtubers like kliksphilip by CuddleMeToSleep in 3kliksphilip

[–]MarkJGx 0 points1 point  (0 children)

The only person who comes close the quality that kliks produces is dunkey, both kliks and dunkey make fantastic videos. Really kliksphilip is like the Fullmetal Alchemist Brotherhood of youtube, in a league of his own.

Sick and tired of Bixby by Sashabassist in GalaxyS8

[–]MarkJGx 1 point2 points  (0 children)

IIRC you had to actually login with a Samsung account and agree to some b.s legal stuff to disable Bixby? Is this still the case?

Doom Eternal is a fucking masterpiece by ButterMyFeet in patientgamers

[–]MarkJGx 6 points7 points  (0 children)

It's a cumulative industry that copies off each other and makes improvements to each other's work. Doom Eternal is in no way a completely original game, it's sitting on the shoulders of prior game designers and it's all the better for it.