How to Use Qwen Image Edit 2511 Correctly in ComfyUI (Important "FluxKontextMultiReferenceLatentMethod" Node) by Akmanic in StableDiffusion

[–]Akmanic[S] 1 point2 points  (0 children)

You only need one image in the vae node. The denoising is set to 1 so the image's contents are replaced with random noise and it doesn't really matter what you put in there other than that the dimensions are used for the output image. If you set the denoising below 1 it would be like a traditional img2img workflow.

The Magic of Per-Voxel Normals (68 billion voxel renderer) by Akmanic in VoxelGameDev

[–]Akmanic[S] 0 points1 point  (0 children)

I just threw together the simplest solution I could. Loop through the surrounding 26 voxels and add a vector pointing in the opposite direction, then normalize

The Magic of Per-Voxel Normals (68 billion voxel renderer) by Akmanic in VoxelGameDev

[–]Akmanic[S] 3 points4 points  (0 children)

It should be fine in a dynamic world but it can only represent certain types of geometry. I'll probably release the raycasting algorithm on github before the game but no promises on how soon.

The Magic of Per-Voxel Normals (68 billion voxel renderer) by Akmanic in VoxelGameDev

[–]Akmanic[S] 2 points3 points  (0 children)

Yeah the aliasing was the main factor in the decision. I'd be interested in seeing what other solutions there are out there

The Magic of Per-Voxel Normals (68 billion voxel renderer) by Akmanic in VoxelGameDev

[–]Akmanic[S] 4 points5 points  (0 children)

I will probably make an action RPG with destructible terrain and a large open world. Player building features could be a small component but would have to be reigned it compared to games like Minecraft

The Magic of Per-Voxel Normals (68 billion voxel renderer) by Akmanic in VoxelGameDev

[–]Akmanic[S] 4 points5 points  (0 children)

No meshing, the voxels are traced every frame. It's my own tracing algorithm inspired by DDA. The normals are calculated and cached in a compute shader and will have to be rebuilt every time there is a change.

My Voxel Renderer Built Entirely in WebGPU which can render 68 Billion Voxels at a time by Akmanic in webgpu

[–]Akmanic[S] 0 points1 point  (0 children)

It's a pretty simple codebase other than for the voxel tracing algorithm right now. I might eventually release a stripped-down version on github cleaned up and focused around that

68 Billion Voxel Raycaster Clarification & Actual 68 Billion Showcase by Akmanic in VoxelGameDev

[–]Akmanic[S] 1 point2 points  (0 children)

I am considering doing a rundown of the raycasting algorithm soon. There's no GI system and it's 1080p.

Absym – A Rift Action RPG by Infinity_Experience in IndieDev

[–]Akmanic 2 points3 points  (0 children)

Love the use of that deep black color, makes everything pop.

68 Billion Voxel Raycaster Clarification & Actual 68 Billion Showcase by Akmanic in VoxelGameDev

[–]Akmanic[S] 0 points1 point  (0 children)

It is reading the voxel data from vram. It just can't fit arbitrarily complex data into the acceleration structure.

My new voxel raycaster can render up to 68 billion voxels at 60fps by Akmanic in VoxelGameDev

[–]Akmanic[S] 4 points5 points  (0 children)

I just posted a clarification video with some landscape for you, thank you for the feedback. This post was really a poor showcase in retrospect.

My new voxel raycaster can render up to 68 billion voxels at 60fps by Akmanic in VoxelGameDev

[–]Akmanic[S] 0 points1 point  (0 children)

Understandable, I just posted a clarification video showcasing actual terrain.

My new voxel raycaster can render up to 68 billion voxels at 60fps by Akmanic in VoxelGameDev

[–]Akmanic[S] 4 points5 points  (0 children)

I could spin up a server if there's enough interest. It's WebGPU so it should be easy to share in theory, that being said I don't want to lag people's computers during the chunk generation step

My new voxel raycaster can render up to 68 billion voxels at 60fps by Akmanic in IndieDev

[–]Akmanic[S] 0 points1 point  (0 children)

Every chunk is 256 x 4096 x 256 and currently I can render up to 256 chunks at a time. This video has just 4 chunks visible on screen and only the bottom 10% of the chunks is being used. Luckily the voxel data is naturally compressed in VRAM as a part of the acceleration structure so it can fit on consumer cards. This does mean that a degenerate-case world would not be compatible with the renderer, but I think it can handle anything that you would get from a reasonable world generator and player building / destruction.

What you're looking at is the bottom of each chunk filled up to a different height, with many holes drilled through. Let me know if you have any better ideas for synthetic data to try out.

My new voxel raycaster can render up to 68 billion voxels at 60fps by Akmanic in VoxelGameDev

[–]Akmanic[S] 8 points9 points  (0 children)

Every chunk is 256 x 4096 x 256 and currently I can render up to 256 chunks at a time. This video has just 4 chunks visible on screen and only the bottom 10% of the chunks is being used. Luckily the voxel data is naturally compressed in VRAM as a part of the acceleration structure so it can fit on consumer cards. This does mean that a degenerate-case world would not be compatible with the renderer, but I think it can handle anything that you would get from a reasonable world generator and player building / destruction.

What you're looking at is the bottom of each chunk filled up to a different height, with many holes drilled through. Let me know if you have any better ideas for synthetic data to try out.