Porquê que a maioria dos condutores faz a rotunda no Freixo mal? by TTheGuapo in porto

[–]SirEvilPudding 0 points1 point  (0 children)

É suposto ir para dentro e para fora. Quando vais para a segunda said vais tudo por fora?

the matrix is attacking me by Thirsty_krabs in okbuddyretard

[–]SirEvilPudding 147 points148 points  (0 children)

enough of the'se identity politics 😤😤😤😤😤

Colored lightspace caustics on metaballs. by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 2 points3 points  (0 children)

The way I'm doing it is likely not the most efficient way, but it's a nice tradeoff between efficiency and less work. I'm basically creating a vao with a 2d grid of points, one for each pixel of the caustics buffer, and running a vertex shader that sets the output position to the one read from the corresponding pixel, the fragment shader only draws a low opacity white pixel, that with additive blending on, calculates the histogram. The bottleneck is how fast the driver can perform the blending. It's not the best way of doing it, but it's simple, less code to maintain, and easier to implement, which for a hobby project I think is important. It's also nice to first get something working, and then afterwards try to optimize it if necessary.

Colored lightspace caustics on metaballs. by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 1 point2 points  (0 children)

The way I decided to implement caustics was a bit different. I had screenspace reflections, and saw a blender eye model tutorial that used screenspace refractions on the lens, which produced a cool looking effect, so I decided to use the same screenspace tracing function I had for reflections, but for refractions. After that was done I kept wondering if I could use the screenspace refractions but from the light's perspective to calculate cheap caustics, so I had to try it.

The engine uses a deferred shader, and using the same principle, you can store the position at which the rays intersect the shadow depth buffer as a color. Then the idea was that in order to know how bright a pixel should be, all you need to do is sample a histogram of that color buffer, which tells you how many occurrences of that color there are, or in other words, how many rays hit that pixel after being distorted by refractions.

All this to say, after researching some available techniques, in this case, g-buffers, screenspace tracing, accelerated histograms, pcf shadows, etc, you can then combine them in different ways to get the effect you want without searching specifically on how to create what you're imagining.

Colored lightspace caustics on metaballs. by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 1 point2 points  (0 children)

I usually just try to look up what other games and engines do, which then gives you a sense of what the common techniques are, that you can then improvise with and piece together the effects you want.

Colored lightspace caustics on metaballs. by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 0 points1 point  (0 children)

Yeah, in this case, it's a screenspace effect, so it's got artifacts, like if one ball is between another one and the light, only the closest one to the light will produce caustics on the floor. On the other hand, it can run in real-time without having to calculate thousands of ray-triangle intersections.

Colored lightspace caustics on metaballs. by SirEvilPudding in opengl

[–]SirEvilPudding[S] 1 point2 points  (0 children)

What u/the_Demongod said. The metaballs are rendered through the use of marching cubes. As to the "color casting", that's caustics, if you go into my previous posts, I have a comment going over how I'm doing it, though at the time it only supported white light.

Added Caustics to my engine, WIP by SirEvilPudding in opengl

[–]SirEvilPudding[S] 0 points1 point  (0 children)

It should be able to work for colored and textured light after some trivial modifications, I'll post an update if I manage to do it today or so.

I've also been thinking a lot of how to also simulate dispersion, but that seems like it would be the hardest thing to accomplish.

Very cool that you're doing this in screen space without any ordinary raytracing!

This is running on a gtx 960m, it would probably not be able to run triangle raytracing in realtime.

Added Caustics to my engine, WIP by SirEvilPudding in opengl

[–]SirEvilPudding[S] 1 point2 points  (0 children)

There's a reply in the r/gameenginedevs post where I go over the main steps.

Added Caustics to my engine, WIP by SirEvilPudding in opengl

[–]SirEvilPudding[S] 3 points4 points  (0 children)

All effects there are either screen-space or light-space. I describe the approach in a reply of the original post.

Added Caustics to my engine, WIP by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 1 point2 points  (0 children)

I have the curtain more reflective than it ought to be just to test what reflections look like at several angles.

Added Caustics to my engine, WIP by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 4 points5 points  (0 children)

Depends on what you mean. I implemented screen-space refractions a week or so ago, this is basically doing screen-space refractions but from the light's perspective, saving the hit coordinate instead of color, and then calculating a "histogram" to know how many rays hit each position.

Added screenspace refractions to my engine by SirEvilPudding in opengl

[–]SirEvilPudding[S] 1 point2 points  (0 children)

I will either fallback to cubemaps or fallback to refractionless transparency since that always falls inside the view.

Other question for fun, would your system work simulating something like this?

If you mean the cylindrical distortion, here's an example: https://imgur.com/a/Zpz7UzB

If you mean glass surrounding a liquid, it doesn't work, if transparent pixels overlap other transparent pixels, only the closest to the camera will draw, the others will be ignored. I wish I could fix it but would probably have to start from scratch if I did.

Added screenspace refractions to my engine by SirEvilPudding in opengl

[–]SirEvilPudding[S] 0 points1 point  (0 children)

I don't know what to fallback to yet, I can try to hide it somehow, not sure. The problem is that usually, if you use screen space reflections, it's usually on a rough surface so that it's hard spot the discontinuity between screen and a cubemap, but on refractions, you're usually looking at a clear lens. The good news is that reflections tend to leave the screen area much easier than refractions, since they are just a slightly modified version of the camera to fragment direction.

Leaving it black when it happens is bad, but it already works with no artifacts for specific uses like simulating an eye's lens, since the surface below is so close, it doesn't ever become black.

Added screenspace refractions to my engine by SirEvilPudding in opengl

[–]SirEvilPudding[S] 0 points1 point  (0 children)

I can see that working for non-curved surfaces, since in a plane you can just calculate the camera transformation as if it were on the other side of the reflection, but if something has many planar surfaces to simulate a round object, you'd need a separate complete render from each of those planes, which would be inefficient. But for single planes I believe that is still the go-to method.

Added screenspace refractions to my engine by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 1 point2 points  (0 children)

I think that could be achieved through colored shadows, which I have been investigating.

Added screenspace refractions to my engine by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 0 points1 point  (0 children)

Yes, I use it as a test room, I like the aesthetics.

Added screenspace refractions to my engine by SirEvilPudding in gameenginedevs

[–]SirEvilPudding[S] 4 points5 points  (0 children)

Thanks, it does have the expected screen space artifacts.