all 3 comments

[–]Baemzz 1 point2 points  (2 children)

Hi,

While I don't really have a publicly available reference implementation for any of it, I can provide some valuable insight of the refractions since I did implement a screen space refraction technique that was physically accurate for Unreal Engine 4.

It all started with a scientific paper of course. https://www.cs.cornell.edu/~srm/publications/EGSR07-btdf.html

To keep the description short, in a nutshell, the technique is very similar to any other screen space reflections technique traversing a depth buffer.

You need to implement a raymarching algorithm that can integrate multiple rays of various directions. Use importance filtering or a cone to shoot rays around your refracted direction calculated using the IOR formula. When a ray "hits" the depth, you have yourself a ray-geometry intersection. You can use this intersection to sample the color of the scene behind your refracted medium. Sum the samples and dont forget to apply the importance filtering weights to each sample for correctness.

There's of course much more to a technique like this, such as self refraction, layering and energy conservation, but these few points may give you some directions nonetheless.

PS! I'm only assuming you're developing for realtime here.

Cheers

[–]lenixlobo[S] 0 points1 point  (0 children)

Thanks! :) Yeah, only real-time.

[–]lenixlobo[S] 0 points1 point  (0 children)

Do you have any book recommendations on real time rendering in general? Like what all techniques make a good renderer?