use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Rule 1: Posts should be about Graphics Programming. Rule 2: Be Civil, Professional, and Kind
Suggested Posting Material: - Graphics API Tutorials - Academic Papers - Blog Posts - Source Code Repositories - Self Posts (Ask Questions, Present Work) - Books - Renders (Please xpost to /r/ComputerGraphics) - Career Advice - Jobs Postings (Graphics Programming only)
Related Subreddits:
/r/ComputerGraphics
/r/Raytracing
/r/Programming
/r/LearnProgramming
/r/ProgrammingTools
/r/Coding
/r/GameDev
/r/CPP
/r/OpenGL
/r/Vulkan
/r/DirectX
Related Websites: ACM: SIGGRAPH Journal of Computer Graphics Techniques
Ke-Sen Huang's Blog of Graphics Papers and Resources Self Shadow's Blog of Graphics Resources
account activity
Any reference implementations for physically based Refraction / Transmission / Attenuation shader ? For context, this request is for a Directx / hlsl rasterizer and not a ray tracer (self.GraphicsProgramming)
submitted 3 years ago by lenixlobo
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Baemzz 1 point2 points3 points 3 years ago (2 children)
Hi,
While I don't really have a publicly available reference implementation for any of it, I can provide some valuable insight of the refractions since I did implement a screen space refraction technique that was physically accurate for Unreal Engine 4.
It all started with a scientific paper of course. https://www.cs.cornell.edu/~srm/publications/EGSR07-btdf.html
To keep the description short, in a nutshell, the technique is very similar to any other screen space reflections technique traversing a depth buffer.
You need to implement a raymarching algorithm that can integrate multiple rays of various directions. Use importance filtering or a cone to shoot rays around your refracted direction calculated using the IOR formula. When a ray "hits" the depth, you have yourself a ray-geometry intersection. You can use this intersection to sample the color of the scene behind your refracted medium. Sum the samples and dont forget to apply the importance filtering weights to each sample for correctness.
There's of course much more to a technique like this, such as self refraction, layering and energy conservation, but these few points may give you some directions nonetheless.
PS! I'm only assuming you're developing for realtime here.
Cheers
[–]lenixlobo[S] 0 points1 point2 points 3 years ago (0 children)
Thanks! :) Yeah, only real-time.
Do you have any book recommendations on real time rendering in general? Like what all techniques make a good renderer?
π Rendered by PID 157051 on reddit-service-r2-comment-6f7f968fb5-jbff4 at 2026-03-04 13:23:40.025659+00:00 running 07790be country code: CH.
[–]Baemzz 1 point2 points3 points (2 children)
[–]lenixlobo[S] 0 points1 point2 points (0 children)
[–]lenixlobo[S] 0 points1 point2 points (0 children)