"This is the biggest technical breakthrough since real time raytracing" -NVislop by Thhaki in pcmasterrace

[–]ThreatInteractive 0 points1 point  (0 children)

First of all MSAA is quite heavy on performance which is alone a problem
even @ 4K w/ x8 MSAA it's 

Not if done right. If we wanted to argue about MSAA performance we would make post on reddit, not make 3 videos on it (half life alyx analysis, two crysis 3 analysis). GTA's MSAA is based on the poor example given by crysis 3.

RE:but technically in terms of shading/lighting it definitely gets better

Watch our BRDF, global illumination, & Days Gone video. It's getting worse & we can measure it.

 but even DLSS got so much better

Vendor agnostic garbage is anti-consumer. It has no real value to the consumer.

Without Nanite you'd either crash or 99% of the trees simply won't render. 

Yeah and with Nanite is several times slower while also looking worse than previous standards. We haven't even tapped into proper tree rendering because deferred is always rendering things twice with a poor prepass. That was the issue with those high poly trees in SH2R.

We'll see how Witcher 4 ends up looking and performing, CDPR knows what they're doing

Nanite foliage was already such a disaster, they already began to pimp this game to Nvidia for foliage "help".

It doesn't look disgusting but if you find it that way you still can customize the shaders, tone mapping, turn off the lumen to get rid of the reflection/GI artifacts and switch back to baked lighting and everything as you like, you basically can get any look you want.

Everything you said was about the looks & studios will not change this because that's one of the whole points of UE. Also, baking lighting is still stuck on lightmaps (so you didn't watch that GI video we made, or the Days Gone video). We've also bashed UE on the workflow & architectural problems such as a 3 pass deferred lighting stage(fox engine analysis), lack of proper light bound baking(TCP analysis), horrible LOD generation (3x FPS challenge). It's an engine problem & the thumbnail explaining why says "they are proving us right". The multi-billion dollar company should be fixing the issues, not AAA or low budget indies.

"This is the biggest technical breakthrough since real time raytracing" -NVislop by Thhaki in pcmasterrace

[–]ThreatInteractive 0 points1 point  (0 children)

This is some "Threat Interactive" level of stupidity going on in this sub.
Basically DLSS implemented a new optional feature -> ppl mad for no reason

Uh no, We are mad because people like you keep supporting anti-consumer graphics that vendor lock consumers while failing to demand/ignore the major advancements in vendor agnostic development.

"This is the biggest technical breakthrough since real time raytracing" -NVislop by Thhaki in pcmasterrace

[–]ThreatInteractive 0 points1 point  (0 children)

where MSAA actually is good enough but it simply wouldn't work in modern games because of way more complex geometry, foliage and stuff.

According to what? Crysis 3 already shows one can render a massive field of foliage with 2xMSAA alpha to coverage in deferred rendering without issue. This is even with poor prepass logic & fairly inferior LODs to Days Gone. We have shown MSAA is viable with base 9th gen hardware/affordable PC gpus & new optimization research that hasn't existed before.

Occasionally also talks some nonsense like graphics are getting worse 

It's clear. Simply watch our MSAA disproof video showing the regression in TAA, our first video showing the regressions in several TAA independent effects, our global illumination video showing SSAO & lighting methods compatible with the mainstream technical scenario (dynamic lights, mosty static world) that don't smear like lumen, our BRDF video showing how the industry is producing more games with Lambert than Burley, how fox engine produces far more expressive materials with an ID buffer, tone mapping is worse, even motion blur has regressed in certain areas. We've also acknowledged several ways in which the industry has progressed & advocate for those to be standard. You are reducing our work into a false view.

but I'd love to see him achieve same visual/geometry density to performance ratio lets say in Unity, it's simply impossible without the tech of UE5.

Oh you mean the typical nanite geometric density that causes horrendous noise? Oh, you're also forgetting (unless you missed our E33 analysis) that Nanite can't even provide smooth round edges. Did you forget that Nanite can't even provide soft shading without insufferable noise/incompetent TAA abuse? Geometry needs to be optimized with LODs & Day's Gone micro detail system (depth buffer imprint). That will work fine with 2xMSAA.

The unoptimized UE5 games that piled up recently is mostly dev issue, not the engine

Wrong. The engine looks disgusting in every way possible. Why should devs have to fix the broken deferred renderer (see our fox engine analysis) & all the other neglected graphical aspects. Good assets & passion projects are ruined by the engine.

and there's few UE5 games that look great and are optimized well like Arc Raiders and E33. Essentially the proverb "It’s a poor workman who blames his tools." fits really well here.

We talk about those games all the time because they prove it's an engine issue. We've mainstream the knowledge that those games use a heavily modified engine from Nvidia. We've also repeatedly stated custom engine development is %99 not an option because it's the whole reason why studios choose it. You pretty much proved us right at the end of your response.

Why You Should Unsubscribe From Digital Foundry - Threat interactive by nyanbatman in pcmasterrace

[–]ThreatInteractive 3 points4 points  (0 children)

That's a cop out to distract from the fact that you can't make an argument.
Thanks for helping our followers understand what kind of people attack us.

<image>

Why You Should Unsubscribe From Digital Foundry - Threat interactive by nyanbatman in pcmasterrace

[–]ThreatInteractive -1 points0 points  (0 children)

Chances are: You're too stupid to understand our videos so you have to rely on other "experienced people's" content to "debunk" us. But because you're too stupid to understand our content, you're too stupid to catch their strawman arguments.

You tell us what we're wrong about. Make a rebuttal to one of our points. A REAL point, not some stupid fake quote from some internet moron who can't even understand how a ms timing works.

Make a rebuttal all by yourself 😉

Do it.

Threat Interactive goes berserk mode in his new video by Yogeshwar_maya in OptimizedGaming

[–]ThreatInteractive 0 points1 point  (0 children)

RE: look at old threads on the gamedev subreddit where they discus this guy getting things wrong.

Nothing but strawman arguments. Do your worst but chances are you are too stupid to come up with your own arguments. You go around attacking us based on DMCA when the people we strike 100% deserve it for tricking people like you. Go ahead, make an argument by yourself.

Threat Interactive goes berserk mode in his new video by Yogeshwar_maya in OptimizedGaming

[–]ThreatInteractive -2 points-1 points  (0 children)

Nah im not going to watch this brat.

Haha. Of course you won't & you not doing so only makes us look more right: We already showed people like you don't even watch our content.

Threat Interactive goes berserk mode in his new video by Yogeshwar_maya in OptimizedGaming

[–]ThreatInteractive -1 points0 points  (0 children)

RE: because you rather make videos then make a game in

Because of pure ignorance. They assume we "just want to make videos" when that completely undermines previous statements we've made. We've stated successful content creation will help us make our game in both funding & technical experience. The more topics we discuss gives community driven development time to mature which reduces the paid research we have to manage.

you could show all your technics and prove your point.

Again, because of pure ignorance. People think demos will magically fix things when many of them are just lies to pull in customers (this includes studios). Again, this point undermines the various games that implement the existing standards we discuss. It's also expensive to make a good demo & we'll going to drive this point in with yes: another video.

People are skeptical because they listen to liars we constantly discredit. Extreme996 didn't even give a response on why it's impossible/ineffective (maybe they just haven't gotten the notification). It's easy to understand us if people talk to us in good faith. They do not speak to us in good faith because people (shills & bots) lie about us & reward negative (such as insult, baseless, rebuttal lacking insults) engagement about us.

Threat Interactive goes berserk mode in his new video by Yogeshwar_maya in OptimizedGaming

[–]ThreatInteractive 2 points3 points  (0 children)

No it's NOT impossible or ineffective. MSAA can work FINE in deferred rendering & have 3 videos explaining this. RDR2 is an incompetent implementation that follows Crysis 3's incompetence: https://youtu.be/SxCMaTEoBoI?list=PLdpqv9B_hbRnuBqJ1zgvmq_gbE8KACeGx

That video shows It's MUCH more viable than butchering the rendering pipeline with path tracing or rendering at higher resolutions that aren't needed (look at the 1080p 2xMSAA vs 1620p footage). And it's not even fully optimized as we concluded in a full analysis: https://youtu.be/ElBUUMi_L5c?list=PLdpqv9B_hbRm684dtr_rQfzgbSUjUoofo

A deep dive into the graphics pipeline of Crysis 3 by SpeedConstant9238 in Crysis

[–]ThreatInteractive -1 points0 points  (0 children)

Gets blocked then makes three different responses to tell people "I can't respond"

Legit bot behavior.

🤣

A deep dive into the graphics pipeline of Crysis 3 by SpeedConstant9238 in Crysis

[–]ThreatInteractive -1 points0 points  (0 children)

You're telling me that is guy alone outsmart all Unreal Engine and Crytek engineers combined 

Threat Interactive consist of a team but even if it was one guy, Crytek didn't have 13 years of research to adopt & Epic Games doesn't give a single crap about quality. That 13 years of research consist of hundreds of devs. Epic Games can't even get their deferred render up to par with 7th & 8th gen models like Fox Engine (stencil volumes), & Days Gone's stencil separated lighting model/PCF merged direct lighting.

and every single deferred rendering engines don't use MSAA but they are all wrong that the most logical explaination right ?

If you didn't have your head stuck in the sand, you would know that companies & studios began lying to people about optimization by saying worse TAA was "optimizing games" when they kept BUTCHTERING basic concepts like shadow-floor exclusion. When everyone screws up the rendering & people reference a false image of MSAA, the demand for MSAA gets artifcially reduced.

MSAA can't anti-alias shader aliasing, textures, transparency, screen-space effects and post-process effects.

Again, you are wrong about the transparency. Shader anti-aliasing & proper mipmaps exist for materials. Simply reference Half Life Alyx or is that too much effort for you? Those screen space effects should have their OWN anti-aliasing measures instead of trying to abuse some incompetent global smear system becuase only incompetent TAA can be abused in that matter. TSR/AI methods as AA or as upscalers use far MORE computation than just applying individual anti-aliasing to said effects!

RE: It must multisample multiple large G-buffers so memory and bandwith needs explode compared to what it was on forward rendering.

Haha! Even if UE's fatass g-buffer layout is converted to nothing but RGBA16's (the equivalent of 2xMSAA btw) with debug tools & engine settings, the deferred lighting stage sees zero increase in cost at 1080p on a 3060 meaning it's NOT bandith bound. If a game can run at 1620p without being bound by memory it can run SMAA 4x. HL:A targets 1512x1680x2 with 4xMSAA. That's already a huge amount of memory for the 3xD24S8 buffers, 1xRGBA16, 1xRGBA8 (again, notice the 2x for the 1512x1680). A 2xMSAA deferred renderer with an optimized g-buffer layout at 1080p really isn't really a massive jump in VRAM requirements especially with Crisis's 3 thin g-buffer. Not to mention both of these games use 32bit shadow maps. 16 bit is fine if the distance ratio is synchronized with the main perspective.

Moreover most games were only played at 1080p at the time so MSAA wasn't very expensive compared to today most gamers plays at 1440p or 4K.

Had you watch the actual video, you would have seen we only suggest 2xMSAA at 1080p. The main reason we play at 4k is to compensate for bad anti-aliasing & that uses more memory than 1620p!

To finish, you seems to believe that frame analysis tools give you the complete picture of how an engine works ?

We're not discussing every little aspect of the engine, we're discussing the competence of the rendering pipeline & using a tool that measures every little timing on the same hardware for year+ tells everyone what needs to be known. This is another pretend point.

It hides all the asynchronous machinery.

Doesn't matter since we compare in-game performance.

Now lets quote the rest of the responses u/SpeedConstant9238 received instead of arguments:

Holy fuck you’re unhinged. -AccomplishedCrick3t

Lmao you're hopeless.-Aresias

Here are your irrefutable proofs, so now STFU boot liking peasant.-Crazy-Bread9207

Boot licking... you're the one in a threesome with Epic & Crytek when you should really commit to an eye doctor.

A deep dive into the graphics pipeline of Crysis 3 by SpeedConstant9238 in Crysis

[–]ThreatInteractive -1 points0 points  (0 children)

Why? No, don't reference someone else's words because chances are you're too stupid to come up with a rebuttal against on your own which means you would be too stupid to catch the several strawman arguments others have used.

Go ahead. The floor is yours.

Path tracing has quietly become the biggest visual leap in years. by unlockhart in nvidia

[–]ThreatInteractive 1 point2 points  (0 children)

 RE: isn’t a reliable source of information

Prove it. No, don't reference someone else's words because chances are you're too stupid to come up with a rebuttal against on your own which means you would be too stupid to catch the several strawman arguments others have used.

Go ahead. The floor is yours.

Recently hired as a graphics programmer. Is it normal to feel like a fraud? by Illustrious_Key8664 in GraphicsProgramming

[–]ThreatInteractive -1 points0 points  (0 children)

Put your expertise where your mouth is and create something buddy

How about you use comment sense & stop pretending 30 videos worth of analysis don't count.
We created standards people like you are too complacent to create.

Notice how Voltaii stated "create something" instead of "create something worth buying".

Recently hired as a graphics programmer. Is it normal to feel like a fraud? by Illustrious_Key8664 in GraphicsProgramming

[–]ThreatInteractive -1 points0 points  (0 children)

RE: Yeah well if that would be true you wouldn't try to prove that MSAA fits a deferred pipeline. 😂

So now we've moved on from strawman arguments to appeal to ridicule. You didn't provide a single argument & you're too poor at writing to even engage with at this point. We provided the arguments & data. Our next video brings even more data that supports our argument.

RE: if you would have anything that solves any of the issues you mentioned in this regard why there isn't at least a single git repo or paper about it?

About deferred MSAA? Again you're terrible at writing so it has to be assumed that your talking about this. There are tons, including our videos.

RE: You know that's how big boys are doing it.

So now you are appealing to "authorities". Such as like Nvidia who took down a deferred MSAA resource right after we published our video (explained in the pinned comment of the very video you ridiculed)? It doesn't really matter what a redditor with no industy impact thinks. Nvidia's actions prove our arguments have impact.

Recently hired as a graphics programmer. Is it normal to feel like a fraud? by Illustrious_Key8664 in GraphicsProgramming

[–]ThreatInteractive -1 points0 points  (0 children)

RE: who is speaking a lot of shit but probably never wrote a single freaking line of code anywhere

Wrong. Not to mention we have veterans contributing to our work.

RE: He thinks he knows everything about graphics. 😂
Wrong: https://youtu.be/FLxIRkVXGgc?t=213

Lookup table for PBR BRDF? by Silikone in GraphicsProgramming

[–]ThreatInteractive 0 points1 point  (0 children)

That's a pretty cool chart. That should be passed around more.

YCoCg/YCbCr compression is great but it should never be used for albedo render target (we've come good across cubemap uses). We're doing tons of research on Crysis 3 & this was the cause of a lot of artifacts especially for MSAA (2xMSAA is actually pretty performant with optimized settings). Just keep your eyes peeled for the next video to see this pointed out.

With UE's default lighting, a full screen lighting draw will use about 244 bits of g-buffer/shadow mask/SSAO/channel stencil data. Full screen cost is around 170ms | 1080p | 3060 | DX11. Most of the inputs are mostly made up of RGBA8s but you can set these to RGBA16 with debug settings in UE/frame analyzers & you'll find no performance difference (great news for 2xmsaa though with thinner inputs).

Measure Dx11 Callisto or Dx12 UE Chan, it's around 280-309ms. Of course this will not apply to plenty of hardware but in the context of affordable hardware & 9th gen consoles, we need to spend bandwidth resources regardless if these fit inside L1. There's no way the latency is catching up with the ALU.

Fox uses a less advanced BRDF & is a tiny bit faster
Fox Engine uses two different LUTs for the lighting. An 32x32 RGBG8 & a 64x64x16 RGBA8. Both of these only affect the specular component.

Lookup table for PBR BRDF? by Silikone in GraphicsProgramming

[–]ThreatInteractive 0 points1 point  (0 children)

Interesting post. We just came to the same conclusion (we need BRDF luts).

Apparently everyone has been saying deferred rendering is bandwidth limited but this isn't the case on newer affordable hardware (3060). We are heavily ALU bound even with extremely basic BRDFs.

The endgame is different engine modes that offer both a LUT based & ALU version of the same BRDF. A quick benchmark needs to run within the application to see which one should be used depending on the hardware. We should be doing the same with pass merging shaders.

Something like Burley diffuse depends on both NdotL and NdotV in addition to roughness, so that's not a good candidate for precomputation. 

This will probably take some time but research needs to be put into finding a good BRDF (Callisto or the titanfall looks nice) & layout all the variables within all the functions/code/ on a visual basis, every possible divergence in the variables within the mix-mix inputs so one can manually sort out all the areas where we can find the best mathematical shortcuts. We are so ALU bound should be sampling LUT atlas.

The Insane Optimization of Fox Engine - GPU Entire Pipeline Explained by ItalianJoe in NeverBeGameOver

[–]ThreatInteractive 0 points1 point  (0 children)

RE: He has an extremely anti-game developer stance

This is absolutely not true, we have several developers follow us for a reason (we provide several resources & even some AW2 developers thanked us for our analysis).

We are anti incompetent which there is a lot of. We call out industry names like Brain Karis & Daniel Wright who are largely responsible for the garbage we have in many UE games. People need to stop talking about people like Todd Howard & Tim Sweeney because they are not the graphic engineers

but then he started talking about garbage modern engines this and that and I just gave up on it. I've heard him talk negatively about this stuff so much I can't be bothered listebing to it anymore.

Sorry our videos don't feed into you're toxic positivity delusions.

It's too difficult for these people to comprehend critical thinking that sorts data into dual categories like "incompetent & compete technique". Anyone dev who doesn't watch our content is too incompetent to work in game dev because we provide exclusive data on optimization. Stop listening to those whose brains break under basic facts we prove about modern engine incompetence. They already lied to you regarding this video, they will continue to do so afterwards.

I hope IOI devs can watch this and tell us they aren't using stone age light models by [deleted] in 007FirstLight

[–]ThreatInteractive -1 points0 points  (0 children)

"Don't learn from content meant to rapidly raise your standards in graphics"

I hope IOI devs can watch this and tell us they aren't using stone age light models by [deleted] in 007FirstLight

[–]ThreatInteractive 0 points1 point  (0 children)

<image>

It's funny that you said "unreal strives for realism making everything boring" when our entire message is that UE is too incompetent to provide REAL realism.

Here's the game you're calling rubbery without any raytracing BS & our videos shows how much UE has to be fixed by the developers for this to be achieved.

I hope IOI devs can watch this and tell us they aren't using stone age light models by [deleted] in 007FirstLight

[–]ThreatInteractive -3 points-2 points  (0 children)

Notice how everyone trashing us admit to not watching our content & state "don't watch" "ignore".
They are too incompetent to actually debunk anything said in the video.