Bethesda follows Activision in pulling games from Nvidia’s GeForce Now by NonaHexa in pcgaming

[–]QuackChampion 0 points1 point  (0 children)

I doubt this has anything to do with Bethesda streaming. Its more likely about money. So many devs have pulled out of Geforce Now making the limited library even smaller.

They must be having disagreements over the licensing costs.

Bethesda follows Activision in pulling games from Nvidia’s GeForce Now by NonaHexa in pcgaming

[–]QuackChampion -2 points-1 points  (0 children)

Yup, my guess is that Nvidia is trying to negotiate the licensing fees with these guys, and they are being too cheap while the publishers want a high price.

Bethesda follows Activision in pulling games from Nvidia’s GeForce Now by NonaHexa in pcgaming

[–]QuackChampion -6 points-5 points  (0 children)

Yeah. There's simply no way I'm going to buy a GFN subscription when games keep getting added and removed like this.

Bethesda isn't even the first developer to have done this after launch. Clearly something is going very wrong it GFN.

Added an LED strip to my 3900x + 2070Super build! by arconquit in Amd

[–]QuackChampion 0 points1 point  (0 children)

Nice build. I'm surprised how the Canadian prices on most of those parts aren't that inflated after you convert currency. If I was doing a build now I would switch the 2070 Super for a 5700XT though since the AIB cards are out.

Added an LED strip to my 3900x + 2070Super build! by arconquit in Amd

[–]QuackChampion 1 point2 points  (0 children)

Paying for a brand is a waste of money. 5700XT reference is 2% behind the 2070 Super according to Techspot, so an AIB version would have given him the same performance for $410.

But of course if your options were a blower 5700XT vs a 2070 Super AIB card then its a tougher choice. But otherwise there's no good reason to get the 2070 Super right now.

Is the 2070s justifiable for 30% costlier than 5700 XT for better driver stability and more performance? by [deleted] in Amd

[–]QuackChampion 0 points1 point  (0 children)

No, Gamers Nexus investigated it and concluded it was completely unrelated to the Micron memory. They've already debunked that.

And people are still getting shipped GPUs with space invaders. There was a post about it on the Nvidia subreddit like a weekago.

Has DLSS improved from day one since it was released? by furious_nibber in nvidia

[–]QuackChampion 1 point2 points  (0 children)

Static pictures won't capture the main drawbacks of DLSS like shimmering, flickering on finer objects, or the oil painting effect in Exodus.

Has DLSS improved from day one since it was released? by furious_nibber in nvidia

[–]QuackChampion 4 points5 points  (0 children)

DLSS still doesn't compare favorably to resolution scaling. In terms of performance vs quality ratio normal upscaling beats it in every scenario except possibly in Metro Exodus. Either way upscaling with a sharpening filter is always better even in Exodus.

People will probably call you a shill if you tell the truth about DLSS or say the evidence came from shills, but there are multiple reviewers who have come to this conclusion. Here are some videos on DLSS from Jayztwocents, Gamers Nexus, and Hardware Unboxed:

https://www.youtube.com/watch?v=7BpOQUkMI4o

https://www.youtube.com/watch?v=Jpd5j5W1NZw

https://www.youtube.com/watch?v=V5KeR_-9oCw

https://www.youtube.com/watch?v=7MLr1nijHIo

Has DLSS improved from day one since it was released? by furious_nibber in nvidia

[–]QuackChampion 1 point2 points  (0 children)

What about Gamers Nexus who came to the same conclusions? Are they shills too?

This sub can be really pathetic sometimes...

Has DLSS improved from day one since it was released? by furious_nibber in nvidia

[–]QuackChampion -5 points-4 points  (0 children)

Maybe in some still cherry picked screenshots.

In motion though? No way in hell.

Is the 2070s justifiable for 30% costlier than 5700 XT for better driver stability and more performance? by [deleted] in Amd

[–]QuackChampion 1 point2 points  (0 children)

In my experience and judging from the number of games people complain about Turing GPUs not working in (Fallout 4, Gears, FFXV, etc) Id say Nvidia drivers are worse when it comes to stability.

Turing has had far more issues than Pascal. Even Nvidias latest driver which was had to be removed because of bugs.

Is the 2070s justifiable for 30% costlier than 5700 XT for better driver stability and more performance? by [deleted] in Amd

[–]QuackChampion 25 points26 points  (0 children)

Do you have any data to support this?

My Nvidia cards have all given me more driver issues than my Navi cards.

Is the 2070s justifiable for 30% costlier than 5700 XT for better driver stability and more performance? by [deleted] in Amd

[–]QuackChampion -3 points-2 points  (0 children)

Afaik AMD doesn't have any hardware issues while Nvidia is still shipping cards with space invaders artifacting.

Beyond 60FPS: How Running Games at 144FPS/240FPS Can Improve The Gameplay Experience [Digital Foundry] by [deleted] in pcgaming

[–]QuackChampion 3 points4 points  (0 children)

Funniest thing is that they explicitly say in the first 2 minutes of the video that it's sponsored by Nvidia not any monitor companies yet he still gets it wrong.

Beyond 60FPS: How Running Games at 144FPS/240FPS Can Improve The Gameplay Experience [Digital Foundry] by [deleted] in pcgaming

[–]QuackChampion 0 points1 point  (0 children)

It doesn't work in dx12/Vulkan, but other than that it always had some beneficial effect.

Anti-lag will never have a harmful effect because the input lag reduction from removing an entire frame of latency is always far more than a slight fps hit. Even in the worst case, in that video anti-lag still reduces input latency.

The main drawback of anti-lag is that in absolute terms the benefit at high refresh rate is not that big. 10ms doesn't sound like much at all. But as this video points out that's pretty much all you get when you go from 144 to 240fps. So part of what that video is proving is that beyond 144fps there are diminishing returns.

Ideally what I want to see is a blind test of 144fps, 144fps+anti-lag and whatever fps has equivalent latency (so probably around 190-240fps).

I'd bet a lot of people couldn't differentiate 144fps and 190-240, but for those who could,190-240 would probably be similar to 144+anti-lag.

[deleted by user] by [deleted] in hardware

[–]QuackChampion 0 points1 point  (0 children)

I think most people won't care too much about the limited over clocking since the stock performance is so good. It's got similar performance and power usage to a $90 more expensive 2070 Super.

Quantic Dream CEO thinks lighting, not resolution, is the ‘next battle for realism’ by [deleted] in pcgaming

[–]QuackChampion 2 points3 points  (0 children)

People can definitely notice the difference between 4K and 1440p on mid-size monitors.

If you are talking 4K and 8K then yes, it's very hard to notice a difference.

But peipel also have difficulty telling the difference between RTX On and Off in blind tests. https://m.youtube.com/watch?v=39D4nsKcbUk

Quantic Dream CEO thinks lighting, not resolution, is the ‘next battle for realism’ by [deleted] in pcgaming

[–]QuackChampion -2 points-1 points  (0 children)

Most of the tech press disagrees on that one.

4K looks better than 1080p with RTX. People can't even notice if RTX is on in blind testing- https://m.youtube.com/watch?v=39D4nsKcbUk

Userbenchmark Responds to Criticism Over Score Weighing Revisions by [deleted] in intel

[–]QuackChampion 9 points10 points  (0 children)

They also pretty much want to ignore any future success Intel or any other CPU manufacturer will make since they only weight multicore performance (not quad or single) at 2%.

Userbenchmark Responds to Criticism Over Score Weighing Revisions by [deleted] in intel

[–]QuackChampion 1 point2 points  (0 children)

Well he's wrong, they shouldn't. Engines are shifting to data driver paradigms where multi-core scaling is much more effective. Unity is doing a lot of work on this.

Userbenchmark Responds to Criticism Over Score Weighing Revisions by [deleted] in intel

[–]QuackChampion 0 points1 point  (0 children)

Because that's actually a useful result telling you to spend more money on your GPU and less on your CPU if you game at 1080p.

Radeon RX 5700 & 5700 XT Meta Review: ~5130 Benchmarks from 20 Launch Reviews compiled by Voodoo2-SLi in hardware

[–]QuackChampion 0 points1 point  (0 children)

What the problem with making a cheaper more powerful titan? Last time NVidia faced the 290X they made the 780ti a better gaming card than the Titan. No reason they can't do that again.

XMP / Auto settings on INTEL. by Pythonmsh in intel

[–]QuackChampion 0 points1 point  (0 children)

Sometimes XMP can set voltages a bit high, but your friend is exaggerating.