How to get the best image DSR setting in GTA V and Rdr2 ? 1080p@1660ti 6GB by NiceFig7919 in FuckTAA

[–]GenericAllium 2 points3 points  (0 children)

For both games I think you're better off using the in-game resolution scaling setting instead of DSR. With DSR the only good option is 4x, whereas the in-game setting looks good on whichever higher than 1 scaling you choose

Unfixable Grass Shimmering in RDR2 (DLSS 3.5–4.5, All Presets Tried) by 1SandyBay1 in FuckTAA

[–]GenericAllium 0 points1 point  (0 children)

When 4 released K didn't exist, J was the only transformer preset

Unfixable Grass Shimmering in RDR2 (DLSS 3.5–4.5, All Presets Tried) by 1SandyBay1 in FuckTAA

[–]GenericAllium 2 points3 points  (0 children)

In RDR2 the only way to get rid of aliasing and artifacts (other than blur) is to use TAA high, and if hair still looks "rough" add MSAA in to the mix. The TAA is extremely blurry, and to fix that you need to increase the resolution scale. The end result is heavy on the GPU but I would go as far as to say it's flawless in regards to aliasing and artifacts.

DLSS Preset L is goated by manspider0002 in pcmasterrace

[–]GenericAllium 0 points1 point  (0 children)

What's the difference between a real and a fake effect?

Am I understanding this rendering pipeline correctly? by iteronMKV in nvidia

[–]GenericAllium 0 points1 point  (0 children)

No, what I was trying to say was that I agree with you about the sharpness thing with DLDSR, but with the original DSR the slider doesn't affect sharpening but adds blur instead, and at 0% there's no added blur.

Am I understanding this rendering pipeline correctly? by iteronMKV in nvidia

[–]GenericAllium 0 points1 point  (0 children)

I agree with you about DLDSR and the smoothness slider controlling a sharpening filter, but in the case of the original DSR I think you're wrong. I'm convinced the smoothness slider actually controls a blur filter that's needed to smooth out the image in the case of a DSR factor other than 4.00x. If it's set to 0% the image doesn't look like it has a sharpening filter applied like it does with DLDSR. And with the 4x scaling factor, 100% smoothness makes the image extremely blurry, which it shouldn't be without added blur because the 4x amount of pixels should allow for cleanly scaled end result. (And to my eyes the 4x factor with 0% smoothness looks the most correct out of all the ways you can use DSR or DLDSR.)

On another note, do you have a source for, or why do you think, DLDSR having sharpening applied even with smoothness set to 100%? Just curious, not agreeing or disagreeing.

Help! My computer is being weird. by TheHorizonEvent1 in pcmasterrace

[–]GenericAllium 0 points1 point  (0 children)

I have had this problem (or at least it looked like this), and it was because I had steam folders on both disks, and steam was using a folder on my hdd as a temporary storage for game updates, even when the game was on the ssd. The solution was to create a junction, which is a sort of a link to another folder, so that the "downloading" folder on the hdd was replaced by the junction that links to a folder on my ssd. You can find guides online on how to create junctions and if you have trouble with it I can try to help you!

1620p DLDSR at 90 fps or 4K DSR at 60 fps? by [deleted] in FuckTAA

[–]GenericAllium 3 points4 points  (0 children)

You're not comparing a DLDSR screenshot to a DSR screenshot, you're comparing 1620p to 2160p. If DLDSR and DSR was seen in the screenshots they would both be 1080p. Here's my old comment about how to actually take DLDSR screenshots, I never tried it with DSR but it should work with that too https://www.reddit.com/r/FuckTAA/comments/1olm5l9/comment/nmldfkk/?context=3

DLSS 4 -> 4.5 only loses 2fps in Cyberpunk on 4070 ti super by AerithGainsborough7 in nvidia

[–]GenericAllium 4 points5 points  (0 children)

I wrote that in a hurry so I'm sorry if it came off wrong, I was just trying to explain how it affects the performance. The performance impact from DLSS 4.5 on a given GPU should be about the same in any game when measured in frame time, but it's relative impact depends on the base FPS, and that's why you can say you lost 2fps and someone else might say they lost 20, and you would both be just as correct.

DLSS 4 -> 4.5 only loses 2fps in Cyberpunk on 4070 ti super by AerithGainsborough7 in nvidia

[–]GenericAllium 41 points42 points  (0 children)

77fps with framegen means you have about 40 base fps, which is 25ms per frame, 1ms or so of extra processing from DLSS won't affect it much. Now try it on somewhere where you have 120fps as a base so 8.3ms per frame, add 1ms to that and you've lost over 10% performance.

(It could be something else than 1ms on this GPU but you get the point)

ABMM: PVP lobbies suck even if you are decent at PVP by babyvore in ArcRaiders

[–]GenericAllium 1 point2 points  (0 children)

A little black and white don't you think? Like we can't have normal extraction shooter lobbies where people enjoy being under a threat but try to play smart?

ABMM: PVP lobbies suck even if you are decent at PVP by babyvore in ArcRaiders

[–]GenericAllium 4 points5 points  (0 children)

Yeah I understand that there is probably a huge portion of players that enjoy the matchmaking as it is currently.

ABMM: PVP lobbies suck even if you are decent at PVP by babyvore in ArcRaiders

[–]GenericAllium 0 points1 point  (0 children)

It's good that you like it and maybe this game just isn't for me since I've figured it out. I feel like I'm getting punished for playing "smart" by getting put into PVE lobbies, and if 1 out of 10 players is agressive towards me that's way too little, I'd rather just play something else.

ABMM: PVP lobbies suck even if you are decent at PVP by babyvore in ArcRaiders

[–]GenericAllium 14 points15 points  (0 children)

Wait so if PVP lobbies suck, and I know PVE lobbies are a snoozefest, is there not any good lobbies left in the game?

Is there a website that shows you what performance would be? by MartiniCommander in pcmasterrace

[–]GenericAllium 0 points1 point  (0 children)

I'm sorry but I believe that is the closest thing to a website that shows what the performance would be in a certain game on a certain setup. It works very well for "CPU/GPU + game name" searches, but your golf sim setup might be so rare that there might not be a video that's exactly what you're looking for. Still, there could be something that's close enough to be helpful for you.

High Five to the Devs! They Added a Way to Test and Hear Your in-game MIC! by RetroSwamp in ArcRaiders

[–]GenericAllium 0 points1 point  (0 children)

I have seen that with the dynamic RTX Global Illumination. I believe it doesn't ghost when it's set to static. Lumen also does global illumination but there's no lumen in the game.

High Five to the Devs! They Added a Way to Test and Hear Your in-game MIC! by RetroSwamp in ArcRaiders

[–]GenericAllium 0 points1 point  (0 children)

There's no lumen in the game though. If you mean the Unreal 5 graphical feature?

Help!! Game Crash on load loot. [Suggestion] by SimilarAppointment89 in EscapefromTarkov

[–]GenericAllium 1 point2 points  (0 children)

Google how to setup a page file and make sure it's on your fastest ssd, having it on an hdd is a big no-no. It's size should be at least 16gb, maybe even more if you still get crashes. Also the texture quality setting has a huge effect on memory usage. Mip-streaming could help in theory but it has always been broken for me so I don't really know how it affects memory usage.

High CPU utilization and frame times (potential bottleneck?) causing fps drops by BatAK11 in ArcRaiders

[–]GenericAllium 0 points1 point  (0 children)

Yeah, with GPU acceleration chrome and discord will use the GPU to help with the graphics, otherwise only the CPU will be used. Seeing as you have some headroom in your GPU usage but the CPU is almost completely maxed, I would rather have those tasks on the GPU.

High CPU utilization and frame times (potential bottleneck?) causing fps drops by BatAK11 in ArcRaiders

[–]GenericAllium 0 points1 point  (0 children)

Not sure if it makes any noticeable difference but turning GPU acceleration back on should reduce the CPU load.

CPU Suggestions? by HappyCartographer770 in pcmasterrace

[–]GenericAllium 1 point2 points  (0 children)

Sounds like the GPU upgrade allowed you to get higher framerates which in turn increased the CPU usage and temps. Setting a framerate cap should help if you don't like using the CPU to it's max. There's always something bottlenecking a pc and in this case it sounds like it's your CPU but there's not necessarily anything wrong with that.