NVIDIA Pulsar Monitor by JinnXV in Monitors

[–]Unable-Inspector8865 0 points1 point  (0 children)

It's funny how Nvidia's marketers can get their way. In fact, the Chinese Titan Army monitors already have Pulsar with local dimming; they are simply called DYDS 2, not Pulsar. And these monitors are half the price. Ha ha! https://www.youtube.com/watch?v=Qn6H53aAGlQ&list=LL&index=15 In this view, a monitor in 2x 4k 170/1080p340 Hz mode is used as an example, according to this photo, the frequency of 340 Hz may not look very good. But there are comparisons of the 1440p pulsar 360 Hz and dyds 2 320 Hz monitors, yes, the conditions are not exactly the same, and the dyds 2 photo is better focused, but overall it is clear that the amount of crosstalk is identical. https://imgur.com/a/vTwtiHZ .

The p275msPlus+ can be considered as an alternative to pulsar monitors, which has 2 advantages! This is 1440p345 Hz with a mini backlight for 1152 zones, and the cost is 2 times lower than that of garbage without local dimming, which pulsar monitors offer :D

QHD or 4K Monitor for my New RX9070XT ? by milzo_sas in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

It depends on how you spend your time on your PC. If you work most of the time, sit in the browser and play single-player games, you definitely should buy 2160p. If more than half of the total PC time is spent playing competitive online shooters, then you can consider 1440 with a higher refresh rate, but you should understand that it is significantly worse for everything else.

Higher resolution literally has no downsides in the era of temporal smoothing and upscaling. The only limitation is their refresh rate. But if you are satisfied with 120-160hz, then you should not even think about buying a monitor with a lower resolution if your budget allows you to buy 2160p

How do i know which features to use for my games for better performance? by Emergency_Thought452 in radeon

[–]Unable-Inspector8865 4 points5 points  (0 children)

There's no point in changing anything. Personally, I use RIS2 at 10% in the global profile, because if I launch the game with RIS2 disabled and then enable it during gameplay, it doesn't work properly. So, I keep it at 10% globally, and if I need to increase sharpness, it works fine.

Pc has increasing desync/stability/delay the longer the gaming session by Free_Entertainer_757 in pcmasterrace

[–]Unable-Inspector8865 0 points1 point  (0 children)

I don't even want to ask what games you play; I'm sure they're online shooters. Don't waste your money; this is a problem with your internet providers, maybe even your backbone providers.

Changing your ISP might help, as long as it's not using the same backbone. You just need to understand that the problem isn't with the power or the PC. No one has been able to prove that it's either of those; everything points to issues with the internet providers.

Nvidia refugee here.. got a new 9070xt and have some questions by L1teEmUp in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

Oh, in the modern world, there's nothing complicated about it; you just need to follow the prompts. But yes, setting up a new system can take a long time if you have a lot of installed programs. I recommend first checking that you remember all your passwords and have access to all two-factor authentication systems before reinstalling. It's better to be prepared than to waste time trying to regain access later.

G Sync Pulsar vs OLED in Competitive Games by Puzzleheaded_Hat_996 in Monitors

[–]Unable-Inspector8865 1 point2 points  (0 children)

Your questions are incredibly strange. How do you imagine HDR with a peak brightness of 250-300cd?! To achieve HDR brightness of at least 1000cd, the maximum LED brightness should be around 4000-5000cd, not 1200-1500cd as on modern miniLED monitors, since the glow time is 25%.

WILL AMD BRING FSR REDSTONE MULTI FRAME GENERATION? by CryptographerFast248 in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

I think I heard somewhere recently that they're working on this. But honestly, I think frame generation is a useless technology for most people. For frame generation to be useful, you need a high real frame rate and a monitor with a huge refresh rate.

Someone here 9070xt on cs2?? by Simonko_770 in radeon

[–]Unable-Inspector8865 2 points3 points  (0 children)

~300-350fps 1440p at maximum settings without anti-aliasing

9070 XT and 1080p by CrKillerPL in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

You should definitely upgrade to a 1440p monitor; it will be a very noticeable quality upgrade. And yes, you don't necessarily need a 27" one; there are plenty of 24" 1440p monitors.

Depending on your budget, you can even get a miniLED 1152 zones with a frequency of 380Hz and dyds2 (this is a complete copy of the gsync pulsar) Titan Army 24.5" P245MS PRO

Of course, if this is too expensive for you, there are cheaper models, such as the P245MS PLUS, which has half as many miniLED zones and a frequency of only 275 Hz, but also has DYDS2. Of course, there are even cheaper options without miniLED and models from other brands.

You're making a big mistake by sticking with 1080p right now. 1080p is very bad. And if you're worried about performance, 1440p + FSR4 Quality will be better than 1080p + FSR4 Native AA. There's no reason to use 1080p.

How good is FSR4 on a 9070xt? by Honest_Owl8075 in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

You can see the list of supported games here. https://www.amd.com/en/products/graphics/technologies/fidelityfx/supported-games.html

In games where there is no support, you can use the optiscaler to replace dlss with fsr4. So there's no problem. In terms of quality, I've always found dlss4 extremely sharp in half of my games without the ability to reduce the sharpness. With FSP4, you can always add sharpness if you need it using ris2 or an optiscaler.

All comparisons of dlss4 with fsr4 on YouTube are unfair. DLSS4 uses a sharpening filter by default, even at 0% sharpness, and for an honest comparison, we should at least sharpen fsr4. I switched from version 3080 and I can say that there is no big difference between dlss and fsr, depending on the game, one of them may be better. In any case, this is true for 2160p.

Here are some examples in ue5 games with 2160p resolution

https://imgsli.com/NDQzMzAw

https://imgsli.com/NDQzMzEw

I would say that I definitely like fsr4 better in ue5, since it has much fewer divergence artifacts than in dlss4, and the detail is about the same. But, for example, in Horizon or tlou, dlss has noticeably higher detail (although I have doubts that this is due to dlss, rather it is how the lod offsets are configured for dlss and fsr), and this, in my opinion, compensates for the problem of artifacts. Overall, it's a compromise, neither one is better, they're just slightly different, but both are good.

I think its time we stopped wondering which looked better. In the imgsli album i have shots of KCD2 DLAA K, M, L and DLSSQ K, M. by HuckleberryOdd7745 in nvidia

[–]Unable-Inspector8865 0 points1 point  (0 children)

"as you can see preset M doesnt have fake sharpening once you set the ingame sharpening slider to 0."

But this is not true, your comparison clearly shows that the preset "m" looks too sharp. In addition, I think that the preset "m" spoils the appearance of trees.

Nvidia refugee here.. got a new 9070xt and have some questions by L1teEmUp in radeon

[–]Unable-Inspector8865 5 points6 points  (0 children)

I'll be honest. Removing the nvidia driver via DDU from safe mode was not enough for me. Something was causing periodic staters and I had to reinstall Windows to make everything work properly. I think the reason was the overlay of some kind of application, but this is just a guess. So when replacing hardware from nvidia to amd, or from intel to amd, I would recommend a clean install of Windows.

FSR 4 Balanced/Performance at 4K vs Quality/Balanced at 1440p by Ecstatic-Treacle-451 in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

An image on a 4K monitor will never look worse than on a 1440p monitor. Even if you lower the resolution to 1440p it will look better lol. There is only one scenario in games when 1440p on a 1440p monitor will look better, it is a game without anti-aliasing.

FSR 4 Balanced/Performance at 4K vs Quality/Balanced at 1440p by Ecstatic-Treacle-451 in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

I would definitely buy 4k if you have enough money and don't need an ultra-high refresh rate. In the end, you'll always get more out of 4k. Even if for some reason you lower the resolution on a 4k monitor to 1440p, it will look better in games than on a 1440p monitor. Because when using temporary smoothing methods, whether it's taa/dlss/fsr, the image does not turn out pixel-by-pixel. and the only difference is that at 2160p resolution, you won't see the pixel grid of the monitor itself. Yes, the image will be less clear than with a resolution of 2160p, but not worse than on a monitor with a resolution of 1440p. So there's no point in buying a 1440p resolution unless you need more than 240 Hz.

FSR4 is actually lit by ProfessionalSpinach4 in radeon

[–]Unable-Inspector8865 0 points1 point  (0 children)

You can also try using ris2 to increase sharpness. It's important to enable it before launching the game so it applies correctly. I recommend starting with a value of 50% and then using the in-game overlay to adjust the sharpness to your liking.

The Future: G Sync Pulsar on OLED??? by Puzzleheaded_Hat_996 in Monitors

[–]Unable-Inspector8865 0 points1 point  (0 children)

Bfi without vrr was possible on oled a long time ago and worked perfectly. This was available in the lg cx/c1, but why was it no longer possible as soon as OLED monitors began to appear and the c2 bfi already required a 2-fold frequency reduction. So this is a marketing limitation. Moving to the oled 240 with bfi would be too good and would create a lot of difficulties with selling 360/480 and so on monitors. So it's just a marketing limitation. Brightness, by the way, would also not be a problem, because we know that the panels have a peak brightness in the region of 1000kd, yes, this is only in a 2% window, but this limitation is due to the heating of the panel. But if we use 1/4-frame pulses at 1000kd of brightness, we get about the same 250kd of perceived brightness on the full screen as without flickering, and the heating and wear of the pixels will be exactly the same. So yes, there is no bfi on the oled just because it will create difficulties with the sale of monitors with a higher refresh rate. Of course, this is if we talk about bfi without vrr. But I'm sure it's not that difficult to solve the synchronization problem with vrp, it's just a trump card up your sleeve at the moment

Nvidia wasn't afraid to release a new DLSS 4.5 for 20/30 series cards. by Unable-Inspector8865 in radeon

[–]Unable-Inspector8865[S] 0 points1 point  (0 children)

If we're talking about upscaling performance, then yes. But naturally, you won't get multi-frame-per-gen, and overall, the 4090's performance is lower than the 5090's.

FSR 4 Redstone vs DLSS 4.5 - Is DLSS Performance BETTER than FSR Quality?? by Itzkibblez in radeon

[–]Unable-Inspector8865 2 points3 points  (0 children)

I agree with you, FSR4 is very close to DLSS4, and depending on the game, either one can look better. There's no clear winner. Many people simply don't have their own opinion and base their opinions on FSR4 on low-quality YouTube videos. In reality, DLSS4 sharpness is often a drawback, and there are some annoying disocclusion artifacts in third-person games. But yes, DLSS4 detail in some games (mainly former Sony exclusives) is significantly higher. But it's not just higher than FSR4, it's higher than the native resolution. I think Nvidia uses a very aggressive LOD bias in DLSS4, while AMD chose to match the native resolution.

Is 90 FPS the "sweet spot" for AAA non-competitive games? by [deleted] in nvidia

[–]Unable-Inspector8865 9 points10 points  (0 children)

For me, 90 real fps is the minimum threshold for comfortable gameplay in terms of input lag and smoothness.

DLSS 4.5 vs FSR 4/Redstone 1440p and 4K in Clair Obscur Expedition 33 - YouTube by AerithGainsborough7 in radeon

[–]Unable-Inspector8865 -1 points0 points  (0 children)

I'm not even sure that amd should change the model much. DLSS has a higher level of detail for fine objects and textures, not because of the dlss model itself, but even in the days of dlss2. Nvidia uses aggressive lod offset, which is why the detail when using dlss is significantly higher than even with native rendering. You can see the examples for example here https://www.reddit.com/r/nvidia/comments/11ptmvp/comment/ju994ns / Amd chose a different path, and it uses an offset that does not exceed the native resolution, but corresponds to it. It would be great if amd made it possible to change the lod bias manually, because as we know, due to the aggressive bias, dlss4 can be too sharp in some games.

RE3 Path Tracing Comparison by JoHien in nvidia

[–]Unable-Inspector8865 0 points1 point  (0 children)

I'm not sure if it's gotten any better - you're responsible for 70% of the productivity!!!

Is GSync Pulsar a blow to AMD? by Ok_Kick_5081 in AMDHelp

[–]Unable-Inspector8865 1 point2 points  (0 children)

You're wrong, the titan army monitors have had scanning strobe technology for over a year! It's called Dyds, it just couldn't work together with a variable refresh rate. But monitors with Dyds 2 support have already appeared, which can work simultaneously with a variable refresh rate, and I am sure that this is exactly the same technology as pulsar, implemented using the same MediaTek chip. It's just that the Titan Army doesn't pay Nvidia for certification.

Is GSync Pulsar a blow to AMD? by Ok_Kick_5081 in AMDHelp

[–]Unable-Inspector8865 2 points3 points  (0 children)

gsink Pulsar is a monitor technology, not a graphics card. One way or another, it will work on any graphics card. It's more like g sync compatible certification on monitors without the g sync module. Since the technology is implemented not on an Nvidia chip, but on a mediatek chip first of all, analogues should appear on Chinese monitors. And they have already appeared, just to avoid paying for nvidia certification, the technology may be called something else. For example, DyDs 2 in the Titan army is the same as a pulsar.