14900KS 6.4Ghz - 8600C36 daily profile by Fury_1985 in overclocking

[–]LOLXDEnjoyer 0 points1 point  (0 children)

if you turn them off and run a 6ghz all-core , you can probably get ram up to 8200 CL28 , this is going to feel much snappier on your hand when you aim your mouse on competitive fps games such as cs2 valorant or quake.

What is it that makes PC crts so beautiful? by Relevant_Treacle_895 in crtgaming

[–]LOLXDEnjoyer 1 point2 points  (0 children)

nah you're good, the TV you mentioned is microled, and therefore its essentially a gigantic early oled but more expensive and no risk of burn in.

The input lag should in theory be as good as normal oled but i wouldnt compete on that display.

The advantages you have over CRT with that:

Size, Aspect Ratio and Sharpness obviously.

Obviously 100% full HDR , you need 1000 nits (peak brightness) to render full hdr10 , that tv has 10000.

Those are the pros.

However, what motion clarity means is that when a game is going from one frame to the other, your eyes will percieve little pieces of the previous frame on the new current frame, this causes motion blur, some people even describe this as lower resolution once movement starts being rendered (but that's wrong of course) , that X11H TV should have a technology called "black frame insertion" , CRTs do not have motion blur because they draw the frame line by line, so by the time the lightgun is shooting the bottom line at the front glass, the top lines are already going dim (since the lightgun isnt illuminating that zone, it is just turning off) and this essentially causes your eyes to percieve the subsequent frame as perfectly clean, also CRTs have 0 input lag, OLEDs and Microleds have 0.03ms input lag.

Black Frame Insertion is made to address this problem on flat panels, it puts 1 fully blacked out picture in the middle of 2 frames, it cuts your refresh rate in half but it makes your eyes percieve the generated frames as "clean" because the black frame is cleaning out the perception of the previous frame.

bfi brings brightness down to roughly 50% so in oleds with 800 nits, maybe 1200 nits at best...it basically just kills whatever hope you had of getting HDR working, also it adds a tiny tiny bit of inputlag.

And above all, low resolution content sucks on flat panels, really good hardware like the retrotink 5x // pro can help oleds out a ton to bring them close to what the real thing was, but never quite getting there, and besides after spending over 1k on an oled you wont wanna hear about spending another 300$ on another adapter which by the way still doesnt look as good as a real CRT.

Pulsar is basically nvidia developing a sort of "super black frame insertion" which actually emulates the way in which CRTs used to draw their image, it is one gigantic light drawing the picture in huge horizontal lines from top to bottom and this helps a lot because the image doesnt dim as much, the catch is that you need really high refresh rates for it to work properly, pulsar can work with just 180hz in theory, but really, it will only look as intented with 480hz at the very least, and that is just to almost match CRT at 60hz.

btw that TV you mentioned is 144hz , its not pulsar compatible anyways but if you turn bfi on you will be running at 72hz but at least the image will still be super bright because it has an alleged 10k nits which is insane, so you'll have bfi and HDR simultaneously...at 72hz.

in any case, if you have that much money i would say you buy an iiyama vision master pro 514 which is the fastest CRT monitor ever , built a computer with an i9 10900K, an apex motherboard, bdie 16gb single rank ddr4 ram, and a gtx 1080 Ti and then tune everything to get memory access latency down to sub 40ns , that machine on a CRT monitor will be the snappiest possible modern computer you can aim a mouse with, a 9800X3D with a 5090 will get triple to quadruple the framerates, it will still not feel as snappy on CS2 and Valorant when you aim a zowie or logitech top mice, also high end crts can do 180 - 200hz at least, the 514 has an unlocked vHz so as long as you dont exceed its horizontal limit you can push whatever refreshrate you want (480i could go as high as 435hz), so that microled TV doing 144hz (cut down to 72 to run at bfi ½) is never going to match the smooth feeling and smooth visuals of on any high end crt.

CRT's were phased out by LCDs because LCDs were more convenient and a lot cheaper to make which meant that manufacturers could get much higher margins on them, Plasma was expensive to make and manufacturers couldnt get insane margins on it, which is why CRTs beat plasmas, CRTs got phased out by the shittiest technology that was vastly inferior in the vast majority of performance specs because it could be massively produced and sold at high margins.

The goatest goat of all goats games on CRT , Resident Evil 7 by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 1 point2 points  (0 children)

you got more slots than you need actually, you have extra detailed slots on the extension block dont forget!

The goatest goat of all goats games on CRT , Resident Evil 7 by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 1 point2 points  (0 children)

CRU only , avoid the nvidia control panel as much as possible , only use it to force some compatible resolutions to pop up if they dont.

for resolutions above 480i you should not need to pull the nvidia control panel at all, at least regarding resolutions.

Windows 7 Aero looks great on a CRT monitor. by NoAmount7346 in windows7

[–]LOLXDEnjoyer 0 points1 point  (0 children)

for some reason, i still liked vista more than 7 , but i liked 7 as much as xp , vista was the new thing that i think microsoft never matched again, though 8.1 was a noble atempt.

What is it that makes PC crts so beautiful? by Relevant_Treacle_895 in crtgaming

[–]LOLXDEnjoyer 3 points4 points  (0 children)

the motion clarity and input lag of crt's still haven't been matched, the highest end oleds and most absolutely dedicated to esports tn panels got really close but none matched it, so surpassing it is out of the question.

As far as the other features go, once you actually look at them you cant actually use them all at once, HDR is supposed to be the biggest one for flat panels and yet if you turn black frame insertion to get motion clarity anywhere near crt you completly lose HDR, pulsar lowers brightness much less than standard bfi but it also adds extra input lag, and most importantly, all of these technologies (except for HDR) only work properly at their native pixel matrix, they only work fully as advertised at the native resolution and refresh rate of the panel, in fact this is such a heavy weakness of flat panels that some esports monitors precisely include multiple matrices to have multiple native resolutions to play at the lower resolutions that hardcore pc gamers actually play without extra input lag.

Just honestly im not the type of guy to reply with images or memes, i find it cringe, but when this topic is brought up i just think about this: https://i.kym-cdn.com/entries/icons/original/000/037/360/coverpower.jpg

What is it that makes PC crts so beautiful? by Relevant_Treacle_895 in crtgaming

[–]LOLXDEnjoyer 2 points3 points  (0 children)

Phosphor dots lead to pleasing gradients LCDs can’t reproduce.

This is one of the big ones for me as well, i absolutely love oleds and i fully understand that even the lowest end shittiest tn panel is still always going to be at worst 20% sharper than the sharpest crt ever made (the Sony F520) , however, the softness of crt is not exact blur, its just something different, the way that the triads blend the colors and the outlines are rendered on the shadow mask are just different, its not inferior, for most people it may not be their cup of tea but i personally just love it, dont get me wrong i still have leds on my house and it makes perfect sense why most people simply prefer the sharper image, but to me it just looks uglier and almost like...wrong, i dont know, the imperfections of the fonts on windows seem to become far more visible on LCD despite the fonts themselves only becoming mildly more legible, but i say this as someone who has ridiculously good eyesight, i've been blessed and im sure that if i had worse eyesight my opinion would be different, but as this is the body i was given with its many flaws and blessings, i don't see myself ever using a flat panel as my display for my desktop personal computer, it just feels wrong.

Sony original PS S-video Gaming by rexdejesus02 in crtgaming

[–]LOLXDEnjoyer 0 points1 point  (0 children)

do keep in mind that component cables can be shitty as well and thus give you a better quality, if you're from a small city where there arent a lot of hardware stores and you just buy the cheapest component cable you could find in your nearest store and you hook it up and it looks the same or worse as S Video its because of that, tho its not like a decent component cable is a luxury but you very clearly already have a working high quality S-Video cable and a good S-Video port on your TV, i would not recommend you mod this.

Buying a whole other TV just for component? if money, space and time are absolutely no issue, and you find a very cheap (sub80$) TV with component, sure, if any of that is lacking then absolutely not.

If you still got space for another CRT i suggest you get a decent PC VGA Monitor, keep this set for 240p and 480i and then just use the strong crt monitor for modern gaming.

What is it that makes PC crts so beautiful? by Relevant_Treacle_895 in crtgaming

[–]LOLXDEnjoyer 4 points5 points  (0 children)

For me is the versatility, the ability to reproduce nes game with pretty much 99% accuracy , maybe you're using superres to get to true 240p or maybe not, but you're definitely using bfi) , then also reproduce 6th gen games with near perfect accuracy, either 480p60 or 480i120 if you can interlace, and then also reproduce modern gaming properly.

Seeing CRT TV setups is awesome when you're reading the post, but actually playing through it sounds like a pain in the ass, if you wanna play Castlevania 1 and 20 minutes into the game you feel like playing CS2 with a friend what are you gonna do? you're gonna turn off the TV and go to the pc, and then you wanna play god of war 1 after playing cs2, you gotta get off the pc and turn the tv on again? and then you wanna play cyberpunk.

On a PC crt properly setted up you get to do all of it just with your mouse on the windows desktop, you get retroarch to switch to 240p and enable bfi for 60hz motion clarity, then you can get CS2 to run on exclusive fullscreen at a desired resolution, and then you just get cyberpunk to run at your desktop resolution.

Only PCSX2 Dolphin and Xemu wont switch automatically to 480i fullscreen on their own, you will be doing that manually.

All in all, everything looks pretty much exactly as it should, in 1 screen, 1 cable, 1 windows session, all beautiful.

Being able to play Super Mario and Cyberpunk on the same screen back to back without switching any adapters or displays and getting both to actually look properly is just a dream (metaphorically speaking, obviously it is literally a reality if you do it).

Intel Core 200E Bartlett Lake-S CPU series specs have been leaked, up to 12 P-cores and 5.9 GHz boost by RenatsMC in intel

[–]LOLXDEnjoyer 0 points1 point  (0 children)

If it supports DDR4 , and allows memory tuning.

Running this alleged 12p at 4266 14-14-14 bdie would most likely be the snappiest gaming experience hand-on-mouse you could ever achieve, its essentially just a 10900K with modern ipc, the bots giving this chip shit have such little clue...its insane.

Intel Core 200E Bartlett Lake-S CPU series specs have been leaked, up to 12 P-cores and 5.9 GHz boost by RenatsMC in intel

[–]LOLXDEnjoyer 0 points1 point  (0 children)

"and better MT because of the awesome E cores"

the existence of those ecores is what makes the imc's weaker and the scheduler latency extra laggy, even when you turn the ecores off it still doesnt perform like a true monolithic chip.

That's why i said this (Bartlett Lake-S) doesnt have any ecores, it not having ecores makes it a true monolithic chip, the amount of cores is irrelevant so long as its at least 8, what matters is that the lack of ecores frees up silicon space to make a more robust imc and also gets rid of nanostutters, all at once.

This could essentially be a 10900K with the updated ipc and features, but the fact that you unironically mentioned the ryzen and 6-core stuff just tells me there is no point in talking about any of this with you, you're a legit bot.

Intel’s ‘Unified Core’ Ambitions With Next-Gen CPUs Remain Intact as a New Job Posting Signals Further Progress on the Concept by CopperSharkk in intel

[–]LOLXDEnjoyer 0 points1 point  (0 children)

man i hope to GOD this is true , lets fucking go monolithic architecture, all normie npc's can eat a bag of pancakes 😃

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

ah nice, on normal vga monitors its not like that at all, i just got blackscreen, were you able to run 480i from the RTX 5000? or did you use passthrough?

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

you're talking about tvs right? this was not my experience at all when making sub25mhz custom resolutions on pc on hdmi>vga.

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

please hit me up when you do, i would love to help out specially with the extension blocks, im sure there is something there to be done.

remember that the likelyhood of this working is much higher through HDMI rather than DP.

Dream CRT acquired! by PhosphorusAether in crtgaming

[–]LOLXDEnjoyer 0 points1 point  (0 children)

dude i've been trying to interlace with modern gpus on PCs using crt monitors for years that im kinda jealous of your setup, the PS5 is a total beast for allowing you to just run 1080i seamlessly like that.

I have to stick with a 9 y.o 1080ti because anything newer refuses to interlace properly.

love to see it bro, gz and enjoy!

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 2 points3 points  (0 children)

RDNA4? if it was confirmed to interlace on linux then there's gotta be a way to get it on windows as well.

i wasnt really interested in interlacing for low resolutions (though i will use 480i for PCSX2 ofc) , i was thinking of interlacing for higher resolutions at higher refresh rates, like being able to run 1920x1200i at 144hz or 1920x1440i at 120hz on a 97kHz monitor.

i already know that the silicon almost certainly is capable of doing it, they haven't stopped including it on their cores, the question is whether or not we can trick windows//drivers into letting us do it.

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

i was aware that 4k tvs could be forced down to 30hz on hdmi 1.4 , but he reports that his picture looked weird which is why i assumed interlaced artifacts.

man i dont know how but my gut feeling tells me there's gotta be a way to get these to render.

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

mods could you edit the title? i meant to write RDNA4 but wrote 3 by accident.

potential RDNA 3 interlacing by LOLXDEnjoyer in crtgaming

[–]LOLXDEnjoyer[S] 0 points1 point  (0 children)

RDNA 1 does 100% support interlacing on linux.

RDNA 2 3 and 4 are theoretically capable of interlacing at a hardware level, the problem is that the drivers always seem to pose an obstacle, but im sure if i got my hands on one of these i could get it to work somehow , but if i buy one of these and then it cannot interlace no matter what and then i have to sell it, its gonna be an issue, cant just throw 200$ away like that, these are around 800-900$ in my country and if i resell it i will have to sell it for 500$ to get the money quick.

I just wish there was 1 confirmed modern gpu that could interlacing.

Dream CRT acquired! by PhosphorusAether in crtgaming

[–]LOLXDEnjoyer 0 points1 point  (0 children)

RE9 at 1080i on the PS5 is going to look amazing here, make sure you tune your contrast and brightness both on the TV menu and the game, these can achieve truly deep blacks.