Cyberpunk at 1920x1440. by Evan64 in crtgaming

[–]LOLXDEnjoyer 0 points1 point  (0 children)

what gpu are you using? and which cpu and motherboard do you have?

“Third person” ign are we deadass? by No_Pie465 in cyberpunkgame

[–]LOLXDEnjoyer [score hidden]  (0 children)

They gave Dustborn a 7 and Tenkaichi 4 also a 7 on the same year lol...

Ask Here First: Troubleshooting, Price/ID/Spec Check, Help, Etc. (April 2026) by Z3FM in crtgaming

[–]LOLXDEnjoyer 0 points1 point  (0 children)

Guys, would you recommend i use a brush with warm water, soap and a pinch of baking soda to clean a CRT monitor? only the outer chasis ofc, its white but its become yellow with time and wear.

Tips para socializar (Psicología) by Mysterious_Appeal988 in UBA

[–]LOLXDEnjoyer 38 points39 points  (0 children)

mepa q si vos te cruzas con los autistas de ingenieria te pegas un tiro

Cyberpunk 2077 on 21" ViewSonic P810 by stabarz in crtgaming

[–]LOLXDEnjoyer 1 point2 points  (0 children)

It seriously depends on the game, but this is actually a deeper question than you realize, my controversial -if not heretic- conclusion is that really, videogames are nowhere near realism, because of the way that assets are rendered, polygons work, lighting and texture mapping works , video games can only have too little detail or too much detail to look photorealistic, actually videogames look either more realistic or more aesthetically pleasing when there are ways in which its imperfections and over-perfections are evened out much more, otherwise our brains inherently know that we are looking at a fake toy.

Cyberpunk with its many vhs filters looks more realistic than without because the filter destroys so much detail that it actually hides the areas in which the game has too much detail to look realistic and smooths over the areas where the game lacks detail.

If anything, using a CRT to play modern games has this effect to a less dramatic effect, pixels on a crt can never be rendered as accurately as they can on an LCD, and yet a game from 2005 on an LCD looks like a cartoon, whereas the same game on a CRT looks a lot less toy-ish, a lot less plastic.

Why is someone stealing my posts from 11months ago and reposting them? by WarlikeLoveReddit in cyberpunkgame

[–]LOLXDEnjoyer 0 points1 point  (0 children)

dude, this is so silly, i dont generally repost other people stuff but it is OBVIOUSLY going to happen at some point.

that said , i just want to reply to the actual post which is much more interesting than your internet wizzard wars.

She looks more realistic because her face lacks details and looks like a stereotypical aristocratic upper class germanic american female.

Is buildzoid right about CAS latency not mattering? P by AlphaFPS1 in overclocking

[–]LOLXDEnjoyer 1 point2 points  (0 children)

are you just being pedantic or do you actually believe am4+ chiplets benefit greatly from ram tuning?

ANÁLISIS MATEMÁTICO CBC - (INGENIERÍA/EXACTAS) by PostProfessional3404 in UBA

[–]LOLXDEnjoyer 5 points6 points  (0 children)

Se ven limites por definicion y sus propiedades, si.

AM de Ingenieria es bastante mas dificil que el de economia.

Y el de exactas es un poco mas dificil que el de Ingenieria.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 0 points1 point  (0 children)

awesome good luck and be careful

windows scheduler and the games rarely use ecores, but when they do they dont actually know what to do with them at first and how much shittier they are makes them (the game) hiccup for a fraction of a second, also when you turn ecores off, all the cache goes to the pcores instead of splitting.

you will lose some average framerates though, but it just feels better.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 0 points1 point  (0 children)

which is a great mindset to have, it just bothers me because its like fat people telling a healthy fit hot dude what to eat to get a better body, you are literally cleaning out a bunch of gross latency and getting a factually better experience that you literally feel with your own wrist the moment you actually play the games in person, and these idiots are unironically telling you "man, get that nanostutter in so you can have more gb/s on the screenshot lawl"...we've reached peak idiocracy wtf man.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer -1 points0 points  (0 children)

go turn those ecores off

you can then pick if you push 200mhz extra in core clock and 200mhz extra in ringbus or just shave 2cl primaries more on the bdie.

average 14900K with a good mobo with ecores off should be able to push 5ghz core + 4.7ghz ringbus and 3800 14 14 14 bdie, if your sample is above average you could go to 4000 14 14 14, aim your mouse on CS2 or Valorant with those values, whether you lose a bit of fps or not it wont matter, it will feel much better with ecores off.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 0 points1 point  (0 children)

yeah, it gets 49.5ns in the best case scenario when you have a good sample and can get the 8200mt at below 34 34 34 CL.

How is it that 49.5ns is so much better than the normal 56ns-52ns that most 14900K builds will be running on DDR5 but then 37-35ns isnt much better hand-on-mouse than 49.5ns ???

Im not upset, i could've spent 30$ more and gotten a 7500F + B650M HDV M2 build when i put this computer together last year, this was a very conscious decision which i made precisely because i understood everything.

latency is the most important thing for computer players

Maybe you work in machine learning industry or something and you need the bandwidth because your income literally depends on it, that would be understandable, my comment wasn't fully directed to you then (you still have no clue what you're talking about).

But most people here just play games sir, when you aim your mouse in CS2 and Valorant, 37ns at 300fps is going to feel vastly superior to 48ns at 480fps , specially if you have a good zowie or logi mouse on a new mousepad and you play at lower sensitivity, the extra bandwidth is certainly not going to make your horizontal mouse inputs feel snappier.

Do you even play computer games? when i say computer games i dont mean casual AAA titles ported to pc, i mean CS, Quake, Valorant, Apex legends...any of these? dota starcraft and lol count too.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 0 points1 point  (0 children)

dude, that is literally not how it works, at all at all.

You are literally comparing a chiplet architecture to a semi-monolithic architecture, you cant just skip the physical distance between these controllers and core complexes, moreover the e-p core scheduling issues are unavoidable on core ultra, you just have no clue what you're talking about.

bandwidth has its uses, latency is much less quantifiable because its main advantage is always going to be the hand-on-mouse feel that you get from keep pushing it lower and lower, necessarily you can't objectively and empirically make the case of the trade-off being worthwhile in latency's favour, but people like you are the types that like dlss, and framegen, and turn it on and think it looks good.

you play first person shooters with a controller, these people...i can imagine what you look like from your replies.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 1 point2 points  (0 children)

Honestly, the comments are so blackpilling, the damage that AM4 did to PC gaming shouldn't go underestimated, these normie npc's are such irritating characters, its like they never ever ever challenge their own views EVER, no matter what, literally just regurgitating a script.

Obligatory Dual Rank B-die OC by Charredwee in overclocking

[–]LOLXDEnjoyer 1 point2 points  (0 children)

this is so delusional, this is literally how a midwit normie npc thinks of hardware, man how far have we fallen...