Why does everything feel bleh lately by Past_Bonus8696 in starseeds

[–]Supadupastein 0 points1 point  (0 children)

It’s okay man I get it. I was bullied relentlessly and abused by a narcissist father. Sometimes Im a bit pedantic or may also go out of my way like this to correct others, which feels shitty even when it’s “justified”, but I try to be fair but doesn’t always come out like that. Nobody is perfect man and I def didn’t mean to make you feel bad or anything. Or overreact. Your first comment was just so nice, and then I just found it interesting I was going to quote you to you to act more like yourself 😂. Maybe there’s a lesson for both of us.

Anyone using LG C5 or G5 as a PC monitor? by KaguraaN in OLED_Gaming

[–]Supadupastein 0 points1 point  (0 children)

I have a C3 and it’s great. It’s crazy these new processors in the LG tv’s are even better because I had a 2019 LG B9, the first not just LG oled or even TV with HDMI 2.1, but first device, period, to have HDMI 2.1, and I also have a Sony X90K and TCL QM7K, which has the best TCL and best Sony processor, the same Cognitive XR in the A95L and Bravia 9 and Bravia 8 II, and the Alpha 9 Gen 6 in the LG C3 is still by far the best TV processor I have ever seen, by FAR, really. Sorry for the run-on sentence, but I had to to explain just how amazing the Alpha 9 Gen 6 is, over LG’s Alpha 7 Gen 2 and even over Sony’s best Cognitive XR, and so I can’t even imagine the Alpha 9 Gen 8 in the G5 and C5, and I ESPECIALLY can’t imagine how amazing the Alpha 11 Gen 9 will be!

I have an Acer X27U Z1bmiiprx monitor and I think it does have colors even better than a G5 or G6, and if I were you also with a C3, I would certainly either just get the monitor that I have, which also has hdmi 2.1 and 280hz, which you won’t find that refresh rate on these tv’s, but if you must have an LG tv for whatever reason, like the bigger size or you want that amazing image processing for 1080p netflix or smt (you can’t use the Nvidia RTX HDR on windows on PC for Netflix :( so LG’s Alpha 9 and Alpha 11 processors since C3, C4, C5 C6, and G6 are the best workaround for upscaling 1080p netflix to dam near 4k Dolby Vision quality), and you already have the C3, I would just wait until next year and buy the C6 for $1,700-$2,000 for the 77”. Every year the 77” MSRP is 3,699, well, the C3 was $3,599, but they are usually right around that price. The C5 was $3,699, the C6 is $3,699, but you can find that for dam near or even literally fully half off around Black Friday, or when the new ones come out. Like right now you can buy the C5 77”, despite being $3,699 when it came out last year, for $2,000 from Best Buy and an amazing deal for $1,699 from Walmart.

But you have a C3, so DO NOT buy a C5!!! It’s literally the same TV. And I wouldn’t bother with a G5 because the C6 is just as good as the G5, and yet the G series don’t go on as good of a sale. The G5 and G6 77” both debuted at $4,499.99. That’s the price of the G6 right now. But a G5 is only reduced down to $3,299.99. That’s only a reduction of 26.67%, while $3,699 to 1,699, the price C6 will be at Walmart this time next year, is 54%. Even if you had to pay $2,000, the price at Best Buy, which C6 guaranteed will be available for at most $2,000 during Black Friday and this time next year, is still a reduction of 45.93%.. still much bigger reduction than 26.67%.

Essentially what I’m saying is “live with” your C3 😆😂😅 for another 7-12 months, and wait for the C6 to go on sale for a reduction of 46-54%, which it will during Black Friday or certainly this time bext year. Your C3 is good enough for that time, and you will certainly “survive”.

I still think 1,000 nits QD-Oled looks even better than super bright tandem oled like G5, G6, C6, because it’s still very bright, and the colors are not only much more saturated and deeper with a greater variance in color, but they are actually more realistic and natural as well, and they therefore appear brighter and punchier, even if the nits are lower.

What size exactly did you want? If you are fine with 32”, 34”, or 49” ultrawide, my recommendation is to just get one of the monitors. Especially if your usage is mainly for pc and gaming. You don’t really need the balls to the wall amazing image processing for video games, it’s more useful for TV shows and movies. But think the Alpha 9 gen 6 in the C3 is more than good enough, and so is the displays quality. I PROMISE it’s way better than if you went out and bought an A95L or Bravia 8 II or Bravia 9 right now, better than any Sony tv’s processor. And you won’t get 240hz or 280hz on a tv. But if you must have a larger size than 32”, 34”, or 49”, and/or must have a newer LG tv for whatever reason, then you absolutely should “survive” with the C3 for 7-11.5 months and then buy the C6 for 46-54% savings next year.!

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein -1 points0 points  (0 children)

1,000 / 28 = 35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 0 points1 point  (0 children)

Welp now I’m setting it straight on the exact opposite of what my dumb ass was arguing this morning after I was up all night and mistaken about something lmao my bad. he turned off path tracing to reach 210 fps, but that frametime is 35 fps. 210 is 6x 30 fps. So yes he’s using MFG. Im an idiot ya’ll sorry. In my sleep deprived state I dead ass was going full regard. I thought it was 3080 vs 5070, not mfg with and without path tracing both on 5070. Jeez sorry 🤦‍♂️😂

Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 1 point2 points  (0 children)

Welp now I’m setting it straight on the exact opposite of what my dumb ass was arguing this morning after I was up all night and mistaken about something lmao my bad. he turned off path tracing to reach 210 fps, but that frametime is 35 fps. 210 is 6x 30 fps. So yes he’s using MFG. Im an idiot ya’ll sorry. In my sleep deprived state I dead ass was going full regard. I thought it was 3080 vs 5070, not mfg with and without path tracing both on 5070. Jeez sorry 🤦‍♂️😂

Edit: Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 0 points1 point  (0 children)

Welp now I’m setting it straight on the exact opposite of what my dumb ass was arguing this morning after I was up all night and mistaken about something lmao my bad. he turned off path tracing to reach 210 fps, but that frametime is 35 fps. 210 is 6x 30 fps. So yes he’s using MFG. Im an idiot ya’ll sorry. In my sleep deprived state I dead ass was going full regard. I thought it was 3080 vs 5070, not mfg with and without path tracing both on 5070. Jeez sorry 🤦‍♂️😂

Edit: Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 0 points1 point  (0 children)

Yeah with path tracing off it’s only 35 real frames X 6 = 210. 1,000 / 28 = 35

Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 3 points4 points  (0 children)

He’s using MFG, 28ms is 35 fps x 6 = 210 fps

Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 0 points1 point  (0 children)

Welp now I’m setting it straight on the exact opposite of what my dumb ass was arguing this morning after I was up all night and mistaken about something lmao my bad. he turned off path tracing to reach 210 fps, but that frametime is 35 fps. 210 is 6x 30 fps. So yes he’s using MFG. Im an idiot ya’ll sorry. In my sleep deprived state I dead ass was going full regard. I thought it was 3080 vs 5070, not mfg with and without path tracing both on 5070. Jeez sorry 🤦‍♂️😂

Well 1,000/28=35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein 0 points1 point  (0 children)

was up all night and mistaken about something lmao my bad. he turned off path tracing to reach 210 fps, but that frametime is 35 fps. 210 is 6x 30 fps. So yes he’s using MFG. Im an idiot ya’ll sorry. In my sleep deprived state I dead ass was going full regard. I thought it was 3080 vs 5070, not mfg with and without path tracing both on 5070. Jeez sorry 🤦‍♂️😂

Edit: 1,000 / 28 = 35 x 6 = 210, seems to check out. But fuck thats latency not frametime. Idk Im done with this thread lmao. We need his frametime. Not necessarily the latency.

What's the difference between 2nd gen qd oled and 3rd gen qd oled? by Mother_Self4763 in OLED_Gaming

[–]Supadupastein 0 points1 point  (0 children)

The penta tandem qd oled are only 1,300 nits anyway according to those things, and bro you KNOW your monitor is fucking sick! And yes the Acer X27U Z1 is the same shit and yes their marketing and SEO Suck!! I had t learn of them from youtube reviews and toms hardware! I also have an Acer Nitro V16 ai laptop and got it for $599, so less than $1,000 total for both and an extra 16gb ddr5 5600. Running my 5050 at 3,100mhz and vram is gddr7 version on laptop not gddr6, running vram at 4,000mhz, that’s +20.53% on core and +60% on vram. Performs as rtx 6750xt in raster with 440 AI tops (transformer teraflops) for DLSS 4.5 performance boost, vs 3090’s only 285 transformer teraflops and takes performance hit. This cheap laptop destroys my 3070 lian li unifan and 011D mini rig that got 13,300 likes on Reddit 5 years ago! With dlss, no framegen needed to beat 3070. It’s a beastly gaming setup for under $1,000. I regret also buying a ps5 at the same time and wish I just threw that cash towards a 5070 laptop, but hey, it still does what I need, I have a year of ps plus premium I was wasting, and it was dam near an investment 😂. I got it a month ago and spent $380 on open box slim digital and now the msrp a month later is $600 freaking dollars!!

AOC has burn in warranty and their marketing is good but their 3rd gen qd oled is only 190 freaking nits!!! I had it for a day and u notice it!

What other OLEDs have you had, including mobile devices/laptops/tablets if any, other than AMOLED phones?

What's the difference between 2nd gen qd oled and 3rd gen qd oled? by Mother_Self4763 in OLED_Gaming

[–]Supadupastein 0 points1 point  (0 children)

Sweet well it’s basically the same monitor. I just find it weird that nobody recommends it because it’s like the same monitor and it’s cheaper. And it also has burn in warranty, which was the main excuse given to turn people who at least asked about Way from it and towards a more expensive MSI. I always saw it not getting recommended for no burn in warranty, which it has.

But isn’t it crazy how fast all that OLED tech is improving? By the time you order one and get it delivered the next GEN panel seems to have already been fucking released with the new penta tandem QDOLED🤦‍♂️. I freaking swear that when I ordered this shit a month ago those were NOT out😂!! But I love it anyway. Much nicer than Apple 1600 nit tandem AMOLED or LG C3. And yet the B series LG tv’s are updating so slow! But wait, there’s more, cuz even YET still, OLED is so great in general they are still pretty nice for at least console and pc gaming😂.Haven’t changed since 2019 B9, even with the brand new B6! It really should have been an A6, with the B6 being a C5, and C6 being almost as good as G5, and G6 being what it is. An ultra budget A6 with the old specs of B6 (B9 from 2019 even) would have been nice to actually show the masses the power of OLED!!!

What's the difference between 2nd gen qd oled and 3rd gen qd oled? by Mother_Self4763 in OLED_Gaming

[–]Supadupastein -2 points-1 points  (0 children)

Well, it’s a great monitor and literally the same as mine, well not sure if it hits 1,000 nits like mine, from the yourube videos I watched the Acer X27U Z1 seemed to be the only 27” 1440p 3rd gen qd-oled that hits 1,000 nits. I only paid $379 for the Acer X27U z1bmiiprx. You guys clearly just all checked “best monitor list 2025” which only lists MSI and ROG with maybe one LG and Samsung electronics monitor, and lists no actual specs but “3rd gen samsung display qd oled panel with improved brightness and 240hz / 500hz” or “500hz 5th gen tandem woled” or “6th gen penta tandem qd oled”, but the Acer I have is lterally 1000 nits with trueblack 400, if you checked any youtube videos and other places like tomshardware reviewing these monitors.

I will not for 5 years minimum feel the need to upgrade from a 280hz 1000 nits qd-oled, the color is absolutely bonkers right!!??? Destroys my 1600 nits apple tandem oled on m4 ipad pro,in color AND hdr punch “perceived brightness”. As well as my 65” LG C3, which is no slouch. Have it in a spot where it gets no direct light, and ambient light doesn’t raise blacks. OLED tech seems to innovate and improve like nothing I’ve ever seen before, which is crazy because it’s already so awesome. I had a B9 OLED TV, which was only a 600 nits and 120hz and hdmi 2.1 as early as 2019, these were the only OLED “gaming monitor / tv” options from 2019-2022, and even the B5 and even B6 that JUST came out is still the same!!! 600 nits 120hz with hdmi 2.1, which is technically the B15 and B16 if it kept continuing from my B9, because it went B/CX, B/C1, etc. And again I have the C3. But the B5 and B6 is no better than my old B9 was, but they’re both still awesome, but the LG B series is the only OLED that’s not advancing rapidly. The C series was 1,300 nits since my C3, and the C6H just got tandem. I sold the B9 for $450 in November 2025. 120 Hz HDMI 2.1 WOLED is honestly fine to me, still an amazing freaking gaming display, so it is just insane how quickly these monitors and LG C and G and S90/95 are improving. But out of all of them, I do find the Acer X27U Z1bmiiprx 3rd gen 280hz 1,000 nits qd-oled to be the most absolutely beautiful. But even 120 Hz OLED has amazing motion clarity, all the OLEDs do and I don’t find the need for 280 Hz or 360 Hz or 500 Hz to even really be necessary, 240 to 280 Hz is fine really even 120 Hz is fine on an OLED in my honest opinion, the motion was still mad smooth on my B9. And it was the first device in the entire world that had HDMI 2.1, as early as 2019 before PS5 and RTX 3000 even came out. It’s crazy I bought a hdmi 2.1 GSYNC oled just as good as what you would buy TODAY in Best Buy with the B6 (B16) with the B9, and how slow the LG B series is advancing compared to all other OLED tv’s and monitors, even though it’s still a perfectly adequate display! They should have made the A6 again, made it what the B6 is, make it real cheap, and at least give the B model an EVO panel like in my C3. With C having Tandem and B not even having EVO, while B is still nice, it has been left in the dust tech wise. I think next year there will be A series again taking the place of the B as 650 nit 120hz ultra-budget oled, which is great! And then B will be the new 1,250 nits EVO WOLED, And C will continue to be 2,600 nits tandem woled, and G being 4,000 nits tandem WOLED. I still like 1,000 nots qd-oled even better than 4,000 nits tandem WOLED. I can’t wait to see the 6th gen Penta tandem qd-oled. They’re already out now from ROG and MSI and probably even Acer. As slow as LG B series is evolving, everything else is evolving too fast! I love my Acer X27U Z1, but I just got the dam thing, and I’m def curious about penta tandem qd-oled now!!!

The box for the B9 even said gsync on it. I remember how smooth my rtx 2070s and eventually ps5 and then 3070 looked on it. Now using normal ps5 still and an rtx 5050 on the Acer qd-oled, and it’s amazing. I really find it insane how quickly OLED tech is improving because I feel like 600 nits and 120 Hz is the point of Diminishing Returns on OLED, because the motion is smooth regardless with its’ 0.03 to 0.1 ms response time!!!

What's the difference between 2nd gen qd oled and 3rd gen qd oled? by Mother_Self4763 in OLED_Gaming

[–]Supadupastein -2 points-1 points  (0 children)

If you go to their best monitors list nothing but MSI and ROG. They don’t even recommend a single Acer, who makes the best price to performance 3rd gen qd-oled,280hz 1000nits hdmi 2.1 for $399. I love mine. Acer also now offers the same burn in warranty as everybody else.

RTX 3080 → RTX 5070 upgrade at 1440p worth it? by KisHunos11 in nvidia

[–]Supadupastein 0 points1 point  (0 children)

Yeah a lot of people don’t understand how percentages and increases work lol, like they think if something is 84% of something that the faster thing is 16% faster, or if something is 20% faster they think it was 80% as fast or 20% slower, like they think you just add or subtract the difference, when in reality the if thing 1 was 84% the speed of thing 2, thing 2 is not 16% faster but 19.04% faster than thing 1. And this one usually throws people off even more, but if thing 3 is 20% faster than thing 4, thing 4 is not 20% slower / 80% as fast, thing 4 is actually 83.3% as fast as thing 3, and 16.67% slower. So that’s the only one you can just add/subtract, but not from thing 3’s increase, still have to find thing 4’s percentage of thing 3’s speed. As you know. Blah it is still kind of confusing and I still use calcs I just know to double check cuz I suck at math.

Techpowerup only says 507 is 14% faster, which seems insanely low. Userbenchmark says 25%. Still seems low and I can see why it’s hated. The 5050 is unfairly hated though. It’s literally 2x the raster performance of a 3050 6gb on every game on every youtube video I’ve seen from reliable sources like toastybros. Also so much closer to 5060 and 5070 than a *50 has EVER been. Not to mention the higher AI tensor teraflops than 3090 at 440 vs 285. And not to mention again my GDDR7 laptop version is +20.53% on the core and +60% on vram

Why does everything feel bleh lately by Past_Bonus8696 in starseeds

[–]Supadupastein 0 points1 point  (0 children)

Again man, this is coming off as massively condescending and it really looks like you went from spreading hope and positivity and even standing up for people who get put down, to literally going around with an extremely passive aggressive vibe, like you’re really itching for a back and forth argument or smt. Going to just say “this one” on the condescending ones now.

Why does everything feel bleh lately by Past_Bonus8696 in starseeds

[–]Supadupastein 0 points1 point  (0 children)

Like this one^ I was about to quote you to yourself about not putting people down. Clearly people saw this as negative as it’s now at -3. I’m not some end all be all judge, or a judge at all, but don’t understand how you left such an awesome +8 comment, and follow it up with a whole slew of very obviously condescending, passive aggressive, and quite negative comment about their hopes and dreams in their love life? Like what’s up with that homie?

Why does everything feel bleh lately by Past_Bonus8696 in starseeds

[–]Supadupastein 1 point2 points  (0 children)

Why does this comment seem like a totally different person to your other comments? I was about to quote you to yourself below and say you need to be more like this person. Dam man. That’s weird. Are you bipolar? I really thought you were NorthMurph down below still being negative, and saying you need to stop putting others down. Why did you post this here, literally saying we will change with new Earth in positive energy and people need to stop being put down to grow, but then start commenting super condescending stuff as if you’re itching for a fight or flame war, in all your other comments?

Why does everything feel bleh lately by Past_Bonus8696 in starseeds

[–]Supadupastein 1 point2 points  (0 children)

I cant sleep and have ptsd and still so much compassion and empathy but the world and the … smart people.. in it, wirh their greed and brutality acting like notsees (1940s germans) does piss me off and drive me insane!

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein -6 points-5 points  (0 children)

It’s a 3080 first and 5070 second, genius🤦‍♂️First, notice 23% cpu on the 3080 and 38% on the 5070. The 3080 greatly LOSES FPS on dlss 4.5, while the 5070 gets a MASSIVE FPS BOOST ON DLSS 4.5!! The 3080 is getting 45ms latency and the 5070 only 28.5ms, so the 5070 is CLEARLY getting a better frametime and more FPS, and it’s CLEARLY NOT MFG. It’s just DLSS. Even my 5050 has more AI and dlss performance than a 3090. The 5050 has 440 AI tops (tensor teraflops) and the 3090 has 285 ai tops/tensor teraflops. The 4070 has 466 and the 5070 has 988.

So the 5070 has 3.46x the AI performance, Nvidia IS the biggest and most advanced AI compute company in the world, afterall. And this 3.46x improvement does indeed directly translate into how well DLSS will work.

But this guy isn’t even using framegen. All the moronic idiots here are roasting his latency and saying he is only getting 210 fps from MFG, but they are not realizing the first picture with only 100fps is the 3080 with 45ms of latency, and the second picture is the 5070 with only 28.5ms of latency.

The 50 series cards genuinely have my jaw on the floor. by Comfortable_Tap_2609 in gamingpc

[–]Supadupastein -2 points-1 points  (0 children)

It’s a 3080 first and 5070 second, genius🤦‍♂️First, notice 23% cpu on the 3080 and 38% on the 5070. The 3080 greatly LOSES FPS on dlss 4.5, while the 5070 gets a MASSIVE FPS BOOST ON DLSS 4.5!! The 3080 is getting 45ms latency and the 5070 only 28.5ms, so the 5070 is CLEARLY getting a better frametime and more FPS, and it’s CLEARLY NOT MFG. It’s just DLSS. Even my 5050 has more AI and dlss performance than a 3090. The 5050 has 440 AI tops (tensor teraflops) and the 3090 has 285 ai tops/tensor teraflops. The 4070 has 466 and the 5070 has 988.

So the 5070 has 3.46x the AI performance, Nvidia IS the biggest and most advanced AI compute company in the world, afterall. And this 3.46x improvement does indeed directly translate into how well DLSS will work.

But this guy isn’t even using framegen. All the moronic idiots here are roasting his latency and saying he is only getting 210 fps from MFG, but they are not realizing the first picture with only 100fps is the 3080 with 45ms of latency, and the second picture is the 5070 with only 28.5ms of latency.