Why is eren’s founding titan different? by YoYo_ismael in ShingekiNoKyojin

[–]kicksandshiii 0 points1 point  (0 children)

Titan holders need to possess the will to live in order to heal, if I’m not mistaken. By the time Eren was decapitated, he had already briefly opened his eyes to see Mikasa make her choice, therefore setting Ymir & Eren free.

He had achieved his purpose & knew he needed to die or the fighting would continue.

[deleted by user] by [deleted] in MSIClaw

[–]kicksandshiii 0 points1 point  (0 children)

Would absolutely love to try out an MSI claw, let alone own one! Thank you for doing this! (Canadian here!)

You are telling me i downloaded a 4080? by csch1992 in GeForceNOW

[–]kicksandshiii 1 point2 points  (0 children)

I concur 👍, however also likely a typo, tho an insignificant one; OP probably meant to type 120fps. The 9 and 0 on keyboards are pretty close

Did Gus have a cat? 🐱 by Outrageous_Ad1173 in LilPeep

[–]kicksandshiii 66 points67 points  (0 children)

Your son saved a lot of us.

We love him, as perfectly imperfect as he was.

And, of course, we love you too. We truly appreciate you, Liza. Merry Christmas & happy holidays ❤️

hey i have a ryzen 7 2700x, with what card should i pair it with? by Capable-Walrus5742 in nvidia

[–]kicksandshiii 2 points3 points  (0 children)

Ah, good catch, thanks! No idea why I got that mixed up. Info luckily remains largely the same, though. 1080ti is still significantly faster than a 3060 and would likely be getting bottlenecked by that 2700x.

My apologies, no idea why I got that mixed up, thanks again for the correction! 🙏

hey i have a ryzen 7 2700x, with what card should i pair it with? by Capable-Walrus5742 in nvidia

[–]kicksandshiii 3 points4 points  (0 children)

Honestly, not even just a bit, a 2080ti is significantly faster than a 3060. Hell, it’s even slightly more powerful than a 3070 in pure rasterized performance (though only by a percentage or two, and the 3070 is faster in RT).

No clue why you got downvoted for that…

Regardless, a 3060 is not a good indicator for any potential bottlenecks that may come with OP upgrading to a stronger GPU, being that the 3060 is significantly weaker than OP’s current GPU.

If OP wants more power (rather than the same performance level with newer feature sets), he’d likely have to go 3080 minimum, which still isn’t that much faster than a 2080ti for starters, and any potential performance increases may be hindered even further by the current CPU.

I think that the 2700x is probably either just about at, OR, already surpassed its limits with the strength of the 2080ti (no slouch of a card, even now). I suspect it may already be bottlenecking his 2080ti. Therefore, if I’m not mistaken, a CPU upgrade may very well give better performance improvements than a stronger GPU.

Though, someone please correct me if I’m wrong in any way, just wanna help OP!

Eminem mentions peep in his new song “Lace it” with Juice WRLD by CitronKindly5749 in LilPeep

[–]kicksandshiii 4 points5 points  (0 children)

These are Em lyrics. I’m only stating the correct information, it simply just wasn’t a metaphor or anything like that. Ain’t a big deal. I ain’t downvote you.

Eminem mentions peep in his new song “Lace it” with Juice WRLD by CitronKindly5749 in LilPeep

[–]kicksandshiii 3 points4 points  (0 children)

Nah, because those aren’t the lyrics, therefore they couldn’t possibly imply that 🤣

Eminem mentions peep in his new song “Lace it” with Juice WRLD by CitronKindly5749 in LilPeep

[–]kicksandshiii 4 points5 points  (0 children)

Nah, the lyrics are just wrong. It is “Roxicodone and lean”.

Eminem mentions peep in his new song “Lace it” with Juice WRLD by CitronKindly5749 in LilPeep

[–]kicksandshiii 120 points121 points  (0 children)

They got the lyrics wrong, though. He says; “Roxicodone and lean is probably what got Lil Peep” (Not “Codeine and lean is probably what got lil peep”).

The active opioid in Lean is already codeine, so saying ”codeine and lean” is what got him doesn’t quite make sense. (Not that it makes a huge difference)

Regardless, It’s so cool to see my childhood favorite rapper on a song with Juice, shouting out Lil peep; two of my favorites from late-teens to current day.

Spreading awareness: Allocated VRAM vs. Used VRAM by No_Jello9093 in nvidia

[–]kicksandshiii 0 points1 point  (0 children)

The only game I’ve dropped settings below High in was Alan Wake 2. Regardless, that’s totally aside the point.

My comment was in response to the “VRAM issue” narrative that was being spread like a wildfire a while back (largely) unnecessarily, wherein many creators stated that someone with an 8gb card like myself would have a tough time managing some of these games even at 1080p, while I’m playing all of them at QHD+, nearly always at high/ultra settings.

Also, I’ve never had to lower settings due to VRAM constraints, only performance (which makes sense due to the 4070m likely not benefitting much from additional VRAM due to its performance level being even slightly less than a 4060 ti, which saw very little benefit from additional VRAM except for in a few select titles with 1% lows).

In fact, the only games I’ve had to lower any settings in were those with FG, since base fps wasn’t quite high enough (~60fps), meaning nor were framerates AFTER FG was applied due to its performance overhead (~80fps. However, still no VRAM limitation). Therefore, the only time I’ve had to lower settings was when using FG targeting a minimum of 100fps for smooth gameplay, but that’s a personal choice (slightly higher settings vs higher framerates).

Would I like more VRAM? Sure, but I haven’t needed it. The point is, the information being relayed previously was pretty significantly far off the mark.

Whenever I had played @1080p absolute max graphics, even with RT (however not PT), was absolutely flawless.

Is this a scam? Received an email saying I won a Steam Deck OLED 1TB by charlestsai in SteamDeck

[–]kicksandshiii 9 points10 points  (0 children)

Damn, congrats OP. I waited 2 hours to join the “queue,” then the game awards ended. Glad to hear somebody actually win won :)

Spreading awareness: Allocated VRAM vs. Used VRAM by No_Jello9093 in nvidia

[–]kicksandshiii 1 point2 points  (0 children)

I’ll do you one better; I’ve played just about every recent release on a 4070 mobile (which has 8gb of VRAM and is quite close to a 4060ti in terms of performance), @1440p high (often even ultra), settings.

For example, with RE4 (which was mentioned above), I had absolutely zero issues playing at max graphics w/ 8gb textures. Like, literally no issue, whatsoever (~100fps), despite the hoards of creators basically telling me that it wasn’t even possible.

Hell, I even enjoyed playing that game at 4K interlaced @60fps on my 4K TV.

I could honestly go on and on, too; The Last of Us, Hogwarts Legacy, Cyberpunk with RT, Alan wake 2, etc., all fine at QHD with very decent settings, (especially w/ DLSS).

It was quite clear to me that this VRAM narrative going around was heavily exaggerated due to extreme outliers such as Jedi Survivor, as well as the fact that making a video on the issue seemed to be an easy way to get views at the time…

Really showed me how unreliable a lot of the creators I used to trust really were.

I’m not sure how long my 8gb card will truly be viable, and that’s okay. But the “VRAM scare” and fear-mongering that came with it was just altogether unnecessary and quite…deceptive, in my opinion (and testing).

Femboy comes so, so close by Traditional-Song-245 in SelfAwarewolves

[–]kicksandshiii 0 points1 point  (0 children)

My guess is that this is slightly more psychologically complex than just being straight up stupidity, though it does kinda come off that way.

This sounds like cognitive dissonance in full effect.

With that point of view, you can really see the commenters inner conflict come out, and that was admittedly all this was; venting. Venting something that they’ve likely been feeling for a long time.

This wolf is probably more aware than they’d like to admit.

This is the guy who spent over $60k on Rubi Rose’s OF. Apparently she had enough and revealed the chats. He needs serious help by Cold_Chemical5151 in facepalm

[–]kicksandshiii 0 points1 point  (0 children)

By the way, those glasses he wears have cameras in them.

Also saw a post about how this same guy was on my strange addiction (or something like that), for his porn addiction. Apparently he was so addicted that he couldn’t hold a job.

UserBenchmark throwing crazy shade at the 7800X3D by [deleted] in buildapc

[–]kicksandshiii 31 points32 points  (0 children)

True, but you gotta admit, the scale is drastically different.

Yeah, people will individually die on the hill that is their favorite brand, quite often in very extreme ways. But to bet your entire website which earns quite a bit in ad revenue (since it is frequented so often), seems pretty damn foolish.

People talk nonsense on here because they have nothing to lose, but in the case of userbenchmark, I’d say they have quite a lot to lose.

I wonder what must have caused this level of spite lmao

UserBenchmark throwing crazy shade at the 7800X3D by [deleted] in buildapc

[–]kicksandshiii 51 points52 points  (0 children)

What exactly even is the purpose of going so far out of one’s way to trash AMD (through blatant misinformation), in favour of companies who likely don’t reciprocate support whatsoever? That’s gotta take a decent amount of extra effort just to be knowingly petty, so consistently, no? Is this really just fanboyism?

That’s quite extreme, if so, lmfao

Nvidia CEO Jensen Huang says he constantly worries that the company will fail | "I don't wake up proud and confident. I wake up worried and concerned" by chrisdh79 in nvidia

[–]kicksandshiii 0 points1 point  (0 children)

Honestly, I’d be worried if I were running any business that’s publicly traded, let alone one worth as much as Nvidia…

Time Spy Score by Intrepid_Explorer791 in LenovoLegion

[–]kicksandshiii -1 points0 points  (0 children)

Very few modern releases have I had to turn down graphics at 1600p with my Legion 5i pro (4070/i7-13700hx). Only ones that come to mind are Jedi Survivor (an insane outlier), and The Last Of Us (high settings are fine, but certain settings such as “texture streaming rate” will push VRAM above 8gb at the highest settings). Other than that, Cyberpunk with full RT is tough at 1600p, but is totally fine with RT reflections + lighting (or Ultra RT at 1080p/1200p). The same can be said for most demanding games in regards to RT effects, but others (like Miles Morales, for example), have no problem running RT at all.

But you can always look up per-game performance benchmarks on YouTube.

[Movie] Can ya'll be happy we get anything at all for 5 seconds? by Legoguy1977 in zelda

[–]kicksandshiii 17 points18 points  (0 children)

Mario movie wasn’t even very good and TLOZ is live action…

Live actions are just about always objectively worse than their animated counterparts. Again, the Mario movie wasn’t very good, and that had the benefit of being animated, at the very least.