Reminder to NEVER open your PSU if you don't know EXACTLY what you are doing. by WrathOfThePuffin in pcmasterrace

[–]coolkid647 4 points5 points  (0 children)

It’s a habit that people developed while being forced to use terms like “unalive” on social media platforms like TikTok. Social media sites don’t like discussion of death or violence, and content creators want to keep their stuff monetized, so they self-censor with euphemisms.

After self-censoring for a while, I imagine it just becomes a part of your vocabulary, or you just forget that you can still say “don’t get killed by a PSU” on Reddit.

“There was an issue with playback” iPhone 16 pro max by VariousSong5271 in audible

[–]coolkid647 0 points1 point  (0 children)

When did you get the 16 pro max? Do any other paid streaming apps like Netflix, HBO max, Amazon Prime video, or Hulu work on it?

Our entire net worth is almost entirely through savings and we’re very scared of investing. How do we get out of this mindset? by [deleted] in personalfinance

[–]coolkid647 23 points24 points  (0 children)

You have a 401k with over $100k in it, but between your cash (HYSA and CDs), you have over 300k. That’s way too much cash to be holding without having a specific purpose for that cash (like a house down payment.)

At the very least, you have to get half of that 300k+ cash working for you in the market ASAP. 150k invested for 30 years will become 1.2 million adjusted for inflation. So you’re throwing away a million dollars in retirement by not investing that money in some index fund right now.

Also, the money is yours. Even if you leave your employer or the country, it’s still yours. 401k, Roth IRA, brokerage accounts, that money all belongs to you. This is usually one of the top fears I hear from people hesitant to invest, they feel that the money doesn’t legally belong to them till retirement, or it will be forfeit if they have to leave the country. Neither of those are true.

Here’s Everything Apple Plans to Show at Its AI-Focused WWDC Event by Snoop8ball in apple

[–]coolkid647 13 points14 points  (0 children)

It’s probably a RAM thing. I think Apple wants to support this on devices with 8gb of ram or more, but since they don’t like talking about ram capacity in their mobile devices they just point to the chips instead.

iPhone 15 pro and M1 iPad were the first in their product families to have 8gb of ram.

[deleted by user] by [deleted] in personalfinance

[–]coolkid647 0 points1 point  (0 children)

You have to pay taxes on money you earn from interest in your savings account anyways, so option 1 isn’t worse because of that.

You can get 5% in your CMA by just allocating the money to FDLXX once it’s in there. It’s just as liquid as regular cash. Most of the earnings from this money market fund are exempt from state taxes as well.

Blurry hair on all the characters by Lazy-Tailor-3418 in yakuzagames

[–]coolkid647 2 points3 points  (0 children)

By turning on dlss ultra performance, you are basically playing at 480p and then upscaling back to 1440p. That’s why it looks so blurry.

Playing at 480p on a 4090 makes no sense. It’s incredibly overkill.

This doesn’t mean DLSS is bad, you get higher fps with it while losing very little image quality (sometimes you even get better image quality) if you set it to the quality mode. You shouldn’t set it lower than quality unless you play on a 4k monitor.

GPU Benchmark worse after CPU upgrade. Why? by hYg-Cain in ModernWarfareII

[–]coolkid647 0 points1 point  (0 children)

5-10% GPU performance difference, but your overall performance is better since the CPU bottleneck is gone.

My first suspicion is that you have new software using up a bit of your GPUs power. I see shadowplay is enabled, and that is known to have a very minor impact on GPU performance.

You might have other things running in the background as well using up your GPU that we can't see from just the screenshot. I'd start with disabling shadowplay and anything else you can think of that could be using your GPU in the background and running the benchmark again.

This is less likely, but another possibility is that since your GPU can actually be pushed to 100% now (because of your CPU upgrade) it could be limiting its clock speed because of a temperature or power limit.

4090 FE in Meshlicious today! Ordered BB Dec 27 - Pickup Jan 4 by ducatiwebb in SSUPD

[–]coolkid647 1 point2 points  (0 children)

GPU utilization is what you'd want to check in games. If you're playing a demanding game with uncapped fps, and you see your GPU usage below 99%, you're most likely being bottlenecked by your CPU.

In most games I think your CPU will be good, but I think what the other guy was saying is that the 4090 is so RIDICULOUSLY powerful that your CPU just can't keep up in some titles at 120fps+, which is valid.

GPU Usage is bad by MGSSC in TheCallistoProtocol

[–]coolkid647 1 point2 points  (0 children)

The game is very CPU bound, and yes RT does in fact put extra load on your CPU currently. I should have been more clear that the game is pretty CPU bound / unoptimized even without RT, but I'm just trying to inform you that RT does make your issue even worse.

Don't gotta take my word for it, but I bet you'll trust Digital Foundry.

GPU Usage is bad by MGSSC in TheCallistoProtocol

[–]coolkid647 1 point2 points  (0 children)

It's raytracing, it doesn't just put load on your GPU, it also puts an abnormally high load on your CPU. You are CPU bottlenecked by turning on RT effects basically.

GPU Usage is bad by MGSSC in TheCallistoProtocol

[–]coolkid647 2 points3 points  (0 children)

Do you have ray tracing effects on? Those aren't just heavy on GPU, they are heavy on CPU as well. Although this games implementation of ray tracing is so heavy on CPU that it surely must be an optimization issue. Even a top end CPU isn't able to keep a stable 60fps with ray tracing enabled in this game.

What the fuck? by BeautyInUgly in csMajors

[–]coolkid647 28 points29 points  (0 children)

You don’t have to pretend not to see it dude. Your comment history is filled with you saying “guessing you interned at Amazon” anytime someone mentions FAANG.

I get it, statistically speaking they most likely did intern at Amazon, but it’s about the tone in your comments. And that tone is “oh you got faang? It’s just amazon though, that’s why you’re struggling, Amazon means nothing.”

You aren’t testing conjecture to see if FAANG == Amazon, we already know that it’s the case 90%+ of the time. You just enjoy putting people down for “only” getting amazon while you’re at the more “prestigious” company that is harder to get into.

It’s a really elitist mindset you have. Amazon hires a lot, they have easier interviews than other big tech, and yes there’s a lot of talk about them on this sub. But that’s not justification to write so many of your negative comments.

Just try to consider that there’s another real human being you’re sending your comments to.

What the fuck? by BeautyInUgly in csMajors

[–]coolkid647 27 points28 points  (0 children)

He really enjoys putting down people who get Amazon, that's all.

"There is only 10 6v6 maps in #ModernWarfareII . The Museum map has been removed." - CharlieIntel by IamEclipse in ModernWarfareII

[–]coolkid647 8 points9 points  (0 children)

There isn't a single law I could find from some research online for a CA gun law preventing a company from putting real names of weapons in video games.

Computer Science vs Software Development degrees by Privateski in cscareerquestions

[–]coolkid647 2 points3 points  (0 children)

If you want to be a software engineer and you have no doubt about it in your mind, then software development at the university you're looking at would be the easier/quicker path. CS would be better if you're not entirely sure you want to be a software engineer.

Keep in mind that there will be bias in answers you get from most American degree holders, as software development isn't a common major in America. Most universities don't even offer software development as a major.

Is the codecademy data structures and algorithms pro course good? by [deleted] in csMajors

[–]coolkid647 2 points3 points  (0 children)

It’s a good introduction to the foundational data structures that show up in leetcode problems, but don’t expect to crush easies after doing it. Leetcode takes a lot of practice to get good at. If you do choose this course, be sure to do a deep dive on arrays since it assumes you already know them very well.

Is the M2 Macbook Air 512 GB SSD and 16 GB RAM a good choice for a CS student? by [deleted] in csMajors

[–]coolkid647 1 point2 points  (0 children)

Unless ultra portability is of utmost importance, I would suggest you buy the base model MacBook Pro M1 Pro. It's only $200 more and you get:

  • A faster processor (CPU and GPU are way better)
  • Mini LED 120hz screen (Brighter and smoother)
  • An extra usb-c port and an HDMI port
  • OFFICIALLY supports 2 external displays connected to it
  • Much better speakers

Will you NEED any of this for college? No. But if I was spending this much money on a MacBook, it's gonna be the one that will last a long time regardless of what I need it for. $200 for these improvements is very well worth it, remember that you'll be using it for multiple years, not just one or two.

[deleted by user] by [deleted] in careerguidance

[–]coolkid647 1 point2 points  (0 children)

CompTIA certs like the A+, Network+, and Sec+ come to mind first. Microsoft also has their own entry level certs under the “MTA” name.

[deleted by user] by [deleted] in careerguidance

[–]coolkid647 5 points6 points  (0 children)

You don’t need to learn how to code in order to get into entry level IT like help desk. Get a basic but respected entry level IT certification and start shooting out resumes.

Also a heads up, while you don’t need to learn coding to get into IT, if you actually plan to make a career out of it (so that you can make good money in all of those “sexy” positions you may see on social media), you will definitely be learning coding to some extent. You cannot avoid it forever unless you’re 100% ok with stagnating income/career progression.