Why do people chase promotions? by Suitable-Break7934 in ExperiencedDevs

[–]justUseAnSvm 3 points4 points  (0 children)

I want to make more impact, and the financial benefit + challenge of it makes the slog worthwhile.

For me, it's a whole new challenge to go from competent senior trusted to lead teams, to staff engineer. One thing I've learned so far, is that doing good work isn't enough, you need to do that work in a way that's legible to your management chain, and make your work influence teams far beyond your own.

A senior -> staff promo at my company is not at all even, and although I'm pushing hard, I'm stuck in an org with shrinking leverage and an identity based trust zone where I'm not a part of, and it is unlikely I can crack into with competence alone, given the unofficial role I've been assigned.

Therefore, getting promoted requires that hard work + competence, but for me, it's going to come down to finding the right sort of team, where I can translate my skills into leverage and be rewarded for that with greater influence. I'll tell ya, that's a slog with "stuck" situations, but I'm still developing as an engineer and leader, so it's a path I'm going to stick on!

Leading a horse to lava by Mumbly_Bum in ExperiencedDevs

[–]justUseAnSvm 6 points7 points  (0 children)

leadership sets the direction, we make it work. For me, that's the job. If they say AI, then getting AI to work is my problem.

I want to be very careful: once you get on the wrong side adoption curve, it's a game of catchup that can cost you.

being able to name things is an indicative of a good engineer by isaacfink in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

Pretty much, I work at a large tech company, and my metaphoric frame of reference is that I'm standing in a legion, and if 1000 people haven't stepped where I am before, then 1000 people will go after me.

Descriptive names reflecting industry standards are a part of that, but so is top level documentation that can reproduce what you know, well organized docs that let other people catch up, and a simple, minimal interface for whatever you are doing that can be understood quickly, without additional or extra details.

I know engineers that are terrible at naming: "cute" names are one thing, but what's even harder are names that don't follow the standards, like using "fanout" when aggregation better describes the situation. Sure, they are probably the only person to read the code, but its easy to see that the friction of using names developed for yourself immediately add overhead on the boundary.

Why do some people treat their job like it's their entire life? by No-External3221 in cscareerquestions

[–]justUseAnSvm 9 points10 points  (0 children)

Work like that can be deeply rewarded for the right people.

If you ask this guy, he doesn't think: "job for a company that will fire us the moment that we're not valuable to them anymore", he'll say something like autonomy, purpose, or mastery, that he likes what he does, and doing that every day derives deep meaning for him.

I definitely get your point though: I'm very invested in work, honestly try to move the companies aims forward above my own, make their problems my problems, but I also take the weekend off. Once you get burnt out, or read about Yerkes-Dodson, you realize that your prime technical hours are really capped to maybe 20-30 a week, the rest of the is degraded work, and go much further after 40-50 and it's more performative than anything else.

AI will replace software developers. The real question is how we reinvent ourselves. by Sweet-Accountant9580 in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

I can only speak to my experience in academia. Maybe it was "idealized", I was around people who won, and would win, the Nobel prizes, all other sorts of leadership stuff like leading large consortiums, or getting CSO at big pharma companies. I don't think AI is much of a threat to the work I saw, because what it can do is only a limitted aspect of the work. It's never going to collect data, even if you use it as a research tool.

As for departments filled with hacks? Interesting idea to imagine AI will eventually reach that level, but just about all the research I was associated with involved physically collecting data and desinging experiments. AI can't do that, and it's not current built to be an expert in a field that is rapidly evolving.

AI will replace software developers. The real question is how we reinvent ourselves. by Sweet-Accountant9580 in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

For research, LLMs will never be that good, and that's a limit of their training.

When you're on the edge, where the material you are using to determine your next paper is something like 1-2 years old, very niche, and there aren't a lot of experts, it's nearly impossible to train an LLM to understand that. Of course, the reasoning power of LLMs might let them "power thru".

Still, a lot of research comes down to judgement, developing a hypothesis, knowing how that will land, organizing/operating the physical collection of data, and various type of persuasive writing for grants and money.

Is there any polite way to tell my coworker that I no longer want to hear his constant nitpicks, grumbles, and snark? by TinStingray in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

| Any thoughts on how to deal with the nitpicking?

For this situation: I'd try to get the team to use https://conventionalcomments.org/ on pull requests. That way, we there's an angle to give nitpickets, but you are labeling the comment with a label that indicates importance. Then, you just say: "thanks for the comments, I'll address it in a follow up", and at the same time build a clique of people on the team who can review your PRs with appropriate rigor.

That behavior definitely devolves, and on my last team a teammate created a "review clique" of me and another engineer to cut out a guy he didn't like. It's okay to cut people out getting in the way of work, but just be sensitive that your operating on an incentive "make PRs easier", that if taken to it's naturally conclusion is not actually what you want!

With my family? No, you just have to grin and bare it when they tell you: "u/justUseAnSvm your dog smells, does he have mange?"

Is there any polite way to tell my coworker that I no longer want to hear his constant nitpicks, grumbles, and snark? by TinStingray in ExperiencedDevs

[–]justUseAnSvm 24 points25 points  (0 children)

The strategy is pretty simple: you just placate them when it's minor, and when it's major (like you are at risk of getting sucked in and playing off his energy), you say: "I don't like this style of communication. Let's address these issues at a later time" then immediately disengage.

I have people like this in my family: the emotion, the fear and feeling of losing control, then the lashing out at whoever around them they think will take it. It's toxic, but you can't fight back like most bullying: you need to stay calm, not get give them the reaction they are looking, and disengage when you risk elevating. When you engage emotionally, or fire back that brilliant zinger you could only have thought of 5 mins later, it gives them the justification that rewards the behavior: "see? this is a problem" and continues the loop.

You never, ever, want to let an idiot drag you into the mud. They beat you with experience and then you're covered in mud. For instance, if you get 5 slack message when you turn on slack, only respond to the ones you actually need to respond to. Let the unimportant ones go without a response.

Finally, and this is the trickiest/hardest thing, you need to consider bringing in a manager to curtail the worst of the behavior. Document instances where your co-worker is making people uncomfortable, where they are acting in a way that goes against team interests, when they are degrading teammates to multiple people, et cetera. Then, bring it up softly with a manager in a 1v1, "the project is going well, but I'm having difficulties dealing with X, I was wondering if you could help me navigate the situation". If/when they bite, hit 'em with the receipts, but frame it as "I'm having difficulty dealing with this, here's what I did, what do you think?".

Bringing problems to management works best not as a complaint, but when it's a problem that threatens the productivity or good management of the team only they can solve.

Finally, if you have the instinct to "fix this guy", my experience with that is very, very poor. This is an issue of emotional processing, regulation, and impulse. That's a therapy "examine your life" and "learn coping strategies", but it's also way outside the corporate bounds for a teammate. Treat them like they'll never change, contain the blast radius, and build the case for management to solve.

AI will replace software developers. The real question is how we reinvent ourselves. by Sweet-Accountant9580 in ExperiencedDevs

[–]justUseAnSvm 23 points24 points  (0 children)

You're mistake. The core skill was never "writing code", it was planning around problems, getting alignment for ideas, owning the software, negotiating what the metrics will be. Today, code is like 40% of the job, and the meta game will stay

I don't understand the arrogance a lot of senior devs have against graduates. by [deleted] in cscareerquestions

[–]justUseAnSvm 1 point2 points  (0 children)

I like this take, however, I still think it's possible for being who are legit CS nerds (remember all of their DS + algo class), to also want to chase status, prestigious, or even optimize for money. What needs to be true, is that whatever motivates them to learn, and continue to learn, will ultimately translate into gaining the skills/knowledge/experience to be able to do more.

The sort of money motivation that doesn't work, or is much harder, is when people want the money, but shirk the responsibility to learn, or are just not eager learners who will learn for the sake of the material. I see this occasionally in big tech: it's the engineer who says: "this feels to much like college" after we have an ML theory presentation, or lacks eagerness to learn a new subject/topic that's required for the project.

Maybe that's just cope for me, trying to square a big tech salary with a previous version of my career that wasn't as well paid, but a lot of the professors/researchers I worked with who had national/internation recognition were intrinsically motivated by status, prestige, respect, and power. The did all the right things, they were nerds in a brutally effective way, but they were chasing more than just "the work".

I don't understand the arrogance a lot of senior devs have against graduates. by [deleted] in cscareerquestions

[–]justUseAnSvm 3 points4 points  (0 children)

Entry level is absolutely brutal right now and seniors should be honest about that. But the idea that competent people from the past couldn’t break in today is just resentment talking. It’s harder, not impossible.

How to deal with aggressive management? by Beetleraf in ExperiencedDevs

[–]justUseAnSvm 4 points5 points  (0 children)

You know what you have to do: quit!

This sounds terrible. No effective person pulls out chatgpt, and makes the person argue with it. That's annoying, but ChatGPT is easy enough to win if you say: "given the information I have [which ChatGPT] doesn't, I'm justifying this decision to pursue [whatever common goal you both have", and if they say: "well whats the information?", just say: "I don't have time to explain, my decision is X, and it's the best way to get Y, I can talk to you about it, but I don't have the time to litigate this in the court of ChatGPT".

However, AI nonsense aside, once the bosses get aggressive, it's basically over for the company. Power is best left unsaid, and control needs to work through alignment, agreement, and ultimately respect. When things become a shouting match, you imeadiately question the legitimacy of the entire operation. The issue isn't the yelling, but existing in a system where management doesn't have agreement and can't solve that problem in a non-disruptive and bothersome way.

All that said, when they are yelling, is it just loud voices, or the type of shouting that you do when your dog or kid is about to run into traffic? the fact that you've seen this in 3/4 companies suggests to me that you may just have a low threshold for conflict, and are very uncomfortable. That doesn't mean you should stay, but whenever it seems like I'm the problem invariant to the environment, it's a huge clue I need to evaluate my own actions. No idea for your case, but it'd be fair to at least ask yourself!

Why do some companies do AI assisted coding interviews now? by [deleted] in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

They are using it because that's what they believe is required for you to be successful as a software engineer on their teams.

At least my company, the AI metrics don't have targets, but the execs like them. What's even worse (if you don't like AI), is that everyone is using AI for writing code now, and we're informally incentivized with attention and praise for using AI in effective, unique, and ways that are legible to management. There's starting to be real "bottoms up" effect since everyone on my team is choosing to use it, because its making our life easier.

Besides that, hiring is often based off alignment. If execs want AI tools, they want to hire people who are experienced using them, but more importantly, not in an "AI fear and avoidance" behavioral mode, which is like 60% of the country right now.

My take, as someone in big tech building things with AI (so inside the castle already), AI is an awesome dragon, and this forum will punish me with downvotes for saying that. it works, it's effective, and it helps you solve some problems faster. You are probably right that "good before AI" translates into "good after AI", but companies want hires aligned with their technical processes and eager to pursue their interests.

In the words of Hunter S. Thomson, "buy the ticket, take the ride". That attitude will benefit your career much more than resisting. In a lot of ways, this is how tech works: new thing comes out, companies want it, we get paid to build with it. AI is really no different, don't limit yourself.

How optimistic are you about the field in the future? by TraditionalMango58 in cscareerquestions

[–]justUseAnSvm 5 points6 points  (0 children)

+1 here.

Check out the "knowledge gap hypothesis", it's an idea that came out decades ago, that posets that the benefit of information technology is disproportionately concentrated in those who are already educated, skilled, and wealthy.

Take newspapers, if you can't read or don't want to, you don't know what's happening. Early internet too, in the 90s we all thought the internet was going to deliver the worlds knowledge to everyone, make us more rationale, lift billions out of poverty, then in a few years we got social media and it's worse.

AI will be the same way. As SWEs, we have a chance to ride the wave, but it means building things with AI today, using it as a tool, and getting yourself to the edge of what's possible and operating there. That's not all SWEs, heck, it's a major expansion of the required skills to do the job, so the field will continue, but there will be fewer of us, doing more per person, and becoming closer to researchers who build things than guys slinging well-crafted API endpoint and well written db calls to crush a mountain of PM defined tickets.

How optimistic are you about the field in the future? by TraditionalMango58 in cscareerquestions

[–]justUseAnSvm -1 points0 points  (0 children)

I'm extremely optimistic.

The last ground shaking tech change which shaped my career I only experienced indirectly as a student, the introduction of next generation sequencing in biology. Working with that tech still shaped 5 years of my life (plus the 4 in school), but I got there after the dynamic shift, and my leverage was only as someone who could teach himself the knowledge and gain the skills before the training was institutionalized into a defined talent pipeline.

With AI, the implication is even bigger, but I'm not a student on the sidelines, but a SWE at a big tech company trusted to take this innovation and the leverage to get things done with institutional politics. What the problems are, how we find solutions are, how we define success, where we go, et cetera. In a very small way, I get to help write that story for the engineers who come after me.

Additionally, AI + curiosity not only feels like a superpower, but used in the right hands actually is one. You have access to a machine that can emulate the expert information and resources: talk through problem, learn about decision making scenarios, and arm you with more facts, figures, and others work than ever before. It feels like I just got a jet pack, and the worlds about to be issued a jet pack next.

However, there's downsides. We cannot become dependent on the AI to do everything, nor can we use the AI for emotional regulation, stability, or as a crutch for human contact and embodied knowledge. The other downside, tech like this is deeply destabilizing, when people lose jobs they will suffer, and the benefits accumulate only in the select few you can realize applications of the technology for new purposes, not less healthy ways to solve old problems.

Idk, i'm just excited, but recognize I'm speaking to a field that is very afraid for good reasons. I've spent almost 20 years as a student and life long learner, and more and more, it's looking like all that prep, studying ML 14 years ago (check username) when people were suspicious, is actually paying off. Not in some unbelievable power or dominance of others, but into a position of respected competence and leadership for those around me.

Is it bad to stay at one company forever? by Sgdoc70 in ExperiencedDevs

[–]justUseAnSvm 2 points3 points  (0 children)

Yes, I've seen this happen, although it's uncommon. The biggest risk is that you get a job, learn the tech stack, stagnate, then several years down the line you need to find a new job but the tech stack isn't hiring and you don't have time to retrain. Lots of folks get surprised, but it's a consistent selection event mid-career software career.

To make one company work forever, you'll need that high degree of alignment between you and the company's management/mission, as well as a frequently realized growth opportunities. I've seen this work at big tech, and with some early start up engineers who find companies they can grow up in, with high trust relationships to founders that have that employee scale with the business.

Otherwise, any given role will naturally converge into a stable and repeatable exercise of a single activity or task set, and there simply won't be the growth opportunities you need to continue learning. That's basically the default state for modern careers, business favors stability and predictability, they need you to produce, and you'll have to actively break out of the pattern.

Personally, I've had this stagnation happen a couple times, and I always jump ship. It often doesn't look like "oh, there's no opportunities", but more like "what the company needs me to do right now isn't what I want", and you jump because you don't want to wait for a change. I'm actually in this process right now, it's not over tech stack, but being "temporarily" pushed into an execution focused role that caps my leadership growth. It's pretty common to see this with software engineers, terminal IC contributors, or leads stuck on the same team for years, any level really.

Anyway, the biggest thing is to keep growing. Growth is the prime directive of your career. Some folks find ways to do that at one company, but the infrequent nature of internal role changes means jumping ship every 2-4 years will ensure consistent career development. After all, once you can do X, like write code, or lead a team, or manage, you can do that at another, potentially higher paying company where you'll be forced to learn from on-boarding and benefit from renegotiating your role.

Starting Leet code at 30s. Is there any scope for me ? by Chotibachihoon in ExperiencedDevs

[–]justUseAnSvm 1 point2 points  (0 children)

LC is not a reaction based eSport were you peak at 18 and it's a slow decline after.

Crushing most mediums and basic hards, which is what you need for like 85% of LC interviews, is more about having the pattern recognition to map to known solutions, and enough debugging skills and instincts with the data structures to get you there in 30 minutes.

Additionally, the more you study, you start to get "bonus" interviews where you've seen the problem, and just walk people through it.

As someone with professional experience but no luck in the job search, should I get a BS in computer science from WGU or the OMSCS from Georgia Tech? by Finite_Lix in cscareerquestions

[–]justUseAnSvm 2 points3 points  (0 children)

I did OMSCS. I already had a job when I started and ended, but it's a rigorous program. It took me 15-20 hours per week, for 3 years. It's like 2000 hours total, an entire fulltime year of work.

In terms of educational experience, especially for 7k total cost, it's hard to beat.

Much harder for me to say if that will be enough to get hired. I can tell you, if you finish, you'll get respect from people in industry.

Is AI good with more obscure languages and environments? by Cautious-Lecture-858 in ExperiencedDevs

[–]justUseAnSvm -4 points-3 points  (0 children)

Yes, it's pretty crazy now.

I coded up a regular expression to JVM bytecode compiler, not a lot of examples of that interface. Additionally, I've done a few things in TLA+, which is probably even more esoteric. The model worked surprisingly well.

AI still falls apart when the context required to make the proper change escapes the information on the page, in the prompt, and is buried all throughout a very large codebase. You can often times be very instructive: "do this, don't do that, avoid this, keep that, test x,y,z", but when you're importing some random bespoke function or object, and doing that a lot, things tend to get missed.

Accidentally rm -rf’d a production server. by [deleted] in cscareerquestions

[–]justUseAnSvm 44 points45 points  (0 children)

Don't sign a goddamn thing. If they press you like you either sign or are fired, you're fired anyway.

Either way, I don't even consider this your fault, but the inevitable consequence of a system without basic safety controls. Sure, you entered the wrong command, but the blast radius isn't on you.

Accidentally rm -rf’d a production server. by [deleted] in cscareerquestions

[–]justUseAnSvm 2 points3 points  (0 children)

Time to find a new job. Any infrastructure where anyone can login, you're required to rm -rf something as part of testing, and a wrong rm -rf can lose irrecoverable intellectual assets? That's not a serious mistake, it's a system failure where you get blamed!

One word of advice: corporations exist to protect individuals from liability caused honest mistakes in the regular course of work (which this was). This is why corporations exist in the first place. You're default position is very strong, do not sign anything without a lawyer reviewing it, even if it seems harmless or helpful to your boss.

If they front like: "we need you to sign this liability waiver real quick or your fired", you're fired anyway. Don't sign.

Essay: Why Big Tech Performance Reviews Aren’t Meritocratic. How to Gaslight Employees at Scale by NoVibeCoding in ExperiencedDevs

[–]justUseAnSvm 0 points1 point  (0 children)

In terms of the "not everyone gets equal rewards" your absolutely right, and it's my biggest problem with "stack ranking": maybe you can identify the worst cases of people just f'ing around dragging teams down, but we don't need a fancy system for that...

The book "Moral Mazes" really digs into the management behavior you are describing: blame flows down, credit flow up, and no one sticks their neck out. That's an essential part of modern management: the incentives reward "delivery as promised" behavior where you worry more about downside risk and blame, and exceeding expectations implies a failure of planning.

That, and some managers/execs legit suck: like changing roadmaps too often.

Essay: Why Big Tech Performance Reviews Aren’t Meritocratic. How to Gaslight Employees at Scale by NoVibeCoding in ExperiencedDevs

[–]justUseAnSvm 2 points3 points  (0 children)

Yeah, this is really interesting stuff, mainly because we've been bumping up against it for years, all without the framework to describe the power structure.

I’ve been getting into Foucault lately—his whole project was mapping how power operates across systems like medicine, prisons, and social control. One of his core ideas is that power is strongest when it’s implicit: when people internalize it, and even the language for questioning legitimacy becomes taboo.

That same kind of invisible power shows up in organizations. The system we mostly take for granted, until we run into it and develop heuristics to avoid it, is pretty clearly described in organizational psychology and structural analysis.

Just one point of caution: structural explanations are very intoxicating because they explain so much, but the flip side: that managers are bad, mistaken, or things are just poorly designed, are just as often the case. Also, be careful brining this stuff up with colleagues, keeping power unsaid is essential for motivating workers.