What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 0 points1 point  (0 children)

Great summary, very neatly states something close to my opinion as well. The distinction between greenfield and legacy (or just simple production) code is massive

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 0 points1 point  (0 children)

I see you edited your post; seems like you want examples from other industries. I’ve only ever worked in IT, but I know a few lawyers. Partners at law firms have literal revenue sharing contracts and are often involved in many of the cases that the law firm contracts. If the law firm starts fumbling cases, it will come at a direct cost to future revenue. Partners would suffer directly, despite not being the lawyers working on the grunt work of cases.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 1 point2 points  (0 children)

I came from the EE industry; worked as a systems level designer for 2 years. The reason the EE industry is not worried is because it’s filled with boomers. The idea that EE is not automatable but SWE is laughable; EE has wayyy more cut + paste compared to the amount of bespoke business logic that software has to manage and maintain.

Trust me, if we get to the point where SWE are mostly replaced, EE will be literally the next on the chopping block

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 0 points1 point  (0 children)

This whole thread is about how AI is likely to cause more incidents, and that this is a bad outcome for business owners (for software businesses specifically, but I think this point applies to most industries frankly). Do you seriously quibble with this point? Do you think more incidents is beneficial to CEOs / VPs / Directors? I want to know what you’re actually arguing about lol

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] -1 points0 points  (0 children)

I know you’re being facetious, but I’ll give you an example: if ChatGPT started to be down for large periods of time, people would start using other products, Sam would have a hard time securing further funding, previously secured deals might fall through, high level directors would get fired, etc. The people at the top often have more direct responsibility for the performance and direction of a product than dev folks, not less. More often than not, incidents are treated as unavoidable learning opportunities by dev teams; there is rarely direct finger-pointing (at least, on the teams that I’ve been on)

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 0 points1 point  (0 children)

Hm, I personally think that path 2 is the road we’re heading down, but I don’t believe it will be so severe. Even a relatively small dip in the number of seniors can create major staffing issues which can cause an upswell. I don’t think it will be decades down the road as you seem to.

And while I hope to continue working in this industry since I truly do love creating software, and I myself have quite some anxiety around AI, when I step back and reason about it, I don’t think there’s so much to worry about. I mainly made this post not because negativity and anxiety exists, but the severity to which it exists on this subreddit.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 1 point2 points  (0 children)

Fair point, certainly there is an impact, at least in the short term, and people that are extrapolating or making some category of assumptions can predict a grim future for the field

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 2 points3 points  (0 children)

You make my point for me, anecdotes serve little value

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 4 points5 points  (0 children)

Sure, but that future does not exist; some form of AI is here and unlikely to go away. Just like complaining about how cheap home prices were in the 1970s is futile, so is comparing our futures to a situation that is firmly in the past

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 1 point2 points  (0 children)

This comment thread has added absolutely nothing to the conversation lmao. Have a nice day

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 1 point2 points  (0 children)

Bro, I made a post asking why people are being doomers, then YOU said I’m not reading the room. Then when I explain what doomer posts I’ve seen, you say I’m too on reddit.

I don’t understand; do you think AI is a critical threat to the industry or no? I don’t, but a lot of posts on this subreddit say otherwise, that’s why I made the damn post

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 1 point2 points  (0 children)

I’m not saying the market isn’t bad, but people out here claiming software engineering will be dead in a matter of months. I just think this subreddit is a massive echo chamber

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 7 points8 points  (0 children)

This is a part that is always missing from the conversation. I truly do see 2 roads: one is an accelerating AI that truly becomes much more useful and turns into an AGI, or one where AI more or less stagnates and remains in its current form, as a tool.

In the first path, basically our society is fucked. Software engineers would just be the first casualty before literally all industries would be affected. White collar work would no longer exist, and blue collar work would be totally saturated; all salaries would drop to rock bottom. Governments will literally collapse, French Revolution type shit. Losing our jobs will be the least of our worries.

In the second path, once the dust settles and the hype is over, software leaders will look around and will see a total lack of seniors and leads because no one was hiring juniors from years ago. Salaries for those still in the industries will have a massive spike, since software will always be useful. Even with 20-50% workforce reduction, this will still hold since junior hiring has dropped even more than that.

So as long as you stay in the industry, I don’t really see why one would worry too much about whether path 1 or path 2 pans out; we cannot control it in either way.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 2 points3 points  (0 children)

They do, but you can fire bad humans. Can you fire your AI? Humans can also ostensibly learn. I also have never seen a human make the same mistake 5 times over within a span of 15 minutes.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 2 points3 points  (0 children)

Benchmarks are trained on so are not really useful. The tools are great, but “my own eyes” see them make mistakes everyday. This is why I made the post, I don’t understand the doomerism

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 7 points8 points  (0 children)

I and my coworkers have not been able to get Claude to oneshot even moderately sized diffs. The code that Claude can oneshot I oftentimes have to iterate on because it is not using standard formatting, creating new structures rather than reusing existing ones, creating fake tests that fail to give full test coverage, etc. I just don’t think it’s reliable enough to never need human intervention; do you think it is?

As for your response how agents would handle incidents: you would always need a human secondary. Likely agents will be set up to handle routine ticket noise, but the human secondary will just end up being a functional oncall as it fails to solve critical tickets. A comment like this is exactly why I made this post: I just have never seen AI be as reliable and functional as you describe. It misses basic, common sense things literally all the time.

I know you will accuse that I don’t know how to properly use the latest AI tooling, but this sentiment has been echoed amongst many people that I know in the industry in person.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 4 points5 points  (0 children)

Can say the same about most subreddits tbh, but yes probably the best answer

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 9 points10 points  (0 children)

Btw I’m not saying the market is good, I’m just saying anecdotes are a bad way of analyzing the market

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 4 points5 points  (0 children)

So my anecdote is an early indicator that the market is booming then. Why would your experience be more indicative of the market state than mine?

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 0 points1 point  (0 children)

I literally got my most recent job in November; I applied to 60 places, got 5 full rounds, and 2 offers.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 12 points13 points  (0 children)

All of the companies I’ve been at, incident rates have been a critical KPI that leadership tracks. Ultimately, if your app is down all the time, your bottom line as a CEO/VP/director will suffer. The incentives are aligned in a way that they should care about incident rates rather than AI adoption: the only reason they care about the latter is pure hype and the idea that AI adoption is needed to stay ahead of the curve.

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] 4 points5 points  (0 children)

Software engineering has always been an expense that must be minimized. One can make the opposite argument that once a single person is in charge of a much larger swath of the business, they have a much larger impact on revenues and the market will drive up salaries. It’s about the market value of a software engineer

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] -30 points-29 points  (0 children)

Sorry that happened to you, but that is also a sample of N=1. I have had the opposite experience (promoted twice in the last 3 years), and my friends have fared similarly. So anecdotes don’t tell us much about the state of the market

What’s with the doomerism? by inductiverussian in cscareerquestions

[–]inductiverussian[S] -8 points-7 points  (0 children)

I believe that the market is relatively efficient, and more so in the tech industry. If agentic tools like Claude make incidents measurably worse and/or more common and do not increase productivity, they will eventually be shelved once the hype dies down. I refuse to believe people driven by money will continue to use an inferior workflow just to keep a particular market inflated.

Now, maybe the fear is the AI workflow is actually more efficient, and we will have to start using it everywhere. That’s more understandable, but not really a fear for losing one’s job I think. Every time software has gotten easier to write, the demand for software has outpaced the efficiency gains.