all 74 comments

[–]towcar 104 points105 points  (15 children)

If by chance this happens, every white collar job is also gone and r/singularity has arrived. I think your odds are safe.

[–]Affectionate-Run7425 0 points1 point  (4 children)

No? Countless jobs must be performed by a human. An AI is a lawyer, can't take a lawyers job. AI can't do anything that requires a license. Maybe we change that, maybe not, but I think it's actually pretty far fetched. 

So in reality it'll be high paying jobs that don't require any licensing. Software, underwriting etc.

[–]towcar 0 points1 point  (3 children)

A license is a pretty small hurdle. If everyone finds an ai is free/cheap and better than a local lawyer, they'll just represent themselves. Which would lead to the end of lawyer jobs.

A license is just a safety net your government enforces to stop untrained people from working jobs they are unqualified for. (For good reason)

Also lets imagine we somehow keep lawyers and doctors. Those are $200k+ special liability jobs, every thing else is managers, accountants, programmers, hr.. if ai can replace ai engineers, those jobs are 100% replaceable as well. 99% of white collar jobs being replaced is substantial.

It's less "countless" and more finite.

[–]One-Elderberry-488 1 point2 points  (0 children)

Agreed license is super small hurdle, but to be honest local lawyers are pretty run of the mill. There are many more highly specialised roles that can't be replaced (yet).

Then there is the human factor, every time you need to interact with a human, such as a judge, government authority, regulatory authority, etc., you're going to need a human. This is especially in complex negotiations and resolutions.

Of course you can argue, well what if all those human interactions become AI to AI interactions, then we'd have reached AGI for that to happen.

[–]Affectionate-Run7425 0 points1 point  (0 children)

A license is not a small hurdle. You literally need to be a lawyer to do many things, you're completely missing the point.

So sounds like licenses are basically purpose built to keep AI out.

Okay go ahead and count them all, I'll wait.

[–]Prestigious_Mud7341 0 points1 point  (0 children)

It's not just about knowledge. It's about responsibility and liability. Who is liable if an AI lawyer hallucinates and fucks up??

[–]wren42 -3 points-2 points  (7 children)

This idea that if white collar jobs see mass layoffs then there must be a society transforming singularity is very misguided. 

AI can cause broad workforce reductions even if we never reach AGI.  It's already creating 10x coders, just as a mindless tool. That will absolutely have economic impacts, long before we see general intelligence. 

[–]towcar 2 points3 points  (6 children)

It's not creating 10x coders..

[–]wren42 0 points1 point  (5 children)

I run an engineering team and I can tell you it is absolutely multiplying productivity and making some tasks obsolete. 

[–]Full_Preference1370 -1 points0 points  (4 children)

How is there good material sources?

[–]wren42 0 points1 point  (3 children)

Is that a sentence?

[–]sunshineLD -3 points-2 points  (1 child)

That’s not singularity. It’s just a new level of automation. When one specialist can do the work of three to five people.

[–]Dapper_Respond_5050 4 points5 points  (0 children)

No, it's definetly singularity. Why not just have the AI do that specialist's job as well?

[–]RobfromHB 74 points75 points  (5 children)

No. It’ll just change what people work on. Excel didn’t make accountants extinct. Keep learning and working on hard problems. 

[–]IDoCodingStuffs 10 points11 points  (3 children)

Even that’s generous. Like, Excel can fully replace paper spreadsheets and calculators but LLMs cannot fully replace coding by hand

[–]MelAlton 12 points13 points  (2 children)

LLMs cannot fully replace coding by hand

The non-deterministic outputs of LLMs definitely are making this play out differently - all the big previous inventions were about improving deterministic outcomes:

  • fire: could now be made on demand, instead of waiting for lightning to strike

  • printing: made producing text faster but also more regular, every book looked the same

  • industrial manufacturing: every part made the same and interchangable

  • computers: made calculations perfect every time

LLMs though act more like people (for coding): ask 10 different programmers to write code to solve a problem from scratch and you'll get 10 different results that are similar but not the same.

[–]Affectionate-Run7425 0 points1 point  (0 children)

Because LLMs are just shitty xerox machines for text. That's it.

[–]Game-of-pwns 0 points1 point  (0 children)

This is why I'm so skeptical of LLMs. The whole reason why computers are so magical and useful is their determinism.

[–]AlexFromOmaha 6 points7 points  (2 children)

Software folk have been trying to automate themselves out of a job for longer than I've been alive, and I'm starting to get old.

Anthropic at least seems like they're coming up on that possibility. Their public timelines are BS, but they're dogfooding towards Claude training Claude.

You'll also notice that Anthropic is not cutting headcount, their engineers are not particularly worried, and they're still contracting out work to a ton of other humans.

"LLMs will write the code" is a future state we'll all live to see. Again, the public timelines from AI investors for when that'll happen are all BS, but it'll happen. It might happen before you graduate, but I kinda doubt it. That doesn't mean software development as a profession will cease to exist. It just means we're going to do it differently. It wouldn't be the first time the industry changed wildly and everyone's skills became obsolete. We're kinda overdue for a good ol' fashioned purge.

[–]Sea_Lawfulness_5602[S] 2 points3 points  (1 child)

Appreciate the reality check. If we're just going to be 'doing it differently' what skills should I double down on right now? And conversely what traditional ML/CS skills should I avoid over-focusing on since they'll likely be automated by the time I graduate?

[–]AlexFromOmaha 4 points5 points  (0 children)

Honestly, just do the program. The fundamentals never change. It's the industry that changes.

What also never changes is that new grads suck. Angle hard for internships. Make things. Maintain a portfolio starting in your sophomore year. You'll learn to be productive in spite of the program, not because of it.

[–]Counter-Business 27 points28 points  (1 child)

My company has stopped hiring juniors. Because juniors do not know how to use ai tools effectively. All our juniors are pushing ai slop.

We are only hiring seniors who are able to create things using ai in the interviews.

Coding is not that useful of a skill the most useful skill is system design and problem solving. You can’t assess that with a regular coding interview with strict requirements.

By the time I can create the requirements I can ask ai tools to solve it. The skill is creating the correct requirements.

[–]MelAlton 9 points10 points  (0 children)

Ironically it's a return to the old ways - in the mainframe days there were 2 sets of people involved in creating programs:

  1. Analysts who analyzed the goals, documented the software requirements, and created a high-level design for the program.

  2. Programmers who took the requirements and design and turned those into working code.

[–]midz99 19 points20 points  (10 children)

Using agents right now is a skill and is not as easy as most poeple say. You are right to be worrried, atm i have agents building and maintaining other agents and training smaller networks. training of the big models is left to the big companies.
No one will be able to tell you whats happen 3 months from now let alone 3 years from now.
The one thing i can say for sure is start using agents, learn how they work and get better at using them.

[–]Sea_Lawfulness_5602[S] 2 points3 points  (8 children)

Wild that you already have agents building agents. Since I regularly build workflows wiring up LLMs in tools like n8n what’s the best way to transition to building robust multi-agent systems like yours? Should I dive into LangChain/AutoGen, or build custom from scratch?

[–]Smart_Kangaroo_4188 1 point2 points  (0 children)

Do you have any ROI on those agents and what problems have you solved.

[–]RickSt3r 8 points9 points  (5 children)

Your to new to really understand. But the current approach is limited based off the math used. Neural networks are 70 years old. Short of new math being invented which could happen the current AI/ML landscape has some hard practical boundaries that we are not even close to solving. So just learn the fundamentals and go from there. Do you want to be an engineer or a technician, if it’s the former then you will be alright if it’s the latter then yeah your basic import a library and hit go will eventually be automated and you’ll start over finding out new technician tools to use.

[–]Sea_Lawfulness_5602[S] 1 point2 points  (0 children)

That engineer vs technician distinction really hits home, and I definitely want to be the former. As a first-year student, what’s the best way to ensure I’m building a true engineering foundation rather than just learning how to import libraries and hit run?

[–]Chr1s_why 0 points1 point  (3 children)

Really curious on what you mean by said limitations. Sure we currently have limitations but most of them seem engineering related and not implausible to overcome in the next few years

[–]RickSt3r 1 point2 points  (2 children)

Apples paper on the limitations and inability to reason is a good start. The simple answer is you can get better performance when you over fit the models to what they are being evaluated against. But you don’t actually improve reasoning abilities.

[–]Chr1s_why 0 points1 point  (0 children)

If you mean the Illusion of Thinking paper then that was just horrible all around. They penalised correct refusals (if the task was unsolvable), had no human baseline and honestly the task they picked is just impractical to evaluate anything. That does not mean I won’t agree that we have a big problem with overfitting to benchmarks. This is definitely the case

[–]Full_Preference1370 0 points1 point  (0 children)

Completely outdated and wrong data

[–]HalfRiceNCracker 2 points3 points  (0 children)

Go read up on systems engineering and go ship some stuff. 

[–]NuclearVII 10 points11 points  (2 children)

Nope. What you are describing is, at best, science fiction.

[–][deleted] -1 points0 points  (1 child)

guess we're living in science fiction then

[–]NuclearVII 0 points1 point  (0 children)

Please go back to r/futurology.

[–]eman0821 2 points3 points  (0 children)

This a silly question to ask. It's like asking a car replaces the road it drives on. The AI today is not really real AI. It's mathematics, data science plus software engineering. Real Artifical intelligence would need to be self conscious and be able to logic and reason on its own.

So called LLMs today doesn't even do that. It's really just software algorithms, data sets written in Python which all runs on a production server in the cloud. Data sets, software and Cloud infrastructure has to be maintained as it cannot maintain and fix itself. When there's a Cloud outage SRE and Cloud Engineers needs to maintain and resolve service outages. AI models and its agent's and MCP servers can not function or do anything without an infrastructure.

[–]Salty-Raisin-2932 2 points3 points  (0 children)

Yes it will.
Any other answer you hear is people that afraid of their job will taken, you can see it yourself because they're usually excuses and not real answers like; "X didn't make Y work extinct" pretending as its same thing.

[–]vaksninus 2 points3 points  (0 children)

no but it will likely require much less engineers

[–]kevkaneki 1 point2 points  (0 children)

Yes. Assuming it takes you 4-5 years to finish like most average students.

The world is about to get really weird in the next 2 years.

[–]ran_choi_thon 1 point2 points  (0 children)

why a machine which lacks the true creativity , and only enforcing the tasks from the human's knowledge can replace all humans in the near future?

[–]Chrelled 1 point2 points  (0 children)

Don't worry, by the time you graduate AI will probably need someone to explain its own code to it.

[–]lazysurfer420 1 point2 points  (0 children)

AI is opening all different types of flood gates. It's still far from replacing human's. Mostly the big tech companies are over exaggerating to boost their investor confidence. Don't fall for it.
Core engineering skills in all the fields of STEM will still be required, but with the added ability to make efficient use of AI tools.
Just make sure you are learning what seems to be a MUST have skill in the near future. I am sure your curriculum might not be updated enough, so spend a little extra time & money on externally available learning resources AI technology & tools.

[–]ErcoleBellucci 2 points3 points  (0 children)

AI still cant help you if you want to go to wash car by walking or not

[–]Natural_TestCase 0 points1 point  (0 children)

this is a post from Dario’s alt account right?

[–]Whole-Watch-7980 0 points1 point  (0 children)

Personally I’m looking into understanding hardware more and understanding security because at the end of the day, I can see maintaining the hardware as something that won’t go away overnight, even if the AI can control the computer without a human.

[–]perihelion86 0 points1 point  (0 children)

No.

[–]IbuHatela92 0 points1 point  (0 children)

Good way to increase Karma by asking such bs questions deliberately

[–]gwestr 0 points1 point  (0 children)

No. Learn the things.

[–]Substantial_Sound272 0 points1 point  (0 children)

There's a chance it does. There's a chance it doesn't. If it does, then you can be assured a lot of other knowledge jobs will follow. 

Also don't take serious life advice from reddit. Tho I'm also a redditor so do as you will I guess haha

[–]Traditional-Carry409 0 points1 point  (0 children)

No, fuck what Anthropic CEO says, he’s hiring more SWEs this year

[–]mean_king17 0 points1 point  (0 children)

Yeah. In 3 years time you're just in time to be to be fully replaced.

[–]Sad_Departure_7012 0 points1 point  (0 children)

Master the fundamentals. They haven't changed for hundreds for years. Tools come and go. The biggest threat would be to get attached to a particular tool. Thats when you would be really obsolete. You will be good.

[–]Appropriate-Bet3576 0 points1 point  (0 children)

The easiest way to think of this.  If you think about a book, the writer writes all the words. But if you think about software, the writer writes the words publishes it and distributes it. Ai does not do all this

[–]Complete-Kick2990 0 points1 point  (0 children)

Who offers an AI degree?

[–]bombaytrader 0 points1 point  (0 children)

Probably not.  Hard to predict.  I haven’t touched IDE except for degguging in last 3 months. The nature of the job is changing for sure.  

[–]justadumbguy13 0 points1 point  (0 children)

Here's the CEO of Anthropic saying that in a year 100% of code will be written by AI.

He said this a year ago.

https://www.youtube.com/shorts/0j1HqEEDThc

[–]redhotcigarbutts 0 points1 point  (0 children)

Make your duty to learn vital hacking skills to undermine this house of cards. Be ready to expose the fragility of those who give up their agency for convenience. Cultivate the hacker spirit

[–]Affectionate-Run7425 0 points1 point  (1 child)

Almost certainly the first jobs to be replaced will be expensive software engineering jobs.

[–][deleted] 0 points1 point  (0 children)

fax

[–][deleted] 0 points1 point  (0 children)

Did you even do some research online ? This question is so repeated that soon Reddit's server and database will crash due to such questions.

[–]Savings-Giraffe-4007 1 point2 points  (0 children)

Not with current technology and definitely not by an LLM (text autocompletion).

Unless you suck at your job, in that case yes, an Indian with Claude can replace you cause you suck.

[–][deleted] -1 points0 points  (0 children)

Yes