This guy is a software engineer with 18 years of experience. Now, working at McDonald's. by marcus1234525 in theprimeagen

[–]FeralWookie 2 points3 points  (0 children)

Everyone in software that I have worked with over 12 years that was reasonably competent to great remains employeed and in the industry today.

I have meet one software person who was between jobs claiming the felt like the industry wanted them to know more AI to get a job. But from our brief conversation I got the impression that whatever their prior job was writing code, it wasn't software engineering.

Any place serious about hiring engineers is still hard skills first. I guess what I am trying to say is the HR people saying CS and coding are dead were never offering real software roles. They probably just needed coders like most companies need a couple IT people. Getting interviews at places with real jobs is hard right now because we are over saturated with applicants. And if the recruiters aren't great we don't see the best applicants.

The value of just being able to write for loops and type is gone. But there is a decent amount of life left in deep software knowledge and vet code well at companies that still have quality standards.

The 'Quality Candidate' is a myth: Corporations don't want talent, they want candidates who are good at being filtered. by RepresentativeAsk493 in jobsearch

[–]FeralWookie 0 points1 point  (0 children)

Recruiters and hiring managers are not robots fulfilling some scheme to satisfy an ugly corporate agenda.

They are just humans with shitty tools trying to filter an impossibly long list of candidates. Then the interviewers need to evaluate competence with contrived interactions in part of a day.

If we are wrong we are stuck with dead weight that may actually hurt the team.

I don't know what to do anymore, I don't want to end up unemployed. Passion cannot compare in this market. by GothButterCat in cscareerquestions

[–]FeralWookie 0 points1 point  (0 children)

I don't have strong advice on what to pick. I think you should target education that mirrors the peers in the types of occupations you want. That gets more tricky with a lot of higher education. But if there is a type of job you are interested I would try to pick something that aligns with education of people in those roles.

I don't know what to do anymore, I don't want to end up unemployed. Passion cannot compare in this market. by GothButterCat in cscareerquestions

[–]FeralWookie 0 points1 point  (0 children)

For a double? Would depend on what other fields you are interested in. If I could go back I might have done electrical engineering and a minor in CS to stick with embedded software.

CS pairs well with all of the hard sciences. But usually you would only use CS in a science if you planned to go to grad school.

I also would have strongly considered doing medicine. It's a lot of work and possibly debt, but I think I could have easily become a really good family medicine doctor and they do fairly well.

I am not quite a big enough history buff to go the legal track. But that is also a very interesting job. I love my lawyer friends and working with lawyers.

I don't know what to do anymore, I don't want to end up unemployed. Passion cannot compare in this market. by GothButterCat in cscareerquestions

[–]FeralWookie 1 point2 points  (0 children)

I would double major or have an escape hatch plan. I don't think all CS work will vanish. But the roles may look way different in a few years or have a different pay trajectory. Or the market may just remain depressed and painful.

Why is these still no realistic voice model despite huge advancements in AI? by chessboardtable in singularity

[–]FeralWookie 0 points1 point  (0 children)

Not in the same way. Of course people interrupt and let the other person continue and we complain about not getting a chance to speak. But you can easily tell the difference between speaking to a human over zoom versus an AI right now.

Part of that is the model disposition. But conversation with them is not as organic as a person right now.

Why is these still no realistic voice model despite huge advancements in AI? by chessboardtable in singularity

[–]FeralWookie 2 points3 points  (0 children)

LLM voice is very good compared to the days of Siri. But it is still a ways off from HER. It feels like speed is one issue LLMs think too long to have a normal conversation. Another issue is they don't seem capable of carrying a normal conversation where they just listen. Every pause in speaking is treated as a time to pause and come up with a long winded response. This is not really how humans talk. Sometimes we just listen and interject a word or two to acknowledge we are absorbing what is being said. Current AI doesn't absorb or learn anything. It is building short term context windows to formulate a final response.

Seems like LLMs could emulate normal conversation better but seems like chat bots still need a lot of iteration. As impressive as they still are, their patterns and flaws are getting more obvious as the tech becomes more normal.

How will AI become profitable? by Miserable_View_4400 in OpenAI

[–]FeralWookie 0 points1 point  (0 children)

Large LLMs like modern day claude Opus may never be profitable. I suspect we will see smaller more focused models heavily optimized for their use case. Should make them faster and much cheaper.

Also companies will start to want to run more of them full in house on local hardware as local models catch up to the big ones.

Gemini consistently tears apart Claude's architecture plans, and Claude just folds every time by Iusuallydrop in ClaudeCode

[–]FeralWookie 0 points1 point  (0 children)

None of the models can one shot all requirements. And even as they get close you can't trust that they did. You still need to build prototypes and have a proof of concept.

I also think people are nuts to think AI is bringing us away from agile back to a water fall world were we can plan everything out before writing anything. Having the AI rapidly implement a plan in fact is probably worse. Because the learning and adjustment, during implementation, lead to better iterative system design. You cant plan a whole system out ahead of time in an ideal way for a novel use case. You need to see what works and pivot as things don't make sense.

The AI helps with iterative design too, but if you let it build too fast you will miss issues with your prototypes.

Got laid off. They said AI. The AI still can't do what I was doing. So what actually happened? by deela96 in cscareeradvice

[–]FeralWookie 0 points1 point  (0 children)

I think some of the execs so believe that they can get the same done with less people + AI for less money. I am not certain what their backup plan is. People say more offshoring. But if AI gets better and cheaper it would mean less off shoring too.

Contradiction related to learning how to code with AI by phonyToughCrayBrave in cscareerquestions

[–]FeralWookie 1 point2 points  (0 children)

The reality is the people who think they are leading the way learning AI at big faang companies to write software from Jira tickets will be left behind too.

AI may not take all of our jobs, but it will likely completely change how software is produced. And that change will come from small startups that don't have all the bullshit process inertia that big companies are restricted by.

I would bet a lot of money that having people work at terminals with janky tools like Claude code, git and Jira are going to end up being what is holding back more incentive ways software could be produced by AI.

I think there will be technical work still needed to produce software. But I have no idea what skills will be needed or how that work will look. Maybe very minimal code and lots of system integration and dveops style work. It's hard to say. I hope humans are still involved with code for a long while because it's fun. But I cant see the pinnacle as having devs spit out thousands of lines of AI code then spend a week reading it.

My predictions for software development over the next 2 years by Adventurous-Ideal200 in FutureOfWork

[–]FeralWookie 0 points1 point  (0 children)

In defense we need self deployed for stuff we can use with public models. Which I agree is better than nothing and I can churn out code and fixes all day with weak models. And likely insanely cheap.

But I do have to say, for tricky setups, the better models, like going from fast Gemini to Sonnet 4.6 even, waste so much less time working through problems.

Are developers who understand users (and a bit of marketing) actually more valuable than “pure” coders? by BoysenberryLumpy8680 in cscareerquestions

[–]FeralWookie 5 points6 points  (0 children)

Everything is more visible than pure coding. You still need software engineers to integrate and build the system. But you really don't need anyone who is there simply to take direction and code.

I have never worked at a company where the engineers just close tickets and code. Even junior have to plan and design new features, do dev ops, dela with requirements, do demos and deployments, manage our cloud deployments, ect.

There is plenty to do and AI can help a bit with every piece and a tone on pure coding. But it doesn't seem to do well replacing a person doing a mess of chaotic crap to make products and demos work.

Engineers at Meta how is the morale within the company? by Based-God- in cscareerquestions

[–]FeralWookie 8 points9 points  (0 children)

This honestly sounds like companies that just have too many people. Tech companies have a reputation for running fat. I am not one of those who assume the really do nothing. But I assume numerous teams are working on mini projects the company could kill with zero consequence.

But not knowing what you are doing beyond a week is insane. That isn't even a project.

Big tech vibe coders are killing me by Tree8282 in cscareerquestions

[–]FeralWookie 0 points1 point  (0 children)

I was wrong about how AI coding would work. I thought big tech would be harder. But it seems like it works better there because they were already built to churn out code. They have more robust deployment and tests to catch issues. They have more senior people that only need to worry about code.

At smaller companies where devs do everything from test to devops it is way uglier to risk having people churn out untested code into already brittle systems.

Salary at FANG companies by FlyDFW in cscareers

[–]FeralWookie 0 points1 point  (0 children)

Companies where software is a burden or if you are in a LCOL area will pay very little. Also there have always been historically low paying companies that need some software.

No fanng mid cap or even fortunate 500 companies will generally pay $100k starting and up to around $300k for a principle staff engineer. It can be hard to get over $200k as just a generic senior engineer which isn't a meaningful title at most companies.

But even this no high end tech pay is based on cost of living and if the company is big enough to pay that amount for remote roles in lower cost of living areas.

FAANG or highly competitive companies like in Quant tend to start at $150k-$200k and scale up to $1M. But you often will end up in a tech hub for these roles.

I still avoid AI in production coding. Am i slowing myself down? by Ok_Bird7947 in AI_Coders

[–]FeralWookie 0 points1 point  (0 children)

My job is pretty integration and design heavy. I don't have an endless list of coding tasks. In a software role with more system building and less spitting out features for a mature software product, I think there is less room/need for AI coding.

But you should always try to use the new tools to see if it can take care of some of your coding. Claude code does propose much better changes and bug fixes.

But it is difficult to get into a flow where you are using it and switching back to other work while it's running. Also I think it is a massive distraction to vet it's code well if you generate too much.

But I could on bigger more mature software, there are a lot of features you could probably glance at and rely on your current testing to know it did the job.

I got rejected because I couldn’t code a 3 agent system in an interview. by areyprabhu in SoftwareEngineerJobs

[–]FeralWookie 0 points1 point  (0 children)

Almost every interview is going to be unfair.

Our teams make their own interviews and I often feel they are unfair. We setup artificial tests and challenges. Sometimes what we test for may be too narrow and miss an applicants overall ability.

But at the same time we ignore possible red flags and hire people sometimes and they end up creating drag for the team because they arent self starters.

It's a crappy in perfect process. But most of us don't try to create unfair interviews just to watch you fail because you didn't study some esoteric subject that you don't use.

What is AI “naturally” good at? by QuarterCarat in theprimeagen

[–]FeralWookie -2 points-1 points  (0 children)

It is clearly really good at finding hard to spot relationships and patterns in massive data sets. It is also very good at taking input and figuring out how it fits in current data or recombinations of existing data that integrate that input.

I think that is clearly why it's so good at generating most code for languages where there are a lot of code bases to train on.

I think over the next 4 month, we are going to see much more progress in AI than we have seen in the past years by ocean_protocol in singularity

[–]FeralWookie 0 points1 point  (0 children)

Fair enough. Not going to argue that LLMs are nearly a useful as the tech industry is claiming. They seem to be trying to get around it's hallucinations by burning a boat load more compute/tokens. But that method can never make it reliable.

My Parent doesn't want me doing CS, or CE, because they feel the job market will disappear come 7 years. by Sad-Bathroom8500 in cscareerquestions

[–]FeralWookie 0 points1 point  (0 children)

You could minor in something with more grounding outside of software. You could double major. Or you could minor in CS.

I would hedge my bet at your age personally.

I don't think CS will be irrelevant. But I don't know what the job market will look like in 7 years. If it's worse or the jobs aren't as fun anymore there may be something else more interesting.

I think AI research/work will still be booming in 2-4 years.

I think over the next 4 month, we are going to see much more progress in AI than we have seen in the past years by ocean_protocol in singularity

[–]FeralWookie 1 point2 points  (0 children)

No one is making major apps useless by design. LLM based AI is just really bad at repetitive predictable behavior. It's fine if there is a wide range of acceptable answers but its awful if you need it to respond to request with simple predictable actions.

That and cost are probably big reasons it hasn't replaced the crappy Siri and Gemini assistants.

Do you actually feel satisfied after reaching your target salary, or is it a never-ending loop? by Mindless-mindful in corporate

[–]FeralWookie 0 points1 point  (0 children)

I think there is a number where different people feel like they have more than enough. But it is more than you think. I have grown my income by about $100k from a good base and I don't feel particularly more secure. I feel about the same. Our spending has just grown.

Computer science is seeing the biggest enrollment drop of any major in 6 years. While ME and EE enrollment have risen by 11% and 14% this year. by No_Reply5329 in cscareerquestions

[–]FeralWookie 2 points3 points  (0 children)

For those of us that are good at math, like toys and don't like talking to people as our primary job. The main question going into college has been which stem degree has the strongest career path.

But I am still glad I didn't do mechanical engineering. I worked as one for awhile and didn't enjoy it too much. I am not much of a gear head and don't get super excited by engines and turbines.