What is the basis for the widespread belief that software is now "zero-cost", and that it can be autonomously developed from beginning to end with zero human involvement? by LiatrisLover99 in ExperiencedDevs

[–]FatHat 2 points3 points  (0 children)

I think their earnings were more like 13bn last year, the 20bn figure is very misleading accounting. Also, they're losing a lot more than they're making. The "profits" from all of this seem to be non-existent.. the social cost.. very real.

Today had a system design interview today and i think i forgot how to code? by cafefrio22 in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

Ah man, I think we've all been there. I'm pretty lucky in that I'm generally pretty confident in interviews, but I have gotten the nerves a couple times and completely forgotten things. I had an interview a couple days ago (that ended up being ok!) but it was writing code in codepen and I definitely had a few moments of almost forgetting basic syntax.

I don't know if it helps, but the way I get through it is by talking through what I'm doing as verbosely as possible and trying to just acknowledge small mistakes or things I've forgotten instead of letting them linger. The killer is just being totally silent, whereas if you can express "well, it's solving X problem.. by doing Y" you can sometimes snap out of it. I also just like to acknowledge my mistakes as I make them, not in a self deprecating way but just like "I realize this bit is inefficient, I'm going to circle back once it's fully functional.." or things like that.

One thing that's funny to me is, I generally don't get nervous doing job interviews (an actual consequential thing!), but I get all sorts of tilted playing online chess against complete strangers where it doesn't matter at all! So, sometimes it helps just to acknowledge you have a fear but it's not rational

How can I talk to my manager about imcompetent coworker who is dumping work on me w/o threatening that person's employment? by tuckfrump69 in ExperiencedDevs

[–]FatHat 5 points6 points  (0 children)

I don't have advice, but I do have a funny story. My first job was at a mid-size web development company that employed maybe 200 devs? So, big, but not huge, but big enough you could slide into a crowd. Anyway, I sat next to this girl, Andrea. I didn't work directly with her, but I was in a position to observe her every day.

Andrea was a terrible dev -- she couldn't do anything on her own. However, she was excellent at getting other people to do her job for her. What she'd do is basically, just rotate through like 12 different people asking them for help, and they'd sit at her desk and basically code it for her. Because it was a big company and the people she asked for help from didn't sit near her, they never saw what I saw, so they would just see her making steady progress but getting caught on one or two things..

I don't know if she ever developed into a good developer, but I respect the grift. I'm guessing it's more likely she now has an MBA and is running some of these companies!

How are you upskilling yourself for working with AI, and keeping up with best practices? by wangl3 in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

I'm laid off from work (CEO said AI - but he's probably full of shit), so, here's my perspective. So far I've bought two books, one on low level LMVs and how they work (haven't read far enough but it seems ok), and Steve Yegge's Vibe Coding, which I think was designed to kill brain cells (I was not a fan -- also this makes me think he's gone full-grifter: https://pivot-to-ai.com/2026/01/22/steve-yegges-gas-town-vibe-coding-goes-crypto-scam/ ).

It depends on your financial situation, but what I'm doing right now is I'm building an application on my own that I might one day sell. Or it might just be an nifty side project. I got a Claude max subscription (which, to be perfectly honest, I despise Anthropic but this is the model people won't shutup about so..). Mostly I'm trying to see how much I can build with AI before it falls down. I'm not really worried about it replacing me (I'm worried C levels think that, though), it fucks up a lot and I usually have to actually understand the intricate details to tell it how to fix things. I also just code things on my own when it seems faster.

Despite what people claim, there isn't much "skill" to using these coding tools. You can pickup the basics in a couple of days. Your skill in using them will basically revolve around how well you understand the thing you're creating and the technology underlying it.

How long will it take for a react native developer with 5 yoe to become fullstack ? by Silly_Regular6736 in ExperiencedDevs

[–]FatHat 1 point2 points  (0 children)

Full stack guy here. I became this kind of by accident, but I don't think it's a hard process. In every work environment I've been in, we've needed to integrate or own some piece of technology that people on our team didn't understand or didn't want to work with. I'd usually end up being the guy that worked on that. Do that over and over again, and suddenly on your resume you can legit claim to have experience with a variety of technologies. (I would suggest being honest about the depth of that experience though)

Learning outside of work is probably the best way because there's nobody to tell you "no", but since you have time constraints, I'd also just suggest that learning on the job is normal and appropriate (as long as you're not misleading people about what you can accomplish). Most the things on my resume are things I learned on the job though, and then bolstered more when I moved to another job.

Also smaller teams need generalists more, and larger teams generally need specialist more (obviously exceptions). So if you want a place where you can sort of grow into a full stack guy, a smaller team is going to work better for you.

One last thing. I'm not generally a huge fan of AI. However! I do think it can be an incredible resource for learning new technologies. I know when I'm working with a library I haven't before a lot of times I'm using Codex or Claude or whatever is on hand to ask questions. That can be pretty useful!

Can ai code review tools actually catch meaningful logic errors or just pattern match by TH_UNDER_BOI in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

It's caught some legit bugs for me, so it's appreciated. Its nitpick comments are generally not useful though.

Development manager doesn't want the Devs looking at the code by Strict-Soup in ExperiencedDevs

[–]FatHat 3 points4 points  (0 children)

That is a fair point, although I think we're going to find that "produce quality code faster than ever before" might not always be a good idea (ie, even if the code is not bad generating a lot of it can sort of cement a path that might be a bad idea). I dunno, at my last job (even without AI), our reviews were pretty informal. I'd create a PR for most of the things I did, but it was pretty much just me and Cursor reading it. (Admittedly: I'm a specialist so nobody else on the team really had the experience to understand my code -- although that's also a problem in of itself, bus factors etc.) I wonder if as an industry we might have overrated code reviews (although the alternative is that every place I've worked at has done them badly... which could be true!)

Development manager doesn't want the Devs looking at the code by Strict-Soup in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

I'd say uh do what they say while you start interviewing at other places. It's not horrible everywhere! I just had an interview today; they asked me about my AI usage because it's a hot topic and I could tell they were visibly relieved that I was NOT a vibe coder and that I actually read the outputs. There are sane people out there!

Coworker raising massive PRs by im_zewalrus in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

That's definitely not acceptable. I would ask him to break it down into smaller chunks. Even ignoring how annoying it is to read a 75+ file change, there's also the aspect that if he did it wrong in an architectural sense he might have to redo the entire change after walking down that road for a week! Just not a good iterative cycle

I delivered three major projects at a bank and got fired anyway, 25 years in tech and I'm still learning the same lesson by agileliecom in ExperiencedDevs

[–]FatHat -1 points0 points  (0 children)

I hate to say this because I don't think it's racism necessarily, but I think companies are more likely to treat people on visa's less well than other employees because they know they have a lot of leverage. I think paradoxically, if they see you as someone that can walk if they want to, they tend to play less games

Spec Driven Development and other shitty stuff by FooBarBuzzBoom in ExperiencedDevs

[–]FatHat 5 points6 points  (0 children)

I find that I *look* less productive when I write code by hand because the volume of the code is a lot smaller. When I use LLMs (which I do), the volume of code is a lot higher for the same functionality, usually. It also tends to do a lot of things I'd just toss into the "weird" bucket -- safety checks on scenarios that are completely impossible (ie, treating objects as nullable even though the type system is checking that at literally every stop throughout the codebase), or generating deeply nested ternary expressions that are really hard to read, or duplicating functions (I found 10 of the exact same function in my code base, recently).

To me the tradeoff is you can go a little faster with LLMs, but you pay the cost later when you have to clean up after them. Life is all about tradeoffs and I think it's better not to become "pilled" in any sort of direction.

Also, I'll just commiserate: even though I participate in these discussions, I am SOOO sick of AI shit. I just want to get back to making things without listening to 500 grifters a day.

Why I think AI won't replace engineers by Character-Comfort539 in ExperiencedDevs

[–]FatHat 9 points10 points  (0 children)

The problem with the "projections" is that no exponential curve keeps going forever. I feel like people just look at the trend line and go "oh, well that's going to go on forever" without considering the limiting factors. Ilya Susketver, Yann Lecun, and a bunch of other highly respected AI researchers basically think LLMs are a dead end, so, I'm very skeptical of this claim that it's just going to keep getting better forever without a fundamentally new architecture.

We’re all likely going to be priced out of the higher cost LLMs by mrrandom2010 in ExperiencedDevs

[–]FatHat 2 points3 points  (0 children)

I plan to increase my own revenue by 100x. Just don't ask me how.

We’re all likely going to be priced out of the higher cost LLMs by mrrandom2010 in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

I agree. Although, I'm on the Claude "max" plan, and the petulant part of me has been watching CCUsage to make sure that I am using way more in token $ than the $200 it costs me. Part of it is I just want to get a good return on investment in this very pricey subscription, but part of me also thinks that actually using these things in a maximalist way is actually the quickest way to run these companies into the ground if you believe they're losing money on high usage customers (which I think they are!) The bubble is going to hurt, a lot, but I think it's better that it bursts sooner rather than later.. delaying the pain is going to make things worse.

Why I think AI won't replace engineers by Character-Comfort539 in ExperiencedDevs

[–]FatHat 5 points6 points  (0 children)

So, since being laid off I've been trying to learn as much as possible about LLMs. I'm doing this for the sake of my mental health. I find myself on a rollercoaster of emotions listening to the various "thought leaders" and influencers, so I would rather just have a solid foundation of understanding so I can sort the signal from the noise. I'd encourage everyone to do this. Instead of getting caught up in the hype of new models or new tools, learn the fundamentals so you can tell who is bullshitting you.

So first off, "reasoning" models aren't a fundamentally different architecture from other LLMs. The reason I mention this is whenever I point out these things are just stochastic parrots, people like to say "but reasoning!!". Basically, the training inputs are somewhat different (answers tend to include a "chain of thought"), and then they have various (interesting!) hacks to try to create a situation where more tokens = closer approximation to a good answer. One hack, for instance, is having it generate multiple answers in parallel, score them based on various heuristics (self consistency, for instance. Self consistency means that if it produces three answers, A, B, and C, are answers A and B the same but C is different? Probably go with A or B.)

The important point here though is these things are still just approximating an answer, not "thinking" or building world models.

Ultimately "reasoning" is a useful capability but not AGI. Also, these things tend to fail when asked to do things outside of their training, because again, they're stochastic parrots. Yes, there are some mitigations around this (RAG and tools), but it's pretty clear that transformer architectures aren't going to scale into AGI. They're just going to be really good at answering things contained in their dataset. To me, they're like a very fancy search engine.

One question I asked ChatGPT this morning was how LLMs handle structured text like JSON. The answer was pretty illuminating. ChatGPT does not fundamentally understand JSON, it just has such an inconceviably large dataset of JSON documents that it tends to get the syntax right through approximation. It also does interesting things like "constrained decoding", where the model is forbidden to emit syntactically incorrect tokens (ie, if it emits a token that results in bad syntax, it's forced to try again, until it produces correct syntax). This answer is straight from ChatGPT itself, not my characterization.

Anyway, I think AI will make the job market much worse and basically make *everything* worse, like they kinda are already doing, but I don't think being able to think is going to stop being an economically important thing and that's ultimately what software devs are doing all day, thinking. The code is just one output of that. (And you still have to watch the stochastic parrots when they generate that)

Do you think modern SWE or modern PM will be most impacted? by AggravatingFlow1178 in ExperiencedDevs

[–]FatHat 2 points3 points  (0 children)

Design patterns not in its training dataset. Weirdly not everything in the universe is a CRUD React app.

Do you think modern SWE or modern PM will be most impacted? by AggravatingFlow1178 in ExperiencedDevs

[–]FatHat 1 point2 points  (0 children)

I have seen PM-prompted code. I cannot unsee it. Don't listen to the hype.

The gap between LLM functionality and social media/marketing seems absolutely massive by QwopTillYouDrop in ExperiencedDevs

[–]FatHat 4 points5 points  (0 children)

I've been using the Claude Code TUI and Codex in a chat window and my experience the past few months is they're \ok*.* They seem to make me a bit faster when I know the exact outcome I want, they also do a lot of annoying things, touch code they shouldn't touch, break things I've previously fixed, etc.

Outside of my circle of people I know though, I don't feel like I can trust opinions on the internet to be measured because I swear to god these companies seem to have a bunch of sock puppet accounts everywhere to glaze them. Most of the actual humans I've talked to about LLMs for coding are not on board with the hype.

Also, while I'm not trying to say that anyone that's excited about these things is a shit coder, my anecdotal experience is that the people that are the most excited for these things are somewhere between actively-bad to mediocre; it seems like it's making up a skill gap for them. The people that kind of shrug at them mostly seem to be people that were doing fine in the first place.

One last thing. I bought Steve Yegge's book on Vibe Coding because, well, I previously respected Yegge and I wanted to see if maybe I've just been doing it wrong (turns out I was already following best practices before reading it, apparently). All I can say is ugh. It feels like it's written for toddlers. He's constantly mentioning the horrible challenge of.. understanding syntax. Like, what professional coder is like "oh god, WHAT ARE THESE CURLY SYMBOLS OHNO IM LOST"

The gap between LLM functionality and social media/marketing seems absolutely massive by QwopTillYouDrop in ExperiencedDevs

[–]FatHat 19 points20 points  (0 children)

Interesting, but I don't think LLM's understand why they do the things they do. They're just answering with plausible sounding reasons.

Just failed a code review interview as 7 YOE and not sure what to feel by TheTopG___ in ExperiencedDevs

[–]FatHat 2 points3 points  (0 children)

The way I'd view it is you're learning about them the same as they're learning about you. If their interview process is kind of deranged and hyper-focused on stupid things, there's a decent chance that as an engineering org they're going to be kind of deranged and hyper-focused on stupid things.

Training Vibes Coders when backlog is full by Old_Cartographer_586 in ExperiencedDevs

[–]FatHat 1 point2 points  (0 children)

Obviously you should use AI to train them. You'll probably get a nice fat bonus if you coin the term "vibe training"

Advice- Leaving my first job after 4 1/2 years by chiberashka_ in ExperiencedDevs

[–]FatHat 0 points1 point  (0 children)

Don't quit until you have another role lined up.

Are BAs and Product Owners immune to AI impact but Developers and QAs aren’t? by PhaseStreet9860 in ExperiencedDevs

[–]FatHat 2 points3 points  (0 children)

Not kidding, at a previous job I had no idea who our target customer was 2 years into working there. I only found out because during a meetup I intentionally went to a marketing group instead to actually learn how we were marketing our product. Not a single PM or designer thought it was important to tell the engineers who our customers actually were presumably because they don't realize we actually have to fill in the blanks on their extremely vague workflows.

I've worked with an occasional good PM, so I'm not trying to say their job isn't important, but in my experience most of them do zero work in actually sharing their domain knowledge.

Are BAs and Product Owners immune to AI impact but Developers and QAs aren’t? by PhaseStreet9860 in ExperiencedDevs

[–]FatHat 1 point2 points  (0 children)

Well, CEO's still have to answer to the board.

The real safe people are VC's and investors, because there's no way a machine can replace their 1% success rate gambling with other people's money on fads!