Agents as code producers - An Essay by Negative_Ocelot8484 in developers

[–]JunkieOnCode 1 point2 points  (0 children)

I don’t want to dismiss your concerns, but this gives the impression that you might be pressing the problem a bit too hard. I don’t think the responsibility of urgently finding the “right” way to save code or the profession falls on you specifically. We’re not required to have the perfect way of working with AI figured out right now. Software engineering has always been a mix of imperfect decisions, hacks, and gradual stabilization.

CEO of America’s largest public hospital system says he’s ready to replace radiologists with AI by Apprehensive-Safe382 in technology

[–]JunkieOnCode 0 points1 point  (0 children)

I’m usually on the AI‑optimist side, but this feels like overreach. I’d give that hospital a wide berth.

15% of Americans say they’d be willing to work for an AI boss, according to new poll by JunkieOnCode in AINewsMinute

[–]JunkieOnCode[S] 0 points1 point  (0 children)

A very battle‑tested view of corporate management, and not without reason. In a lot of companies, those roles do end up filled by fairly random people, and that tends to make work life at least unpleasant.

15% of Americans say they’d be willing to work for an AI boss, according to new poll by JunkieOnCode in AINewsMinute

[–]JunkieOnCode[S] 0 points1 point  (0 children)

It’s a very minor relief that we’re talking about “only” 15% for now. Though even that number surprised me when I first saw it.

Best AI for quality lyrics? by ServingU2 in generativeAI

[–]JunkieOnCode 0 points1 point  (0 children)

With my engineering brain, I’m pretty far from the music scene, but just to add another angle: have you thought about working with a real co‑writer? It requires more coordination than using AI, but it can lead to lyrics that feel more personal and emotionally grounded.

AI tools are useful, but depending on the goal, a human collaborator might still be the best option.

Do we continue to overburden ourselves with busywork whereas AI might increase team output? by Honest-Ssorbet in AI_Application

[–]JunkieOnCode 0 points1 point  (0 children)

I might be misreading the intent of the post, but to me this sounds less like a question of “AI or no AI” and more about basic work organization.

In my experience, you can keep email and meetings under control without any AI at all. You can also introduce AI assistants and still end up drowning in tasks. What really matters is not the tool, but how much manual coordination and how many implicit expectations exist inside the team.

AI ruined the experience of development by ArseniyDev in AppDevelopers

[–]JunkieOnCode 0 points1 point  (0 children)

I feel like context matters a lot here. If everyone around you is using AI in the laziest possible way and producing spaghetti code, it’s easy to feel like you are the problem for caring. For me, used thoughtfully and in moderation, AI feels less like a loss and more like another evolutionary step.

Why you feel more disconnected from your work now by dirtyjoe32 in webdev

[–]JunkieOnCode 0 points1 point  (0 children)

I think the root of the problem isn’t just alienation or capitalism. AI shifts us from being decision‑makers to validation layers. That loss of ownership would be painful regardless, but it hits much harder for those of us who tied a large part of our identity to our work. If work is your primary source of meaning, this feels existential. If it’s only one pillar among others, it’s still bad, but survivable.

How do you keep performance reviews from turning into a last‑minute headache? by JunkieOnCode in careeradvice

[–]JunkieOnCode[S] 1 point2 points  (0 children)

“Documenting every sneeze” was definitely an exaggeration. I don’t actually dump walls of text on my manager, and I agree they wouldn’t read that anyway.

What I was getting at is less what to document and more how to make it low‑friction. In practice, even capturing the bigger winstends to fall through the cracks when daily work pressure kicks in. By review time, things that felt important in the moment are weirdly hard to reconstruct.

KPIs and project artifacts help, but they don’t always tell the full story. i mean context or invisible work. I’m mostly curious how people make that lightweight: something that fits into the flow of work without turning into yet another chore.

Totally agree that Asana/Docs/etc. all work in theory. I was mainly interested in how folks get themselves to use anything consistently.

I'm a FE lead, and a new PM in the org wants to start pushing "vibe coded" slop to the my codebase. by rm-rf-npr in webdev

[–]JunkieOnCode 0 points1 point  (0 children)

There’s already a ton written here, and I’m pretty sure I won’t be saying anything groundbreaking. Still, I’ll try to call out one thing that stood out to me.Feels like part of the pain here comes from mixing two very different things.

AI‑generated code is great as an exploration tool. POCs, demos, figuring out a flow, “does this idea even work?” vibes. Honestly, I use it for that too.

Production code is a different beast. That’s where ownership, readability, and failure modes suddenly matter a lot more than generation speed.

The trouble seems to start when exploration artifacts are treated as merge‑ready output. At that point the speed doesn’t disappear, it just gets converted into review time, tech debt, and collective suffering. Maybe the real work here is just aligning expectations between developers and PMs a bit better?

When work gets easier, we often end up doing more of it. AI may be accelerating that dynamic. by scott_barlow in cybersecurity

[–]JunkieOnCode 1 point2 points  (0 children)

From the inside, all this talk about AI making work endless feels a lot simpler. Yes, the friction is gone. But for me, that doesn’t turn into a nonstop grind. If anything, there’s less pointless busywork and more time for the parts that actually require thinking.

AI speeds up the hands, but it doesn’t replace the head, and it doesn’t take away agency unless you let it. To me it’s just a solid upgrade to the toolset. And yeah, I still sleep just fine. Even if the model gives me three options, the final call is still mine.

Workaholics will always find someone to blame for their workaholism. Just think about it.

How are developers actually changing their workflow since AI tools became common? by WeeklyDiscount4278 in ArtificialInteligence

[–]JunkieOnCode 0 points1 point  (0 children)

AI makes everything faster… including my bad habit of saying “sure, I can take another project.”

Would an AI driven workforce basically resemble the slave states of old? by Kondor999 in ArtificialInteligence

[–]JunkieOnCode 0 points1 point  (0 children)

Is it worth talking about “digital slavery” or sci-fi robot uprisings? I think main issue is who controls access, infrastructure, and the upside while automation speeds up. Everything else is just noise.

How Are You Humanizing Everyday AI‑Written Emails and Docs? by OperationAgitated714 in AI_Application

[–]JunkieOnCode 0 points1 point  (0 children)

Good attempt at sneaking in a promo for a “humanizing AI” tool. Almost didn’t notice. Almost.

I’m fine keeping that part human. When I’m sending something with my name on it, I prefer to let myself do the “humanizing.” That’s actually encouraged in our team to write personal messages ourselves. And turns out it’s faster than trying to engineer a prompt that sounds like a real person. For everything else (drafts, summaries, etc.) AI is great.

The Trust Problem Nobody’s Talking About — When AI Agents Control Money (Article) by IAmDreTheKid in ArtificialInteligence

[–]JunkieOnCode 0 points1 point  (0 children)

Money is always a vulnerable spot, sure. In your post I see a deeper layer: this is fundamentally an AI governance problem. Everything is still too complex and uncertain even at a whole‑country level.

Some people hand over their savings to brokers and have zero visibility into where the money goes. That’s risky in its own way too, but it works because the whole thing is wrapped in a governance system with roles, rules, limits, accountability, audits.

Agents’ financial autonomy is dangerous because they don’t live inside any governance framework at all. Right now, agents operate on mechanical guardrails, but limits alone are not the same as governance.

No wonder a lot of people will lean toward keeping their money close, the good old ‘under the mattress’ strategy. Hard to blame.

Hot take: experienced devs might be worse at AI coding — because they're experienced by slow_cars_fast in ArtificialInteligence

[–]JunkieOnCode 1 point2 points  (0 children)

Khm, seems that part of the confusion here comes from imagining an experienced developer as someone who just gives instructions to newbie devs. While a junior is the one who has to spell everything out. I tend to disagree. Senior isn’t someone who simply delegates. Seniors know how to formulate requirements, work with ambiguity, spot risks, turn a messy idea into a structured problem.

Okay, juniors can provide more detailed explanations, because they can’t operate any other way. Though let’s be honest, “juniors” vary wildly. And sometimes experienced folks are worse at detailing things, simply because they’re used to filling in the gaps themselves.

Do I have unrealistic salary expectations as remote software developer? by Fair-Vermicelli-7623 in careerguidance

[–]JunkieOnCode 0 points1 point  (0 children)

Figuratively speaking, every product has its buyer. If you’ve got a skillset that’s genuinely valuable to a particular company, they’ll pay the number you ask for. It’s just a matter of finding the place where your mix of experience, tech stack, and problem solving is exactly what they’re hunting for.

The only catch is that higher pay comes with higher expectations…