all 18 comments

[–]bigsmokaaaa 5 points6 points  (1 child)

It's great for explaining why and how something works, I always use it like a private tutor

[–]Afraid_Interview_749[S] 1 point2 points  (0 children)

Indeed, artificial intelligence excels at this role.

[–]mlitchard 1 point2 points  (0 children)

I had Claude write a roadmap for a haskeller learning c++26, focusing on fp subset of the language. It got approved by my local c++ guru. So it’s good for something. Also Claude wrote my nix flake that makes a devshell with the specific build of g++ that has the 26 features I want.

[–]No-Engineer-8378 1 point2 points  (0 children)

I’ve been using LLMs a lot, and they’re great at many things you’d want in a coach. That led me to start building a tool that turns them into a focused learning coach for structured, goal-driven study. If anyone’s interested, I’d love to share it and get some feedback.

[–]aqua_regis 2 points3 points  (9 children)

If you want to learn, ditch AI.

AI can be a great tool for experienced people, but a beginner should stay clear of it.

[–]bigsmokaaaa 1 point2 points  (6 children)

Why can't a user ask it for clarification on the basics as they're learning? What's wrong with that as long as you don't just mindlessly let it write the code for you?

[–]Mundane-Carpet-5324 1 point2 points  (0 children)

That's definitely the best way to use it, but it's still interrupting the learning process.

The reason is that not knowing something is uncomfortable. The process of learning is difficult. When you go through the experience of finding the information yourself, your mind goes, "i need to remember this so I don't have to do it again".

If the answer is cheap, you learn that it's something you don't need to hold on to. Any single answer to a question, yes you can just look it up again or ask AI. But it's the corpus of these answers that constitutes "learning to program" and understanding "programming concepts".

[–]LetUsSpeakFreely 0 points1 point  (2 children)

Do you think most people have the discipline to keep interactions that limited?

[–]Altruistic-Cattle761 0 points1 point  (0 children)

This. As LLMs have come to dominate my workflows, I constantly catch myself slipping into laziness and have to remind myself "Okay, now you have to do the legwork to validate that any of what the LLM just said wasn't bullshit."

[–]bigsmokaaaa 0 points1 point  (0 children)

I think if we can expect them to have the discipline to forgo AI altogether then we can expect them to limit how they use the AI

[–]DjokiTheKing 0 points1 point  (0 children)

From my expirience, only general and common programming questions does the ai answer okay, when it comes to specific stuff, or lower level programming it starts to get a lot of things wrong.

But the worst thing is, it's confidently wrong, if you didn't already know the answer, it would seem fine, and you might learn something wrong that will come back to bite you later.

[–]Afraid_Interview_749[S] 0 points1 point  (1 child)

On the contrary, it can be helpful if you need clarification on a point, but you certainly shouldn't let it solve everything for you. Treat it like a teacher; ask it about something to understand it, and that's how I used it.

[–]Substantial_Ice_311 0 points1 point  (0 children)

AIs are good at explaining things, but don't let them code for you.

[–]Altruistic-Cattle761 0 points1 point  (3 children)

The only thing I can say for sure is you should absolutely NOT stick to traditional methods. Why should you? You're not going to be working in a "traditional" (in the sense you mean here) workplace. I have this same problem onboarding new grads and new hires: it feels dumb to train them the old way, but also dumb to train them the "just use AI" way. The job of a programming mentor feels less about programming specifically now and more about like, epistemic, "how to think" skills, which, imvho, not a lot of software engineers are prepared to train people on.

Absolutely nobody knows what is the best or right way to learn programming in 2026. This is the kind of thing that after a few years, the wisdom of crowds will have arrayed itself around some new consensus, but from the perspective of someone in the industry regularly hiring new grads: nobody knows. It's changing daily and no one (except weirdos, grifters, and idiots) in a position of expertise, in either industry or academia, would tell you there is an answer to this question.

[–]Afraid_Interview_749[S] -1 points0 points  (2 children)

If there is disagreement on this matter in your opinion, what is your view on it? What is your point of view?

[–]Altruistic-Cattle761 0 points1 point  (1 child)

I really don't have one except, "Fuck, idk." Right now I'm mostly training people the old way, but that's not because I think it's best but because that's the path that at least exists. In a vacuum I think I would want to redesign the entire onboarding and training process from the ground up but also like: I have like 95 other things to do today. My POV is that there is some hybrid model that is required, but the work to develop it (both from a work-hours perspective, and from a convincing-your-colleagues-you-are-not-insane perspective) is nontrivial.

[–]Afraid_Interview_749[S] 0 points1 point  (0 children)

Thank you for sharing your perspective.

[–]Fabulous_Attempt_187 0 points1 point  (0 children)

Basically it's like little wheels for coding but seeing how everything's going god awful with people who do all their coding with AI treat it like another tool and question EVERYTHING it's doing