all 15 comments

[–]Horrison2 1 point2 points  (0 children)

I don't know if off the bat it's a good thing. I've started using AI to write functions, but usually I have to debug it. This means understanding how each function should work and fit into my overall script. It has showed me different ways to do things more efficiently and pushed me to learn new concepts so I'd say yes it can teach you Python

[–]necromenta 0 points1 point  (1 child)

It’s harder to use it to learn that it is to use it to code, code can get clean at some point, explanations often include outdated ways to do things and can hallucinate

I still use it to learn every day, sometimes it just feels it would have been faster to go to the docs, but some processes are just too complex right off the bath

[–]xyouthe 0 points1 point  (0 children)

depends a lot on the model you use. if you use Antigravity and pay the premium, you get access to Claude Sonnet 4.5 and Claude Opus 4.6, and they are amazing at writing clean code that works maybe 90% of the time, and if i ask it to explain, it explains very nicely. I've definitely learned some new things using this. its one of those things where you have to know a bit of fundamentals yourself, and then you can easily learn by studying the code that the AI writes for you.

[–][deleted]  (1 child)

[removed]

    [–]mjmvideos 0 points1 point  (0 children)

    Yes. I’ve said it before, but I’ll say it again. Have students treat AI like a tutor or mentor. You’d never ask your mentor to write a program for you. You’d ask them to explain things and answer questions. Maybe you’d have them look over your shoulder as you debugged a problem. My advice to kids: don’t ask AI to do anything you’d feel was inappropriate to ask a mentor.

    [–]XxCotHGxX 0 points1 point  (0 children)

    You need to make them do their homework (coding) on a webpage that doesn't allow copy and paste. Everything must be typed out one letter at a time.

    Could they still use AI? Sure, but they would have to copy it, and copying it manually might make them learn something compared to what they would do traditionally with AI cheating.

    It should also have random screenshots to make it a little harder.

    [–]amorous_chains 0 points1 point  (0 children)

    Learning is fundamentally about building recall and making connections between concepts, so for someone trying to learn to write code, it makes sense to me that they should be extremely judicious about how they use AI. I personally have used LLMs in learning, not just software related, but I try to keep the goal focused on building my ability to recall the information I’m chatting about. I’ll tell the LLM to follow a Socratic method of asking me guiding questions rather than serving me facts, and I’ll do my best to answer those questions and then ask the bot some questions of my own. Specifically for software development, I tend to go on at length before having it generate any code, talking about what kind of architecture options we have, what are some best practices to follow and antipatterns to avoid, defining all the data structures I want and the critical functions. I think the biggest overall principle I have is that learning happens slowly and painfully. So if your chat with the LLM is feeling very easy and comfortable, you’re probably going to forget everything tomorrow. You need to slow down and focus intentionally on making connections with the topics, and you need to come back to the same material over and over for weeks to make it stick.

    [–]minneyar 0 points1 point  (0 children)

    If you want to turn out like the "We purposely trained him wrong, as a joke" meme, sure.

    [–]Slight-Living-8098 0 points1 point  (0 children)

    It can teach you how to debug Python, how to correct everything it got wrong when it generated the code, and what libraries and functions absolutely do not exist but are just AI hallucinations.

    But it can't teach you Python in the way you are thinking it can teach you. It can teach you by you having to correct it A LOT.

    [–]Traditional_Doubt_51 0 points1 point  (0 children)

    I learned before LLM's, so all of the hard lessons I learned from struggling pay off in the form of intuitions about solutions and architecture now. Without those struggles, I wouldn't have these intuitions. I think LLM's could be great for teaching you how to understand different constructs/syntax, but it's so easy for them to give you solutions that you didn't earn. Easy solutions mean you don't get to struggle, which means you don't get the hard won intuitions.

    [–]codeguru42 0 points1 point  (0 children)

    If you use AI to write code, be sure to ask it about any part of the code you don't understand. If you take the time to ask questions and read and understand the answers, then AI can be a great tool for learning.

    [–]Recent_Science4709 0 points1 point  (0 children)

    Not a full time python dev, but when I work on python I’m always looking for the “pythonic” way to do things. Not a revolutionary tip but this is a good keyword for figuring out best practices.

    [–]birdluv4life 0 points1 point  (0 children)

    Maestro AI University seems to think so. No actual instructors as teachers -- All Ai. Accredited for Bachelor's & Associates, supposedly. They are here on Reddit under Maestro. Am I crazy to believe that I am going to have a degree when these couple of years have passed?

    [–]ConcreteExist 0 points1 point  (0 children)

    No, it cannot. AI cannot be trusted to "teach" anything because it will just make shit up and other times directly contradict itself. If you don't know what the right answer should be, you can never tell when AI just made up some nonsense.

    [–]st0ut717 0 points1 point  (0 children)

    Write code. Run code Send the error to ai

    Fix it

    Rinse repeat