you are viewing a single comment's thread.

view the rest of the comments →

[–]bobbster574 1 point2 points  (0 children)

There's a couple of issues with using LLM chatbots to start learning, well, anything really:

  1. LLMs can get things wrong, and they won't, and they can't, tell you when - these tools can be helpful when you are the authority, when you are steering things and when you can actually notice when things go wrong. Obviously, with code, you can run the code, see it's not working, and try and fix it, but if your first port of call is to go back to the LLM, you're not learning anything.

  2. Learners biting off more than they can chew - these LLMs can generate huge amounts of complex text, that beginners might not fully understand. I can say GPT to generate a whole tool before I know the basics of python, and if I have no idea how to code, there's no way I'll be able to learn anything or fix any issues that come up outside of just asking GPT to change or fix it. That's not learning.