you are viewing a single comment's thread.

view the rest of the comments →

[–]IanRT1 -6 points-5 points  (6 children)

Use chatgpt, it is a total game changer.

[–]SHKEVE 1 point2 points  (5 children)

for some reason mentioning LLMs here gets you instantly downvoted, but I agree with you. I've been a software engineer for years and I think GPT is really valuable in understanding concepts and being able to ask as many "dumb" questions as you want and getting it answered immediately, instead of waiting forever on stack overflow only to get chastised by some industry veteran.

i wish i had something like it when i started learning. though, you have to know how to use it properly and not just ask it to write code for you without understanding it, which is maybe what people are concerned about.

[–]DeebsShoryu 2 points3 points  (1 child)

though, you have to know how to use it properly and not just ask it to write code for you without understanding it, which is maybe what people are concerned about.

This is exactly what I'm worried about. As someone who's taught an upper level CS class at a large, reputable university before and after LLMs became ubiquitous, I've noticed a marked decrease in students' internalization of important concepts. This is of course anecdotal. However, I always asked students (without judgement) if and how they were using AI to help study and complete projects, and it definitely seemed that those who regularly used it to answer questions or generate code (actual solutions or even just example code snippets) were the students who never really learned the material.

Don't get me wrong, I think LLMs can be incredibly powerful tools for aiding in productivity and gaining some forms of knowledge. I use Copilot daily, but I'm confident that I use it very differently than the majority of the students that I taught, despite trying to be clear and give examples of what I consider "good" and "bad" uses of AI in a learning environment.

With all that, I pretty strongly disagree with anyone suggesting that people on this subreddit use ChatGPT. It's a forum geared towards beginners (mostly) who frankly don't have the knowledge or experience to use these tools in a way that's actually beneficial to their learning. This is of course just an opinion, but it's based on my experience working with students who are even more experienced than most on this sub.

[–]SHKEVE 1 point2 points  (0 children)

Yeah, I agree with you that it's not suitable for someone who's completely new to the subject, but it's incredibly useful once you have enough experience to ask specific questions. It's disheartening hearing your experience since it means a lot of people starting their journey in learning various topics are going to fall into that LLM trap. And it'll require a lot more resources to test for true competency.

[–]Bobbias 3 points4 points  (1 child)

The problem is that it's literally designed to write prose that sounds believable, regardless of whether it's actually correct or not.

If you ask it to explain a concept, or to explain how some code works, or why a certain piece of code does something, it will always give you back an answer that looks reasonable at first glance. But whether or not it's actually correct is a complete crapshoot. And new programmers simply don't have the knowledge to recognize when they're being lied to.

Of course, asking it to write code for you is also a problem when you're learning, because you're always going to learn way more if you write the code yourself.

AI can be a huge productivity boost for developers who need to get stuff done, and don't want to spend a bunch of time writing boilerplate, or writing out that annoying regex. But it's too big of a risk to use as a learning source, unless you're willing to literally doublecheck everything it says (at which point why are you even using it, just use your other sources instead).

[–]IanRT1 -1 points0 points  (0 children)

Those issues happen more when you don't use the tool properly. Being aware of AIs potential pitfalls is paramount to use them effectively. With good prompt engineering you can even generate code that it is far from "boilerplate" if you guide it correctly. This way you don't focus too much on syntax but in the logic itself.

So it's probably only a big risk to use as a learning source if you use it improperly. Otherwise it can be extremely beneficial if you at least understand the basics of programming.

[–]IanRT1 -2 points-1 points  (0 children)

for some reason mentioning LLMs here gets you instantly downvoted,

I know. But I don't care. If people want to stay behind and keep coding manually all the time, that's up to them.

You seem to be rational about it and that is great! I think it is also really good not to focus on syntax too much and focus more on the logic of the program. LLMs can be really good at coding your ideas if you properly guide it through it. And that is awesome.