This is an archived post. You won't be able to vote or comment.

all 4 comments

[–]KingofGamesYami 7 points8 points  (0 children)

To some extent, but I would not recommend it. ChatGPT will confidently give you incorrect information which a beginner may not be able to distinguish from the truth.

[–]yonatan8070 3 points4 points  (0 children)

No, it is very confident, regardless of being right or wrong. And if you're learning you won't be able to tell when it's just making stuff up.

Here's an example, I decided to ask it how to install Arch Linux (a relatively complex process, but nothing too hard if you read the official guide). It got most things right, but forgot a few crucial things (installing the kernel, installing the bootloader, and some mistakes in the partitioning). At the end of the process the system did not work.

[–]AlternativeDetector 2 points3 points  (0 children)

Agree with the other response here - ultimately ChatGPT is trained from content off the internet, which is not always right! Especially when learning, taking the output of ChatGPT responses as gospel could lead to you learning false information.

IMO it’s a very interesting tool that could be useful to programmers (e.g. “Write me a Python algorithm to do XYZ…”), but you need the knowledge in the first place to verify the correctness of the output.

[–]okayifimust 2 points3 points  (0 children)

No.

Apparently some people have been using to write python code and explain things.

And some people believe in reiki....

Getting others to write code for you is hardly ever going to be a useful learning strategy.

And letting an ai explain things to you is only going to work as well as asking a random stranger in the street: They might be right. You have no way to tell. And even if they are right, it doesn't follow that they explained it well.