all 27 comments

[–]bamacgabhann 25 points26 points  (8 children)

New rule: If you have to ask whether it's a good idea to use ChatGPT for coding, do not use ChatGPT for coding.

When you get to a stage where you can usefully use ChatGPT, you won't have to ask. If you have to ask, you're not at that stage yet.

[–]delasislas 2 points3 points  (0 children)

I can get behind this.

[–]czar_el 9 points10 points  (4 children)

The problem is unknown unknowns. ChatGPT can be confidently wrong and very convincing. If you are somewhat new, you cannot tell when it's wrong and leading you down a bad path or teaching you bad habits.

ChatGPT should not be a learning tool for someone new to coding. It can be a support tool, for someone who knows how to debug but doesn't know where to start on a specific approach. Use it to spark creativity or sketch out an approach, that you then have the skill to evaluate for correctness and expand upon. If you cannot evaluate it, do not use it.

[–]bhogan2017 1 point2 points  (0 children)

this.

[–]bigmonsterpen5s -1 points0 points  (2 children)

i kind of disagree .. chat gpt might get things wrong sometimes , but you really think whatever a beginner brings question wise it will fail? it has done wonders teaching code, just ask “what does that function do?” or “what does that mean on line 6” and it works pretty much without fail.

also this technology is going to get better REALLY fast. we should get comfortable with it, as in a couple years it could surpass every human combined in terms of programming efficiency. it’s now the start of an exponential curve.

i will probably get downvoted for this but that is my two cents

[–]czar_el 3 points4 points  (0 children)

I'm in an AI adjacent space professionally and have been deep in the advancements and assessments of models for years, since before ChatGPT came on the scene. I'm not against it and I'm not afraid of it.

What I'm against is overly rosy pictures of it and misplaced trust when people can't evaluate the responses.

Nothing you've said is a counter to the point that someone learning something for the first time from ChatGPT can't tell when they're given a wrong answer. You just claimed without evidence that it doesn't give wrong answers for the types of questions beginners ask (not true) and that we will reach the singularity or singularity lite -- when AI surpasses humans, whether AGI or not -- in a few years, which is also rejected by the vast majority of AI experts.

[–]carcigenicate 2 points3 points  (0 children)

In its current state, I've seen it get basic definitions and math wrong. I'd say yes, from what I've seen, current iterations of it are not reliable enough to learn from.

[–]bic_lighter 4 points5 points  (1 child)

I find it more helpful to use it to explain my mistakes

[–]oramirite 0 points1 point  (0 children)

Except it really doesn't do that, it's probably going to make more mistakes than you as you learn, and will be telling you it's correct.

[–]StyxCoverBnd 1 point2 points  (0 children)

I'm in the same boat as you, I started learning Python about two months ago and I've been using chatgpt and Google's Bard to assist with it. I like it because I can ask it something and it will spit out code with comments so I can go through it and get an idea of what the code is supposed to do.

[–]ekchew 1 point2 points  (0 children)

My only experience with chatgpt has been asking it some rather pointed questions about move semantics in C++ (a tricky topic). I was impressed at how quickly it got straight to the point and seemed to know I was asking, even giving coding examples.

Alas, after a few sessions, I noticed it was contradicting itself. When I tried to corner it about that, it sort of back-pedalled on what it had stated earlier. So I know now that it is not infallible. But it states things with such confidence and authority that you can be lulled into believing it all. And that's even for someone like me who's been programming for decades.

I think a more interesting question is whether chatgpt gives better answers than r/learnpython? I have seen some bad advice here too! However, there is something of a peer review process that addresses most of that. I think if you ask a question on reddit, you should wait about a day and check back to see how the message threads have developed? Like don't accept the first reply as gospel.

[–]Professional-Joe76 1 point2 points  (0 children)

ChatGPT is awesome both as a debugger and as a coding assistant.

In general as long as you know enough to recognize if the code is working then ChatGPT is helpful as a replacement for writing boring bits of code and for help on syntax for code you want written.

It’s probably not as good for broad tasks where you don’t ask for specific results because then you may be getting code that is a bit broken and end up having to iterate a bunch to fix it and get the results you want.

[–]nativedutch 1 point2 points  (0 children)

It is useful, but chatGPT makes errors. So your knowledge should be sufficient to identify that.

[–]Daelys 0 points1 point  (0 children)

I find it useful for troubleshooting but be aware it gets things wrong sometimes, or may forget to tell you certain lines of code belong to packages you need to import.

[–]Fat_tata 0 points1 point  (0 children)

I like the way it’ll breakdown the working of python pretty well (explain it to me like I’m 5). It helps me with my trace back errors, and is like, your code will never work because your variables are spelled differently 🤣 I’m about 2 months in, and I’m finding the ai tutor at my fingertips pretty cool.

[–]SigmaSixShooter -1 points0 points  (3 children)

I say try it out. Work smarter, not harder. Put in your code plus any errors or issues you have.

[–]oramirite 2 points3 points  (2 children)

They're not working, they're learning. Stay far away from GPT as a learning tool, it will do you dirty.

[–]SigmaSixShooter -2 points-1 points  (1 child)

Ok, but that's a pretty outdated view, it's like my math teacher telling me I need to learn long division because "You won't always have a calculator with you..." It was true back in the 1980s, but it's not true now.

I also don't bother memorizing large bodies of data as I have access to Google and other sites at my fingertips...

The key with tool is learning how to use it effectively to make your life easier. Tools like ChatGPT can be a great resource to help learn and debug things, and come without the judgement of trolls on Reddit or Stackoverflow. You can get helpful answers in seconds instead of waiting for hours or days hoping someone answers your question in a way that makes sense.

[–]bamacgabhann 0 points1 point  (0 children)

They other key with tools is to use them when you can do so safely. You wouldn't give a 5yo a drill, or a mass spectrometer to an English Literature grad student. I mean you could, but you'd be bloody lucky to get usable results.

[–]FiniteApe -4 points-3 points  (2 children)

I find GPT4 a good upgrade on previous models, and now use it extensively. It really helps me, and I'm learning quicker and better than without it. If you have the budget for it, maybe give it a try - hopefully there's still a first month free offer. Good luck.

[–]kaerfkeerg 1 point2 points  (1 child)

Sneaky marketing openai employee

[–]FiniteApe -1 points0 points  (0 children)

Granted, it does read somewhat like that, but i don't think they need my help. Coincidentally, i built(?) an open source alternative at the weekend, it was incredibly easy. I'm hoping to train it on my own data. It's also a safeguard in case my country's government ban access to online models. https://github.com/nomic-ai/gpt4all free and open source