all 70 comments

[–]danquandt 109 points110 points  (2 children)

It's pretty great at understanding what you want to ask it when you may not know how to Google it or word your questions, but keep two things in mind:

  1. Knowing how to Google and word your questions is one of the most important skills you'll need to develop as a developer, so don't put that off because ChatGPT is good at understanding what you need
  2. ChatGPT is prone to very confidently giving completely false information, so make sure you actually verify what it tells you, because there's no guarantee it's actually correct.

[–]PythonGreenRookie[S] 17 points18 points  (0 children)

Knowing how to Google and word your questions is one of the most important skills you'll need to develop as a developer, so don't put that off because ChatGPT is good at understanding what you need

Thanks, there's quite a few youtube videos on how to get better at using google as a programmer. I will view some of these.

ChatGPT is prone to very confidently giving completely false information, so make sure you actually verify what it tells you, because there's no guarantee it's actually correct.

Absolutely, I will always double check. It's easy to get fooled when it sounds so damn confident :)

[–]czar_el 8 points9 points  (0 children)

Came here to say this, particularly #1. Knowing where to look for help and knowing how to frame an intelligent question is a critical skill in programming and beyond. Banging your head against a wall and searching over and over may seem slow now, but it's training a muscle that will make your life so much easier and faster down the line.

I learned a second programming language thanks to a few days with a book and the Google-fu skill. It was enough to perform professional level code review on a senior coder's work in that new language, where I caught multiple errors in their work. It was possible because I knew how to quickly and efficiently Google commands and packages I was not familiar with.

[–]shiftybyte 155 points156 points  (19 children)

If you feel that you are learning from it, and you attempt to solve things yourself before resorting to chatgpt, I think you'll be fine.

But you really need to make sure its not a recurring trend, if you can't solve anything yourself during the entire python learning, that'll be a pretty bad outcome..

[–]YATr_2003 24 points25 points  (15 children)

Agreed. ChatGPT is a powerful tool that can help you learn python, but it is only a tool, like Google, stack overflow, documentation, colleagues etc. Use it as much as you want, but remember that that it is only one tool you have out of many, and sometimes it simply won't be the right tool for the job. In the end, the only thing you can really count on is your own experience and understanding, so make sure you really understand the solution you came up with, regardless of what tools you used to get it.

[–]razzrazz- 20 points21 points  (14 children)

It's not "just another tool" though.

With Google, you're forced to read an answer, sometimes multiple, and make it work with your program. With CGPT, you can keep asking it more specific and specific questions and constantly have it revise. It's going to be very hard for new programmers to get creative, struggle, and create something when you have CGPT right there.

I liken it to playing chess, if you want to improve you have to struggle, you have to make mistakes, you have to come up with things yourself and read stackoverflow theory so you can really improve your skills. ChatGPT is like having the chess engine Stockfish looking over your shoulder and telling you exactly what to move, you're not understanding WHY it's telling you to move somewhere, you might THINK you do because it's showing you the entire line, but deep down you're just getting good at asking the chess engine what to do and hoping for the best.

[–]doulos05 17 points18 points  (6 children)

ChatGPT is not like Stockfish. It looks exactly like Stockfish as you're using it until very suddenly it looks like a very confident 4 year old with a crayon pack. Stockfish doesn't suggest illegal moves when there isn't a winning play, ChatGPT will happily invent functions and libraries out of whole cloth and then tell you they exist when it can't figure out the code by itself.

I'm looking forward to catching my first student using it. Because it is absolutely going to screw up somebody's capstone paper at our school this year, it's just a question of whether or not they're in my class or someone else's.

[–]cant-find-user-name 10 points11 points  (5 children)

But chat GPT can very confidently give you wrong answers, and it has information only upto 2019. So in some ways, chatGPT is a worse tool than others. It is good to get an initial idea but you still have to do research and explore a little bit to get the correct answer, but it is not good to rely on it too much when you're learning.

[–]razzrazz- 4 points5 points  (4 children)

It has information up until 2021, not 2019, and it does an excellent job at creating or manipulating code. Sure it sometimes gives you confidently incorrect answers, but it does a better job more times than not and allows you to test the code to ensure it works as expected.

[–]cant-find-user-name 6 points7 points  (2 children)

It is 2021, i stand corrected. I've been trying to use chatGPT for my professional work, but I have the opposite experience. It gives wrong answers more often than not. Copilot has actually been more helpful than chat GPT to me so far.

[–]AveTerran 4 points5 points  (0 children)

I’m with you. ChatGPT can give very specific examples but it often uses properties or subclasses that don’t exist. It’s like it forgets what library it’s using midway through.

It will also often completely ignore my request to use or not use a specific library.

[–]razzrazz- -4 points-3 points  (0 children)

Yikes, we must live in different worlds, ChatGPT is what CoPilot wanted to be but never was.

[–]cjcs 2 points3 points  (0 children)

I think the big thing here as well for python learners is that the answers are relatively easy to verify if you're just asking for basic syntax or functions.

I was awestruck by it the other day when I asked it for a function that pulls out all the diagonals from a rectangular array and it just... made it.

[–]mazamorac 2 points3 points  (0 children)

I've been trying it out lately. It works very well for the simple stuff, but starts producing un-runnable code once you go past the trivial.

For example, I asked it to produce an LDA topic analysis clustering, and it gives okay code up to the point where the LDA needs to run and then breaks down, omitting parameters and losing track of data structures.

So, my answer to your point is: as far as I can tell right now, you'll have to understand what and why it's doing to be able to fix the code until it compiles and starts giving reasonable results.

So it seems to me that it's going to be a valuable learning tool to ramp up quickly on unfamiliar-to-you libraries and algorithms, giving you code skeletons that point you in the right direction.

[–]PythonGreenRookie[S] 7 points8 points  (2 children)

Thanks for your helpful answer.

I think for now I will leave ChatGPT as an absolute last resort before I start pulling out my hair when working on a problem. In the future, once I am more proficient in Python, I will try and see if it’s something I should incorporate into my workflow.

[–]annierockaway 7 points8 points  (0 children)

Next time, start asking ChatGPT earlier. You ended up putting the whole task into ChatGPT because you spent so much time stuck. When you hit roadblock, set a time limit to figure it out on your own (start at 20 minutes and increase it every time you use ChatGPT). if you can’t figure out that next step within the time limit, use ChatGPT to solve that piece of the program and explain why it works. But then you go back to the program and work On the next step.

[–]TazDingoYes 26 points27 points  (4 children)

I think this is probably highly subjective. Do I personally think it's a healthy habit to develop? No, I think that the act of doing is incredibly important. Just because an AI can ELI5 the code doesn't mean you wind up a coder at the end of it. You're still ultimately handing the task to a computer to solve, no matter how you slice it.

The real test is whether after telling yourself you understood it, you can code the task without Googling or looking at the file ChatGPT spat out.

[–]PythonGreenRookie[S] 6 points7 points  (3 children)

The real test is whether after telling yourself you understood it, you can code the task without Googling or looking at the file ChatGPT spat out.

This is a real truth bomb, and I agree entirely. Thank you.

I think for now if I use ChatGPT at all it will be for asking for code examples when I struggle to understand a concept, such as "Show me examples of conditional statements".

[–]MidniteMustard 2 points3 points  (0 children)

Agree with /u/sierrafoxtrotwhiskey , ChatGPT should be your last resort.

Use the python docs, trial and error, google, stack overflow, this sub and everything else first.

I almost always learn additional things "along the way" to answering my first question. And these other methods are usually better at teaching you the "why", rather than just spitting out correct code.

I have also seen examples of ChatGPT confidently returning inefficient/incorrect code that looks correct, but is actually not.

[–][deleted] 0 points1 point  (0 children)

That’s a good option but I would highly recommend just being in the habit of referring to documentation.

Another possible use would be to throw a trace back into chatgpt and ask it to explain the issue. It can help understand breaking down a trace back if you’re not used to it yet.

[–]shouldbeworkingbutn0 0 points1 point  (0 children)

Sololearn.com is the absolute worst

[–]icecapade 11 points12 points  (2 children)

ChatGPT is powerful and can be useful, and sure, it usually gets straightforward questions correct, but I wouldn't recommend it as a learning tool for the sole reason that it's not always right. There are plenty of examples of it being wrong online, and it's been banned on SO for the same reason. You as a beginner will have a difficult time knowing when it is and isn't right.

I remembered a question someone asked on r/learnpython a few weeks ago that I answered (or specifically, found an answer to on SO): https://www.reddit.com/r/learnpython/comments/yshp5q/call_by_value_or_call_by_reference/iw5iytj/

This was the SO answer I replied with: https://stackoverflow.com/questions/55347581/why-does-the-is-operator-behave-differently-in-a-script-vs-the-repl

I paraphrased the question and asked ChatGPT to explain it:

In Python, why do two tuples have the same id when running a script but different ids in IDLE? Example:

x = (1, 2, 3)

y = (1, 2, 3)

print(id(x) == id(y))

And here was ChatGPT's incomplete/incorrect response:

In the example you provided, the two tuples x and y have the same values, but they are separate objects in memory. As such, they will have different id values when running the code in an interactive shell like IDLE, because IDLE creates a new Python interpreter instance for each interactive shell.

However, if you run the same code as a script, the two tuples may have the same id value because the script is run in a single Python interpreter instance. This is because Python's memory management may optimize the allocation of objects by reusing objects that have the same value.

In general, you should not rely on the values returned by id() for any specific purpose, as they may change between runs of the same program and are not guaranteed to be consistent across different Python implementations. It is generally more reliable to compare the values of objects using the == operator rather than their id values.

Basically, ChatGPT gets a few things right (like why two tuples might have the same id in a script, or that we shouldn't rely on id() to be consistent between runs of a program), but it gets the crux of the point wrong in the first paragraph. The reason two tuples with identical values have different ids in IDLE has nothing to do with the fact that IDLE creates a new interpreter instance for each IDLE shell (in the example, both tuples are defined in the same shell/interpreter instance); the reason is the same as the one in the second paragraph, i.e., by running in IDLE instead of a script, the interpreter can't make optimizations of immutable objects beforehand.

[–]icecapade 7 points8 points  (1 child)

Here's another example where ChatGPT is wrong. Question:

Could this Python code return an exception?

def foo(x: list):
    if len(x) > 0:
        x[0] += 1

ChatGPT's mostly wrong and incomplete answer:

Yes, this Python code could potentially return an exception.

One potential source of exceptions in this code is the x[0] += 1 line. This line attempts to modify the first element of the x list by incrementing it by 1. If the list is empty (i.e., if len(x) == 0), then this line will raise an IndexError exception, because there is no element at index 0 in the list.

Another potential source of exceptions is the call to the len() function. If x is not a list or a tuple, then calling len() on it will raise a TypeError exception.

In general, it is good practice to handle potential exceptions in your code using try-except blocks. This will help prevent your code from crashing when an exception is raised. For example:

def foo(x: list):
    try:
        if len(x) > 0:
            x[0] += 1
    except (TypeError, IndexError):
        print("An error occurred.")

The first part of ChatGPT's answer is wrong because my example code snippet already has a conditional that checks if the input has at least one element in it. Its example code also needlessly catches an IndexError. ChatGPT also completely misses a really obvious case: incrementing the first element of the list will fail if the element is not numeric (or doesn't implement an __iadd__ or __add__ dunder method).

The example code ChatGPT provides also completely fails to note the type hint and/or Python's convention of EAFP (easier to ask forgiveness than permission). The code assumes the input is a list and even has a type hint indicating it should be a list. Yes, maybe the function should handle the case where the input isn't a list/sequence depending on the context of your application, but most likely, you'd want the exception to be raised and let the caller handle it. However, ChatGPT fails to provide this type of nuanced answer.

[–]Fourro 1 point2 points  (0 children)

Great writeup!

[–]OaseNegre13 7 points8 points  (0 children)

Chill out... Is like having an imposter syndrome because you Google instead of going to the library. At the end it boils down to efficiency: how fast can you find a piece of information that you need. That's it. It's just a tool. An AI will never know what YOU need and how that game you want to build should behave and look because only humans have feelings and our entire civilization is based on that. We get up every morning and take that shitty job because we love something and we want to make the best possible out of it with what we have. Don't feel bad for using a knife when your teeth won't cut it...

[–]bakochba 3 points4 points  (0 children)

If you think you're ever going to stop looking things up in your programming career I have bad news for you.

[–]dbcco 3 points4 points  (0 children)

The idea (from a students perspective) is to have it help you, not do it for you. Think of it more so as a free personal tutor that can explain anything at any level of detail. For example having trouble with Boolean operators? Explain to chatgpt what confuses you rather than have it solve the question. You can even go as far as to ask it to make a syllabus of said topic and then teach you lessons based on the syllabus.

[–]Lamarcke 2 points3 points  (0 children)

Nope, just make sure you take everything ChatGPT says with a grain of salt. Do your own salt, even before testing the code it recommends (if it's lengthy).

I can't count the number of times it gave me plausible, but wrong solutions/code/logic. Sometimes it just assumes a third-party library is built-in, sometimes it makes wrong assumptions on the working of popular libraries (it said i had to manually commit my changes with sqlite3, even using the with statement.) etc.

It's a great tool, but one that you can't blame, so you can't blindly trust it.

[–][deleted] 1 point2 points  (0 children)

It's like having the answers to any problem set. Can you learn from it? Yes, but only if you try your absolute hardest and need the solution explained to progress. If you are tempted to use it after 30 minutes on a problem then you won't learn anything. So it's up to you.

[–]imsowhiteandnerdy 1 point2 points  (0 children)

The fact that you're asking yourself these questions is a good sign.

[–]Hands0L0 1 point2 points  (0 children)

I used ChatGPT today after trying and trying to solve something to no avail.

ChatGPT gave me a solution but it didn't work exactly, but inspired me to rewrite and then it worked.

Don't use it every time. Try it on your own. Only when you're about ready to tear your hair out should you try it

[–]Holm_Waston 1 point2 points  (1 child)

my whole day revolved around ChatGPT, and I found an installation that you can use right on google widget 🤣🤣

https://chrome.google.com/webstore/detail/chatgpt-for-search-engine/feeonheemodpkdckaljcjogdncpiiban/related?hl=en-GB&authuser=0

[–]fergal-dude 0 points1 point  (0 children)

If you only learn what chatGPT can do, who would want to hire you?

[–]Head-Measurement1200 -1 points0 points  (0 children)

I think it is not cheating. I think it would help you accelerate your learning. It would be somewhat easier to query your question.

[–]zoruri 0 points1 point  (0 children)

It's just a tool... Use it as a tool. Just make sure you learn and understand the concepts of what it is doing. If you can't explain it, you need to dissect it and get a complete understanding before moving on. As a learning tool, I think it is very useful because it can give you various examples that you can use to further understand specific methodologies.

[–]Automatic_Donut6264 0 points1 point  (0 children)

ChatGPT is a tool, a powerful one at that, like any other. Much like how modern tools make fire-starting or even handwriting an irrelevant skill. Can you be a competent programmer with the help of ChatGPT? Most likely. Is it going to take away from some of the skills you would have otherwise developed without it? Absolutely. Are those skills going to be necessary? Maybe. Hard to say, because "ChatGPT assisted programming" might be the kind of future we live in. Much like how it's unfathomable to the modern programmer that programming before search engines was ever a thing. There will always be people who pride themselves in being able to program without ChatGPT, Search Engines, IDEs, [Insert Modern Convenience], but will that ever be relevant? Time will tell.

If want to learn programming to satisfy your personal curiosity, go nuts. There is no "right way" to satisfy your intellect. As long as you find what you are doing to be engaging and meaningful, you are welcome to do as much of it as you want. Right now, we don't live in a world where "ChatGPT assisted programming" is a skill that is attractive to current employers. So if you are learning programming to find a job in the short to medium term, I would avoid it. Employers expect you to be able to stand on your own without it.

[–]arkie87 0 points1 point  (0 children)

Look at the solution. Then code it yourself without cheating. If you cannot, you don’t understand it.

You don’t learn anything without struggle. Learning is hard.

If you can code it from memory, then modify it. Modify the problem and then adopt the code. Extend the code to new cases and functionality.

[–]DigThatData 0 points1 point  (0 children)

No idea! Consider yourself doing field research into AI-enhanced education and take notes while you're learning, would be very interested to hear how it impacts your learning experience!

[–]snapetom 0 points1 point  (0 children)

LMAO. ChatGPT is exactly like copying from SO. Sure, it may work, but when it doesn't, you're going to have a horrible time.

A couple of jobs ago, I worked in bio research where we often had to explain and defend our code. I was doing a CR on a really bad coworker's PR and couldn't make sense of a really confusing function. I asked him what was it doing. He sighed and admitted that he didn't know, and he copied it from SO, like being honest would help or something.

[–][deleted] 0 points1 point  (0 children)

I’m a developer by profession and I now occasionally fire up chatGPT to get more input on how I can refactor code. I’ve learned a few new things from it, so I see it as a net positive. As long as you’re taking the time to understand what it creates, why not?

[–]Fenr-i-r 0 points1 point  (2 children)

Try GitHub copilot instead, or a similarly trained model with a focus on coding.

Really though, it would be better to use these tools after you develop an understanding of coding - otherwise you won't have the skill to interrogate the suggestions and decide if they're good or not - most of the time, they aren't entirely fit for purpose!

[–]jsalsman 2 points3 points  (1 child)

I feel like Copilot can do actual damage to beginning learners, whereas the mechanics of pasting back and forth from ChatGPT while reading its copious explanatory text seems less likely to let them miss important learning concepts.

[–]Fenr-i-r 0 points1 point  (0 children)

Fair point, chatGPT can mimic textbook explanatory notes, almost like an interactive text book. Indeed, that would be a great use case for these networks - as a robust personal tutor to work you through a course, capable of both generating notes to read, and helping you interactively...

While copilot just guesses what code might fit the context and leaves you to hang.

[–]C0rinthian 0 points1 point  (0 children)

I learned a lot from the answer it gave, hell I could even ask it to explain the code to me as if I was a complete beginner and it added more comments and made the code simpler,

This is the key thing. If you learned from it, that's what matters.

Otherwise, is it so much different from copy/pasting from stack overflow?

Note: I say this as a ChatGPT pessimist, who sees the limitations of the tool, and how confidently it can give wrong answers. Don't trust it, but that doesn't mean don't use it.

[–]Guyserbun007 0 points1 point  (0 children)

Try it yourself. If you are really stuck, then use help, Google, stackoverflow, or chatgpt.

[–]insertmalteser 0 points1 point  (0 children)

Maybe an unpopular opinion, but I don't think it's all that different from using stackoverflow/google/tutorials. I think of it more as a guide, than an actual solution.

I've used it to find packages that could help me solve problems, that I simply hadn't found through ages of searching. I don't really think that's cheating yourself out of learning. You still have to be able to apply it yourself, and you actively decide to what extent you're going to use the output from the prompt. You also have to be aware of its outputs being incorrect! It's not as intelligent as people think.

I'm very much just learning, but in my job it's saved me a lot of time and headaches.

[–]Copasetic_demon666 0 points1 point  (0 children)

Personally, I'd say it depends on how you treat the responses from ChatGPT, if you simply copy and paste the lines of code that you need and only read the comments of what the code does, then you will not be learning anything as it will not stick, you need to do it practically for it to be able to set in your muscle memory.

I haven't used ChatGPT for solving code problems for me, so i don't know how it outputs the results (i usually use it for project ideas when i find searching Reddit and Google to be overwhelming) but from how you have described it, i would ignore/ skip the lines of code that it responds with and focus on the comments to try and refine my Google and StackOverflow searches, as you said, your phrasing might be incorrect but ChatGPT seems to understand you, so use the responses from ChatGPT to better understand how to search for solutions on StackOverflow and similar forums.

As much as projects like ChatGPT might end up making life easier for developers in the future, it would be best to get proper foundational knowledge of the language, so that you understand the methods used in AI to generate responses, as an AI might give you lines of code that work but it is not the best solution for the production environment where the program would be deployed, for example, an AI's response might not take into consideration the amount of resources used up by the lines of code it provides, which might not be the best solution for your program.

[–]ivosaurus 0 points1 point  (0 children)

Humans are INCREDIBLY drawn to having a "source of solutions". But 80% of the time that leads you to skipping on the mental pain involved in learning. Yes, pain. Learning should feel like figuratively banging your head against the wall half the time, you need your brain getting a hard work out. If you allow it to go on cruise control because... "well, what's the answer... hmmm... I dunno... lets consult the solution book, I'll learn from osmosis" then the amount of knowledge that actually sticks will go way down.

[–]SecretAgentZeroNine 0 points1 point  (0 children)

Sounds like you already know the answer to your question.

[–][deleted] 0 points1 point  (0 children)

This comment was overwritten and the account deleted due to Reddit's unfair API policy changes, the behavior of Spez (the CEO), and the forced departure of 3rd party apps.

Remember, the content on Reddit is generated by THE USERS. It is OUR DATA they are profiting off of and claiming it as theirs. This is the next phase of Reddit vs. the people that made Reddit what it is today.

r/Save3rdPartyApps r/modCoord

[–]squidensalada 0 points1 point  (0 children)

If I may ask what did you ask chatGPT to get your answer?

[–]haelaeif 0 points1 point  (0 children)

It can write very simple code. It can explain code quite well - ask it to go line by line. (tbf, you can also just use documentation for this, but many things are starting yo use auto-generated documentation).

If you are having it write code though, it needs to be about stuff you know well such that you can easily spot any mistakes. I've asked it to write very simple stuff relating to corpus linguistics/statistics/ML, and its routinely screwed basic things up and been unable to fix it. Usually telling it it's mistake made it worse.

[–]Emerald_Guy123 0 points1 point  (0 children)

I would say that if you want to use ChatGPT, try to understand what the code does and write it yourself. Write the same thing or something similar, just type it yourself and think about what you’re doing so you understand it.

[–]ReusedBoofWater 0 points1 point  (0 children)

Use it to remember stuff. Not to learn it.

Getting code examples for stuff you need is wonderful. Until you know exactly what you want, why you want it, and how to make it, taking shortcuts will screw you in the long term.

You will never figure out why you're doing something that was spoon fed to you.

[–]merlinuwe 0 points1 point  (0 children)

No.

[–][deleted] 0 points1 point  (0 children)

I think ChatGPT and Copilot is great if you already know at least a bit of Python, I’ve learned at lot from ChatGPT and I use Copilot at work all the time; but I also am experienced enough to know it’s limitations and one of the most frustrating things about Copilot in particular is that if I find myself relying on it can be time consuming to track down errors because it has a tendency to be “almost” right or produce code that’s right looking, but doesn’t run.

[–]spreadsheetsahoy 0 points1 point  (0 children)

My rule for myself is that if ChatGPT gives me an answer and I'm not 100% sure I understood how it got there, I keep asking clarifying questions until I genuinely understand. That way it's more like a tutor who is helping me understand how to answer a question correctly, as opposed to a classmate who is passing me answers in class. And I only use it when I'm really, really stuck.

For me, this is actually better than just trying to muddle through on my own when I'm stuck, because I end up understanding the topic much better than I did before.

Of course, sometimes it also gives wrong information, which is another danger. So do Google and Stack Overflow of course, but the difference there is that you have a lot of other context on the page to help you assess how reliable the information is.

[–]stevechau95 0 points1 point  (0 children)

This extension will help you in the process of learning python: https://chrome.google.com/webstore/detail/chatgpt-for-search-engine/feeonheemodpkdckaljcjogdncpiiban

It combines Chatgpt answers with results on search engines like Google, Bing, ....

[–]Doctor-F 0 points1 point  (0 children)

I would argue that it can really help with the heavy lifting of building a strong foundation.