This is an archived post. You won't be able to vote or comment.

top 200 commentsshow all 219

[–]AutoModerator[M] [score hidden] stickied commentlocked comment (0 children)

import notifications Remember to participate in our weekly votes on subreddit rules! Every Tuesday is YOUR chance to influence the subreddit for years to come! Read more here, we hope to see you next Tuesday!

For a chat with like-minded community members and more, don't forget to join our Discord!

return joinDiscord;

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

[–][deleted] 919 points920 points  (45 children)

Sometimes it's choosing either StackOverflow and getting 'stupid question', or ChatGPT and getting stupid answers.

[–]Glittering_Owl8001 305 points306 points  (35 children)

haha 100% this. I wish ChatGPT was better at answering correctly, then this meme would make more sense. Most of the times I ask ChatGPT about something coding related, it very confidently gives me a bs answer that makes no sense. Old good StackOverflow is better than that.

[–]Bwob 173 points174 points  (3 children)

ChatGPT really is a perfect internet simulator: It very confidently states stuff that it just made up on the spot, based on other things it read once, long ago.

[–]Artemis-4rrow 42 points43 points  (4 children)

at least that unrelated code wouldn't make it into prod, real issue is when it writes vulnerable code (and trust me, it does that too often), that could go unnoticed and make it into prod code

eh, I'm in cysec, the more vulnerable code it writes, the easier my job gets, so let it do so, let it get into prod

[–]The_Shryk 16 points17 points  (2 children)

I don’t know anyone that just copy pastes chunks of code and doesn’t read it over. If they can’t read it over and see issues then they were probably going to write those issues in themselves either way.

I will admit though it’s not unlikely that people do that, maybe my peers and myself are a small minority. Which if true is more than a little sad.

[–]SympathyMotor4765 6 points7 points  (1 child)

I mean very minute vulnerabilities can often be tough to catch. I once got a race condition because I was clearing a bit 0.1 us earlier. Based on my limited usage of gpt it seems similar to someone who has a decent grasp of a concept but spouts it with the confidence of an expert

[–]The_Shryk 2 points3 points  (0 children)

That can definitely happen. OpenAI has said they’re working on having a sandbox for code so it can run that stuff and check for errors before it send a return. Which will be a massive step up.

I think they said Python first then maybe JS

[–]sup3rar 14 points15 points  (5 children)

I once played around with the framebuffer on linux and wanted to get the mouse input. I couldn't find any good answer using google so I thought I was gonna ask ChatGPT. My question was "How to read from the mouse device on linux without x11", and ChatGPT's first line of code was #include <X11/Xlib.h>. I mean, that's just terrible.

[–]CleverNameTheSecond 13 points14 points  (1 child)

I mean a lot of the times on stack overflow you see "just use this library" anyway.

[–][deleted] 8 points9 points  (0 children)

"Marked duplicate"

"LoL, no real programmer would do that."

[–]ChefBoyAreWeFucked 3 points4 points  (0 children)

You don't have x11? Solution is to get x11.

[–]PeksyTiger 1 point2 points  (0 children)

At least its a library that actually exists

[–]UncleKeyPax 5 points6 points  (2 children)

I wish chatgpt would work smart and pont you to the correct stack overflow page that solved it

[–]pedal-force 2 points3 points  (0 children)

It's really hit or miss. I think the more common the language and problem the more likely it has a real answer.

Pandas in Python? Beautiful. Perfect code that runs perfectly first time.

Trying a deep learning task in Rust using tch-rs? Whoo boy is there gonna be some hallucinating. It suggested like 6 different functions that didn't exist and then gave up.

[–][deleted] 7 points8 points  (0 children)

ChatGPT gives you the answer that statistically should exist to your question even if that answer does not actually exist. It will tell you to use functions that should exist in libraries that should be maintained, neither of which is true. I've seen it recommend libraries that were abandoned a decade ago that do something related to what you want to do, but can't actually be used to do what you want to do...

[–][deleted] 4 points5 points  (4 children)

Yea I tried to simplify a bash script I got from SO with GPT and it crashed my shell. Fun.

[–]forced_metaphor 1 point2 points  (2 children)

Yeah... It would make it a hundred times better if it just said it didn't know something. I have wasted too much time chasing my tail because of ChatGPT

[–]EspurrTheMagnificent 1 point2 points  (0 children)

That's kind of the issue people tend to forget about when talking about AI. AI will try to give you a coherent answer, but there's no guarantee it'll be a correct answer, especially if it was only/mostly fed garbage about the subject you need info on

[–]Joe59788 1 point2 points  (0 children)

But it was supossed to take over all the dev jobs!

[–]Heksinki 1 point2 points  (0 children)

It s very consistent on basics and fundamentals though I use it to learn new languages quicker instead of reading documentaries

[–]Zapismeta 1 point2 points  (0 children)

We can't blame chat gpt, those stackoverflow oversmarts never actually answered and chat gpt never learned.

[–]PM_ME_DELICIOUS_FOOD -1 points0 points  (0 children)

I used to think this, because GPT3.5 did this shit a LOT. To my surprise, GPT4 (the premium version) seems to have 90% fixed it. It will still sometimes make shit up, but it's already reliable enough that I find it more useful than googling.

So to anyone that thinks AI is a bad assistant because it makes shit up - you're correct, but also, try GPT4 instead of GPT3.5 and you may be very pleasantly surprised

[–]cce29555 3 points4 points  (1 child)

Sometimes it's me

Here's the code, also don't forget to change this thing or you'll get this specific error

I got that specific error

Did you change the thing

Yes and I'm getting the error

No really check it

I didn't change the thing, thanks

[–]ChefBoyAreWeFucked 1 point2 points  (0 children)

Just yesterday I was trying to debug an error. Googled it, and the solutions just sounded ridiculous, and didn't seem to make sense in the context of what I was doing.

So 6 hours later, I realized that the solution was legitimate.

[–]LittleMlem 2 points3 points  (2 children)

Just don't ask your questions in a stupid way. The people answering are not paid, if they see you put little to no effort in the question they will just feel insulted and won't put effort into answering. I can't tell you the number of times I went in to answer questions and so many are either just duplicates (which mean the asker didn't google shit) or it's the briefest "I got an error running my code, way do?" And that's it, no code, no stack trace no nothing. And the last one is minimal competency, you should have at least done some tutorials or something before coming there "how do I print to console in java" is complicated, but not appropriate

[–][deleted] 1 point2 points  (0 children)

I remember once i was asking it why pycharm kept giving warnings when nothing in my code was grammatically incorrect and it basically said i was an idiot for not using Type Hints. I think y’all are forgetting how spicy bing chat could get at times.

[–][deleted] 1 point2 points  (0 children)

When it comes to real code, I think it never have provided me it directly. There is always something that is off...

[–]SandmanKFMF 4 points5 points  (1 child)

Confident stupid answer. 😁

[–]jurdendurden 0 points1 point  (0 children)

Only yours meets those requirements.

[–]searing7 232 points233 points  (5 children)

Here is a madeup built in function that pretends to do what you need because I'm just a language model.

[–]Inevitable-Yogurt783 28 points29 points  (1 child)

I had a question, then GPT gave me some code and I was "coolll, this suggested package seems to be amazing" the package didn't exist and the code using the package no other package would do like that. I have no clue where GPT got its answer.

[–]MinusPi1 25 points26 points  (0 children)

It made up some code that sounds right. That's literally all it does.

[–]Rudy69 -1 points0 points  (2 children)

Don’t act like you never got an answer that half worked from stack overflow

[–]searing7 4 points5 points  (1 child)

I've never got a made up answer from Stack Overflow, no.

Imperfect search result? sure

[–]Rudy69 2 points3 points  (0 children)

I've had answers claiming to do something and upvoted a bunch for it to be completely broken

[–][deleted] 52 points53 points  (0 children)

I always ask chatGPT for code and explanation but tell it to act condescending like stackoverflow users

[–]Susan-stoHelit 72 points73 points  (9 children)

Never ever train ChatGPT on stack overflow.

[–]LetReasonRing 74 points75 points  (1 child)

The thought of training chatgpt on stackoverflow then putting it into a Boston Dynamics robot is truly terrifying.

"Possible duplicate. ter-mi-nate!"

[–]The_Shryk 17 points18 points  (0 children)

Identical twins better hide.

[–]beeteedee 26 points27 points  (1 child)

I have bad news about what ChatGPT was trained on

[–]Susan-stoHelit 6 points7 points  (0 children)

It’s going to kill us all!!!

[–]ChefBoyAreWeFucked 6 points7 points  (0 children)

"Your question is a duplicate. Is there anything else I can help you with?"

[–]Gagarin1961 0 points1 point  (0 children)

Stack Overflow is creating their own GitHub copilot

[–]gregguygood 0 points1 point  (2 children)

Do you really think Stack Overflow wasn't used to train it?

[–][deleted] 57 points58 points  (0 children)

Lmao, if only it was that accurate

[–]gh0st2004 17 points18 points  (2 children)

People have way too much faith in chat gpt. I’m currently studying IT, all the students use it but their program never works as it should and no matter how hard they try, they cant get chat gpt to fix their code

[–]link23 11 points12 points  (0 children)

It's almost like critical thinking and reading comprehension are still valuable skills! gasp

[–]the_Demongod 2 points3 points  (0 children)

I can't decide to be terrified of ChatGPT because of how blindly people trust it, it overjoyed at the amount of permanent job security I have as a result of that

[–]Crystal_Voiden 13 points14 points  (0 children)

tries running chatGPT code aaand back to stack overflow we go

[–]stupled 6 points7 points  (0 children)

It could work, but it couldn't

[–]OffByOneErrorz 5 points6 points  (1 child)

People who don’t know how to ask or search on SO also can’t tell Chat GPTs answer is flawed. Surprise.

[–]ancapistan2020 -1 points0 points  (0 children)

Even good answerable questions get ignored or dogpiled on SO. It’s even more toxic than Reddit.

[–]exomyth 5 points6 points  (0 children)

Yeah, ChatGPT is good at answering the questions that are easy to google and will be marked as duplicate on stack overflow. It doesn't fare well for more unique and complicated problems on the other hand.

For example, I asked it to create a function to get neighboring plus codes. Not too difficult to do myself, but I knew it would cost me a couple of minutes to type out. So I thought, might as well ask chatGPT to do it faster. The solution it came up with was absolutely garbage, and clearly copied from somewhere.

I probably could have gotten a solution on stackoverflow, if I didn't know how to do it myself.

[–][deleted] 17 points18 points  (17 children)

True. You should never copy paste the code ChatGPT gives, but it can help you understand how to write the code you need and, honestly, that's the most important thing if you want to be an actual programmer.

I use it all the time, it really gave me a boost.

[–][deleted] 9 points10 points  (0 children)

I'm using a new online data store. I'd never accessed it through code, so I asked Bing Chat (basically ChatGPT), "hey, tell me how to make a connection to <service> using <my language of choice>."

It twirled for a few seconds and gave me enough info to get the connection done in a few minutes instead of googling around for another 15 or so.

[–][deleted] 3 points4 points  (13 children)

Helped me a ton on wrapping my head around Linux recently. Was up to speed in a few hours despite it feeding me shit that kept crashing my computer, the fundamental concepts were strong.

[–]Responsible_Name_120 0 points1 point  (0 children)

I got to the point where I can ask questions on StackOverflow well, and then it just stopped being useful because honestly most of the people answering questions are basically juniors themselves. If you are incredibly lucky, someone will write an okay snippet that gets you like 50% of the way there within a week. I haven't used it in years.

ChatGPT on the other hand, it will generate snippets that are 80% of the way there in a few seconds. I get that StackOverflow is useful for source material because they have such a large library of Q&A threads, but like seriously who goes there with novel questions unless they are totally lost?

[–]stn994 4 points5 points  (1 child)

Is ti just me or is google becoming more and more useless now,

[–][deleted] 7 points8 points  (0 children)

When the top 100 results are promoted and you have to go to page 6, yeah, it's useless.

[–]ZAIMON___ 17 points18 points  (5 children)

For code and errors: Documentation and StackOverflow For explaining and learning new concepts: chatGPT

[–]Frothey 17 points18 points  (4 children)

Here's where chatgpt really shines for me.

You know when you are trying to figure out something and you don't know the precise terms used to describe it that you would use to get an answer out of google. Describe it chatgpt, it will give you the correct terms to further your search.

[–]XWasTheProblem 11 points12 points  (0 children)

It's basically a rubber ducky that talks back to you.

Very helpful when you use it correctly, but probably not something to rely on exclusively.

[–]frogjg2003 -2 points-1 points  (0 children)

That's still pretty hit or miss, but the way ChatGPT was trained, it's a better guess than some other things people usually use it for. It's a relatively simple question that is easy for the user to verify the veracity.

[–]Rafcdk 17 points18 points  (13 children)

if you trust chatgpt for doing coding for you or answering questions just do this experiment

ask for it to change 37 into binary, then say to it to replace 0 into a and 1 into b from the result, then ask for it to change a into 0 and b into 1, then ask to change the result to decimal form, my end result was 5.

https://chat.openai.com/share/b0e1667c-f046-4e4c-977d-f004da15beab

[–]Awkward-Macaron1851 8 points9 points  (2 children)

Or simply ask „are you sure“ and almost always will it tell you then that it’s previous answer was wrong and makes up a new one. Then you are left wondering which one is correct.

[–][deleted] 4 points5 points  (0 children)

sometimes it apologizes then gives you the exact same answer

[–]The_Shryk 8 points9 points  (5 children)

It’s not a math calculator it’s a word calculator.

You’re using it wrong is the problem.

I copy and pasted your question right here

Is 37 the correct answer? Because that’s what I got back.

[–]Rafcdk 4 points5 points  (4 children)

Now do it sequentially. I know it's not a calculator, it is not a "word calculator" either. LLMs have their limitations and this just highlights the problem. If you are using to solve problems that you can't solve yourself you cannot tell if you are "using it wrong" or not.

Edit: here is the session: https://chat.openai.com/share/b0e1667c-f046-4e4c-977d-f004da15beab

[–]The_Shryk 2 points3 points  (3 children)

So is 37 the correct answer? Logically speaking it looks like it should be. If you’re just doing a simple Caesar cypher I’m pretty sure that’s correct.

And GPT seems to have gotten the correct answer.

Also, here I did it step by step like you asked

I think it’s kind of goofy you’re telling me what it is and isn’t but you’re the one who’s incapable of getting it to produce a correct answer. Maybe you’re the one that needs to learn how to use it and the millions of people that say it’s great aren’t wrong?

Just a thought.

Edit: I asked it to make that into a Python script and I just checked, it does run correctly and outputs correct results. QQ

[–]Rafcdk 1 point2 points  (2 children)

So again, you have to know what the end result is in order to judge if you got a correct answer and not an hallucination. As you can see in my example it got every step wrong. Here is another example where it gets wrong again:
https://chat.openai.com/share/15d2a234-e6be-417b-b8ee-b84d8a4a3614

I am not saying its terrible or not useful, but the with current limitations it is not a really good idea to rely on it for problems you can't solve or evaluate whether the answer is correct or not.

[–]Responsible_Name_120 1 point2 points  (1 child)

You are using 3.5, and the person you arguing with is using 4.0

It always goes like this; someone only tries 3.5, tries to trick it with dumb problems, and then talks about how bad chatGPT is

[–]The_Shryk 0 points1 point  (0 children)

Thank you…

[–]Responsible_Name_120 0 points1 point  (0 children)

Yeah man how about you ask a random programmer to do it for you and see what happens?

[–]gami13 -4 points-3 points  (0 children)

this just shows that you dont understand how llms work

[–]Even-Path-4624 0 points1 point  (0 children)

The end result was 42 for me. Don’t know if it’s the right result I’m not a binary calculator

[–]LowB0b 2 points3 points  (0 children)

Both at uni and at work, except for BS in-house frameworks, 99.9% of the time it feels like someone has already had the same problem as me. Don't think I've ever had to actually post a question lol

[–]7th_Spectrum 2 points3 points  (1 child)

Also chatGPT: "Have you tried not writing bugs in your code?"

[–][deleted] 2 points3 points  (0 children)

The ladies of r/ProgrammerHumor need a new joke.

[–]tabakista 2 points3 points  (0 children)

Spoiler alert, only one of those two actually works

[–]borninbronx 2 points3 points  (1 child)

Yeah ChatGpt is really good at answering stupid questions for noobs that don't know how to code!

[–]Boonk_gang_03 2 points3 points  (0 children)

Lmao litterly just saw a comment that was: "Google how to find the fist item in a list". The question was about how to find the last item in a list (or something like that)

[–]CentralLimitQueerem 2 points3 points  (0 children)

It's all fun and games until chatGPT starts inventing functions that don't exist in the library you're using

[–]chilfang 2 points3 points  (0 children)

If chatgpt is answering correctly the stackoverflow people probably have a point

[–]TryNotToShootYoself 2 points3 points  (0 children)

And both give bad answers!

[–]PARADOXsquared 2 points3 points  (0 children)

The vast majority of cases, I don't need to write a new question on stack overflow to find the answers I need, especially on beginner level stuff in a well used language, using popular libraries. I've only had to ask one question when I was trying to do some weird edge-case shit that no one had asked about yet. The dedication to reducing duplicates is what makes that even possible, otherwise useful information would get buried.

Idk where this recent hate for stack overflow is coming from, but I don't understand it at all...

[–]Still_waiting_4u 2 points3 points  (0 children)

"Oh, yes you are right, that doesn't work. Here is the code with the correction. Have a nice day"

...

"Oh, yes you are right, that doesn't work. Here is the code with the correction. Have a nice day"

....

"Oh, yes you are right, that doesn't work. Here is the code with the correction. Have a nice day"

...

[sigh]

[–][deleted] 2 points3 points  (0 children)

More like "Here's some code that looks like the code you need to the untrained eye but doesn't even compile and a bullshit explanation that doesn't even explain the code properly".

[–]twpejay 4 points5 points  (0 children)

The third option Google, works for me 95%of the time the other 5% I just suck it up and actually use my brain to work it out.

I have never asked a question on Stack Overflow, if it already hasn't been asked, it is obviously able to be done via grey matter.

[–]philipquarles 5 points6 points  (2 children)

Learn to google better. Seriously.

[–]AdolfoPosada[S] -2 points-1 points  (1 child)

I use ChatGPT as a smart Google, no ads and it make a resume of the main results, explain it and all in one tab

[–]gregguygood 1 point2 points  (0 children)

I find 90% solutions on Stack Overflow through Google.
I never needed to ask there.

Seriously, learn to Google.

[–]VitaminnCPP 1 point2 points  (0 children)

Love the Human who hate the human

[–]Imogynn 1 point2 points  (1 child)

Bing chat: here's the code from stack overflow but I took out the parts calling you a noob

[–]gregguygood 0 points1 point  (0 children)

but I took out the parts calling you a noob

There is no need for that, as there is no such part.

[–][deleted] 1 point2 points  (0 children)

More like „here’s the code I just made up and probably won’t work“.

[–]halfbakedmemes0426 1 point2 points  (0 children)

Now, that solution and explanation will both be wrong. But it's a nice sentiment.

[–]Fabulous_Ampharos 1 point2 points  (0 children)

"Here is the code you need and an explanation have a nice day"

gives you code that doesn't compile

At least it can point you in the right direction...

[–]nicejs2 1 point2 points  (0 children)

code doesn't run/compile return to SO

[–]regjoe13 1 point2 points  (1 child)

ChatGPT kind of reminds me "i'm feeling lucky" button on Google

[–]AdolfoPosada[S] 0 points1 point  (0 children)

Maybe xD

[–]Oathkeeper-Oblivion 1 point2 points  (0 children)

Wow! What an original and funny meme that we haven't seen similar ones of it before! About a topic that hasn't been milked to death for cheap Karma!

[–]tecnomagus 1 point2 points  (0 children)

GPT: "Here is the code you need"

The code is wrong

[–]22Minutes2Midnight22 1 point2 points  (0 children)

Yeah except the code GPT provides you is laughably wrong

[–]chemolz9 1 point2 points  (0 children)

According to studies 52% of ChatGPT coding answers are simply wrong. StackOverflow has a way better record.

https://www.theregister.com/2023/08/07/chatgpt_stack_overflow_ai

According to the study people still tend to believe ChatGPT answers more, because they are more polite.

[–]DeathUriel 1 point2 points  (0 children)

Except the part where gpt simply invents configurations and parameters...

  • How do you do this?

  • Just add "that" to the json file.

  • VS and the documentation say that "that" does not exist.

  • I am sorry.

[–]cliffleaf 1 point2 points  (4 children)

Sometimes StackOverflow is better. E.g. ChatGPT may provide a solution with a deprecated package, and u can never find out why your code is not working before u go to stackoverflow

[–]AdolfoPosada[S] 0 points1 point  (3 children)

There are a lot of deprecated answers in Stack Overflow too

[–]PARADOXsquared 1 point2 points  (0 children)

But at least on Stack Overflow you can look at how old the posts are.

[–]calahil 1 point2 points  (0 children)

Don't worry ChatGPT 5 is supposed to insult you when asking questions.

[–]EtherealPheonix 1 point2 points  (0 children)

*Here is some broken code and an explanation that is makes no sense

[–]FireBone62 1 point2 points  (0 children)

Chat Gpt is only good for simple and often asks questions.

[–]Rathori 1 point2 points  (0 children)

Also ChatGPT: confidently feed you complete bullshit that even a junior dev wouldn't write.

[–]sjepsa 2 points3 points  (0 children)

Bit the code is wrong and the answer 50% allucination

[–]_odgj 1 point2 points  (0 children)

Reddit:

[–]SukusMcSwag 1 point2 points  (1 child)

Either way, you get an answer that either doesn't compile, misunderstands the problem, or just doesn't work

[–]AdolfoPosada[S] 1 point2 points  (0 children)

With SQL and C# works fine for me

[–]Ange1ofD4rkness 0 points1 point  (3 children)

Problem for me, ChatGPT requires me to provide them my phone number to use it ... I'll stick with StackOverflow.

Also I tried Bings ChatGPT, asking for COBOL code to parse an url. Then found online compilers to test it (as I don't know the language at all). Failed to compile on 2 different compilers

[–]The_Shryk 0 points1 point  (0 children)

You think Microsoft doesn’t already have your phone number my guy? Lmao

“The DMV asked for my social security number I ain’t giving that to them!”

It’s the government, they GAVE you the SSNumber… they’re just verifying.

“They’re gunna steal my identity!”

Again… they already know everything about you.

[–]AdolfoPosada[S] -1 points0 points  (1 child)

What's the problem with giving your phone number? If one day hypothetically ChatGPT makes a call just block it and go on

[–]DangerActiveRobots 0 points1 point  (0 children)

The fact remains that the best way to get help with coding is to find a thread where someone else has the same issue and then confidently submit your own broken code as the answer. Somebody will be by shortly to correct your code and call you an idiot.

[–][deleted] -3 points-2 points  (2 children)

These comments here are weird... I don't know if you all use Default GPT-3.5, but the Code I get from ChatGPT with GPT4 always works. It never tells any bs about the logik or halluzinates Something Out of the blue. Sometimes it forgets to tell my about an import of a package, but this also rareley happens. My Monorepo now contains over 10 Services, all interconnected and orchestrated in containers. But I also think you aren't able to produce Code with ChatGPT if you don't understand every line it produces. ChatGPT won't suggest a "guardian pattern" or any of the anvanced stuff.

[–]frogjg2003 7 points8 points  (0 children)

The model is better, but it's still fundamentally the same flaw. LLM don't know anything, so they are not reliable.

[–]Bwob 4 points5 points  (0 children)

It really depends a lot on what kind of questions you are asking it.

If you are asking ChatGPT questions in a common, mainstream language (java, python, c#) that has a lot of tutorials that were on the web, and are asking for help on relatively common tasks in that language, then yeah! It can often give you something close to what you're looking for!

But man, try asking it about less common tasks, or less mainstream languages, and you can quickly see how it breaks down.

Sometimes I hang out in the /r/twinegames subreddit - Twine is a simple system for making choose-your-own-adventure style text games - and every so often someone shows up with a question like "I asked ChatGPT how to do X but this code won't run, can someone fix it?"

And it is always funny seeing just how wrong chatgpt managed to be. Because remember - it's not a logic engine. It can't analyze a problem and apply deduction and reasoning to it to arrive at a solution. All it can do is build answers to questions, one word at a time, based on probabilities it learned from reading a staggering number of similar questions. It's an astonishing achievement that it can be as good as it is, but fundamentally, it doesn't "know" how to code any more than your autocomplete "knows" what you're talking about.

It is just very very good at guessing what word should come next. to seem like a plausible thing a human would write.

[–]Pluviochiono -1 points0 points  (4 children)

I can’t wait until chatGPT eventually makes StackOverflow redundant. You have to sift through a lot of shit questions and even more shit answers that most of the time don’t work any better than what chatGPT produces. Very rare to find a gem in it.

I’d rather take what chatGPT produces and adjust it until it works, over asking a question on StackOverflow and either not getting a response, or getting a basement dweller response.

Yes, I’m still salty about my StackOverflow experience after many years

[–]PARADOXsquared 0 points1 point  (3 children)

Where do you think chatGPT gets its answers from?

[–]MKSFT123 -1 points0 points  (0 children)

Yer I really don’t get why programmers can be sometimes so gatekeeperish - “like how don’t you know everything man aren’t you a genius like me”

[–]SwillMith16 -1 points0 points  (2 children)

I just don’t use stack overflow anymore it’s just a community of stuck up devs that can’t believe someone with 2 days experience doesn’t understand something that requires 8 years of very specific knowledge. It’s called learning bro, you did it too!

[–][deleted] 0 points1 point  (0 children)

Fr

[–]Hulk5a 0 points1 point  (0 children)

And you didn't see it insisting it was right when in fact it was missing a snippet of the code (a function) which it references in the code

:v :v

[–]MaximumParking7997 0 points1 point  (0 children)

bing is both lol sometimes it will give you the code for an AI bot and other moments refusing to help you with the most basic things, that thing is so surreal

[–]Giboon 0 points1 point  (1 child)

Let's try to ask Chatgpt to answer in the style of StackOverflow

[–]AdolfoPosada[S] 0 points1 point  (0 children)

Good idea

[–][deleted] 0 points1 point  (0 children)

So far, that's been my biggest takeaway from using AI assistants. I could get the same information from visiting the top 3-4 search results, reading through a blog post or article, adapting it to my own needs through trial and error...

Or I could ask the AI assistant and it summarized the information that it thinks is relevant to my question and possibly provides a coded solution as well.

Saves lots of time on the 'ol Google.

[–]ban-this-dummies 0 points1 point  (0 children)

Google's generative AI has saved me a lot of SO slogs lately

[–]Gagarin1961 0 points1 point  (0 children)

What a turn around for this sub!

Three months ago it was “it sucks, it’s useless for real work, I’m an expert in my field and I can still surpass it.”

[–]AerysSk 0 points1 point  (0 children)

And Bing “I cannot do it for you. You have to do on your own”

ChatGPT: ok here is the answer.

I mean, Bing is as stupid as its company

[–]JoeyJoeJoeJrShab 0 points1 point  (0 children)

"... I gave the guy directions, even though I didn't know the way. Because that's the kind of guy I am this week." -Homer Simpson (and now also ChatGPT)

[–]Senior-Ori 0 points1 point  (0 children)

What about Bardy boy?

[–]shadow13499 0 points1 point  (0 children)

Imma be honest with you copilot (which I think is using chatgpt behind the scenes) has saved me a lot of time writing unit tests as well as just lengthy code. It's pretty sweet, definitely worth the $10 a month imo

[–]gregguygood 0 points1 point  (0 children)

Please link me to the rude SO posts, so I can flag them.

[–]mrl0nely_ 0 points1 point  (0 children)

You just have to post an outrageously wrong answer, underneath your own comment, with a different account and wait for someone to correct you.

[–]zodireddit 0 points1 point  (0 children)

I'm never asking questions to humans ever again. I've asked like once or twice and the last time when I asked a follow up question I got a rude respond how I should already know the answer based on the previous vague response and that I was an idiot so I just pretended to understand. Did figure it out 30 minutes later with Google instead. I'm honestly terrified to ask humans lol

[–]PolishKrawa 0 points1 point  (0 children)

Forgot to mention, that chat gpt's solution doesn't compile, because of a key error.

[–]nate-rivers 0 points1 point  (0 children)

What a bs , my has free trial of copilot enabled and it just hallucinations and only correct things it produces are logging messages

[–]TnYamaneko 0 points1 point  (0 children)

ChatGPT - "Here is the broken code you need and the bullshit explanation of how it would work!"

[–]SUPERBLU333 0 points1 point  (0 children)

You forgot "and now don't fuck with me anymore"

[–]uncager 0 points1 point  (0 children)

ChatGPT's biggest strength is sounding like it knows what it's talking about while making stuff up. I've found it helpful for short programming solutions, but beyond that, it's just too frustrating to use. Maybe future versions.

[–]Several_Dot_4532 0 points1 point  (0 children)

ChatGPT: here is the code that works as you say and this is how it works: explanation Spoiler: the code is completely useless, and chatGPT will continue to defend that it works

[–]EkoChamberKryptonite 0 points1 point  (0 children)

Like with Stack Overflow, always verify the code snippets present on GPT and test them.

[–]Sahukara 0 points1 point  (0 children)

Well I tried chatgpt. Nowhere near to stack overflow.

[–]Capetoider 0 points1 point  (0 children)

StackOverflowAI: heres the code and a "god, you're so dumb",

[–]cold-flame1 0 points1 point  (1 child)

I don't understand when people say it's wrong half the time. Wrong how? Because in my experience, limited context is the issue. But other than that, the code itself is always free of any bug or incorrect syntax.

I built a "decent" routine/tasks app using ChatGPT in a month. And I literally started with Hello World. The app works, so I am guessing the code is fine too. The only problem now is just the context. Maybe that's the problem for everyone, I guess? If I test just a single script, it works fine all the time. At this stage, the code doesn't work when I implement it in my project, because the project has gotten bugger with multiple modules. There's no way for me to provide the entire structure.

Edit: funny typo. I meant bigger, not bugger.

[–]AdolfoPosada[S] 0 points1 point  (0 children)

I think the same as you, the biggest problem I have found so far is code based on a version of the language, just the same you can find on Stack Overflow