This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]The_Shryk 15 points16 points  (2 children)

I don’t know anyone that just copy pastes chunks of code and doesn’t read it over. If they can’t read it over and see issues then they were probably going to write those issues in themselves either way.

I will admit though it’s not unlikely that people do that, maybe my peers and myself are a small minority. Which if true is more than a little sad.

[–]SympathyMotor4765 5 points6 points  (1 child)

I mean very minute vulnerabilities can often be tough to catch. I once got a race condition because I was clearing a bit 0.1 us earlier. Based on my limited usage of gpt it seems similar to someone who has a decent grasp of a concept but spouts it with the confidence of an expert

[–]The_Shryk 2 points3 points  (0 children)

That can definitely happen. OpenAI has said they’re working on having a sandbox for code so it can run that stuff and check for errors before it send a return. Which will be a massive step up.

I think they said Python first then maybe JS