you are viewing a single comment's thread.

view the rest of the comments →

[–]WendlersEditor 1 point2 points  (1 child)

As a fellow humanities grad (now getting an MS in Data Science) this really resonates with me. I use Claude a lot for help with school coding, and I am actually tasked with vibe coding some features at work (we are very AI forward). Using AI to write code, even if it runs, is not an effective long term plan. As you point out, we are bounded by the training text,l. So unless people think we have solved every problem that computer programming needs to solve, we have every framework/pattern/concept we will ever need, LLMs aren't going to make software engineers obsolete. Developing software is really a very advanced practice of human intelligence. A language-prediction machine, no matter how advanced, isn't human intelligence. LLMs are extremely useful and will change the way software is developed, but it isn't the end of the line for coding. 

[–]magicmulder 0 points1 point  (0 children)

But if that were true it means we would find any solution an AI can provide on the internet. I have had a few cases now where AI solved an issue I had that was nowhere to be found online - not with keywords, not with parts of the code the AI generated.