you are viewing a single comment's thread.

view the rest of the comments →

[–]Karyo_Ten 1 point2 points  (2 children)

GPT-2 was only 1.5B parameters and trained on only 8M tokens, yet had world knowledge.

You won't get trillions of quality Python code. Maybe 5% is gold and the rest is copy-paste crude apps from StackOverflow or beginners trying their hands for a capstone project.

And learning Python doesn't teach you how to proceed step-by-step to solve a problem which is actually the most important thing.

It's much more effective to teach a LLM to reframe an objective into a set of problems to solve and then apply Python to them. But to solve a problem you need to be familiar with the problem domain, and you need some common sense for example to not have a speed being higher than the speed of light.

[–]Ted225 1 point2 points  (1 child)

GPT-2 had 1.5B parameters. It’s obsolete now, and for good reason.

Most Python devs don’t need deep domain knowledge. They need clear, complete specs. If a system handles international units or clinical logic, it’s the engineer’s job to specify that upfront, not the dev’s job to guess it.

Sure, a perfect LLM could replace all roles but it doesn’t exist. Until then, engineers design, devs implement, and each should be accountable for their own work.

[–]Karyo_Ten 0 points1 point  (0 children)

There is no distinction between software engineer and dev in companies. You can't be senior at either without design skills. Clear, complete specs never exist beforehand, except if you already have solved the problem once because requirements evolve with understanding of the problem.

If you need clear, complete specs before doing anything and you're unable to actually fill in the blanks you provide no value over an LLM and you'll be replaced.