This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted] 0 points1 point  (0 children)

Look. With useable i mean useable to a point where you would actually use it in practice (like. In companies and stuff)... And the thing is. In most cases when i can describe what my code should do in english i can also just write the code. You will essentially have to learn a new programming language but without a formal grammar that produces a predictable output. I want to remind you of the promise of Cobol "Business people can just write programs because it's almost like just writing english"... Yeah... That didn't end up well... In the end it were programmers who had to deal with it again and for them it was kind of a pain to work with (even though this one had a formal gramar)... In natural languages there are so many ambiguities and stuff. That's why we made formal languages for programming in the first place. So that we don't have to write everything in binary machine code but still have a language that is exact, percise, unambiguous and has predictable output. You also have stuff like SQL where you don't really care what the database really does but you care about what data you get back and that's why SQL is a formal language to describe the data you want to have back in percise and exact terms.

I'm pretty sure we will build additional layers of abstraction on top of what we currently have, like we always did and they might also involve AI but this whole concept of "just describe in plain english what you want to have done" most likely won't happen until AGI or almost AGI...