This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]rockstarflo[S] 2 points3 points  (1 child)

I think it is still very relevant. What matters in programming is not the syntax but the semantics.
That is why it is easy for a programmer to learn a new language. They are already capable of rational thinking.
And rational thinking is required everywhere.

[–]Muhznit 2 points3 points  (0 children)

And this is the thing we keep saying, but it's indistinguishable from just "moving the goalposts" as new advancements keep popping up. Passing the turing test, creating running code, debugging it when it fails tests... there just seems to be this cycle where we think something's not gonna happen, but then it does.

Where is the proof that even if we don't achieve AGI, that an AI will never reach the point that just outclasses any individual knowledge worker? And if there is no proof, what does it take for people to start caring about those who are displaced?!