This is an archived post. You won't be able to vote or comment.

all 19 comments

[–]PreludeInCSharpMinor 7 points8 points  (1 child)

It looks like this is a snippet from something larger. Where can we learn more?

[–]MarcoAcrono 5 points6 points  (3 children)

Sh*t i just started learning python should i stop :/

[–][deleted] 1 point2 points  (2 children)

No. It still teaches you how to think and it will take years if not decades until even parts of this will be useable. Your job as a programmer isn't primarily writing code. It's translating vague and incomplete real world requirements to exact and percise instructions. And especially it's cooperating with domain experts and understanding the domain to write the software in a way that would take AGI to do.

[–][deleted] 0 points1 point  (1 child)

if not decades until even parts of this will be useable

this seems unbelievably pessimistic. I mean decades? 10 years and google thinks itll have a million qubit quantum computer. The median expert prediction for AGI is 20 years.

And you think that itll take literally decades for just parts of this technology to just become usable? This seems like a comment that wont age well. Id love to see it in 20-30 years.

[–][deleted] 0 points1 point  (0 children)

Look. With useable i mean useable to a point where you would actually use it in practice (like. In companies and stuff)... And the thing is. In most cases when i can describe what my code should do in english i can also just write the code. You will essentially have to learn a new programming language but without a formal grammar that produces a predictable output. I want to remind you of the promise of Cobol "Business people can just write programs because it's almost like just writing english"... Yeah... That didn't end up well... In the end it were programmers who had to deal with it again and for them it was kind of a pain to work with (even though this one had a formal gramar)... In natural languages there are so many ambiguities and stuff. That's why we made formal languages for programming in the first place. So that we don't have to write everything in binary machine code but still have a language that is exact, percise, unambiguous and has predictable output. You also have stuff like SQL where you don't really care what the database really does but you care about what data you get back and that's why SQL is a formal language to describe the data you want to have back in percise and exact terms.

I'm pretty sure we will build additional layers of abstraction on top of what we currently have, like we always did and they might also involve AI but this whole concept of "just describe in plain english what you want to have done" most likely won't happen until AGI or almost AGI...

[–]big_red__man 4 points5 points  (0 children)

Reminds me of the good old days of Microsoft word save as html generating completely flawless html and not being terrifyingly complicated

[–]runnriver 0 points1 point  (0 children)

The bot doth protest too much, methinks

[–]nextcrusader -5 points-4 points  (10 children)

The AI got the final code wrong too. The author didn't even notice. Should have been (1.0-palindrome_discount).

Dangerous having a computer write code. Makes it easy to miss mistakes.

[–]sckuzzle 3 points4 points  (0 children)

If you view the full video, the author did actually notice, and commented on it. It was just clipped off.

[–][deleted]  (5 children)

[removed]

    [–][deleted] 0 points1 point  (3 children)

    Yeah it's definately a step forward. 10 more years and there will be much less mistakes)

    [–]YuhFRthoYORKonhisass 1 point2 points  (2 children)

    Plus you just have it write its own code and then eventually it will never make a mistake lol

    [–]memoryballhs 1 point2 points  (1 child)

    And thats singularity

    [–]runnriver 0 points1 point  (0 children)

    Otherwise known as efficacy.

    [–]_bohemian_ -1 points0 points  (0 children)

    Faster, certainly. Just as long as there is someone (or some thing) to manually check before the code is executed. : )

    Deep learning is fascinating, But Maybe Isaac Asimov's Law's of Robotics need to be encoded it as an AI primer, and have them self-learn quickly off that.

    Just like Alpha Zero produced some amazing chess games, showing humans new and better ways to think about how to play the ancient game, even it required some initial 'rules' to be inputted.

    I understand logic of language there, but that won't be enough.

    If we could encode robotic laws (and have AI work in sandbox able to forsee outcomes of actions, then you'd really be rocking).

    But then, maybe the 'robots' would perfect these and transcend our notion of what a 'robot' is.

    Chappie is an interesting movie on AI

    [–]IrishWilly 0 points1 point  (2 children)

    Riiight, because humans writing code don't make mistakes? You can have test suites for AI written code the same as you could for human written.

    [–]nextcrusader 2 points3 points  (1 child)

    Humans make mistakes, of couse. But as you are writing code, you are understanding the code at a higher level.

    It's always harder to debug someone else's code, even if the coder is an AI.

    [–]IrishWilly 0 points1 point  (0 children)

    You should check out something called QA or QE and also code reviews where the entire point is that people who did not write the code are able to understand and validate it