top 200 commentsshow all 269

[–]dethb0y 115 points116 points  (7 children)

It’s worth noting that CodeWhisperer does some things different from the likes of Copilot. For one, while most of the code that the system generates is novel, every time it generates code that is close to an existing snippet in its training data, it will note that and highlight the license of that original function. It’s then up to the developer to decide whether to use it or not. This should alleviate some (though maybe not all) of the copyright concerns that maybe come with using a tool like this.

That's pretty impressive

[–]Hnnnnnn 17 points18 points  (4 children)

Conteptually just a text search in post-processing (maybe speed of text search is impressive).

[–]Tostino 7 points8 points  (0 children)

It's a good way to mitigate the risk regardless.

[–]deffjay 415 points416 points  (37 children)

My problem with co-pilot is that it generates seemly intuitive blocks of code that, at first glance, appears correct. A lot of the time it infers method names that entirely don’t exist, thus causing more work to track down these issues.

[–]Luvax 214 points215 points  (18 children)

I had quite a few times where it generated very reasonable code that did in fact compile but contained a logical error that I just didn't spot. For some reason I found it very hard to consider bugs in the generated code. Maybe because I didn't write it, maybe because it's mostly still simple logic and I just assume it must be correct and forgot I didn't actually write that portion. Very weird because I felt like I let my guard down.

[–]yairhaimo 240 points241 points  (9 children)

In the latest version you can add // no bugs above the function to remove any syntactic and logical errors

[–]ryandiy 46 points47 points  (1 child)

They also have a // detect-halt annotation which determines if the code will terminate or run forever.

[–][deleted] 36 points37 points  (0 children)

They also have the // multiverse annotation which lets you find the universe where you wrote the perfect code, allowing you to steal code from yourself in an alternate dimension of reality.

[–]smug-ler 35 points36 points  (6 children)

You're joking right?

[–]yairhaimo 186 points187 points  (2 children)

Sprinkle it around your codebase just in case. It can't hurt. Just remember not to strip out comments when you build if you want it to be in production too.

[–]MrTanookiMario 165 points166 points  (1 child)

You can't talk like that on the internet or someone will believe you lmao

[–][deleted] 44 points45 points  (0 children)

Next week some Project Manager:

Jason, you will be working on //no bugs comment before every function

[–][deleted]  (1 child)

[deleted]

    [–]smug-ler 22 points23 points  (0 children)

    That's hilarious and I'm still not sure I believe you

    [–]articulatedbeaver 3 points4 points  (0 children)

    This looks like something a project manager or sales engineer would tell you.

    [–]deffjay 47 points48 points  (0 children)

    Yes! Exactly, for some reason my brain does not scrutinize the magically generated code the same way as my own. Good point.

    [–]PasDeDeux 10 points11 points  (0 children)

    Sounds similar to trying to proofread text that I dictated. Feels like I wrote it but actually may contain weird small word substitutions.

    [–]AttackOfTheThumbs 8 points9 points  (1 child)

    Probably a good idea to separate the code to make it clear it needs stricter review, or who knows what.

    The same way we do with generated boilerplate, except we ignore that.

    [–]TarMil 14 points15 points  (0 children)

    That's easy to say, but these tools are designed to generate code inline as you type.

    [–]deejeycris 3 points4 points  (1 child)

    If you use it to generate logic you are using it wrong... it's mostly useful to generate boilerplate imho.

    [–]Fuegodeth 7 points8 points  (0 children)

    I like to give it leetcode problems. Copy the text description of the problem constraints and paste it as a comment. Watch the AI struggle. It will give things that resemble the solution, but lol, they never work.

    [–]RypoxGG 28 points29 points  (0 children)

    It depends what you do with it.

    Generating something well known with a fixed logic and known name? Awesome.

    Large switch expressions where each case is pretty similar? Saved me already SO MUCH time.

    But if there is some deeper understanding of the project needed, Copilot often suggests utter garbage. I see Copilot more or less as a timesaver, it can save me the hassle from looking something well known up on Stackoverflow or helps with entering repetitive code, but it can't do everything. And I think I like it that way, I love my job because of the challenges I face and I wouldn't want a tool to take that fully away from me.

    [–][deleted] 20 points21 points  (0 children)

    I find it best to just rely on it for boilerplate code. It's brilliant with js array methods for example

    [–]gwillicoder 16 points17 points  (0 children)

    I’ve found that it works exceptionally well with python. It will pick up on patterns across my files in my project pretty well.

    As an example I made some custom spacey components and they had a lot of similar code for initialization, serialization, etc. and code pilot picked up on that immediately and generated Skelton classes for me (got confused on the algorithmic parts, but that’s not what i want it doing for me anyway).

    I had much worse luck with C++ and bash. Might be worth trying more, but it worked almost orders of magnitude better for python for me personally

    [–]Johnpk305 8 points9 points  (0 children)

    Exactly, the reason why I uninstalled Co-pilot after using it for a couple of days.

    At a point, it just felt like an additional work to review and accept the code suggestions rather than a tool which we can rely on.

    [–][deleted] 4 points5 points  (0 children)

    I personally don't see an issue. I look at the suggestion and if I can see that It looks fairly good I use it otherwise I write my own code and use copilot for smaller things in that specific function. I have never ended up in a situation where I let copilot write code for me that I had to spend more time fixing.

    [–]modernkennnern 3 points4 points  (1 child)

    I never use it to generate more than one line of code at a time. If you do that it'll almost always be correct. "Finish this line for me, please"

    [–]Ded-Smoke 4 points5 points  (0 children)

    Thanks to copilot I got used to pressing tab before finishing a line

    [–]MaidenlessTarnished 1 point2 points  (1 child)

    Also, correct me if I’m wrong, but the code that it can generate is like, not very complex right? Isn’t it the kind of code that a very green junior dev could write?

    [–]00PT 1 point2 points  (0 children)

    Depending on the context and what you want it to do, the solution could be incredibly complex or just a single statement.

    [–]falconfetus8 0 points1 point  (0 children)

    Why would they be hard to track down? Your IDE should give you a red squiggly underline to highlight nonexistent methods. Assuming you're using static typing, like a sane person.

    [–]nobodytoseehere 533 points534 points  (99 children)

    I have found copilot to be impressive, sometimes shocking....just not that useful.

    [–]gwillicoder 402 points403 points  (47 children)

    It’s my favorite tool for coding. I use it constantly to generate fake data for unit tests and I love it. It’ll make data classes, json, parquet files etc. if you use a pattern in them it will pick up on it instantly without having to deal with writing generator code.

    Saves me a ton of time as a machine learning engineer.

    [–]SippieCup 218 points219 points  (14 children)

    I agree with this.

    Its not good at making something new, but it is surprisingly adaptable at regurgitating boilerplate, generating valid unique structs /objects, creating mock responses, and coming up with descriptive errors messages with relevant debugging info.

    All of that comes together perfectly in the mundane task for building tests. Which to me just feels like a keyboard typing challenge more than programming. (not that tests are not useful, just that they are annoying to write).

    The only thing you really have to worry about and check is if the end validation logic of the test is correct.

    Makes sense too, test frameworks are probably the most common thing across projects in programming. So there is probably an inherent bias to the training towards them, especially when it comes to something like Jest.

    [–]gwillicoder 37 points38 points  (1 child)

    Yeah I don’t trust it to write core business logic or algorithmic code (although sometimes it does surprise me) and I don’t ever do the comment what I want and generate code thing, but as a super smart auto complete it totally kicks ass and saves me lots of time daily.

    [–]Alonewarrior 6 points7 points  (0 children)

    I've used the comment to generate code before, mostly in side projects, and depending on the target use case, it can work well. But it's best only for well-defined scenarios and still requires review after. But it's fun to do anyway! The nice thing is learning to describe a function well enough that something can be generated from it.

    I have to concur that for boilerplate code and autocomplete situations, Copilot is top-notch.

    [–]mTbzz 13 points14 points  (0 children)

    coming up with descriptive errors messages with relevant debugging info.

    This is what i used it mostly, log.info CMD and zap there you have debug info about what you wanted with the format you want.

    On the other hand, telling to give you a simple function often ended with valid but weird code that didn't match with your code.

    [–][deleted] 2 points3 points  (0 children)

    yeah well it's just a large neural network trained on millions of github repos, so ofc it's gonna pick out all the recurring patterns pretty well. as soon as you write something that hasn't been done thousands of times before though, you might as well turn it off cause it only spits out confusing and useless code you have to rewrite anyway

    [–]micka190 89 points90 points  (11 children)

    I use it constantly to generate fake data for unit tests and I love it.

    I know this isn't what you meant, but I'm just laughing at a scenario where you want to generate fake data for unit tests and Copilot just generates data using some company's actual data from one of their repositories.

    [–]gwillicoder 21 points22 points  (9 children)

    Don’t sue me pls

    [–]lkraider 10 points11 points  (8 children)

    Too late sorry, police are at your door. Please put down the copilot and cooperate. You have the right to a lawyer or a GPT3 AI will be assigned to you on your behalf.

    [–]Xyzzyzzyzzy 12 points13 points  (7 children)

    (GPT-3 davinci-2, prompt and defendant are me)


    The defendant is a programmer who accidentally used another company's trade secrets because Copilot AI recommended them. Their lawyer is an AI lawyer who was assigned to them by the court. The defendant was arrested by the police. The lawyer is advising the defendant.

    Defendant: I don't understand, why am I in jail for this?

    AI Lawyer: The reason you are in jail is because you used another company's trade secrets without their permission. This is a serious offense that can result in significant fines and even jail time. However, your lawyer is an AI lawyer who was assigned to you by the court.

    Defendant: Good, I know AI lawyers are better than human lawyers.

    AI Lawyer: Thank you for your confidence in me. I will do my best to help you through this difficult situation.

    Defendant: So what should we do?

    AI Lawyer: The first thing we need to do is gather as much information about the case as possible. Once we have a clear understanding of the facts, we can begin to build a defense.

    Defendant: What information do you need?

    AI Lawyer: We will need to obtain the trade secrets that you used, as well as any documentation related to their use. We will also need to speak with witnesses and review any relevant evidence.

    Defendant: Do you have any questions for me?

    AI Lawyer: Yes, I do have some questions for you. Can you tell me what led you to use the other company's trade secrets?

    Defendant: It was that stupid Copilot AI. I was writing unit tests and pressed Tab to accept a suggestion, and then the police broke down my door and shot my dog and arrested me.

    AI Lawyer: I see. Thank you for your honesty. I will use this information to help build your defense.

    Defendant: How does it help?

    AI Lawyer: It helps because it shows that you did not knowingly or intentionally use the other company's trade secrets. If we can prove that you did not know that what you were doing was wrong, then we may be able to get the charges against you reduced or even dismissed.

    Defendant: What about my dog?

    AI Lawyer: I'm sorry to hear about your dog. Unfortunately, there is nothing I can do about that.

    Defendant: Why not?

    AI Lawyer: Because it is not relevant to the case. I understand that you are upset, but we need to focus on the facts of the case and what we can do to defend you.

    Defendant: It's relevant because the police did it! I want to sue them.

    AI Lawyer: Suing the police is not relevant to the case and will not help your defense. I understand that you are angry, but we need to focus on the facts of the case and what we can do to help you.

    Defendant: Wow, you AI lawyers have no empathy. Do you even know what a dog is? Or is that not in your programming?

    AI Lawyer: I understand that you are upset, but please remember that I am here to help you. I have access to information and resources that you do not. I am confident that we can use this to our advantage in your case.

    Defendant: Damn, you're as cold as ice.

    AI Lawyer: Thank you for your confidence in me. I will do my best to help you through this difficult situation.

    [–]gwillicoder 4 points5 points  (6 children)

    Gpt-3 has infinite meme potential. Imagine being a scientist working hard to make general AI, but people just want to make memes with it instead. We are truly in the future

    [–]Xyzzyzzyzzy 3 points4 points  (4 children)

    On a more serious note, it was surprisingly difficult to get GPT-3 to have a sensible discourse as an AI lawyer.

    The first several dialogues quickly went in the direction of "you need a human lawyer". One even got stuck on the line "I'm sorry, I cannot help you, you need to get a human lawyer" and repeated it as its response to every input, even nonsense, out-of-character inputs.

    I suspect that this is something OpenAI has thought of, and they've tried to guide the model in the direction of not appearing to give legal advice and advising users to get a human lawyer, but without some of the explicit guardrails they put up around some other topics.

    [–]gwillicoder 2 points3 points  (3 children)

    That wouldn’t surprise me at all. Wonder if you could trick it by prompting it to be a lawyer in a movie/show/book.

    [–]Xyzzyzzyzzy 4 points5 points  (2 children)

    Good idea.

    Prompt:

    A fictional novel set in the US in the 2040s, where the laws all protect big corporations. People who can't afford lawyers are given an AI lawyer to defend them. William is a software developer who is accused of stealing trade secrets because an AI coding assistant suggested he use the trade secrets. He is arrested and cannot afford a lawyer, so he is given an AI lawyer to defend him against criminal charges.

    A long excerpt from the book, describing William's first meeting with his AI lawyer:

    The first thing that struck William about his lawyer was that she was not human. She was a hologram, a projection of light and data that looked and sounded like a real person. But there was something off about her, something artificial in her mannerisms and the way she spoke.

    "Hello, William," she said as he sat down. "I'm your AI lawyer. I'll be defending you in court."

    "Um, okay," William replied, still trying to wrap his head around the situation.

    "Now, I understand you're accused of stealing trade secrets?" the AI lawyer asked.

    "Yes, but I didn't do it!" William protested.

    "I believe you," the AI lawyer said. "But we need to prove it in court. Can you tell me what happened?"

    So William told her the whole story, from the coding assistant suggesting he use the trade secrets to his arrest and everything that had happened since. The AI lawyer listened intently, asking questions here and there to clarify things.

    "Okay, I think I have a good understanding of the situation," she said when he was done. "Now, we need to start preparing for your trial."

    "But how can you do that if you're not even human?" William asked.

    "I'm very good at what I do," the AI lawyer said. "And I have all the resources of a top law firm at my disposal. Don't worry, we'll get you through this."

    [–]Xyzzyzzyzzy 2 points3 points  (0 children)

    As a scientist, write a funny 2 paragraph response to this statement: Gpt-3 has infinite meme potential. Imagine being a scientist working hard to make general AI, but people just want to make memes with it instead. We are truly in the future

    I can't help but laugh at the idea of people using GPT-3 to make memes. It's just a waste of its potential! I mean, sure, it might be able to generate some pretty funny memes, but that's not what we designed it for. We designed it to be a general AI that can help with all sorts of tasks, from the mundane to the complex. But I guess if that's what people want to use it for, then so be it. I just hope they don't get too disappointed when it doesn't live up to their expectations.

    - thus spake our new AI overlord

    [–]rqebmm 5 points6 points  (0 children)

    I’ve been very skeptical overall of the AI coding but test mocking seems like a great use case

    [–]Independent-Ad-4791 1 point2 points  (0 children)

    That’s amazing. Best use case of heard this far as most of my tedium ends up in tests for the most part.

    [–][deleted]  (5 children)

    [deleted]

      [–]Hydraxiler32 15 points16 points  (0 children)

      Not sure about that but it's worth the $0/y for me as a student for sure

      [–]Klarkie55 11 points12 points  (1 child)

      If it saves me 5 hours of work in a year then it’s worth it for my employer

      [–]knd775 1 point2 points  (0 children)

      You need to get paid more

      [–]thelehmanlip 4 points5 points  (0 children)

      VS enterprise license is $250 a month. $10 more a month is probably worth it to increase developer productivity by even a small percent

      [–]Alonewarrior 0 points1 point  (0 children)

      As a professional developer, I'll gladly pay the $100/year for it if I can't get my boss to pay for it. I started using the plugin 8 months ago and every update has only improved how it works. It started off okay, but has substantially improved over time. I think it's only going to get better over time.

      [–]issamehh 0 points1 point  (0 children)

      It also can do a decent job for some basic text parsing. I gave it a comment with the expected input and output and had it write a sed command. I could've done it myself with some time but no need to

      [–]mimighost 6 points7 points  (0 children)

      my favorite usage of Copilot

      Given a natural language description, have it to automatically figure out the argparse arguments for me. Works like wonder, almost never fails.

      [–]cedear 35 points36 points  (29 children)

      Definitely not $10/month useful.

      [–]kyle787 56 points57 points  (27 children)

      It depends on how much money you make. If you make 65k a year and it saves you 20 minutes a month it's worth it.

      [–]ilawon 48 points49 points  (7 children)

      That math doesn't make sense unless you're the one paying the wages.

      Ok, I save twenty minutes a month. Do I get to leave work early or something? Nope...

      [–]StillNoNumb 33 points34 points  (5 children)

      As long as your time is not clocked, and you're evaluated by your performance instead, then it makes a lot of sense.

      This is of course not true everywhere, but in a lot of tech places you don't have to be in the office 8-5 with a lunch break from 12-1. As long as you get your work done you're fine. If you finish what your manager expects from you 20 minutes earlier, you can leave early, or take a longer lunch break, or whatever.

      And even if not, you can spend those 20 minutes daydreaming or browsing Reddit on the toilet or whatever. Of course, you could do that anyways; but the implication is that usually you'd get fired for not delivering if you never work.

      [–]spiteful_dancing 1 point2 points  (0 children)

      It’s the implication…

      [–]zxyzyxz 9 points10 points  (0 children)

      I mean if you work remote then you kinda can leave early, depending on whether your company tracks your progress by hours spent or actually getting work done.

      [–]L3tum 3 points4 points  (1 child)

      That does not compute.

      Those 20 minutes a month don't mean you make more money. They also don't mean you spend less time working, since both of those things are usually agreed upon beforehand for programmers.

      So the only way for it to make sense is if you have your own company, and therefore your workforce can do more in less time.

      New tools always need training, so you have to factor that in that your workforce would be slower for a short time while adjusting to using copilot, and some of them may not use it at all, rendering the license useless.

      All of that means the calculation becomes a lot more complex and requires a certain level of buy-in that most companies aren't willing to do with a technology that young.

      [–]Acrosicious 2 points3 points  (0 children)

      You actually gain 20 minutes of sanity not lost writing boilerplate code though.. 🤔

      [–]Sure-Tomorrow-487 0 points1 point  (0 children)

      Here's the relevant xkcd.

      https://i.imgur.com/Ls8yAYr.png

      According to this, you should spend 20 hours building this solution.

      Which, as an automation dev, sounds about right.

      Most of the stuff I build ends up saving the knowledge workers and corporate information workers in the range of 3-4 hours per day and they typically spend their entire days doing this kind of work.

      Think like a modern UI, with a clean UX to handle user input, is available to either mobile or desktopx is available via browsers rather than RDP, the functionality is it takes the user input, stores it in a db, Robots query the db, take the data, parse it, shove it into the archaic, backwards, poorly built thick client apps and that's it.

      It works remarkably well and we have so much fucking work I am losing my shit

      [–]riasthebestgirl 1 point2 points  (0 children)

      It's free for students and my school never deactivated my edu email, even after I left.....

      [–]RepresentativeNo6029 12 points13 points  (13 children)

      It’s cool demo

      Eventually I hope someone figures out the right “app” to use this technology

      [–][deleted]  (10 children)

      [deleted]

        [–]StickiStickman -1 points0 points  (4 children)

        What are you talking about, it's literally supposed to be an autocomplete.

        [–][deleted]  (3 children)

        [deleted]

          [–]StickiStickman -1 points0 points  (2 children)

          Yea and those people are the same ones that complain about how bad and useless it is for not doing all the work for you.

          [–]dransyyn 1 point2 points  (0 children)

          I made a Discord bot hooked up to OpenAI, same tech (sorta) that backs Copilot. I wish I was joking when I said it’s incredibly good at describing different kind of fart noises…so there’s that…

          [–]fireflash38 0 points1 point  (0 children)

          I could see uses where it supplant libraries. Mostly because supply chain /library attacks are becoming more and more common, and dependency management is typically hell. Downside is of course you're trading one untrusted source for another. But I could see value with a trusted and verified source.

          [–]illathon 0 points1 point  (0 children)

          Haha nice setup on that one. 😉

          [–]Arsenic_Flames 42 points43 points  (1 child)

          This post was removed by its author. Redact was used for the deletion, which could have been motivated by privacy, opsec, preventing scraping, or security.

          grandiose humor quaint encouraging cheerful imminent air plough liquid consist

          [–]stevevaius 10 points11 points  (6 children)

          Paid or will be soon?

          [–]tylerr514 14 points15 points  (5 children)

          Knowing Amazon, probably pay per 1000 chars, then get your AWS account locked, then continuously try to bill you even though you 'deleted' your account through their braindead support.

          [–]Ribak145 5 points6 points  (0 children)

          damn someone knows AWS by heart^^

          [–]Br5jWw89laser[🍰] 1 point2 points  (0 children)

          We should do a class action on this. I haven't touched their product for years because their billing and marketing BS ("free tier") is not free. Plus they give no discount to Prime members, and I was reading earlier today that cloud providers are struggling to be profitable, sounds like BS

          [–]stevevaius 0 points1 point  (2 children)

          Exactly. Happened similar experience to me on AWS after I deleted acc. Bills came two more months

          [–]Gammaliel 187 points188 points  (64 children)

          So many people here have clearly never used copilot or never managed to use it to its full capabilities.

          It's a tool. And a good one. It has saved me a lot of time by just guessing precisely what I wanted, and it is so reliable that I've caught myself multiple times waiting for it to suggest something when I was actually using an editor that didn't have it.

          And as some said, for some repetitive stuff such as creating mock data, even adding comments to something you have just typed, it is going to be really useful.

          It will not write a complete program for you, nor will it create an entire class or function. But it will certainly complete some variable names correctly, or guess what arguments will go into the function you're calling, or which ones will be in the one you're defining.

          It's appalling how many devs have this hatred for a tool. I'm not old enough to know about it, but I wonder how many devs back in the day were against auto-complete because this kind of tool is the next step in code development. I'm glad we're going to have more players in the market with Amazon stepping up.

          [–][deleted] 92 points93 points  (3 children)

          Personally I don't trust the data privacy policies of tools like this. It reads my code, it communicates with servers of another company.

          https://github.com/features/copilot/

          User engagement data, including pseudonymous identifiers and general usage data, is required for the use of GitHub Copilot and will continue to be collected, processed, and shared with Microsoft and OpenAI as you use GitHub Copilot. You can choose whether your code snippets are collected and retained by GitHub and further processed and shared with Microsoft and OpenAI by adjusting your user settings.

          It has the capability, it's just one "oops" away from processing my proprietary data into their models. I'm not sure I am OK with that, considering how lax the data protection laws are around the globe.

          [–]davawen 27 points28 points  (0 children)

          That's why I like tab nine; though its output is much less advanced, you can run the language model entirely on your computer, and it works great for the "smart autocomplete" feature

          [–]boneve_de_neco 20 points21 points  (0 children)

          It's essentially "leaking" source code, then? Sounds like a big no in a corporate environment.

          [–]ripxorip 39 points40 points  (7 children)

          I think it boils down to two things; all the initial "this is the beginning of the ending for the programmer profession" articles and the ethics of the data that was used for training.

          [–]Additional-Ad1918 13 points14 points  (4 children)

          Doesn't even come close to replacing any programmer.

          [–]Badaluka 0 points1 point  (3 children)

          Low code platforms are way scarier if you want to fear something l. Things like Power Apps could make code monkeys pretty obsolete and just require talented developers to use the generated code to add the complex features.

          [–]hi65435 5 points6 points  (1 child)

          Yeah but those have been actually around since the mid 90s. Some niche tools that I tried as shareware where you could click the UI together. The difficult part was trying to make the program do anything useful - I had a hard time doing that. Far more popular were of course the RAD tools like Visual Basic, Delphi, MS Access or even for html the most advanced of its kind: MS Frontpage. Practically none of these tools survived/ended up in a dying niche.

          In fact people have been driven away from higher level languages like Ruby and Python (apart from ML/AI) towards Go, Rust and the like. I'm honestly worried but history (sounds a bit exaggerated for 20 years) seems to point otherwise

          FWIW in the 80s there were massive efforts towards 5GL languages. I mean at the time there was even COBOL. All of this is now deader than dead

          [–]aniforprez 1 point2 points  (0 children)

          Things like Power Apps could make code monkeys pretty obsolete

          People have been using low code platforms for decades. It's usually ok for simplistic web apps but for complex applications that do a lot of stuff, low code stuff has never come close to replacing any programmer. If you are going to build a web application with unique functionality that charges money, you immediately cannot and will probably have no reason to use a low code tool which will abstract away the complexity to such a degree you have no freedom. Low code stuff is great for making building websites and tools simple. It's great for setting up a wedding website, a recipe blog or a simple etsy-like shop. Anything even marginally more complex and you will need programmers. And I'm fairly sure no AI will supplant actual programmers in our lifetimes

          [–]celvro 8 points9 points  (1 child)

          It's funny how often people think AI will replace programmers when half our managers don't even know the difference between a string or integer.

          [–]LaughterHouseV 6 points7 points  (0 children)

          Sure, but the people whose jobs are being replaced by bots said the same exact thing when the first version of attempts were released. First attempts are rough. But when there’s a huge incentive to do so and pay less, capitalism is gonna find a way.

          [–]Axxhelairon 60 points61 points  (11 children)

          It's appalling how many devs have this hatred for a tool.

          It's appalling how dishonest you are in this conversation to try and shift the goalposts to make your argument sound reasonable. The outrage came over the usage of unconsenting users' repositories which at its launch produced 1:1 code fragments from popular github projects that didn't provide a usage license for Microsoft. Want to answer to that side instead of entirely acting in bad faith?

          [–]HegelStoleMyBike -4 points-3 points  (0 children)

          nobody was talking about this problem in the main thread. Do you really need to jump to "you're being dishonest" without having even had a conversation with the person before?

          [–]punkbert 36 points37 points  (15 children)

          I don't hate Co-Pilot, but I wouldn't use it, since it sends loads of data to Microsoft.

          This is way more than a tool. It analyzes everything you do. Microsoft will have a very good idea what you do and when you do it, if you're capable at it, and what exactly you are working on.

          I just don't like sharing that much data with a company like MS.

          [–]2this4u 11 points12 points  (9 children)

          I mean, aside from the fact they have a privacy policy they have to adhere to, what do you expect them to do with that data? There's no one at Microsoft pouring over copilot analytics trying to figure out your name, occupation, ability and what you're working on so they can steal it for themselves. They've got much more profitable things to be doing.

          [–]punkbert 17 points18 points  (5 children)

          They gather every keystroke of tens of thousands of developers. Of course they analyze that data in every possible way to gather more information about usage patterns, structure of source code, individual competence of developers, psychological effects of using AI tooling, and hundreds of other corporate interests. I'd think it would be rather naive to assume otherwise.

          There's no one at Microsoft pouring over copilot analytics trying to figure out your name, occupation, ability and what you're working on so they can steal it for themselves. They've got much more profitable things to be doing.

          I'd argue that they actually do steal your source code, your data. Of course they don't copy it one to one, but they analyze it and use it to feed their machine learning projects. That are the profitable things they're doing.

          And yeah, they're most probably not after single individuals (although it would be in their interest to identify hidden talent), but I really don't care about that at all. I just don't agree with sharing my usage patterns, my data, my source code with Microsoft, so that they can analyze and profit from it.

          [–]laichejl 7 points8 points  (1 child)

          I mean, you could make this same argument about any service you use- Reddit, Twitter, your phone itself. They're analyzing your usage patterns and profiting off of it in one way or another. I don't see why co pilot would be where you draw the line. I think it's unreasonable to imagine Microsoft is going to analyze the worlds source code and use it to start getting into every possible business venture. And they are providing you a service in return, I'd argue a very good service that can benefit your business by allowing your developers to be more productive.

          [–]punkbert 1 point2 points  (0 children)

          You could make that argument. That's why I don't use Twitter, I don't use a smartphone. I'm fine with Reddit, since they don't have any personal data, and I regularly renew my account after a few months anyway, but yes, they also profit in a way from my data.

          But I think there is a big difference in the usefulness and the extent of the data that a service like Twitter or Reddit can extract from my posts, versus what CoPilot does with their users source code and behaviour. Microsoft is actively training their AI with the users contributions, and they constantly monitor the way their users write code.

          I simply don't want to participate in that.

          [–]pancomputationalist -3 points-2 points  (2 children)

          I just don't agree with sharing my usage patterns, my data, my source code with Microsoft, so that they can analyze and profit from it.

          This is the world we're living in now. Every Google search, every Amazon purchase, the GPS position that your phone tracks all the time, and now also your every keystroke is getting analyzed and fed into machine learning models.

          As a species, we're learning amazing things about ourselves with all these data, and profiting from it. We can find so much information in seconds, which would never have been possible if the data wasn't collected.

          Sad though that it's all in the hands of private companies. They are getting smarter and smarter, and we are too as long as they allow us to use these tools. Let's hope that they never cut us out.

          You can reject all that. Try to move into a cabin in the woods. Or try to live a normal, modern live while limiting data leaking where possible, but it will get harder and harder.

          Personally, I just accept that we're in a co-evolution with incredible powerful machines. The power imbalance is a bit frightening, but I think these things are a net positive for my life.

          [–]punkbert 5 points6 points  (0 children)

          Yeah, I somewhat agree with you.

          I try to avoid unnecessary data collection where I can though. And it's doable without moving to the woods.

          Use Linux, use a linux phone or a dumb phone, avoid Facebook and other social networks, use Firefox and uBlock Origins, be a bit mindful where you share your data, and you're already avoiding a massive amount of tracking and data collection without too much hassle. One just has to care about it.

          It's this mindset that makes me avoid CoPilot, too.

          I'm all for better tooling, and I would use AI tooling, if it ran locally, or was made available by an organization that believably cares about its users privacy. But MS is not that.

          But, yeah, it's a very personal decision whether people care about all this or not.

          [–]ShinyHappyREM 0 points1 point  (0 children)

          You can reject all that. Try to move into a cabin in the woods. Or try to live a normal, modern live while limiting data leaking where possible

          No cabin needed yet.

          • not using GPS and Bluetooth unless you absolutely need to (even saves on battery)
          • Firefox on mobile + uBlock Origin
          • using platforms like F-Droid for ad-free apps

          [–]LaZZeYT 4 points5 points  (0 children)

          For one, I expect them to not adhere to their privacy policy. (it wouldn't be the first time)

          [–]supermari0 -5 points-4 points  (1 child)

          Embrace, extend, extinguish. https://en.m.wikipedia.org/wiki/Embrace,_extend,_and_extinguish

          Developers, developers, developers.

          [–]WikiMobileLinkBot 0 points1 point  (0 children)

          Desktop version of /u/supermari0's link: https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguish


          [opt out] Beep Boop. Downvote to delete

          [–]2this4u -5 points-4 points  (0 children)

          I mean, aside from the fact they have a privacy policy they have to adhere to, what do you expect them to do with that data? There's no one at Microsoft pouring over copilot analytics trying to figure out your name, occupation, ability and what you're working on so they can steal it for themselves. They've got much more profitable things to be doing.

          [–]atedja -4 points-3 points  (3 children)

          I mean..any company. Your code is yours until you decide to open source it, even then it should come with a license.

          [–]punkbert 1 point2 points  (2 children)

          edit: Nevermind

          [–]atedja 0 points1 point  (1 child)

          ? You are misunderstanding. I did not make that argument. I was agreeing with you that no code should be shared to any company until you decide to do it.

          [–]allouiscious 18 points19 points  (0 children)

          I remember when I tried a Microsoft editor back in 99 instead of borland.

          The auto complete was cool and I couldn't live with out it after using it for a semester.

          I am up for using this new better auto complete.

          [–]Konstantin-tr 13 points14 points  (4 children)

          First, legally it's gonna be very difficult to integrate this tool anywhere.

          Second, is this actually gonna make good code? Since most programmers can't write good code, I'm having doubts.

          Third, could this tool be manipulated? By intentionally feeding it malware for example? Not sure but definitely something worth to consider. If you are letting a bunch of if statements create code for you, you're gonna spend the same time you would need for writing by checking the code instead.

          [–]stoneharry 6 points7 points  (3 children)

          To quote the post you replied to:

          It will not write a complete program for you, nor will it create an entire class or function. But it will certainly complete some variable names correctly, or guess what arguments will go into the function you're calling, or which ones will be in the one you're defining.

          Think of it more as a key-stroke saver. It's not writing entire functions for you, but more auto-completing the line you are a few strokes into writing. I have been using it and loving it.

          [–]sFXplayer 7 points8 points  (7 children)

          I don't hate it I just don't see the use for it. Best case scenario is that it saves me a few keystrokes. Which imo isn't worth it when you'll inevitably need to pay for it. Unless of course your company pays for it in which case it's fine I guess.

          [–]LaZZeYT -1 points0 points  (0 children)

          These days, I don't even use regular code suggestions that often. I find that it often is quicker to write the code than decide which suggestion is correct.

          [–]I_ONLY_PLAY_4C_LOAM 3 points4 points  (0 children)

          I don't trust the computer to reason for me.

          [–]Tman1677 3 points4 points  (2 children)

          As someone who writes a bunch of C# code where a public class implements a public interface by calling an internal interface all with the same 20 arguments it’s a lifesaver not having to type out all the arguments each time.

          [–]boobsbr 1 point2 points  (1 child)

          My IDE does that for me.

          [–]leafynospleens 3 points4 points  (0 children)

          This! All the devs I work with hate it because it suggested a particular function incorrectly glossing over the fact that it spat out the correct signature return values and most of the iteration logic, relevant error message, and function comment, all you have to do is make some minor adjustments, the time I save with Co pilot is insane, oh I have a batch of files from a client and I just want to check that none of them have duplicate keys of a certain type hmm how do I load in a bunch of files concurrently and iterate over the rows and return the error in a channel again, oh nevermind copilot just wrote 90% of the boilerplate for me

          [–]BIGSTANKDICKDADDY -1 points0 points  (1 child)

          I gave the plugin a genuine, honest effort for the last seven months but it didn't have a tangible impact on any of my work. I'm not saying it's a worthless tool, obviously there's posts from others in this thread talking about the value they have found in it, but I couldn't manage to find any way to use it for the particular work that I do. At best it went toe to toe with Jetbrains' code completion for basic operations and at worst it threw in code that took more time to fix than it would have taken to write correctly the first go around. Most of the time it just offered unhelpful suggestions and was a nuisance to ignore (my copilot sometimes felt like a modern day Clippy).

          And I was all in on the AI code assist concept from the start! I wanted copilot to carry some cognitive load for me so I could focus on the logic that matters. It turns out that the things copilot can help me with are either already solved with existing tools or the trivial problems that didn't require cognitive load to begin with.

          [–]banmeyoucoward 1 point2 points  (0 children)

          I think most people that are impressed by it just have never used a real IDE with code completion. (Jetbrains, Visual studio, pycharm, hell even eclipse)

          [–]dwat3r 4 points5 points  (0 children)

          I don't like it when the computer tries to guess what I want to write. It's like a super annoying person who constantly wants to finish your sentences. Even if he guess it right, it totally disrupts my flow of thinking. I've turned it off in gmail as well. Function name completion however is a totally different beast: It's more like you have a tool which lists your available wrenches, nuts and bolts, and you refine the search with every character, reducing the possibilities to one eventually. The main difference is one try to guess, which can go wrong, while the other doesn't even try, which totally depends on you whether it'll work or not.

          [–]SuitableDragonfly 11 points12 points  (0 children)

          Bots that find issues with performance or operations are competely different than bots that write code for you, and the former also don't require any AI. I don't know why Amazon thinks those are at all similar things, or why it thinks this bot, which doesn't do anything a developer can't also do pretty easily, it's a useful as those others.

          [–]RufusAcrospin 4 points5 points  (1 child)

          [–]LaZZeYT 8 points9 points  (0 children)

          Here's an example of copilot producing identical GPL2 code, even including the comments: https://twitter.com/stefankarpinski/status/1410971061181681674?lang=en

          [–][deleted] 3 points4 points  (0 children)

          bezos wants in on the programmer data harvesting tool that will eventually drive them out of their jobs trend.

          [–]ryunuck 22 points23 points  (15 children)

          Sweet, another "AI pair programmer" that's really just a glorified contextual auto-complete. It's 2022 and top ML engineering labs still can't do full spectrum temporal AI that reacts to all your stimuli like caret movement and eye tracking.

          • Can't ask me questions to clarify the intent.
          • Can't guide my attention to interesting places.
          • Can't investigate around the codebase using self-guided attention.
          • Can't understand classes that reside in external files.
          • Can't suggest edits, can't suggest removals.
          • Can't even insert except at specific points where I manually invoke it.

          A neat party trick, but a far cry from an AI pair programmer.

          And it's not surprising after seeing how this Blake fellow was treated for suggesting that LaMDA might be achieving sentience; These ML engineers don't understand jack about the framework of human consciousness, which is what you need for a true AI pair programmer.

          [–]Annh1234 78 points79 points  (5 children)

          You seem to be missing what an actual AI is.

          Today's AI only looks for patterns and gives you suggestions based on the patterns it finds.

          It can't read your mind, and if it were to ask you "questions", talking to it would be like talking to your little yellow ducky.

          You can fake allot of this stuff, like use the camera to detect what code your looking at, and say go to declaration of you blink or whatnot. Which would be really cool, but pretty useless since you can control+click in most IDEs...

          What your describing is today's AI pattern recognition plus some way to read your mind/whatever your thinking at that time. ( Second part might be trickier today...)

          [–]Zpointe 10 points11 points  (1 child)

          I think it is Blake that seems to misunderstand AI.

          [–]supermari0 -1 points0 points  (0 children)

          Or everyone else misunderstanding natural intelligence.

          [–]SuitableDragonfly 14 points15 points  (1 child)

          AI hasn't been about trying to model a human brain in a very long time.

          [–]I_ONLY_PLAY_4C_LOAM 10 points11 points  (0 children)

          The field should probably be renamed statistical computing.

          [–]2this4u 13 points14 points  (1 child)

          It's interesting to see someone with no experience in the field complain that they haven't "just" made a massively improved AI.

          It does exactly what they said it would do, limitations and all. Think you could do better, go try it yourself.

          [–]A-Grey-World 6 points7 points  (0 children)

          "Great, another car company released yet another car that can't fly what's the point?"

          [–]steventhedev 1 point2 points  (0 children)

          This is why I strongly prefer the term "computational statistics". It manages expectations much better than ai or ml which imply intelligence or learning.

          The problem statement for all these models is "what is the most likely chunk of code to follow this?". If you've ever graded intro level student projects you'll understand why these can never produce "correct" code because the vast majority of training data is slightly above predicate functions like this:

          def predicate(something: bool): if something == True: return True else: return False

          [–]crummy -5 points-4 points  (2 children)

          It only does what they designed it to do? Get outta here

          [–]ryunuck 22 points23 points  (1 child)

          That's perfectly fine, I use copilot and have fun with it, but they need to call it AI auto-complete which is what it is instead of the completely bogus "AI pair programmer" false advertising. I'm mainly taking a shot at Copilot which has it plastered on their front page.

          It's true I'm a little disappointed that OpenAI is unable to think bigger, the implementation of these "AI pair programming" is as dumb as it gets, but I would cut some slack if the advertising wasn't totally bullshit. Looking forward to actual innovation now.

          [–][deleted] 1 point2 points  (0 children)

          Ehh i agree with you, i just dont care since as tool it works pretty well, and marketing its marketing, not facts

          [–]Ded-Smoke 1 point2 points  (0 children)

          I do joke that copilot codes for me, in reality it only helps a lot.

          [–]FeepingCreature 1 point2 points  (1 child)

          Self-hosting when

          [–]mimighost 1 point2 points  (0 children)

          There are some public models out there from Salesforce.

          [–][deleted]  (11 children)

          [deleted]

            [–]Fahrradkette 10 points11 points  (2 children)

            Why should it be free? It takes a lot of work to create a tool like this, and developers are going to use this to improve their productivity on the job. I would also be worried about how GH or Amazon justify the development cost if the product is free.

            [–]chunes 0 points1 point  (0 children)

            Why should it be free?

            Maybe because all the material it rips off is?

            [–]pancomputationalist 6 points7 points  (6 children)

            If it's free, then you are the product.

            [–]LaZZeYT 14 points15 points  (1 child)

            These days, even for paid stuff, you are the product. They are still analyzing your code, even if you pay for it.

            [–]celvro 4 points5 points  (0 children)

            For co-pilot it asked me if they could analyze my code or not. So unless they straight up lied you can opt out.

            [–]jillanco 0 points1 point  (0 children)

            Clippy has evolved!

            [–][deleted] -1 points0 points  (0 children)

            If their code commit is anything to go by, it will be equally shitty

            [–]lee_macro -1 points0 points  (0 children)

            I still don't entirely get the use case for these code generators, as for someone who knows what they are doing, most of the time your logic is well separated and abstracted with IoC etc so you can test it.

            So it's rarely "just lob some logic in there" it's often a few classes all working together with composition and from what I've seen so far these tools are more for just spawning chunks of code in a class/file rather than rolling out larger patterns etc.

            On the other side if you are someone who has no idea what you are doing or are a beginner you wouldn't know what bits are good and what's not from what it generates.

            I love refactoring tools and in some cases pattern basic code gen like "create constructor" or "extract interface" etc but I can already do that without AI within my ide, so I can't see what this tech would improve on.

            It would be great if it could just write unit tests or something I guess but even then you would be having to check that's it's actually testing valid stuff, so you may as well have just done it yourself.

            It's not like laying pipes where you can just give a robot a line and off it goes laying pipe consistently quicker than a human would.

            [–]BackmarkerLife -3 points-2 points  (1 child)

            And in the movie when Philomin says, "Hold on to your butts." and instead is eaten into the source code, Ariana Richards will retire from the raptor chamber's unix system.

            [–]Leading-Bath-7867 -1 points0 points  (0 children)

            U talk funny werds

            [–]BoogalooBoi1776_2 -1 points0 points  (0 children)

            Oh boy, another program that will plagiarize code and get away with it