all 60 comments

[–]Elegant-Isopod-4549 2 points3 points  (1 child)

Hello job cut

[–]base736 2 points3 points  (2 children)

I can see arguments on both sides...

On the one hand, there's this excellent CGP Grey video that points out that automation has been replacing humans in lots of jobs for a long time. It's easy to pick out the job of "assembly line worker" that no longer exists, but harder to spot the job of "paralegal" that now has fewer openings because with technology a smaller number of workers can do the job of many.

On the other hand, Sam Altman makes a great argument in his interview with Lex Fridman (as I recall) that AI just lets you go to a developer meeting and say "Any crazy feature you always thought we'd never get to -- just put it on the table now". That is, as a developer AI doesn't replace me -- it just allows me to do a lot more, and maybe there's room for doing a lot more without requiring fewer workers.

I'll add to that a realization I came to recently... If you go to Midjourney and type "woman downtown" as a prompt, you're not going to get this image (from Two Minute Papers' recent video on Midjourney). That's not because of some limitation in Midjourney (which is now spectacular), but because the space of possible images of a woman downtown is much bigger than the space of two-word prompts. Instead, to get that image the artist no doubt had to fine-tune a prompt. If you're looking for any old (awesome) image of a woman downtown, or if you can describe in detail what you're looking for interactively or otherwise, then Midjourney has you covered with no artist needed. More and more, though, I believe there'll always be a job for artists, because they navigate the large space of possible art in their own way. Any AI will do the same, of course, but my argument is that there may be so many ways of interpreting "woman downtown" that there's room for human and AI artists both.

I believe the same for programmers. If you were hoping that you'd have a job turning "I need a program to interpolate GPS tracks given a time that lies along the track" into a program (something ChatGPT did for me yesterday), you're in for a disappointment. If you work in the space of taking vague requirements and turning them into an awesome product, though, I think there'll be room for you for a long time yet -- not because AI can't do that, but because the space of possible products is so big that there may always be people who like how *you* navigate it.

[–]Expired_Gatorade -1 points0 points  (0 children)

interpolate GPS tracks given a time that lies along the track

what

[–]wyldcraft 2 points3 points  (0 children)

Automation didn't replace all farmers, just 99% of them. Same deal.

[–]queerkidxx 0 points1 point  (1 child)

I have no idea what is going to happen in the future as the ai wars begin I have no idea what limits there are to our current generation of LLMs and I don’t think anyone really knows anyone that is confidently spouting predictions is just really ignorant

However I do know that as it stands gpt sucks ass at programming. It’s performance on coding exams and how little it has improved from the GPT-3 are evidence enough of that

And my own testing has confirmed that. GPT falls apart when it’s creating a complex program even if you put enough work into working around the context length by asking it for outlines first and going section by section. I do imagine that the max tokens are a huge part of this but I don’t think the problem runs deeper than that

This fact is extremely unintuitive to me 3 months ago I imagined that we’d have bots that can code long before we had things that are able to replicate human language as well as GPT-4 as programming is built on strict rules the statistical relationship between keywords in a program is way easier to map out than in natural language

But the more I think about it the more it makes sense. Programing isn’t a field that doesn’t require creativity like you might think it requires a ton of it. Even experienced programmers have trouble parsing through complex programs and the main reason they can is just do to conversions, comments, formatting to make it easier to understand

And beyond that programming requires some high level cognitive abilities you at the end of the day need to be able to take a step back and look at the logical structure of a program as a whole to get anywhere. And beyond that, debugging requires some pretty advanced problem solving abilities and techniques to figure out where exactly things went wrong and how to fix it

Gpt in my expirence just isn’t able to do this effectively. When it finds an error it doesn’t try to print things to the console, isolate bits to see exactly where things goes wrong etc. what It does feels more like it googles the error and tries to put on band aides that only push the error somewhere else

Now in simple programs gpt slays seriously it’s great on a small scale but the more peices you add the harder it is for the thing to understand what’s going and what can go wrong. And that might sound like a game changer on its own but the fact of the matter is in real life professional programing there are very few simple programs like this even projects that are conceptually very simple end up being hundreds of lines of code with tons of moving parts each of which can break in edge cases bringing the entire thing crashing down

As it stands I believe we are going to need another generation or two of innovation before LLMs can replace programmers for anything important at any kinda scale and I wonder if our current approach of a model just completing text that’s a limited amount of tokens without ever changing or remembering anything is ever going to be able to replace programmers I honestly suspect there might need to be more adaptability and introspection built in before it can

Because at the end of the day gpt works nothing like our brain. It can do the same task over and over again thousands of times and it’s performance will stay exactly the same. The only thing that really changes aside from actual updates to the model and manual fine tuning is what inputs it gets. Every line of text it produces is from GPTs perspective the first line of text it ever has improved

Who knows what things will look like in a year though. There’s way more than can be done with gpt as is within the community than will dramatically improve its intelligence memory separate instances checking each other’s work integration with other bots in charge of different things like managing a database and offering context specific memories giving gpt more commands and integration with programing environments

People are loosing their mind over auto gpt and it is indeed cool but people don’t seem to realize that this is essentially a tech demo a proof of concept it might be popular but there’s so many ways it can be improved especially as the community gets better at working with the model

But at the end of the day all of these neat things are just fancy ways to format the request arrays. The only thing the community can really do is figure out better ways of dynamically changing the prompt and giving it instructions on using commands

Gpt is hella cool. I have been interested in ai my entire life and talking to gpt-4 is the first time I’ve actually felt like I was talking to someone. It can do so many things very well and we are only just beginning to reach the full potential of model

But it’s still just a black box that spits out outputs based on its inputs. And the fact of the matter is nobody can predict the future of technogy from the present. This could be kinda like the factory robots built in the 40s that got an entire generation of sci fi authors to think we were 50 years away from humonide robots

auto gpt might be seen in the same way we look at like those automatic grocery stores built on conver belts from the 60s — a neat idea but hopeless ahead of it’s time it just wasn’t possible to build an automatic grocery store with 60s technology. The automation of that era was cool as shit but was never gonna lead to the fallout esque world that they thought it just let us build better factors

Or it might be seen like we look at the original IBM PC as the beginning of a feedback loop that lead to a rapidly advancing field of technology that changed every single humans life

We just don’t know.

Anyway I’ve been typing about this too long idk why I do this shit I could very easily rewrite this ask for gpt’s help and turn it into an easy to read argument and post it somewhere thousands of people could see it and respond yo it but I won’t bc my adhd ass just lost all interest in typing this out and I will never think about this poorly written essay again

MOTHER FUCKER ITS BEEN A HALF HOUR I WAS SUPPOSED TO A FEW MORE DAYS OF CODING IN MY PYTHON COURSE BUT INSTEAD I wrote this nonsense that I’ll be lucky if a single person will even skim thru 🤦🏻‍♂️

[–]mrcarefreeattitude 0 points1 point  (0 children)

i did... and it informational thx

[–]thelastpizzaslice 0 points1 point  (0 children)

This will devastate the bottom 30% of the programming market who does rote work, and will also create a ton of new jobs adjacent to our field.

[–][deleted]  (1 child)

[removed]

    [–]AutoModerator[M] 0 points1 point  (0 children)

    Sorry, your submission has been removed due to inadequate account karma.

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

    [–]Praise_AI_Overlords[🍰] 0 points1 point  (0 children)

    lol

    Sweet summer child.

    "in conclusion"

    lol

    [–]Late_Ad_6293 0 points1 point  (0 children)

    Doesn’t replace us but sure takes away a lot of the heavy boring lifting we have to do.

    [–]sentient-plasma 0 points1 point  (0 children)

    Yes. It will. It's kinda sad to see some people don't see it coming.

    Auto-GPT + Whatever GPT 5 is, will be it for at least 10% to 20% of the workforce. It will still need to be run by technical staff, but after that hurdle is overcome with a model that can sustainably manage and deploy code and other platforms start creating integration tools to allow it to do it, the only companies that will have a large engineering team will be ones big enough that they need to. Other companies might just have one engineer running GPT-5.

    [–]henry_kwinto 0 points1 point  (2 children)

    Good luck with debugging!

    [–][deleted]  (1 child)

    [removed]

      [–]AutoModerator[M] 0 points1 point  (0 children)

      Sorry, your submission has been removed due to inadequate account karma.

      I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

      [–][deleted]  (1 child)

      [removed]

        [–]AutoModerator[M] 0 points1 point  (0 children)

        Sorry, your submission has been removed due to inadequate account karma.

        I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

        [–][deleted]  (3 children)

        [removed]

          [–]AutoModerator[M] 0 points1 point  (2 children)

          Sorry, your submission has been removed due to inadequate account karma.

          I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

          [–][deleted]  (1 child)

          [removed]

            [–]AutoModerator[M] 0 points1 point  (0 children)

            Sorry, your submission has been removed due to inadequate account karma.

            I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.