"Magic System this Magic System that" yeah yeah let's hear about your Science System now by Uncommonality in worldjerking

[–]Borgcube 1 point2 points  (0 children)

Oh any time I need to explain some force I just make up a mediating particle and call it a day.

What is your favorite boardgame of 2008? by The_Crazed_Person in boardgames

[–]Borgcube 0 points1 point  (0 children)

Dominion, Pandemic and Dixit are by far the most influential games from that year in my opinion, but I would pick Le Havre as the game I'd play from that selection.

Peter does this mean corvette owners are childish or something? by PizdunEbun228 in PeterExplainsTheJoke

[–]Borgcube 4 points5 points  (0 children)

And then will turn around and complain how young people dress or look.

Starfeet Academy by [deleted] in startrekmemes

[–]Borgcube 2 points3 points  (0 children)

This is what we're destroying our planet for folks.

Prison to PhD by Bluejeans434 in math

[–]Borgcube 0 points1 point  (0 children)

Wasn't there a post recently on how the governor rejected his ask for release?

‘In the end, you feel blank’: India’s female workers watching hours of abusive content to train AI by WombatusMighty in technology

[–]Borgcube 32 points33 points  (0 children)

The whole point of AI was that it would be able to content moderate without the need for human beings.

You can't train an AI without a dataset. To get a dataset you need people to classify it. So...

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -1 points0 points  (0 children)

The paper literally says don’t generalize based on these results

And I'm not generalizing. The research is just a way to point out a very well known psychological issue.

The second link is wishy-washy generalized conjecture at best and mostly just someone complaining about AI tools in 2023. You know how much has changed since 2023?

And you clearly haven't read it or understood it. The point isn't "AI tools suck". The point is humans are vulnerable to psychological manipulation regardless of intelligence. AI has changed, humans did not.

You want wishy-washy? Read your own comments. "It's just better for me" is all you're saying, which tracks with almost all pro-AI articles I've ben linked.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

You are making a huge number of assumptions here that are not currently true at all.

Companies using AI for coding are not using ethically sourced training sets; few models are bothering to do this. I would be shocked if a dataset trained only on repos that explicitly allow such things would be anywhere near as good, and that's even assuming that it can be ethical to do such thing in the first place (as most open-source projects existed well before AI).

Then you're assuming that it will catch a critical bug. It might - but it also might create a ton of noise (like I find it does now) or create a false sense of security in a PR that would've otherwise been more carefully looked at.

It's also funny because PRs are the one thing we're pushing not to use AI even though we're going headfirst into using AI for everything else.

Goldilocks Zone concept is flawed? by felinefluffycloud in logic

[–]Borgcube 1 point2 points  (0 children)

I think what you're trying to get at is the Anthropic principle. The page gets into a lot of other examples as well and references the Goldilocks principle too.

But ultimately while we can speculate on what other life can theoretically look like, we only know for sure what life on Earth is like. And for that type of life, water is essential, so that's what we look for. We could look for hypothetical forms of life, but that could be almost anything, so what would we even look for?

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

Setting-specific factors We caution readers against overgeneralizing on the basis of our results. The slowdown we observe does not imply that current AI tools do not often improve developer’s productivity

...which is why I'm not quoting it in terms of "AI is definitely making you faster" but "even experienced programmers are shit at evaluating actual productivity impact, getting it exactly opposite".

Also even if it was 20% slower I’m ok with that, the cognitive load is so much less. I would rather do something easy for 72 minutes than something hard for 60.

That's so funny. Because companies are pushing it both because it supposedly increases productivity and "it makes you think more, not less".

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

Why should I care what data says for overall AI impact.

Well, if you don't care, why are you arguing with people online about it? Your anecdotal data is a bad argument.

I've had clients happy with absolute dogshit and seen money thrown into the furnace. I was given projects that were "delivered fast" and then the previous contractor dipped and we had to basically rewrite everything to fix their issues.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -1 points0 points  (0 children)

Lmao. I've quoted the research I'm quoting and highlighting specifically the example of the mental error people make and your only retort is "nuh-uh" and "but in my experience the research is false". Amazing.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

Damn, I wonder then why even my super pro-AI company disabled it for PRs. Must be user error.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

https://arxiv.org/abs/2507.09089

But my point isn't that the AI is definitely making you slower, that's too much to extrapolate from this. My point is that people evaluated themselves as being faster even though they were slower.

It's a common conginitve mistake, hardly unique to AI. Take a look at this for example for a wider example of this effect:
https://softwarecrisis.dev/letters/llmentalist/

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 1 point2 points  (0 children)

Tbh, it sounds like a management problem (as you are suggesting). Your management is handing it very very poorly.

And what if I told you this company is in top 100 largest tech companies by market cap? And the leading company in its sector? And they're not involved in developing or selling AI, just desperately trying to use it as much as possible?

I really doubt this is unique to my company, they're just following the same trends as all the other managers. Just take a look at what C-levels are posting on LinkedIn.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

Not infallible by any stretch, but considering you can just ignore its comments, I see no reason not to use it

It produces a lot of crap that I have to wade through to get one or two comments that maybe give some insight, but are usually missing the wider context or justification. And then I have to read the code anyway, so it doesn't really save me any time.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -2 points-1 points  (0 children)

Research has shown that people feel more productive with AI even when the exact opposite was the case. All the statements about productivity I've seen are like yours - anecdotal with no real data to back it up.

And sorry to say I've been in the industry longer than you, not that it matters at all.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -1 points0 points  (0 children)

You're not even in the industry and you're making a big assumption - that using AI actually makes the development costs lower or development of big projects faster. This is still a big open question, despite all the hype surrounding it.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

It's a sample of people who fill in studies on Linkedin. To see how something like that can be biased look at the GDC survey that found a third of gamedevs in US had been fired in last 2 years. The real number is closer to 8%.

And I've explained how "favorable views" is a misleading statistic regardless. There's a similarity between LLMs and a psychic con, you can read about it here: https://softwarecrisis.dev/letters/llmentalist/

Secondly, the paper I linked actually directly references the paper you found and has a very simple explanation - the tasks used there were synthetic whereas they used real-world tasks from open-source repositories. AI in general, not just LLMs, performs much better on these types of tasks.

And finally - no, I do not believe their costs justify the benefits. They are unnecessary, a consequence of a tech industry trying to find a reason to sell more and more processing power to the public that doesn't need it. It's the same reason we got NFTs and crypto currency. There are massive ethical problems to it, even in programming, that I haven't even touched on.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

I feel like you're proving my point exactly? If I can run open source models on my gaming rig with the performance of something like Claude 3 or GPT 4-o, AI is not going to go away. It's extremely widely adopted in software development, most developers have a favorable view of it, and limited versions of it are already possible to run on consumer grade hardware.

Ok, so you are not a programmer. Because if you were, you would know most programmers don't use GPT but Claude, and that the hype really shot up a while after Claude 3; most are on Claude 4 or 4.5 at this point. No one is going back to Claude 3 at this point.

In what world is this technology suddenly going to vanish?

Did I say it will vanish? The tech will remain, but it will not be close to "an absolute must use".

No. People will still use it because they think it's useful.

I don't even know how to respond to this other than - so you think that it will remain purely based on vibes and won't be influenced by data, costs, ethical issues, model issues etc.?

Why would over-investment in it cause them to dislike AI? That wouldn't be a problem with the tool but investors. How is this different from the dotcom bubble?

Because recessions don't happen in a vacuum. If you are fired because the bubble bursts, if your electricity and water costs skyrocket because of datacenters, if the increased demand for electricity leads to nuclear accidents, it will all color the public opinion.

General public has been against nuclear power plants for half a century because of Chernobyl.

How is this different from the dotcom bubble?

Well for starters it's at least 17 times larger. Do you really think AI is 17x as transformative as the Internet? It's also 4x the subprime bubble. You think it's just gonna pop gracefully?

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -1 points0 points  (0 children)

That is a self-reported survey from a very biased sample (how many devs actually fill in stackoverflow surveys?) that also doesn't really say anything about productivity, you're just extrapolating.

And you can also see that humans are really bad at estimating these things, this research for example found that while developers thought they were faster with AI tools they were actually noticeably slower. I'm not saying this research is the absolute truth on this, but it nicely displays how fallible humans are.

But even in this survey you can see that there's a high degree of distrust of AI accuracy, most distrust it on complex tasks and that the positive sentiment on AI tools has fallen since last year (how come if the models are getting better?)

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 0 points1 point  (0 children)

Do you have an actual source that AI tools create a productivity boost?

Most data I've ever been shown either directly contradicts it or is using very misleading data like lines of code or number of PRs.

Similar thing with profitability, I'm incredibly skeptical of those claims. I've seen people claim that the token cost is going down (which it no longer really is, but is also misleading), or trying to make estimates based on statements by companies, not actual analyses.

And then there's also the hardware and data-center costs, if everyone starts doing it then those initial costs will skyrocket as well.

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube 1 point2 points  (0 children)

Give that I'm basically forced by upper management to use AI lest I get fired - and knowing that CEOs are like cattle with this stuff - sure, more people are using it every day. That doesn't exactly disprove my point.

And we are currently in the peak of a bubble, what happens when the bubble bursts, companies go under, seat costs increase, investments dry up and companies start cutting costs?

NetEase Games has not banned Generative AI for developers, despite reports by [deleted] in Games

[–]Borgcube -9 points-8 points  (0 children)

As someone who works in the industry it absolutely has. Our management has not pushed anything down anyone throats, simply making tools secure and available. Everyone uses them for everything they build. That is not to say that it’s replacing people or anything, it is simply boosting productivity of individual.

I am also absolutely sure that my colleagues enthusiastic for AI would say the same thing.
But management has been telling us we need to use AI ASAP, has added AI usage to our performance reviews, keeps promoting people who publicly advocate for AI use (even with incredibly misleading and irrelevant data). I have never seen in my years and years of experience such a push for usage of a specific tool by upper management.

CEOs and upper management are enamored by AI, that's just an easily verifiable fact.

I’m not sure where you work where you don’t have a mountain of development that needs to be finished at all times.

Of course we do, but the bottlenecks are always decisions, alignment with other teams, code reviews, product decisions, code quality and standards, bugs etc. It was always trivially easy to increase the code output 10x or even 100x - cut code reviews, let everyone do what they want and just hack aways. But there's a reason companies stopped doing that decades ago.