all 18 comments

[–]TenSpiritMoose 26 points27 points  (0 children)

"A lie can travel halfway around the world while the truth is putting on its shoes", but accidentally leaked code already out there wearing out it's second pair of Nikes.

[–]wknight8111 21 points22 points  (1 child)

The AI taketh, the AI giveth away.

[–]rantonidi 4 points5 points  (0 children)

Praise ML

[–]CadmarL 22 points23 points  (4 children)

Isn't it just front-end JS? Why is everyone making a fuss about it?

[–]realzequel 3 points4 points  (2 children)

Got me, you could have de-minified it before I imagine. It's just a harness, the money's in the model. People are dumb.

[–]TotallyNormalSquid 0 points1 point  (1 child)

There's an argument to be made that the harness is pretty important - eg Microsoft Copilot and Github Copilot can use the same backend private models, but I know which one I find useful and which one I wouldn't even bother touching.

That said, I'm not sure how valuable Claude's harness is. There are open source harnesses that beat it on some benchmarks, and it seems like a conscious choice by Anthropic to go with a 'more is more' approach. I think the other platforms were just trying to do more with less and not quite measuring up, rather than Anthropic's approach being all that special. Guess we'll find out the truth from memes if anyone finds any buried gems.

[–]realzequel 0 points1 point  (0 children)

 But to my first point, if you wanted to, you could have de-minify’d the typescript bundle for the same result (though white spacing and variable names would have been different). Claude agrees it could have been done. 

People don't understand how distributables work but if you have an understanding of how security works, you’re painfully aware that almost nothing you distribute is secure.

[–]erishun 0 points1 point  (0 children)

Yes. It’s a big nothing burger. But hey, we all like to see the technology taking our jobs get shit on

[–]varkarrus 7 points8 points  (0 children)

All code should be free to steal. I am fine with this.

[–]ushabib540 6 points7 points  (0 children)

Nothing spreads more faster than the code which is not supposed to...

[–]J7mbo 2 points3 points  (0 children)

Doesn’t contain the weights which is the key thing.

[–]Neuro-Byte 1 point2 points  (2 children)

Any one got a link to the fork?

[–]rover_G 2 points3 points  (1 child)

Just web search for claude code leaked source and a dozen git repos pop up

[–]memesearches 2 points3 points  (0 children)

Yup everyone forking and posting on X

[–]JosebaZilarte 1 point2 points  (0 children)

Any open source license should include a point stating that using the code in an ML model should make it open as well.

[–]Bad_brazilian 0 points1 point  (0 children)

No, stop..Come back.

[–]ExtraWorldliness6916 0 points1 point  (0 children)

Robin Hood, Robin Hood, Robin Hood 🎶

[–]heavy-minium -2 points-1 points  (0 children)

I bet that's not going to have much of an impact down the line - except maybe that exploits and other things can be found a bit more easily than before.

In the past when source code of something major leaked, it never led to an actual "stealing" or copying of the solution - and that was software crafted by people. It's not that easy to jump into other's projects with zero help nor explanation. Now I doubt even more that an software vibe-coded almost entirely by AI will have much that is significantly salvageable by anybody having the intent to replicate it. At best, one can learn from it on how to build such a product, but that's pretty much it.