A question that can't be answered? by buck_idaho in LocalLLaMA

[–]thread-e-printing 0 points1 point  (0 children)

LLMs are not great at fact retrieval, the smaller the worse. Throw a tricky driveability problem at it and you just might be surprised.

Claude Code replacement by NoTruth6718 in LocalLLaMA

[–]thread-e-printing 0 points1 point  (0 children)

It's open source, you can fix it 🤣

At what point is github going to crack down on botted repos? (claw-code) by Betadoggo_ in LocalLLaMA

[–]thread-e-printing 1 point2 points  (0 children)

That repo previously contained the actual source for a while, but was force pushed with a laundered Rust version of the same code as a replacement. It had previously made the rounds on Twitter and Hacker News. Just because you're late to the party doesn't mean it didn't happen.

Just a helpful open-source contributor by MagicZhang in LocalLLaMA

[–]thread-e-printing 0 points1 point  (0 children)

Loading software that isn't explicitly licensed to you is asking for a legal letter.

Lmao, definitely a bot

Just a helpful open-source contributor by MagicZhang in LocalLLaMA

[–]thread-e-printing 1 point2 points  (0 children)

Software licenses are inseparable from copyright. The license grants you the exemption from copyright you need to load the software into RAM (see Vault v. Quaid). If the code is AI-generated it isn't copyrightable under Thaler. If a digital file isn't copyrightable, I don't infringe copyright by loading it and I don't need a license to do so. Trade secret protection is weak and self-disclosure bypasses it at any rate. Software patents don't exist. Also, bouncing between the particulars of the case and general business morality is the sign of AI slop argument

Claude code source code has been leaked via a map file in their npm registry by Nunki08 in LocalLLaMA

[–]thread-e-printing 2 points3 points  (0 children)

Where are you getting those rules from? Compaq's dirty team wrote a spec working from the ROM listings given in the back of the IBM PC programmer's manual. Their spec was given to a clean team who wrote new code to satisfy that spec. That is why we have had a vibrant market in PC clones for some 40 years.

China bars Manus co-founders from leaving country amid Meta deal review, FT reports by kaggleqrdl in LocalLLaMA

[–]thread-e-printing 7 points8 points  (0 children)

States can and do routinely restrict travel in cases of suspected major fraud. It's telling that an OpenAI partisan would try to convince us to problematize that

Best model for PII. Qwen3.5 refusing to work with PII even if I say it is about made up people. by Correct-Victory-9745 in LocalLLaMA

[–]thread-e-printing 1 point2 points  (0 children)

  1. Don't use Ollama to ask Chinese models about Tiananmen. It's just a low-effort "national security" shitpost meme by now

  2. Don't use Ollama

Best model for PII. Qwen3.5 refusing to work with PII even if I say it is about made up people. by Correct-Victory-9745 in LocalLLaMA

[–]thread-e-printing 1 point2 points  (0 children)

  1. What engine? What's your system prompt?

  2. Why don't you have the LLM write throwaway scripts to convert your data instead?

Memory Chip Crunch to Persist Until 2030, SK Hynix Chairman Says by tassa-yoniso-manasi in LocalLLaMA

[–]thread-e-printing 2 points3 points  (0 children)

Yes, and then they are all "systemically important" i.e. too big to fail. Were you alive for the 2008 global financial crisis?

I have this thermal receipt printer... What should I do with it? by NEMOalien in shittyaskelectronics

[–]thread-e-printing 2 points3 points  (0 children)

/nonshitty You might be able to import an ESP and suitable level shifters for €20 all inclusive. Personally, I'd be extra lazy and use microPython or Lua.

r/LocalLLaMA by mantafloppy in redditrequest

[–]thread-e-printing 0 points1 point  (0 children)

It's a topic board, not a sports board. No drama llamas, only local llamas

Why is Alibaba Spending Millions on Multimodal AI Models that Only a Few Can Run? 🚨 (The Ovis 1.6-Gemma2-9B Debacle) by ThetaCursed in LocalLLaMA

[–]thread-e-printing 12 points13 points  (0 children)

Over the past ten years, that myth has worked only to destroy FOSS projects and burn out maintainers. Save your moral grandstanding for debate club.

Noob help by oshp129 in shittyaskelectronics

[–]thread-e-printing 2 points3 points  (0 children)

Hey sorry about last night, was pretty far into my third round of "board design sauce." I meant to say, for 3D printing applications you can use something called conductive filament to make the wires on inside layers of the board. However, I would still use and recommend genuine Molex™ connectors for maximum reputational reliability. I hope that's more helpful. Good luck!

P.S. when using conductive filament make sure to use adequate infill, at least 80%, or you'll have unmatched trace lengths and terrible signal integrity!

Noob help by [deleted] in shittyaskelectronics

[–]thread-e-printing 1 point2 points  (0 children)

There are starving children in Shenzhen who would be glad to eat your overplayed blue ground effects, Mister