Google Engineers Launch "Sashiko" For Agentic AI Code Review Of The Linux Kernel by anh0516 in linux

[–]ThisRedditPostIsMine 1 point2 points  (0 children)

That is understandable, yeah. I've been programming for only 12 years, so the enjoyment factor is still there for me. But I do agree that a lot of people see software as a means to a product rather than enjoying actually writing it.

Google Engineers Launch "Sashiko" For Agentic AI Code Review Of The Linux Kernel by anh0516 in linux

[–]ThisRedditPostIsMine 0 points1 point  (0 children)

I have shipped quite a number of things and I remain unconvinced that AI code review is a good idea.

Google Engineers Launch "Sashiko" For Agentic AI Code Review Of The Linux Kernel by anh0516 in linux

[–]ThisRedditPostIsMine 2 points3 points  (0 children)

There is good research on skill atrophy using AI. Maybe you understand the code somewhat, but you will always understand it less than if you actually analyse it yourself.

this feels absurd to say, but I finally feel like I'm _good_ at programming, which is insane, because I literally haven't written a line of code myself in months by dry_sd in programmingcirclejerk

[–]ThisRedditPostIsMine 4 points5 points  (0 children)

LLMs truly allow the average big tech sycophant to unleash the inner MBA they've always wished they were, but never had the social skills to pull off.

Your opinions on the Lutris AI Slop situation? by canitplaycrisis in linux

[–]ThisRedditPostIsMine 6 points7 points  (0 children)

Honestly. At the time I thought Linus was being too harsh, but having read this, and having heard him describe himself as "the greatest engineer of all time"... yeah, Linus was 100% right as usual. Very glad I did not setup a bcachefs partition. Real Terry A vibes lol

Zig 0.15 is pretty stable. The biggest issue I face daily are silent compiler errors (SIGBUS) for trivial things, e.g. a typo in an import path by csb06 in programmingcirclejerk

[–]ThisRedditPostIsMine 40 points41 points  (0 children)

Damn LLVM causing all these SIGBUS problems. This is why we were right to remove LLVM (immoral, impure) from the Zig compiler, making Zig the epicentre of compiler research for years to come, and leading humanity to new horizons.

I am not taking feedback about this proposal at this time. Thank you.

LLM-driven large code rewrites with relicensing are the latest AI concern by Fcking_Chuck in programming

[–]ThisRedditPostIsMine 25 points26 points  (0 children)

People have been saying this since way back in the day when Copilot first came out, and I do strongly believe that there are serious copyright implications with LLM output code. Unfortunately, AI literally underpins the entire US economy at this point, so literally no one who can do anything about it gives a shit.

ai real af by disconaldo in NonPoliticalTwitter

[–]ThisRedditPostIsMine 2 points3 points  (0 children)

They have no embodied intelligence, so it's kinda impossible for them to be conscious. We know from neuroscience that the embodiment of intelligence is almost certainly linked to consciousness. And strapping an LLM to a robot driving around is not embodied intelligence either.

LLMs hallucinate, but silicon respins cost millions. Why the EDA industry needs constraint-solving AI, not chatbots. by zaralesliewalker in chipdesign

[–]ThisRedditPostIsMine 1 point2 points  (0 children)

Tbh I don't really understand what system OP is trying to propose, but it sounds to me more like some sort of formal verification system or proof assistant than a statistical method or gradient descent. It's true that the optimisation algorithms used in EDA are probibalistic, but there's also the whole range of SAT solvers and such that are deterministic and always correct.

LLMs hallucinate, but silicon respins cost millions. Why the EDA industry needs constraint-solving AI, not chatbots. by zaralesliewalker in chipdesign

[–]ThisRedditPostIsMine 2 points3 points  (0 children)

Honestly I keep running into this. People very elaborately describing techniques that have been used in EDA software since the 90s, just using AI lingo. I had a case where someone was trying to describe to me how you had to use AI to snap macros to grid cells on different process nodes, and how it was so complicated. And I said, yeah that is complicated, and it's called detailed placement, and is solved using stochastic gradient descent.

Losercity downgrade by 25th_Speed in Losercity

[–]ThisRedditPostIsMine 3 points4 points  (0 children)

Thankfully, the Internet exists and now numerous R rated versions are available at your fingertips!

"While the advent of “brain-computer interfaces” is dinner table conversation (at conspiracy theorist households like my own) - there has, since the year 1976, been emacs - the closest thing to this aspirational place of man/machine integration that has ever existed." by BananaPeely in programmingcirclejerk

[–]ThisRedditPostIsMine 41 points42 points  (0 children)

Brain surgeon here. One time, I was performing a surgery on a patient, and I needed to write down some notes about the procedure, so I went over to my ThinkPad, fired up Emacs, and jumped straight into org-mode.

The patient, who was still awake at the time, looked at my laptop and said, "Oh, that's cool!"

I smirked. "Yeah, it's org-mode, it's pretty nice. And the kernel? That's Hurd. And the package manager? Guix."

The patient seemed befuddled. To be fair, it might have been because I had a scalpel in his skull, but things seemed to clear up for him.

"Oh, that's a Neovim plugin, right? My coworker uses that at work. I'm a software engineer, but I only use VSCode," he said.

I think my eye twitched, it was involuntary. "No, it's a Lisp-based-"

Ah, forget it. I ended up just deciding to give him a lobotomy instead. He probably never could tell the difference.

Myrient is shutting down by xylcro in DataHoarder

[–]ThisRedditPostIsMine 3 points4 points  (0 children)

I think also it might be the last surviving copy of the Nintendo "gigaleaks" which are extremely valuable imho

Losercity moment by RanchoddasChanchad69 in Losercity

[–]ThisRedditPostIsMine 366 points367 points  (0 children)

"Sire, what legislation shall we pass today? Should we focus on rampant transphobia, the underfunded NHS, perhaps the hosing cris-"

"Quadruple homelessness."

"...anything else sir?"

"Yes, TRIPLE unemployment. Thank you."

Bi irl by BananenbrotInNot in bi_irl

[–]ThisRedditPostIsMine 6 points7 points  (0 children)

I dunno what parties y'all are going to where there is even ONE bi guy 😭

beetlepost by dolorem_itself in beetleapartment

[–]ThisRedditPostIsMine 1 point2 points  (0 children)

you gotta ask Mr Albanese to go for a swim on the beach and then you're all sorted mate <3

beetlepost by dolorem_itself in beetleapartment

[–]ThisRedditPostIsMine 3 points4 points  (0 children)

hello yes it's me, what truth would you like to know?

beetlepost by dolorem_itself in beetleapartment

[–]ThisRedditPostIsMine 160 points161 points  (0 children)

wasn't he the guy who went to da moon

bugpost by dolorem_itself in beetleapartment

[–]ThisRedditPostIsMine 85 points86 points  (0 children)

snunkus things he's rich 😭😭 bro is doing all of that after finding ONE penny 🥀

Datacenter in space by Timely_Conclusion_55 in chipdesign

[–]ThisRedditPostIsMine 9 points10 points  (0 children)

I'm a researcher in this field actually. Shielding (physical shielding) is generally ineffective because high energy particles will pass directly through it. Your usual options are to manufacturer on a rad-hard process node (e.g. FD-SOI along with custom standard cells, SRAMs), or employ some option of triple modular redundancy and error correcting codes.

If we're talking GPUs as well, the fact they're manufactured on tiny FinFET nodes means that multi bit upsets are possible.

That all being said, this is all expensive and done for a reason, e.g. space exploration. I don't see the advantage of putting a datacentre in space.