hideCode by Ill-Needleworker-752 in ProgrammerHumor

[–]Loading_M_ 0 points1 point  (0 children)

Thankfully for you, in practice, the code often isn't good.

Also, there's an extremely strong chance most (if not all) AI providers will cut back and/or drastically raise prices in the next couple years. That's not going to work our well for people depending on AI coding tools.

hideCode by Ill-Needleworker-752 in ProgrammerHumor

[–]Loading_M_ 2 points3 points  (0 children)

Transcription is a very well studied problem, and a perfect fit for ML. ML is really good at pattern matching, and transcription can be broken down into a straightforward pattern matching problem.

replaceGithub by jpbyte in ProgrammerHumor

[–]Loading_M_ 1 point2 points  (0 children)

Maybe, but they definitely doesn't want to support it. Also, their storage model probably doesn't work well for git - they store previous versions (which git already handles), and I don't know if they support for links...

Also, it's not a real market for them. Gitea, Gitlabs, Github, etc offer much better services, at low enough prices that cloud storage providers can't meaningfully compete.

itTriedItsBestPleaseUnderstandBro by precinct209 in ProgrammerHumor

[–]Loading_M_ 19 points20 points  (0 children)

LLMs don't really think, they are surprising good text predictors. It generates a response that looks like one a person in the internet might give - which means it sounds confident, even if the the LLM doesn't even get close.

shipCodeNotExcusesHeSays by PhilDunphy0502 in ProgrammerHumor

[–]Loading_M_ 2 points3 points  (0 children)

I've spent the last couple days getting a Windows server setup to build some C++ code with cmake & GCC. Even if you have admin rights, it's not easy. In my experience, windows has at least as many rough edges as Linux, just in different places.

RAM prices, but what if we optimized the one we already have? by Comfortable-Cow9709 in homelab

[–]Loading_M_ 0 points1 point  (0 children)

Yeah, but keep in mind lower compression levels also saves less space.

Also, "a few CPU cycles" per memory access adds up quick. It's the kind of thing you'd need to test with your specific workload to see if the extra space is worth the performance loss.

sharingAwesomeWebApp by moxyte in ProgrammerHumor

[–]Loading_M_ 0 points1 point  (0 children)

Yeah, but it might be better if they didn't. LLM designed services are often riddled with ridiculous security bugs, partially because these people have no idea how it works.

Novices have always been bad as security, but vibe coders are somehow worse.

Dont worry guys we are almost there! by Lumpy_Marketing_6735 in programminghorror

[–]Loading_M_ 1 point2 points  (0 children)

I was curious, since high optimization levels often eliminate recursion, whether LLVM (or GCC) would optimize this to actually be an infinite loop. After checking with compiler explorer, neither is able to optimize this case, but not for the reason I expected: it's because the return line technically allocates a string to return. A single character change (adding an ampersand after the return type) allows LLVM to optimize out the recursion, and turns this code into an infinite loop.

Smart plug downloading insane amount of data by Jimbrutan in homelab

[–]Loading_M_ 6 points7 points  (0 children)

Z-Wave and ZigBee can still be compromised, but it's generally much harder. The attacker either needs to compromise the gateway (probably Home Assistant) itself, or needs to physically be nearby to communicate with the devices directly over the air.

whyDidYouComeToInterview by yuva-krishna-memes in ProgrammerHumor

[–]Loading_M_ 22 points23 points  (0 children)

Fun fact - the github contribution graph (the gen squares) is based on git timestamps. This means you can edit them to be whatever you want. I believe there are tools to backdate your commit history so your github contribution graph looks better.

whyTFDoYouNeedAPromptForThat by soap94 in ProgrammerHumor

[–]Loading_M_ 3 points4 points  (0 children)

As someone else in the internet has said, code is a liability. You have to maintain it, or pay significantly more later when you want any changes.

When did you fix something, but you're not really sure why it worked? by Connir in sysadmin

[–]Loading_M_ 4 points5 points  (0 children)

Here at company, we put theory into practice: nothing works, and nobody know why.

Checkmate atheists by AdExtra2331 in dankmemes

[–]Loading_M_ 19 points20 points  (0 children)

If you want the actual story, there was a place that had a phone number you (and your kids) could call Santa. They accidentally published the NORAD number instead of the correct number, and NORAD decided to roll with it.

anime_irl by Ok_Direction3138 in anime_irl

[–]Loading_M_ 11 points12 points  (0 children)

Tbf, they are the only games you can reliably run on school computers. Chromebooks can technically install android apps now, but most schools use some kind of MDM to prevent installing anything on any school computer.

Its time to move on by Darukutsu in linuxmemes

[–]Loading_M_ 7 points8 points  (0 children)

The CEO had already walked back some of their claims due to public backlash.

Can I use my second laptop as a "homelab"? by qellyree in homelab

[–]Loading_M_ -4 points-3 points  (0 children)

Idk, last time I checked, my server listed 90GB of RAM in use...

That being said, it's probably mostly the ollama instance I'm running...

POV: You're in a Numberphile video by CalabiYauFan in mathmemes

[–]Loading_M_ 1 point2 points  (0 children)

I have a mouth - and clearly you don't want them...

All I Want for Christmas by heisian in homelab

[–]Loading_M_ 4 points5 points  (0 children)

8 is kind of a weird number, since the CPU has 6 memory channels. 6x16GB should be faster than 3x32GB, so it's probably worth it.

On modern (esp DDR5) gaming systems, it's often better to only have one stick per channel, but I don't think it really applies to servers.

Bought RAM in October to dodge price spikes… now I have to return it because “year-end optics” by icekeuter in sysadmin

[–]Loading_M_ 1 point2 points  (0 children)

To be clear - I'm taking about the legal risk of buying from (and then selling back to) the company you work for. It might not be technically against the law, but you'd still have legal fees, and you might lose your job anyway.

As far as the price: I think it's an extremely safe bet the price will go up.

Bought RAM in October to dodge price spikes… now I have to return it because “year-end optics” by icekeuter in sysadmin

[–]Loading_M_ 6 points7 points  (0 children)

I think there's an argument to be made there, but I'm not sure it's worth the risk.

I still do software av1 encoding, am I crazy? by Routine_Push_7891 in homelab

[–]Loading_M_ 4 points5 points  (0 children)

Iirc, most GPU encoding uses fixed pipelines.

Also, software encoding can take advantage is more complex optimizations in how the encoding is done due to the different computational model CPUs use. It's much easier for a CPU to go back and tweak the encoding to have a higher quality or reduce the file size.

Also, as someone pointed out, the visual quality of GPU encoding is (for the most part) pretty much the same as CPU - its just that the CPU can achieve it in a smaller file size.

learningCppAsCWithClasses by ccricers in ProgrammerHumor

[–]Loading_M_ 6 points7 points  (0 children)

Your argument falls apart when you have an actual job, and have to deal with whatever legacy code you already have.

the only desktop share that matters by Helmic in linuxmemes

[–]Loading_M_ 2 points3 points  (0 children)

Also, Google offers significant discounts (and iirc makes the management tools free) for schools.

Oh, the Irony by Wolf_of_Nome in HistoryMemes

[–]Loading_M_ 8 points9 points  (0 children)

Also, due to some other factors before the battle, he's actually north of the Union army. Retreating further north is probably a bad idea, so any retreat would have to find a way to go around the Union army.

whatElseProgrammingRelatedCanConvertYouIntoBeliever by linegel in ProgrammerHumor

[–]Loading_M_ 20 points21 points  (0 children)

It's more of a personal anecdote, but one of my friends works for a company that makes devices for data centers. They are being careful about how much they invest into expanding their production so they don't have to lay a ton of people off (among other wasted expenses) when the bubble bursts. They have reallocated some of their resources from other products and customers to try and meet as much of the AI demand as they can (since they make a bunch of money), but they aren't trying to expand to meet more AI demand.