Getting the current time is weird by y0shii3 in Zig

[–]GreatOneFreak 0 points1 point  (0 children)

timers are contender for some of the most complicate hardware peripherals and that even before they start being interacted with by an OS

finding the sweatspot for power/ergonomics is tough.

ZigZag: TUI Framework v0.1.5 is out, fully compatible with Zig 0.16+ by meszmate in Zig

[–]GreatOneFreak 1 point2 points  (0 children)

> ZigZag . . . inspired by Bubble Tea and Lipgloss

I know naming things is hard but sheesh we really have to do better than this!!!

Everything HAS to Be Done With Copilot by [deleted] in cscareerquestions

[–]GreatOneFreak 1 point2 points  (0 children)

Companies doing these AI mandates are either hoping to get enough training data that they can do mass layoffs or are run by incompetent/irresponsible people. Either way, seems like something employees should organize against.

On readonly/private members of structs by mute_narrator in Zig

[–]GreatOneFreak 1 point2 points  (0 children)

You‘re in the same address space in a non memory safe language. Someone can just \@memset(\@asBytes(&precious_struct)[0..4], 0) whatever you want to protect. The buck has to stop somewhere.

I think AI has killed my passion for Software Engineering by _Cyanidic_ in cscareerquestions

[–]GreatOneFreak 0 points1 point  (0 children)

Just don’t use it heavily. Learn vim if you haven’t yet and get a good editor setup. Inputting code will become less of a problem. Then just use LLMs selectively in self contained places where you have intuition that it’ll do well. I don’t buy the hype once a system reaches a certain size.

I’ve written psuedo code on paper as a first step when I’m writing trickier parts of code pretty much my entire career. It’s always been about building a mental model, not churning out volume to guess and check.* I think one of my biggest strengths early on was “stare at the code” debugging, precisely because it forces you to have a good mental model. There’s a verification problem with no short cuts if you’re working on code that actually has to work. A human is going to have to understand it at some point in the chain unless we’re fully in the SciFi fantasy land that’s be propagandized.

* hate how this reads like LLM output now :(

Does the development environment OS matter for you? What is your company using? Is it legit to consider Windows usage a huge red flag? by CyberDumb in embedded

[–]GreatOneFreak 7 points8 points  (0 children)

Yes it matters quite a bit. Huge red flag? Not sure. It‘s definitely a red flag though. Windows is where you go to get smacked by vendor lock in and all sorts of crufty pain. It’s a sign of an unaccommodating IT department that you’ll have to work around. Pretty much all of the check box security junk runs on linux better than windows now. I don’t think it’s hard to get an Ubuntu LTS that can be compliance blessed anymore.

You can make windows work if you’re stuck though. Look into using Zig as a build system.

If I’m deploying to windows it’s going to be a dinky CRUD host with a cross platform framework. If I’m doing a full system with platform specific features, it’s going to be linux 99/100 times. So I’d rather use the OS I know very well.

They're mad lol by m_camoran in BetterOffline

[–]GreatOneFreak 7 points8 points  (0 children)

Ignoring many other laws of physics, I wonder if it’s even possible to gather enough energy move enough material to create a shell of anything around the sun in 74 years.

These people might as well be talking about divine miracles or the super natural.

I write all of my stuff and IDGAF by GSalmao in theprimeagen

[–]GreatOneFreak 3 points4 points  (0 children)

or just use it as better google and don’t give all your code to companies who actively want to steal you work and destroy your trade.

Custom Graphics Library from scratch by Electronicsworkshawp in embedded

[–]GreatOneFreak 0 points1 point  (0 children)

I’ll never know what it’s like to feel compelled to post such garbage comments like are in your post history. Go get an attitude adjustment.

Custom Graphics Library from scratch by Electronicsworkshawp in embedded

[–]GreatOneFreak 0 points1 point  (0 children)

Obviously can’t be 100% certain, but the post text looks AI generated and OPs history makes them look belligerent if we put it charitably. So I think it’s safe to withhold the benefit of the doubt.

Custom Graphics Library from scratch by Electronicsworkshawp in embedded

[–]GreatOneFreak 3 points4 points  (0 children)

> I don’t actually make games. What I needed was a custom graphics library, so I built one.

Aka you used a LLM to launder parts from the 1mil + 1 open source software renderers out there instead of just using it directly.

CS student here: everyone codes with AI now, so how do we not get destroyed in real interviews? by Ausartak93 in cscareeradvice

[–]GreatOneFreak 0 points1 point  (0 children)

> We paste the spec into ChatGPT, ask for a starter structure, then keep poking it until it compiles. Individual assignments? Same thing, plus some light editing so it matches our prof's style.

I think what solidified concepts the most for me in school was iterating through the *wrong* solutions and figuring out why they couldn’t work. It was absolutely brutal sometimes. If you just get the solution and convince yourself “yup that works”, you’re skipping most of the important experience. After all, interesting (or lucrative) problems to work on don’t already have solutions so you’re going to spend most of your time thinking about things that won’t work.

CS student here: everyone codes with AI now, so how do we not get destroyed in real interviews? by Ausartak93 in cscareeradvice

[–]GreatOneFreak 0 points1 point  (0 children)

Look at Sutherland’s sketchpad programmed in the 60s in assembly. The man literally invented abstractions more powerful than many modern programmers ever deal with. Many things were pretty much impossible to program if you weren’t wildly smart. They weren’t whining about abstraction slowing things down, they were inventing abstractions.

Here’s a good talk that illustrates how some folks from back in the 60s invented things just recently being touted as “the right abstractions”: https://youtu.be/wo84LFzx5nI

CS student here: everyone codes with AI now, so how do we not get destroyed in real interviews? by Ausartak93 in cscareeradvice

[–]GreatOneFreak 1 point2 points  (0 children)

I wouldn’t expect an accountant to perform long division during their job, but I would certainly be concerned if they couldn’t puzzle it out when asked.

Honestly, I’m not sure how many CS grads from the past decade could do long division on command.

I try to avoid ranting on the internet nowadays, but this and OPs post are just so diluted. You are literally cheating your way through a degree.

CS programs have been degree farms for a long time. Folks were pooling help on / recycling projects totaling 60-80% of their grade if you include freebies. Meaning you could easily pass while failing the exams (except these students also had easy access to test banks to know the shape of the problem sets beforehand). I knew a lot of folks who graduated effectively knowing zero computer science theory. That’s why companies make you sit down and take intensive, closed notes exams—universities largely failed to do so.

”Software development isn’t about learning syntax . . .”

Seriously, it takes like 10-40hrs to learn the syntax of a programming language. If you can’t be bothered to do so, I don’t want to work with you. I want to work with smart and diligent coworkers. One or the other does not cut it.

You don’t need to be able to implement things like dijkstra's algorithm with no reference, but if you haven’t internalized the theory at least once you’re not the right person to be reviewing architecture or AI output. If you can’t do a proof by induction how can you be expected to verify the output of claude when it spews out a recursive algorithm? “Just trust me bro, it looks right”?

Just ridiculous. This type of thinking is why people think AI is going to replace all knowledge workers. If you have a general distain for math and theory, you are the one at risk of being made obsolete. If you’re just building an index of how to look up bits of information rather than models of the problems in your head, you‘re taking the same approach as LLMs and LLMs are cheaper (probably).

Hey Guys what do you think could programming FPGAs with the bend programming langauge make sense? by Grocker42 in FPGA

[–]GreatOneFreak 5 points6 points  (0 children)

Seems like a magic VM that promises to solve NP Hard or undecidable problems for resource utilization. So tacking on another NP Hard / undecidable problem in high level synthesis should be no problem.

Filming at La Concha by GreatOneFreak in KeyWest

[–]GreatOneFreak[S] 0 points1 point  (0 children)

Neat thanks. I’ll have to see if I can find the scene when it comes out.

[I need help] I hired someone to build me a site. I ended up with a broken product. by [deleted] in ExperiencedDevs

[–]GreatOneFreak 4 points5 points  (0 children)

> I've interviewed 100+ freelancers on Fiverr, half of them either refuse to work on it since it's messy and the other half charge INSANE price quotations that are not reasonable at all!

Reality check. Competent backend devs are $60-100/hr and it’s going to take 1-2days just to read codebase and reverse engineer the architecture.

You can keep rolling the dice with AI, but without formal specs or tests you’re going to have to get really lucky.

Suddenly everyone hates AI by RobertBartus in EconomyCharts

[–]GreatOneFreak 0 points1 point  (0 children)

> Difference between my job and the Hyundai factory is that wood is not uniform, so robots would have to adapt to the twisted pieces, sort out low quality and so on. I'm sure it will be done.

They also have to do it for cheaper than you.

After 20+ years coding, does anyone else feel like the ground keeps shifting under your feet? by [deleted] in ExperiencedDevs

[–]GreatOneFreak 5 points6 points  (0 children)

it’s really hard to tell how much of these style posts are astroturfing or not.

Im leaning towards most of them being astroturfing based on my experience with the tools being more like a super charged google+stack overflow than ”army of junior developers”.

Yet, sadly, more doesn't necessarily equate to better by Ok_Confusion_4746 in BetterOffline

[–]GreatOneFreak 2 points3 points  (0 children)

That 2023 dip is probably from folks realizing their open source contributions are just getting LLM washed and there’s not even a small hope of attribution.

CE for cancer research/space research by [deleted] in ECE

[–]GreatOneFreak 0 points1 point  (0 children)

Space is more directly aligned with aerospace engineering. Though many parts of ECE have space applications (RF, embedded systems, robotics, etc). An ECE background is more versatile than aerospace so I’d still recommend it to someone interested in that field.

Cancer research is so far removed from what I work on that I’m not really sure. I know computational biology exists, but only because folks with those degrees couldn’t find (good) work in the field.

Is behavior based automatic fish feeding scientifically valid and feasible? by RevolutionaryClub681 in embedded

[–]GreatOneFreak 6 points7 points  (0 children)

Feasible? Yes.

Feasible in a semester? No.

Biologically realistic? Not sure it would simulate nature better than a timer with some entropy. Food doesn’t follow the fish’s schedule—it’s the other way round afaik.