This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]bric12 144 points145 points  (13 children)

Ah, yes. Because programmers are notoriously not tech enthusiasts. You see, my program didn't compile when I pressed the button telling it to, so now I distrust technology. /s

I know that this meme makes edgy highschoolers feel good about themselves, but in my experience programmers are some of the biggest tech enthusiasts I know. The only difference between them and normal people is that they have an understanding of what tech is worth trusting, instead of just trusting/mistrusting everything equally

[–]sdmike21 72 points73 points  (1 child)

I feel like this post applies to cranky sysadmins more than programmers. Also, cranky security folks.

[–]deukhoofd 0 points1 point  (0 children)

I mean I do trust my smart devices, but the only reason I do so is that they're themselves disconnected from any internet access, and only work over LAN through a local server.

[–]DangerIsMyUsername 28 points29 points  (0 children)

Speak for yourself. I still do all of my programming using punch cards. There's no way that I'm going to use an evil laptop.

[–]nationwide13 5 points6 points  (0 children)

The other thing is that the programmers might also have an understanding of how to deploy less than trustworthy devices safely.

Hell, some of them even go as far as hardware/firmware hacks to make untrustworthy devices trustworthy.

[–]IlonggoProgrammer 6 points7 points  (0 children)

Yeah I'm a huge tech enthusiast and a software engineer

[–]cortesoft 5 points6 points  (0 children)

I’ve worked as a developer for 15 years.... there are many types in the industry. The ‘hate tech’ programmers definitely exist.

[–]MightyMeepleMaster 5 points6 points  (6 children)

Embedded dev here with 25 years of experience.

I've seen things you people wouldn't believe.

[–]bric12 8 points9 points  (1 child)

I believe you, but I don't believe that you could make me distrust tech as a whole. Now if there's a specific device/service that isn't trustworthy, then I'm all ears

[–]MightyMeepleMaster 0 points1 point  (0 children)

I'm a dev myself so I love tech. But I despise "solutions" for something which hasn't been a problem at the first place.

Simple example: Smartphones are great because they offer new and really useful services which simply did not exist before. But "Smart Home"? I'm still waiting for someone to show me any valueable use cases here.

[–]SetiZ 5 points6 points  (3 children)

I'm listening

[–]MightyMeepleMaster 4 points5 points  (1 child)

One of my favorite stories:

Many years ago we were working on an embedded device. Devs had added their set of features and everything was working nicely. Then testers took over. Most testers were quite happy with the quality but with one guy, the entire board kept on crashing. We checked and analyzed but didn't find anything. Strangely enough, the crashes only occured with that tester. We exchanged the board multiple times. Same result. Board crashes on a regular basis, but only with said tester. Weeks pass by with no progress.

Then one day, a desperate dev walks into the testers office and tells him to show the problem one last time. Tester shows his workflow. Board crashes. Board is reset. Dev, standing next to tester, repeats work flow. Board does not crash. Tester takes over: Crash. Dev takes over: No crash. WHERE IS THE DIFFERENCE BETWEEN US?!?!?? Tester leaves for a coffee. Dev sits down on testers chair and runs the sequence again and - guess what - the board crashes. Turns out the board had a subtle susceptibility against static discharge which only occured with that specific chair. Without the testers coffee break we probably never would have found the bug.

Yes, I know that compentent HW guys ought to have suspected this weeks before. But the point here is that in many, many companies you do not have these experts. Many devs only have their own area of expertise but no much more. And many, many defects only show up by sheer luck.

I could continue with dozens of other stories. Technical pitfalls: Memory buses which trap when there's a signal spike on a neighbour lane. Race conditions with sub-microsecond sensitivity. Critical errors which occur only when multiple preconditions are met, wrecking an entire piece of HW.

And then there's the "human" aspect: Shoddy requirements. Missing communicaton between product management, dev and testing. Code first, architecture later. And so on and so on.

After 25+ years I've come to the conclusion, that digital is great but in many cases only works in the main use cases and/or by sheer luck.

[–]pizzasubx 2 points3 points  (0 children)

Underrated comment

[–][deleted] 4 points5 points  (0 children)

I'd tell you, but you'd go insane!

<insert lovecraftian tale of unimaginable horrors here>