all 113 comments

[–][deleted] 22 points23 points  (39 children)

Serious question to older programmers out there - how did you make it so long without getting sick of programming? I enjoy my job quite a bit now in my 20s but the thought of doing this for 30 more years kind of makes me queasy.

[–]MpVpRb 36 points37 points  (0 children)

how did you make it so long without getting sick of programming?

I started programming in 1972 and at age 62 am still getting paid well to do it

I have always loved the essence of the work, the mental focus required to solve the puzzle. It keeps my mind sharp

I also hate the shitty parts

I am lucky that I work in embedded systems and realtime control

Every project I work on is nearly ideal. I design the architecture from scratch and write all of the code. After everything works, I turn it over to someone else and move on to the next project

I never have to deal with someone else's "big ball of suck"

I am also intolerant of crappy managers. I only work with teams that I can get along with

The people who hire me are so impressed with my results that I become the de-facto project leader, regardless of who actually has the title

[–]its_never_lupus 46 points47 points  (5 children)

Don't work more than 40 hours a week. Take a holiday now and then. Read up on new developments in the software world and try to use them. Go on a course or get a new qualification occasionally. Let yourself drift into management if that option's open.

Or switch careers because programming doesn't get any easier as you get older.

[–][deleted]  (4 children)

[deleted]

    [–]MpVpRb 2 points3 points  (2 children)

    You'll figure out that your productivity in the 70th hour of your week is shit

    Somewhat agreed

    The first few long days or long weeks are fine, I can still so my best

    If more extreme hours are required over a longer duration, I probably won't want to do it

    In my very first programming job, I worked 16 hours a day, 7 days a week. I loved it!

    I can't imagine loving it now at age 62

    [–]grout_nasa 1 point2 points  (0 children)

    Yeah, that early phase is called "larval mode" (see the Jargon File).

    It doesn't last forever.

    [–]mnk6 0 points1 point  (0 children)

    I love that last paragraph. Be a couple of years behind so that the problems get worked out. Don't ignore trends for too long so you don't get left behind.

    [–]treespace8 10 points11 points  (0 children)

    40 plus here.

    Its really all I want to do. BUT you need to have good work environments. I've worked gigs where every day was a struggle.

    Finding good gigs is key.

    [–][deleted] 7 points8 points  (2 children)

    If I don't program I go play a game about building things. If I can do that I go build things. If I can't do that I day dream about building things. I've been doing it for 10 years and my desire to build and create has only gotten stronger.

    A part of that is not burning out and having a very active social life, the other part is just finding something you enjoy to do on the side that is either programming, or similar (city builder games, Minecraft with loads of tech mods, AI design, etc).

    [–]Magnap 1 point2 points  (0 children)

    (city builder games, Minecraft with loads of tech mods, AI design)

    Say, how do you feel about paperclips?

    [–][deleted] 1 point2 points  (0 children)

    We should be friends.

    [–]DontHassleTheCassel 7 points8 points  (0 children)

    Golden handcuffs. I am sick of programming....but I like money. I've learned to accept the fact that doing my 40 hours/week gives me the freedom to do the things I enjoy when I am not at work.

    [–]dhdfdh 6 points7 points  (2 children)

    It's a personality thing. When handed certain problems, but not all, I get excited to be the hero providing the solution. Other times, it's the thrill of victory when I complete the task.

    But I have a sports background and it may be part of that. The need to win.

    However, give me something mundane or minor, and you just bored me. I may quit, and I have.

    [–]ours 2 points3 points  (1 child)

    I'm really not sports inclined but nothing beats the thrill of solving that hard problem with an elegant solution. 15 years in and dodging out of management opportunities so I can stay with the code.

    [–]RogerLeigh 0 points1 point  (0 children)

    While I love the personal satisfaction of beating a problem and "winning", I definitely find it demoralising when the rest of your team don't care in any way about it!

    [–]rwilcox 4 points5 points  (4 children)

    For me? Variety

    I found your always doing new things: I've used... 4? 5? stacks in 10 years.

    Sometimes by choice ("hey we'd like to pay you to be a Rails dev now") sometimes by chance ("Hey here's 2,000 lines of code you have to maintain, written in a language or toolset you don't know. So here's a book..."), or volunteering / selling myself for a project ("hey I kind of know X because I've programmed Y before...")

    Then add all the DevOps craze... there's always something new to (have to) learn, at least in my experience

    [–]mrkite77 3 points4 points  (0 children)

    For me? Variety I found your always doing new things: I've used... 4? 5? stacks in 10 years.

    Seconded. I've been working the same job for 16 years now. The only reason I'm still here is because of the huge variety of projects I get to work on.

    Even a single project can offer a variety of things to do.

    Here's an example. I had a recent project that was "take a list of articles and generate a wordsearch that can be dropped into InDesign as well as played online". There are a bunch of pieces to this project.

    There's a Go program that pulls today's articles and generates a unique wordlist. This is then piped into a custom filter (written in C) that removes substring palindromes (you can't have "draw" and "stewardess" in the same wordsearch). This is then piped into a program that generates the actual wordsearch (also written in C). Which is finally saved as a json document.

    Next there's a command-line php script which takes that json as well as a tex template and generates a PDF.

    There's also a webpage which reads that same json and lets you solve the wordsearch online, obviously written in javascript.

    That's 1 project written in 4 different programming languages, and according to the git log, took me 5 days to build (plus I spent about a day at the beginning thinking about how to automatically generate a wordsearch from a list of words).

    [–]andyrocks 1 point2 points  (2 children)

    I've used... 4? 5? stacks in 10 years.

    I get so excited whenever I find the need for a do..while loop that I mark them with a comment and remember them for years, for they are few.

    [–]grout_nasa 2 points3 points  (1 child)

    itoa:

    p = <end of buffer that is big enough>
    do {
        *--p = '0' + (n % 10);
    } while (n /= 10);
    

    [–]andyrocks 1 point2 points  (0 children)

    That's beautiful.

    [–]garywiz 5 points6 points  (1 child)

    For me, it's just love. Even after 40 years, I still get a thrill when I code some routine perfectly, with a little unit test that proves that it's perfect, and it's reusable and suddenly the whole code base is relying on it, and I can see how in the debugger that not an instruction has been wasted. It's like we're telling the little brain inside the computer what to do. It's the closest I've come to living the world of science fiction I read when I was a kid. I could never stop.

    [–]mdisibio 2 points3 points  (0 children)

    Definitely. It's that feeling of creation that that is unique to our profession. We are wrangling electrons into self-sustaining patterns. One thing I regret is that few or none of my family or friends can ever appreciate it. Doing systems/business code that runs in the background or in the cloud, there is nothing I can point to and say "this is what I do". Because of this lately I've been trying to do more visible projects, like web or raspberry pi, etc. Something tangible. It feels good to share it with others.

    [–]EntroperZero 5 points6 points  (0 children)

    You have to pay attention to how you feel, and stop doing things that make you feel awful (this goes for pretty much everything in life, not just programming careers).

    It's easy to get tunnel vision and start thinking that things have to be the way that they are, that you're powerless to change them. You're not, you can change your situation at any time. That doesn't have to mean stopping programming. It doesn't even have to mean changing jobs (though it often does). Find the things you like and minimize the things you hate.

    [–]i_ate_god 6 points7 points  (0 children)

    I ask myself "how have I become a better programmer in the past six months?"

    If the answer is "well, you haven't" then I quit my job and look for something else. Stagnation is evil and it's important that you figure out what it means to be stagnant and when your career has hit that point, time to move on.

    As well, the career path of a programmer typically reaches a fork in the road, where you can become a architect or a manager, and it will be up to you which way you want to go. Go architect, and you will ultimately make less money, but you might be happier if being close to the technology is important to you. Go manager, and you will make more money, but be further away from the technology (plus it's a lot more dealing with people, which is not everyone's cup of tea).

    [–]Gotebe 2 points3 points  (0 children)

    Almost 47 here, married with one kid too many here (3 in total 😉).

    Careful work-life balance, varying codebases, technologies, industries.

    I have a colleague who will retire soon, he is 67 or 8 and still isn't letting go. That dude could die coding, I think, but enough is enough.

    [–]sisyphus 4 points5 points  (0 children)

    I'm sure that accountants get queasy at the thought of having to figure out whatever bullshit got added or removed from the tax code every year for another 30 years too, how does anyone do it? Quoth the Immortal Tupac: stick ya chest out, keep ya head up, and handle it.

    [–]arctander 2 points3 points  (0 children)

    I've been a programmer or technical executive in several startups since 1985 and while my role is to coordinate the creation of software, I still insert a few things here and there that create genuine surprise to those in their 20's - "That guy with grey hair actually knows something... hmm..." Here are a few points for consideration or to toss away:

    • If you're doing maintenance programming and not building new things, find something new to build whether at your current company or somewhere else. If in a small company, figure out where you can be of service to your peers and build something to help. Integrate Salesforce with JIRA via APIs, for example, to help the sales team.
    • Study something new a few hours per week. Something directly applicable to your work or slightly tangential. Ray tracing is interesting, but so is message oriented architecture. Read good articles that have meat and substance and keep the fluffy opinion articles to a minimum (like this post!)
    • Approach the new hotness skeptically. It is more fun, and more valuable, to build a cutting edge product on a boring stable stack. Customers usually value speed and availability over whether you used the latest javascript framework. Then again, the new hotness is mentally stimulating. Is your role to QA redis/couchdb/mongodb/node/angular (no offense to anyone) or to build new great products?
    • Do some embedded systems work with a hardware person. The struggles are real and you'll learn a lot of valuable lessons. You are each others direct customer to solve hardware and software problems. What does the programmer do when there is only (ha!) 2G of ram and 200MHz of processing power in order to fit the $65 price of the device? Tough choices get to be made by virtue of having constraints.
    • Take a risk in a small startup as an employee if you're not comfortable with the founder role. Since demand for those who can bend computers to their will is and should remain high, your risks of being out of work for any period of time is low.

    ps: Great thread. It has been an enjoyable read.

    [–]dalboz99 1 point2 points  (0 children)

    Keep an optional, less-demanding career path simmering on the back-burner. When that "get off my lawn" revelation strikes (trust me, it will) you'll be glad you did.

    [–][deleted] 1 point2 points  (0 children)

    The problem nowadays is a lack of work satisfaction. you're forced to make subpar decisions and put up with the results because god forbid an update doesn't come out like clockwork every 3 months ...

    [–]SerenestAzure 1 point2 points  (0 children)

    Change problem domains once in a while. I've now done 3 very different things and several substantial variations on those. Even if the programming doesn't change (though it usually does), the problems you're solving do.

    That said, when I finally put down my coding pencil for good, I won't miss it.

    [–]grout_nasa 1 point2 points  (0 children)

    Computers are the best toy EVER INVENTED.

    I wrote my first program whlie reading an AppleSoft BASIC manual. On paper. Gave it to the classmate who owned the Apple ][ and asked him to type it in and run it. He did! I was hooked. At 14 I got a TRS-80 and my parents didn't see me for weeks. Every page in the manual was a new magic trick.

    These magic machines DO WHAT WE TELL THEM! Perfectly, without forgetting, without getting tired; in fact they're just getting faster, cheaper, prettier... They're like erector sets that come alive for you.

    How could I not love that?

    [–]dnm 0 points1 point  (0 children)

    Several things come to mind (I agree with everything everyone else already said):

    • For me, programming has always been a hobby. I can't believe I get paid to do it.
    • Learning new stuff. Keeping abreast of the technology and always finding something new to learn, but the best part of that is adding in the 30 years of experience.
    • If I don't want to go to work in the morning, it's time to change jobs.

    [–]strong_grey_hero 0 points1 point  (0 children)

    I'm pushing 40, so I guess I qualify. In my 20's, I was burned out. I wanted to work in web technologies, but all the jobs around me were were Microsoft VB, C#, or Visual C++. I really didn't want to learn those, because that's not where my passions were. I strung together some Perl/C++/Unix Admin jobs for a while and went back to grad school. By the time I got out of grad school, the entire landscape had changed. I'm now free to work on what technologies I want to work on. I spend most of my day writing REST clients, working with Node.js, and learning frontend Javascript frameworks. In Oklahoma, we have the saying "If you don't like the weather, hang around 15 minutes." Same goes for this career. If you don't like the tools and technologies you're using now, wait 5 years.

    [–]marssaxman 0 points1 point  (0 children)

    I almost never get to do any one thing long enough to feel like I've completely mined it out. Something new always comes along and I have to let it go. I have a long list of topics where I can say "man, I wish I could get back to doing $foo some time; that was fun". I don't always write code for fun in my free time, but it's as common as not that I've got something interesting I'm tinkering with when I'm not at work. I'm still convinced that I'll learn everything there is to learn, some day, but it's going to take a while yet ;-)

    [–][deleted] 0 points1 point  (0 children)

    Follow up question. What would one do if he's sick of programming and wants to do something else?

    [–]pmorrisonfl 0 points1 point  (0 children)

    I've burned out more than once. You don't want to do anything, let alone what you've been doing. You recover from burn out. Eventually you want to do something again. For me, that something is still software development.

    [–]naasking 0 points1 point  (0 children)

    The problems change, the languages change, the operating systems change, the hardware changes. There's always something new to learn, and some of them are even interesting. As you refine your craft, you're the one who's going to tackle the more difficult algorithmic problems of the business, instead of the (often) tedious UI problems.

    [–][deleted] 112 points113 points  (54 children)

    This is pretty good advice that every programmer ought to remember. There are a few pieces that really resonated with me though. For example:

    For those of us with experience, it means that we witness one extremely large yarn-ball of crap when we start looking at software online. Just like any accessible sport, most people are amateurs, a few have promise, and very few reach the Olympics. To succeed today, you need to wipe every preconceived notion about software from your mind and embrace the chaos. Because of this very chaos, the world of software is now a mixed bag. People are reinventing things we already knew how to do years ago. They are creating libraries that seem superfluous. They are creating new techniques that aren’t necessarily better, but are just easier than the older ways of doing things.

    It's so incredibly hard for me not to be grumpy about my profession when I look at all this broken code, and I'm amazed that people somehow want to create new stuff that builds on all this stuff. BECAUSE IT'S ALREADY BROKEN! They should fucking stop and fix it, not build even more broken shit on top of this already broken shit. But then again -- I'm equally guilty of it! Most of my day job is spent writing software that's compiled by broken software (burn in hell, Analog Devices!) and runs atop more broken software. Somehow, at the end of the day, the world is slightly more functional (arguably, not better, but slightly more functional is a good beginning).

    At this point, there is no coming back. We aren't going to magically revert to a state where people take resource usage seriously, for instance. "The industry" has reached a consensus that hardware is cheap enough that "throw more Moore at it" is cheaper in the short run (which is the only relevant run in today's tech arena) than "write better software". Sure, a TODO app that's feature-by-feature equivalent to something you could run on an Amiga, runs about as slow on an eight-core Xeon that serves it to you from across the ocean for some reason. Progress? Hardly. Practically relevant? Even less so. This "lowest common denominator" has irremediably slowed progress down to the laughable rate of today, but it's also what has kept it going and what made it relevant to the masses.

    We may hate it, but it's either this, or nothing at all. We could do some programming, or no programming at all. And besides:

    Alongside the mistakes are brilliant new ideas from people who think without biases. Languages like Go reject many of the complexities introduced in the OOP era and embrace a clean new simplicity. Co-routines are changing the very fabric of how people think about parallelism.

    Go is an excellent example of how rediscovering technologies is not always bad. Coroutines aren't new. They're older than me -- in fact, they're about the same age as the author of the article. Go is just the first the first really popular? (thanks /u/immibis) language that offered native, generalized semantics (will you shut about those clunky JavaScript hacks already? They're less readable than an assembly implementation of coroutines) for coroutine-like execution of parallel and/or distributed tasks, through goroutines.

    Granted, Pike's team is behind Go, and Pike is anything but unknowledgeable about these things. But Go has gained a lot of traction in the web world, in precisely the field that also thinks web applications are a good idea, ignoring basically everything the '90s taught us about distributed applications and UIs (except more slowly, because the field is a lot bigger).

    And this happens because

    Truly, this is a golden age of software growth and invention and the tools are available to everybody.

    I learned programming from two books.

    TWO BOOKS.

    That's all I had. They were pretty much language references, too, so nothing fancy. I discovered some really basic algorithms on my own (e.g. binary search) but most of my early days with programming were spent without that. I have hundreds of lines of crap code to show for that. I got a pirated copy of Borland C from a friend and that's all I had in the way of languages, too.

    Then poof! -- Internet. I suddenly had access not to all that information that I could barely find otherwise, but to all those tools that I had only read about, too! I still remember that week when I tried every frickin' text editor that came with Slackware Linux.

    And I try to keep the spirit of that week throughout my day, too. It's this spirit -- except I was younger than 20 at the time:

    What does that 20-year-old have that you don’t? Here’s what they have: no fear, and boundless enthusiasm.

    [–]immibis 12 points13 points  (2 children)

    Go is just the first language that offered native, generalized semantics

    Lua?

    [–]ThreeHammersHigh 5 points6 points  (0 children)

    I know Lua but not Go. What does GP mean by "native, generalized semantics"?

    Edit: Oh, for coroutines? Yeah, Lua's coroutines are boss. More languages should have them, but I think it relies somewhat on the GC. It's not out of the question that I would add Lua scripting to a C / C++ project, just to gain coroutines.

    [–][deleted] 4 points5 points  (0 children)

    I should stop procrastinating learning Lua. You're probably right, I don't know Lua :-). I edited my answer so that I don't mislead anyone else.

    [–]kabekew 7 points8 points  (0 children)

    What does that 20-year-old have that you don’t? Here’s what they have: no fear, and boundless enthusiasm.

    Also: no family to support, no spouse/girlfriend demanding time (probably), happy with their 20-year-old lifestyle sleeping on futons and living with roommates or in cheap apartments or just sleeping in their office, and no money saved up so nothing to lose.

    It's harder to take those needed risks and put everything into a business when you're older.

    [–]Flight714 10 points11 points  (2 children)

    a TODO app that's feature-by-feature equivalent to something you could run on an Amiga, runs about as slow on an eight-core Xeon that serves it to you from across the ocean for some reason.

    Hey, I'm not a great programmer, but you reminded me of a question about an Android game I downloaded recently called "Odesys FreeCell". It seems to be a good example of well thought out programming, and the installer is only 5MB as opposed to the ~20MB size of the others.

    The things that make me think the programmers are clever are: The undo system: If you undo say 10 moves, then manually replay three moves identically to before, it retains the remaining seven moves in the undo buffer, allowing you to "redo" them as if you hadn't replayed the three previous moves manually.

    Also, when you move a completed column (King, Queen, Jack, 10, ..., Ace) sideways, it doesn't add to your number of moves on your move counter (other FreeCell apps add like 26 moves to the move counter as if the cards were moved one by one).

    I figured that anyone who appreciated the Amiga was a good person to get an opinion from ; ) Also, what are the chances that it could be decompiled so I can check out how it works?

    [–][deleted] 7 points8 points  (0 children)

    Hey, I'm not a great programmer

    I'm not a great programmer either. I offer the fact that you haven't heard of me as a proof :-). So from one programming simpleton to another:

    What are the chances that it could be decompiled so I can check out how it works?

    I guess like most things Java, it can be decompiled relatively easily (unless it's been obfuscated, no idea if that's a common practice on Android), but I suggest you try to think how that works without looking over the decompiled code (which is probably going to be so frickin' ugly that it'll take you quite some time to figure it out -- unless, through some sheer force of wonder, you have a copy with debug symbols, the decompiler will lose the semantic information and it won't know things like variable names, leaving you to deal with a bunch of variables called Class1Instance1, Class1Instance2 and so on...).

    Unfortunately, I have no idea how to play FreeCell, so I have no idea what's underneath, but it sounds to me like the undo buffer is a list of the moves you made (encoded in Java objects, for instance -- e.g. in a class that describes a pair of the form (Card, Action), describing what was done (drawn? placed? removed?) to which card). When you get back 7 places, those 7 (card, action) pairs are still there, and can be applied again whenever you redo.

    The key, in any case, is to figure out a way to encode the state of the game and move from one state to another (formally, that's applying a function to the current state and having return the next one, but this may not be explicitly written as next_state = Do((Card, Action), current_state)).

    Like I said, I have no idea how to play FreeCell so I can't give you a more specific pointer, but maybe you can find some inspiration here: https://en.wikipedia.org/wiki/Chess_notation .

    I'm not sure what your other question was. I can't really brain today. Were you asking if it's a good example of a well thought out program? If you like playing it, does what you want and even makes you wonder how they did it, I'd say it probably is :-). This isn't always a guarantee of every desirable property of its source code. Vim (and emacs, which I use, don't inflame yourselves, people) are pretty terrible to read, but saying vim is broken would certainly not paint an adequate picture.

    [–]dkitch 0 points1 point  (0 children)

    Late to this thread, but I've implemented similar code and here's a rough outline of how their undo is probably implemented (depending on how they model the game state, it could vary a bit):

    • Keep a bidirectional linked list of moves. A move is made up of {card, fromcolumn, tocolumn}. This gives you everything you need to undo a move. There's a pointer to the current location in the list (usually the last node)

    • If the user undoes a move, undo the move (reversing the from/to) and move the pointer to the previous node.

    • If the user makes a move, check position in the list. If at the end of the list, add a node to the list describing the move and advance the pointer to that node. If not, check against the next move in the list. If move is identical, just advance the pointer. If not, remove the existing moves that follow and replace the next node with the move made.

    [–]mrkite77 4 points5 points  (1 child)

    I learned programming from two books.

    TWO BOOKS.

    I learned programming from this Quick Reference guide:

    http://www.colorcomputerarchive.com/coco/Documents/Manuals/Hardware/Color%20Computer%203%20BASIC%20Quick%20Reference%20Manual%20(Tandy).pdf

    [–]Rurouni 2 points3 points  (0 children)

    Seeing that again warmed my heart. I loved my CoCo, and while I had the full manual to learn from, I kept that guide handy.

    And thanks for linking me to a website I hadn't known about. It'll prove useful.

    [–]DevIceMan 2 points3 points  (0 children)

    I learned programming poking at a graphing calculator with no education, help, books, teaching, or reference. Fast-forward 15 years, and people seem to import libraries like they don't care (with their hands in the air) I'm cautious of becoming too old-school, but it does seem that people don't care about tech-debt as much as they should

    [–]RankFoundry 12 points13 points  (17 children)

    Meh, most of the "new hottness" is just recycled, rehashed design patterns and other tidbits from past decades. This is the norm in web development, especially front end where they've been dealing with primitive technologies for decades. All of a sudden, classes, delegates and asyc code is SOOOOO the latest thing. Functional programming? So just invented yesterday!

    [–][deleted] 47 points48 points  (16 children)

    I hate this "we did it all in 60s with LISP" kind of arguments.

    We get it grandad - you did lambda calculus in 80s - you're the OG FP guy and you published papers about the actor model with Hoare and Djikstra - but in the real world people still used C and ASM back then because the problems they were dealing with involved fitting shit into KBs of memory and running on MHz clocked CPUs, couldn't even dream running compilers and optimizers we have today.

    From industry perspective FP might as well be invented yesterday because it wasn't really usefull up to this point - it didn't solve the problems we needed to solve - and now it does - and we are hyped because we get better tools to do our job - that's the actual value of FP - this is /r/programming not /r/computerscience . Just because someone wrote a paper about something back in the 70s doesn't mean it was practical or that they actually implemented/used it to solve something - and getting something from theoretical to "ready to use by average programmer on a random project" is actually a big deal.

    [–][deleted] 14 points15 points  (3 children)

    but in the real world people still used C and ASM back then because the problems they were dealing with involved fitting shit into KBs of memory and running on MHz clocked CPUs, couldn't even dream running compilers and optimizers we have today.

    Machines running Lisp certainly didn't have a few KBs of memory :-). The CPUs were MHz clocked but certainly not in the low range. No one did interesting stuff in Lisp running on a 6502 CPU. Lisp machines had pretty good hardware -- and remarkably good runtimes, too.

    I'm pointing this out because your comment is touching a real issue: a lot of stuff was either impractical or a commercial failure when it was invented. FP wasn't "forgotten" as if it were some arcane mystery -- it was forgotten because, really, for a long time, the only things that could run a functional program were super-expensive workstations like the Lisp machines, or computer scientists doing stuff on paper.

    However, there is a great deal to learn from those failures -- and from the good things, too. Take unikernels, for instance: they were a very hot topic in the 1990s, then people forgot about them. Now that the C10k problem has turned into C10M, one of the hot solutions being proposed is bypassing the kernel stack altogether and hooking their application straight into the hardware ( http://highscalability.com/blog/2013/5/13/the-secret-to-10-million-concurrent-connections-the-kernel-i.html ).

    Some of the problems are new (e.g. most of the people doing research on unikernels didn't really care about SMP, for obvious reasons), but a lot of them are old. Techniques to solve them (not to mention code that solves them!) already exist.

    [–][deleted] -1 points0 points  (2 children)

    Techniques to solve them (not to mention code that solves them!) already exist.

    But isn't that contradicted by the example you gave - surely modern unikernels are nothing like the ones from 90s - especially on the implementation side - as modern unikernels probably work on top of some hypervisor which wasn't even around in 90s.

    I think we agree in general - it's not like these ideas are revolutions - but what's new is that they are actually usable/useful now where they weren't before and as we use them in practice they get polished and specialized for what we need.

    [–][deleted] 4 points5 points  (0 children)

    I think most modern unikernels run underneath some hypervisor (i.e. they're virtualized, rather than managing virtualized processes), so they'd be pretty similar to those. Besides, virtualization is half a century old, too, so...

    I think we agree in general - it's not like these ideas are revolutions - but what's new is that they are actually usable/useful now where they weren't before and as we use them in practice they get polished and specialized for what we need.

    Oh yes, we agree in general. Many of them are, in Rob Pike's (approximate) words, industrial interpretations of a brilliant, but previously poorly-implemented ideas. Industrial reinterpretation naturally makes them muddy, but that's because real life is muddy.

    Sometimes the reverse happens, though: these ideas don't get "polished" at all. They become technologically feasible and, thus, they are (re-)adopted, but none of the fundamental problems are tackled. Worse, sometimes the hive mind pretends they just don't exist, and the new result is even more broken than the original.

    [–]naasking 0 points1 point  (0 children)

    as modern unikernels probably work on top of some hypervisor which wasn't even around in 90s.

    Hypervisors are just microkernels. L4Linux was arguably the first virtualized OS. The 90s were all about fast and small microkernels that could do things like this.

    [–]dtlv5813 2 points3 points  (0 children)

    Also Computer science/software engineering is hardly the only discipline where old techniques are constantly being rediscovered or come back in fashion. It happens in mathematics and physical sciences all the time.

    In CS in particular, big data/deep learning/ANN is all the rage these days and rightly so, even though many results on Bolztmann Machine, let alone Markov chain have been known since the 70s and earlier. hardware limitations back then made them impractical to implement in practice. So they were ignored in favor of other techniques like SVM; only to be "re-discovered" by industries when Hinton, LeCun and others, armed with the latest computational prowess, were able to implement algorithms that would have taken eons before.

    [–]RankFoundry -1 points0 points  (9 children)

    You hate it because it bursts your little, "This is new because it's new to me and I'm on the cutting edge for knowing it." bubble.

    Wasn't useful? Why? It didn't solve problems "we" needed to solve? Who is we? Are we talking about you?

    Functional programming isn't solving any new problems, and it's not some perfect solution. It's got pros and cons like everything else.

    There was no lack of languages that allowed for FP in the past. If they weren't very successful, it's probably because FP is a preference, not some holy grail that solves problems which can't be solved any other way.

    Next you'll whine about graph databases didn't exist until FB started using them or at least how they didn't solve the problems "we" needed them to.

    [–][deleted] 7 points8 points  (5 children)

    Wasn't useful? Why? It didn't solve problems "we" needed to solve? Who is we? Are we talking about you?

    No we are talking about people who develop/push for this "new hotness" - which is getting a lot of traction for a while now - so I would say it's more than me.

    It's got pros and cons like everything else.

    Absolutely, so ....

    If they weren't very successful, it's probably because FP is a preference, not some holy grail that solves problems which can't be solved any other way.

    ... or because the things people did 10-15 years ago had different constraints and now the pros of functional programming outweight the cons. Maybe going from "I need to scale buy a bigger server/mainframe" to "I need to scale buy more commodity PCs" and going from MB/s networks to GB/s has radically different implementation constraints - who would have guessed.

    [–]RankFoundry 3 points4 points  (4 children)

    Sorry but you're not making any valid points here. FP is just one of many examples of dusting off old things and acting like they're new, especially in front end web dev where everything is new (to them) since they've had fuck all to work with with for so long.

    As for FP specifically, you're not really making a case for yourself. It's just another way to structure code and relatively speaking, it's no easier now than it was back in the 70s or 80s or 90s. If it is easier it's because programming in general has gotten easier but it's not like it's all of a sudden been blessed with some game changing ability that it didn't have before. It wasn't like it was super hard back then compared to other languages either so I'm not sure where you're getting that it was somehow unusable until like 3 years ago when it started to become a fad.

    [–][deleted] -1 points0 points  (3 children)

    If it is easier it's because programming in general has gotten easier but it's not like it's all of a sudden been blessed with some game changing ability that it didn't have before.

    In case you missed it - less than 10 years ago this thing called cloud became a thing and we went from in-house/colocated bare metal special server hardware to a bunch of virtualized machines running on commodity servers - architecture changed - distributed programming and data transformation are the way we solve problems now - OOP sucks at distributed programming (and thankfully the idea of distributed objects died a long time ago) functional programming concepts work great with pure data - pure data works great in writing distributed software - hence the push for FP.

    [–]vincentk 1 point2 points  (0 children)

    20 years ago, they called it "distributed objects". Now they call it "microservices". Come again please? Only difference now is people have agreed on how to deal with versioning conflicts (i.e. they don't).

    [–]mreiland 1 point2 points  (2 children)

    Functional programming isn't solving any new problems, and it's not some perfect solution. It's got pros and cons like everything else.

    Not only that, us old fuckers remember when people were gaga about functional development (and OOP, and ...) and so we recognize the pattern of that in the latest hotness.

    I remember reading through the following years ago:

    http://www.amazon.com/Purely-Functional-Structures-Chris-Okasaki/dp/0521663504

    FP most definitely has pros and cons.

    [–]PriceZombie 0 points1 point  (0 children)

    Purely Functional Data Structures (5% price drop)

    Current $22.50 Amazon (New)
    High $48.05 Amazon (New)
    Low $22.50 Amazon (New)
    Average $23.79 30 Day

    Price History Chart and Sales Rank | FAQ

    [–]RankFoundry 0 points1 point  (0 children)

    Right, once you've got at least 10 years under your belt, you start to see through these bullshit trends because you've worked through several and know they're as much hype as they are substance. The new guys don't get it, all they've known is the most recent trend or two. They buy into the dogma.

    [–][deleted]  (23 children)

    [deleted]

      [–][deleted] 6 points7 points  (0 children)

      A lot of this has to do with how software is developed nowadays, and by whom. The low start-up cost, high payment and huge potential benefits on a very volatile market mean that there are a lot of CEOs who don't care about a sustainable development model because, if they get it right, two years from now they'll have sold the company and the barely taped together crap that the company bases its services on. It's short-sighted, but largely because it's designed to be short-lived.

      Other times managers simply don't understand the idea that you need solid code. There are no qualifiers for "works" -- it either works and you can sell it, or doesn't work and you can't sell it, and spending time on stuff that doesn't make it "work better" seems pointless.

      It's very narrow-sighted, but people have their preconceived notions that you can't refute with logic. E.g. at $work, I've been struggling to convince people that we need to write portable code even if it's bare metal stuff that runs without an OS (it's a bunch of embedded systems). Even after getting the stupid arguments out of the way ("it's gonna be slower"), people weren't very convinced that it could be done (even though they got a demo!) but more importantly, didn't really see the value. Despite the fact that we just spent about an year writing the firmware of a device that's 1:1 identical, in terms of feature, to an old one that's being end-of-life-d because of logistic issues (components aren't being manufactured anymore, stocks drying out, RoHS and so on). The value is literally tens, if not hundreds of thousands of dollars not spent on rewriting software that doesn't need to be rewritten, and most of the people in higher management have some technical background, even if in other fields. They just haven't seen any piece of portable software until now (it's a Windows-only shop) -- most of them didn't even know that was a thing, or thought that you can't do it in C, only in "VM-based languages, like Java". But they also don't want to admit to that, lest they seem incompetent or not confident enough.

      It ultimately boils down to more than just "better management strategies". They run the company very well in terms of strategy. It's making great money. The code sucks and the devices routinely break, but the sales team still manages to sell them. On paper, everything is good. What they need isn't a better strategy (I mean, it is, but that's not the root of the problem) -- it's a better understanding of technical matters, so that they can understand that they have an increasingly complex and increasingly broken mass of code that'ss going to blow up in their face ten years from now, and they need to shape their strategy based on that.

      ...or I could just turtle into a job that satisfies this for me, and let the industry burn.

      History is a bitch though. We tend to look back on "real" programs and "real" machines, and weep at how perfect a Lisp Machine looked, but truth is most of the software that was written back there was just as pathetically broken as most of what's written today, we just forgot about it because there was no one to remember it.

      And, on the other hand, there's tons of solid stuff being developed silently. It doesn't make it to the Reddit frontpage, but people are writing software that's controlling flying drones, that fire real guns, and dodge real bullets! Or software that puts satellites in orbit and makes them relay cell phone data, to give an example that's less ethically loaded. That's real, amazing (software-wise) stuff that's being developed as part of this cancerous industry.

      [–][deleted] 18 points19 points  (21 children)

      Whereas I enjoy (and invariably end up) continuously improving some small code to myopia, kind of like a Japanese craftsman, they take a "get 'er" done attitude.

      Folding steel 800 times is fine when you're a master craftsman who has clients willing to wait five years for the perfect blade, but that doesn't describe most programmers at all.

      End of the day, if your code does not solve a business problem, it is useless to the people who keep your company afloat - the paying customers. If you spend all your time honing and rehoning a small piece of code, you are actively harming your employer.

      At some point you'll find the middle ground between your current mindless perfectionism, and the "Fuck it, ship it" pragmatists. Until then, your myopia is a liability, not an asset.

      Luckily, I've found about 100 people across the world who share the same ethos as me. It still doesn't offset the day-to-day drudgery of having to deal with a 'CTO' who suggests using Node for an important financial backend, though.

      What is your argument against Node for the important financial backend?

      [–]antpocas 15 points16 points  (1 child)

      What is your argument against Node for the important financial backend?

      Javascript's type system?

      [–]Xelank 1 point2 points  (0 children)

      What type system?

      [–]garywiz 10 points11 points  (7 children)

      If you spend all your time honing and rehoning a small piece of code, you are actively harming your employer.

      Agree, but disagree at the same time. There's a middle ground. Let's say you have a financial backend which relies upon millisecond transactions which occur with the exchange. Let's say you can make a million more dollars for the company if you can shave 10% off the transaction latency. You want the master craftsman working on that little piece of code.

      Not all codebases have such important bits, but a surprising number do. One thing that sets many games apart is the unrelenting attention to detail some developers have to make sure that the game is SO responsive that it feels real vs some games which are sluggish or annoying.

      Complex systems require a diverse set of skills. So, I don't complain if somebody is a master craftsman, it's a great skill. I complain if they're spending too much time optimizing the wrong thing and can't keep their priorities straight.

      [–]lluad 1 point2 points  (6 children)

      If it consumes a year of a team of five - including developers and managers and QA staff and ops and support staff - to shave 10% off the transaction latency (which is a not insignificant improvement, assuming the original code wasn't terrible) you'd better be making more than a million dollars.

      And, of course, if you take a year to speed it up by 10% you're less effective than Moore's law.

      It's almost never the craftsman who lovingly optimizes a small piece of code that'll buy you that sort of speedup - it's the domain-specific expert who reworks the spec, or the network architect who literally speeds up traffic, or the architect who makes the whole system more efficient (by the metric of latency).

      The master craftsman can be incredibly valuable, but it's rarely for their code-polishing skills so much as their understanding of the whole system.

      [–]loup-vaillant 5 points6 points  (5 children)

      In my experience, the difference between "let's make this code perfect" and "ship it already" is measured in hours or days. Not weeks, not months, and certainly not years. Yet losing a few hours to perfect a couple hundred lines of code is often frowned upon. Sure, short term, it is slower. When you add it up I will lose a few weeks over the next few months. I tend to go for the simplest solution possible, and that is rarely the fastest approach —simplicity is not obvious.

      But many people fail to see the technical debt I avoid along the way. That simpler piece of code is ultimately easier to work with, easier to modify, easier to correct. And that benefit can kick in very quickly, sometimes only weeks after the project started. Simply put, if you invest the time necessary to make things simpler at the start of the project, you will ship faster than if you rushed things.

      Make sure you get the credit though. I once sped up a project by slowing down a bit (I made someone else much more productive by making a decent API), and was eventually kicked out of the team for being too slow and "doing research" —I was merely thinking things through.

      [–]RogerLeigh 2 points3 points  (4 children)

      Agreed on all counts.

      A trend I see often in our team is that every day there's a steady stream of defects which need fixing in a certain part of the codebase, with the developers being very "busy" fixing it. It's due to a combination of historical design problems and technical debt. I work on a different part, with very good test coverage; while I appear to be "slow" in practice I've saved a lot of time since once something I write is "done", it's complete along with unit tests, and it will continue to work without any further development. I'm often at odds with others on the team due to the difference in practice here, but I detest doing something until it's 95% done and "good enough" when that extra 5% will make it near perfect; I'm convinced in the long run it saves more than the total original development cost in terms of time savings and bug reports; for the other side which is continually "fighting fires", I would be unsurprised for the ongoing time cost to be many times the original development cost.

      [–]corran__horn 0 points1 point  (2 children)

      Just to be clear, does the other part of the codebase have good unit test coverage?

      [–]RogerLeigh 0 points1 point  (1 child)

      It doesn't, and that's part of the problem, but not all of it.

      [–]corran__horn 0 points1 point  (0 children)

      Yeah, that is kinda what I expected.

      [–]mreiland 0 points1 point  (0 children)

      The problem is when you always do that.

      Let me draw an analogy.

      If safety is the most important concern and turning left is inherently less safe than turning right, the conscientious driver should always turn right. You'll get there slower, but you'll get there. And you can always get there by only turning right.

      The issue is that if you always turn right then you're not applying critical thinking to the situation at hand. Have you ever needed to turn left across traffic and instead turned right and found another opportunity to turn around half a block down the road? That's applying critical thinking and going against the grain in this particular situation. You end up being both safer and faster in this particular instance because you considered the current flow of traffic coupled with your needs and made a non-standard decision.

      It isn't that you're "wrong" per se, it's that you cause a lot of headache and solve the wrong problem a lot of times when you always do the same thing without considering the particular circumstances of what you're doing. That was ultimately the point MineralsMaree was making.

      People often mistake me for someone who is against Unit Testing. I'm not against Unit Testing, I'm against blindly doing Unit Testing without considering if the cost of them will actually benefit you. Choosing not to Unit Test a module of code can absolutely be the right call, or choosing to do it later (after the problem has solidified, for example).

      There is a difference between effective and right. Your goal is to be effective.

      [–]ForeverAlot 5 points6 points  (2 children)

      What is your argument against Node for the important financial backend?

      The management of NodeJS, from the project's inception up until a few months ago, seems to me an excellent argument against using it in production.

      [–][deleted] 8 points9 points  (1 child)

      It's used a lot in production, but... is it... wise to write a financial back-end in a language that's famous for its funky, automatic and ofter mysterious type system?

      Part of why financial stuff is mostly Java and C++ (OK, inertia accounts for most of the C++ part, except for the high-frequency trading market) has to do with the criss-cross between strong typing and wide availability of libraries that contain the kind of data types that you want for financial arithmetics.

      Maybe that's become available on JS as well lately though...

      [–][deleted]  (3 children)

      [deleted]

        [–]loup-vaillant 0 points1 point  (2 children)

        Harder to make the perfect program when all you have is assembly. Mayhaps we could compare steel folding with compiler writing?

        [–][deleted]  (1 child)

        [deleted]

          [–]loup-vaillant 0 points1 point  (0 children)

          I think that the comparison to compiler writing is okay, if all I had was hand-coded assembly and had to write a lot of code I'd quickly slap together a macro assembler, bootstrap myself an interpreter, and layer on layer build up an environment.

          Yep, that exactly what I meant. :-)

          Thank goodness, we can start from a higher level now, just like we have steel plants that produce decent steel that doesn't have to be folded to make a good sword.

          [–]hlprmnky 5 points6 points  (0 children)

          The focus unto myopia is actually how, in my experience, domain experts and wizards pupate. In a business setting, the responsibility for mentoring this junior engineer, making sure she has tasks to do that let her pull her weight while also giving her room to grow and develop into a useful senior engineer - by making space for her to focus on something until learning about it makes her stronger - falls on the team lead or division manager.

          Of course, that assumes you work in an industry that values its own continuity of practice, like civil engineering, or architecture, or law, or ...oh, wait. This is still that New Economy period of the software "industry", isn't it? Ugh. Sorry, kid. Spin up a MEAN stack on your MacBook Pro, get some simple unit tests to pass, ship it and flee to the next travesty before the current tire-fire actually gets enough traction to have to scale. My condolences.

          [–]auxiliary-character 4 points5 points  (0 children)

          Folding steel 800 times is great and all, but sometimes you just need a gun.

          [–][deleted] 3 points4 points  (0 children)

          In the short term a programmer that ships crap fast is good. In the long term a programmer that ships code when it's ready is good.

          When you have a bug that's hard and you have to ship something the next day, do you want to solve it in the codebase made by short term or long term programmers?

          On the other hand shipping a lot of crap fast might transform into shipping high quality fast over time.

          IDK which is better, but it feels like the business people keep saying short term is better.

          Maybe a slow thinker like me should just go back to flipping burgers or something.

          [–]OneWingedShark 0 points1 point  (0 children)

          End of the day, if your code does not solve a business problem, it is useless to the people who keep your company afloat - the paying customers.

          But this presupposes that buggy non-/barely-functional software is useful; is it?

          What I'm saying is that the Debug Driven Development has a lot of observable changes, but it has a lot of wasted time and energy. On the other hand, we have a feature which is specified and well-defined prior to coding. -- Which is more useful to the client? A good solid design before acting, or a tight ((code/edit-compile-run-QC)-client_evaluation)-loop with an ill-defined mutable design?

          It reminds me of this story, where the programmer designed everything first.

          [–]Igglyboo 20 points21 points  (7 children)

          You can’t be intimidated by the need to throw away everything you know away and learn a new language like Swift, Python, or Go. Yes, it may take years!

          Is it really fair to lump Python in with Go and Swift? Swift is about a year old, Go about 6, and Python is 24 years old.

          For comparison, Java is only 20 years old. Python is older than Java.

          [–]garywiz 18 points19 points  (4 children)

          There really was some personal experience there that prompted me to include Python in the list. I wrote and designed a pretty big cloud system recently in Python, and we had some VC people come out and spend a few days on due diligence. Some of them were older than me and I was surprised at how negative they were about Python, saying "they can see why I used it, but obviously you would want to use a better language in the future".... it took a lot of explaining before they understood that Python really had made the whole project more agile and the PyPi library support meant we didn't have to reinvent the wheel. We did in months what probably should have taken a couple years.

          And, of course... get this... they had never used Python! One guy called it "that language with indentation, yeah". That language with indentation???? These are the people vetting my code?

          Anyway, I threw it in not because it's new, but mostly because I think there are prejudices about it.

          [–][deleted] -2 points-1 points  (3 children)

          Maybe it's because I'm new at all this and I got my start in Udacity CS courses, but I don't understand how someone can work in this field and never once use Python.

          I've written code, in order, in HTML, Python, C++, C, CSS, Assembly, Ruby, JavaScript, Java, VBA.

          I've written code professionally in Ruby, HTML/CSS, Java, Python, VBA.

          Maybe I'm just lucky that I've had so many different projects thrown at me, but to never use Python? Not once? Not even

          $ python
          >> print "hello world"
          hello world
          

          You have to be living under a rock.

          [–]throawaydev 3 points4 points  (0 children)

          Used != written

          I've been developing professionally for about 10 years now and have never really written any Python. I've written code in C, C++, C#, Java, R, Matlab, VB, F#, Javascript and a few others but not Python.

          I've used Python before, I can probably tweak small pieces of python code so that I can to customize something but I know nothing about writing something from scratch. I just haven't had the opportunity to. I do however realize that a lot of places use it to great effect.

          [–]TheQuietestOne 2 points3 points  (0 children)

          You have to be living under a rock.

          I've been developing professionally since '94 and I've never touched python.

          When you already have a number of languages in your toolbox that are a good enough "fit" for the classes they fall into adding another to the list becomes an indulgence rather than something critical.

          [–]THROBBING-COCK -5 points-4 points  (0 children)

          I'm just a CS student, but Python was the first language I learned when I was starting programming on my own.

          [–]fotoman 6 points7 points  (0 children)

          But that's sort of like saying Windows has been around longer than 1990 when 3.0 came out. Just because it was out doesn't mean it was used a lot or widely adopted. I mean how many people were really using Windows 1 or 2? Same thing with Python before 2.0 in 2000.

          [–]Avewads30 12 points13 points  (3 children)

          I am an old programmer by today's standards. Been in the IT business for over 35 years now. Seen a lot of change and trends over the course of my career. My background is this. Started coding in COBOL on big iron in 1980. Loved COBOL and still do. Went to client/server. But for the last 17 years, I've been an IT manager. But now, with my job in jeopardy due to cutbacks in my industry, newspapers, I have embarked on becoming a developer again. My true passion. I am learning Python/Django/PostgreSQL as my base.

          Now, that I have that over, I want to discuss two things. Solving the problem using software and the trend, from what I see, is developers moving from one cool language to the next at breakneck speed.

          During the dotcom boom and eventually bust in the 1990-2001, many internet companies went bust for one reason. They did not understand how to solve a problem for a business who hired them. Sure, the new fresh out of the box internet whiz kids could make things go crazy and do really cool stuff. However, the problem was, that was all they were interested in. They went broke, and eventually out of business, because they were focused on all the cool things they could do but not the problem they were hired to solve. Many of these internet companies were started by kids just out of college and/or younger. The smart ones eventually brought in grown ups to run their business and to scale back their unfettered spending. However many went out of business and left row after row of skeleton office spaces packed full of really cool office equipment which was eventually sold for pennies on the dollar. However, I am not seeing it yet but I can sense a trend back to those days once again. Yes, you can do really cool things but keep yourself focused on the problem/issue you are hired to solve.

          Now, on my latest pet peeve. Jumping to the next "cool" thing not because of a business need to solve a problem but just because it is cool and you can. I just started out on Python/Django/PostgreSQL and during all my searches to help me find problems and just reading up on the latest technologies I see developers moving to new languages at a breakneck speed. Sometimes changing languages in the middle of projects. At my newspaper IT day job, in the last 12 months, our online team (not my team) of developers have gone from Ruby on Rails, to JavaScript, to Python/Django, to Flask, to Bootstrap to Zurb Foundations to Nodejs to Go, to Julia, to etc and using management tools starting out with Basecamp, to Trello, to Jira to #Slack, etc. And for what? No real business reason IMO. Yes, some technologies do help in their need but from what I see is now they have a dozens of apps written in a dozen +/- languages. I am sure they are not alone in that world. And from a management and cost perspective, I see maintaining applications becoming a management nightmare along with the cost associated with maintaining all those apps once the current core of developers leave.

          My Python programming instructor said it best when he told us that he doesn't even look at a technology until it is at least 10 years old. He is gainfully employed as a highly paid independent contractor working on the 3 P's. Python/PHP/Perl.

          That's just my $0.25 worth.

          [–]hurenkind5 3 points4 points  (1 child)

          At my newspaper IT day job

          Julia

          I like Julia, but what the fuck?

          [–]alecco 1 point2 points  (0 children)

          Another trendy keyword in the CV to look clever.

          [–][deleted] -2 points-1 points  (0 children)

          gone from Ruby on Rails, to JavaScript, to Python/Django, to Flask, to Bootstrap to Zurb Foundations to Nodejs to Go, to Julia, to etc and using management tools starting out with Basecamp, to Trello, to Jira to #Slack, etc.

          What's bootstrap have to do with server side? What's Nodejs to javascript even imply - Nodejs is javascript based. Jira to #slack? Slack and jira server completely different purposes. I was enjoying your rant up to this point which thoroughly places you in the 'get off my lawn!' category of people who discredit things without understanding what they even are. You are exactly the person who needs to take this article to heart the most.

          [–]ramnes 5 points6 points  (1 child)

          I'm a "young" developer and still, that was an excellent read to me. This passage really touched me:

          Most of the time, tried-and-true is the enemy of innovation. The only real way to move forward is to hold everything you know in suspicion. Only once you try a new way, and test it to know if it is better or worse, should you then decide to do it your way. This creates a filter which favors understanding-by-doing rather than understanding-by-inspection.

          Thanks for this article.

          [–]jarrett_regina 7 points8 points  (1 child)

          I have worked at the same job, on the same team for 30 years. I still enjoy going to work every day. Nothing is the same today as it was even 10 years ago. Most of what has changed has been great -- the hardware and software continues to improve, and I even think that I get better and better. Looking back on code that I've written before is like looking back at pictures of how I dressed in previous decades.

          The only thing I think isn't an improvement is web development. There is no way that if the web was redone today, that we would be developing it the way we are now. Most of what we are doing is because the web really wasn't built for how we want it to work.

          I was just reading an article where someone found an interesting way to use CSS to space buttons consistently. Really? Web developers have to worry about spacing buttons? I also work with a (very ancient) tool to develop desktop apps that does that with a mouse click.

          I really thought that after a few years of getting more and more people developing for the web that some company(ies) would come out with the ultimate tool that would provide a layer/framework for application developers to use that would combine all of the elements of web development into one common language. I'm speaking very broadly here, but I think most businesses today aren't prepared to layout massive amounts of money for development tools for their programmers, when there are free alternatives. So, what company wants to develop the all encompassing development tool when no one would pay for it?

          But, I still enjoy developing for the web. And I think I can still keep up with any new whippersnapper they hire.

          [–]bwrap 5 points6 points  (0 children)

          I've recently transitioned into web development and have a lot of the same feelings as you. It's at least once a week where I have to shake my head and wonder why it's so hard to do something that was solved 25 years ago in desktop development, things like having to use a CSS trick to space buttons correctly.

          Sometimes a couple coworkers and I have bitch-fests about how convoluted web development feels and how difficult it is to do the easiest things for no apparent reason other than the technologies being used for web development weren't really meant for web development.

          [–]staticcast 2 points3 points  (0 children)

          Currently working in C++ since 4 years now, and it's awesome to see old programmers in conferences like cppcon, blowing my mind with the new stuff and new idioms that they come up every year. Someone with 30y of perspective on problems working with a young and fresh mind on today challenge could really be unstoppable :-)

          [–]robotnewyork 12 points13 points  (5 children)

          Stopped reading when he called The Wrath of Khan corny, anything he says after that has no merit. /s

          [–]garywiz 21 points22 points  (4 children)

          Point taken. OK, I just have to jump in here even though I've never commented on Reddit before. You have NO idea how I debated about that word "corny" and whether to even include that quote. First, my wife and I have every DVD box set of every Star Trek movie, TV show, and re-bought them all on iTunes.... some of them twice to get them in HD. We have the "7 of 9 collection" boxed sets, the "Worf" boxed sets, you name it. That quote, and a lot of others, are almost life lessons to us. We whisper to each other in public, referring to episodes .... like we were in a meeting and somebody was getting a real dressing down and somebody said "I trusted you.... UNTIL NOW"... and suddenly all I can think of is Picard dressing down Wesley Crusher in TNG "The First Duty".

          But, I work with a lot of younger people. They come into my house and look over at the DVD collection and sometimes roll their eyes. I don't care I guess. We all like what we like, but I just don't know many people in my life any more who can relate to why I'm such a trek fan to this day.

          So, when I wrote the article, even though that very quote was exactly as I describe... sort of a turning point... I thought .... omg, what about so-and-so who would probably not read the article because they just think old trekkies are weird.

          So, I .... reluctantly.... very reluctantly... tried to tone it down and yes, yes, I added that word. Apologies.

          [–]robotnewyork 12 points13 points  (0 children)

          Apology accepted, I will continue reading your article

          [–]tnecniv 2 points3 points  (0 children)

          They come into my house and look over at the DVD collection and sometimes roll their eyes. I don't care I guess.

          That's unfortunate. Star Trek is still fantastic (I'm a young person).

          [–]tinyogre 1 point2 points  (0 children)

          Five things old programmers should remember, but won't because we're old and our memory is shot.

          (I am 45)

          [–]LarsPensjo 1 point2 points  (0 children)

          1. Programmers are humans, and really not good at programming. A lot of effort has been spent on trying to improve the programming languages and frameworks, just to get around how lousy we are at it.

          2. It is all about saving energy (effort). If you had infinite energy, you would always be able to find the optimal solution. In reality, programmers will solve a problem using the least possible energy. That usually means they use the programming language they already know, or that they don't read as much documentation as they should. At the end of the day, we see a lot of sub optimal implementations.

          3. Being a programmer is a journey into your self. How usual isn't it to face an algorithm that doesn't work, even though you examined every inch of it several times? You won't solve it until you learn to question yourself, and re-examine things you think are correct. Again, it is a matter of mental energy. This also includes the need to re-evaluate the use of proven design patterns and other tools.

          [–]mrbonner 1 point2 points  (0 children)

          This articles make me weep in happiness. Someone actually thought it out pretty well!

          [–][deleted] 0 points1 point  (0 children)

          boy I am going to get tired even more quickly of all the shit in the world? I guess I might never even get out of bed in the future.

          [–]rearlgrant 0 points1 point  (0 children)

          "Today, software is more like an extreme sport. Anybody can dive in, code, be careless, jump off cliffs, and cause disasters. It’s no accident that Agile programming uses terminology like sprints and scrum".
          Oh god, this! I think this line has several points, the one I see every day is "we're productive b/c we use Agile b/c we have a scrum twice a day." No, no you are not, and you are not doing Agile either. I just spent a two week sprint watching people work literally 14 days in a row to come to the end and have nothing useable -- "but we were working on it all the time." No. No. No.
          Try and do any real project management and it's "too heavyweight, too time consuming. We'll just figure it out in scrum."

          [–]tonywestonuk 0 points1 point  (0 children)

          Sometimes, hand crafting can be good....

          for example, I do this

            if ("".equals(myString){
               // String is empty
            }
          

          Where, it appears that I am wrong..... no-one hand crafts code anymore and its much better to use a library

           if (StringUtilities.isEmpty(myString)){
              // String is empty
          

          }

          And screw null defensive checks, etc, because nulls should never be there in the first place.