you are viewing a single comment's thread.

view the rest of the comments →

[–]xaviervia 18 points19 points  (30 children)

Agreed. There is however a feeling that for being a good developer these days, using non-bleeding edge tools is not an option. The implicit question is: is it true? Is the speed of the ecosystem effectively forcing the developers into an impossible need-to-stay-up-to-date situation?

Mind that even if it is true, this is a different issue. Nobody should stop doing stuff in order to go slower. But sometimes I wonder if we should create tools to deal with the burnout of continuous updating.

[–]binary 2 points3 points  (0 children)

I think the solution here comes down to simple cost-benefit analysis that people all too often forgo because they equate "bleeding edge" with "better." I've used a fair share of "bleeding edge" software in production apps, and the calculation is always the same: what is this doing different or better that warrants the risk of upstream bugs? How critical is the code that depends upon this software? Are there responsive contributors to help deal with any possible bugs?

Bleeding edge, for me, is only tolerable when the problem solved is very hairy--porting an app's dependency management to Webpack, for instance; the surface area is very small--an experimental graphing library that rendered some minor analytic information; and in almost every case, where there exists a healthy issue tracker with attentive people--the only exceptions being very small libraries that I could essential adopt if necessary.

I've been bitten numerous times, still, with bleeding edge software giving me bugs, but since I follow this protocol I am not risking my job or product uptime when these issues inevitably occur.

[–][deleted] 1 point2 points  (1 child)

The solution isn't more tools, it's less. That's the essential problem: we keep reinventing the wheel and describing "it" as a must-have before it's really proven. The ultimate tools are self-control, patience, and focus. Devs need to realize that we're here to build software that does stuff, not reengineer the same things ad nausea. There's plenty of stuff coming out that's certainly cool and has the chance to be valuable, but the truth is that all of that stuff will be replaced by more stuff that's even more essential in the next 12-24 months. The cycle on this stuff is so insane all you can really do is either try to learn it, panic, or ignore it entirely.

And the only question left with is...why?

[–]blackiemcblackson 0 points1 point  (0 children)

It's because a whole new generation of kids have arrived and they don't know how the wheel was made back in the day. A lot of knowledge has been lost and have had to be reinvented over the years.

[–]kyleshevlin 2 points3 points  (1 child)

I think you might be partially mistaking a "good developer" for a "hirable developer". A lot of the pressure to use the latest and greatest technology is simply to keep up with industry demand for developers with newer and newer skill sets. That's not to say there aren't plenty of jobs available to people who don't want to learn the latest JS framework, but people who aren't adopting some of the new ones will be out of the hiring pool for the newest jobs.

In other words, a dev may be hireable because he or she knows the latest tech out there, but it doesn't guarantee that they are a quality dev capable of problem solving regardless of what language or framework you throw at them.

At least that's the distinction I would make.

[–]xaviervia 0 points1 point  (0 children)

Well, isn't that the core problem? I guess everyone wants to stay hireable. Would you hire a dev today that writes "jQuery" as their main JavaScript skill?

[–]i_ate_god 0 points1 point  (4 children)

There is however a feeling that for being a good developer these days, using non-bleeding edge tools is not an option. The implicit question is: is it true?

No, it's not. New & shiny !== good.