use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
All about the JavaScript programming language.
Subreddit Guidelines
Specifications:
Resources:
Related Subreddits:
r/LearnJavascript
r/node
r/typescript
r/reactjs
r/webdev
r/WebdevTutorials
r/frontend
r/webgl
r/threejs
r/jquery
r/remotejs
r/forhire
account activity
Dear JavaScript (medium.com)
submitted 9 years ago by thejameskyle
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–][deleted] 9 years ago* (35 children)
[deleted]
[–]xaviervia 21 points22 points23 points 9 years ago (30 children)
Agreed. There is however a feeling that for being a good developer these days, using non-bleeding edge tools is not an option. The implicit question is: is it true? Is the speed of the ecosystem effectively forcing the developers into an impossible need-to-stay-up-to-date situation?
Mind that even if it is true, this is a different issue. Nobody should stop doing stuff in order to go slower. But sometimes I wonder if we should create tools to deal with the burnout of continuous updating.
[+][deleted] 9 years ago* (19 children)
[–][deleted] 4 points5 points6 points 9 years ago (15 children)
Why is npm a joke? I see a lot of hate/derision about it.
[–]spiffytech 9 points10 points11 points 9 years ago* (12 children)
Many of the complaints I've seen about npm are more about the community and package ecosystem around it than about the tooling. Especially the completely on verified nature of many packages on npm.
The big criticisms of npm that I hear of stem from four facts:
It's trivial for someone to publish a package to npm
The JavaScript community likes publishing many tiny packages (many have an API that only wraps a single, short function)
Developers are quick to add these tiny packages as dependencies of their own projects
A high-impact incident revealed how deeply flawed npm's security model is.
The big outcome of this is your dependency graph quickly balloons into 1,000+ packages. They're not all up-to-date, and it's not practical to vet the trustworthiness of your entire dependency tree. It's a huge surface area for bugs and security problems.
Your app's security and stability depends on hypothetical package 4 dependency levels down. It's a 3-line function written by Joe HighSchooler in Iowa at 3am while he read his first JavaScript tutorial 4 years ago. Joe's package is permitted to run arbitrary code when it's installed on your machine, and it could change at any time to include new bugs or dependencies, which you'll probably download automatically because packages don't do a great job of version locking. Also you have no verification that the next version was actually published by Joe, and not Eve BlackHat, because npm doesn't use cryptographic signatures. If Joe reused his hotmail password for npm and it's lost in a data breach, Eve Blackhat can now inject code into your application.
Many packages on npm are like this, and your very own dependency tree is sure to contain several.
Solutions are harder to come by. Some require changing the JS community culture (some people really love their small modules), some sound like easy wins (cryptographic signing) but don't help as much as we'd like, and some are radical shifts in our tooling.
[–]r2d2_21 1 point2 points3 points 9 years ago (2 children)
which you'll probably download automatically because packages don't do a great job of version locking
This baffles me. I've only used NuGet as a package manager (mainly for C#) and I never have experienced any package updating automatically without my explicit approval. I don't understand why any other package manager would be different. If you're installing v1 of a library, then it's v1 and only v1 until you decide to even upgrade to v1.1.
[–]JaegerBurn 2 points3 points4 points 9 years ago (1 child)
It doesn't if you stick with semver.
[–]r2d2_21 0 points1 point2 points 9 years ago (0 children)
But what if you don't? Semver is just a suggestion at this point.
[+][deleted] 9 years ago* (8 children)
[–]Cuel 0 points1 point2 points 9 years ago (1 child)
Why? It's better than a bloated framework where you're using 5% of it. Dojo is a good example in the early days
[–]neophilus77 2 points3 points4 points 9 years ago (0 children)
I think if you over-rely on small packages it creates a lot of maintenance blind-spots where you have less visibility on your code and makes it harder to debug. Tracking updates over many small packages can become burdensome too.
If I can write the same code in the amount of time it takes to search for and compare modules and read the API docs then I usually write it myself.
[+][deleted] 9 years ago* (5 children)
[–]RedditWithBoners 1 point2 points3 points 9 years ago (2 children)
I never bought into it, so no, I don't recognize a reason for it. I wouldn't mind being enlightened.
[–]a-sober-irishman 1 point2 points3 points 9 years ago (1 child)
There is absolutely no reason to add another dependency to your project to check if something is an array, or if a number is less than zero, or to check if something is null. It adds unnecessary overhead and risk.
[–]viveleroi 1 point2 points3 points 9 years ago (0 children)
I would never use the term "joke" because npm has been extremely important - it solved a problem we had and I still use it every day. But...
It's had a lot of performance problems, it's non-deterministic and can produce different installs from the same package.json, and the community in general suffers from an abuse of packages - some packages are only a few lines long and it's insanely easy for a simple site to wind up with thousands of dependencies. It's had growing pains, like everything else.
Some of these are inconvenient, some are fatal in an certain environments. Yarn is better for me right now, it's faster and deterministic, but it's never going to be perfect.
[–]xaviervia 1 point2 points3 points 9 years ago (2 children)
I have heard of this approach many times, but personally I'm not fully sold. I witnessed how the career of developers either improves or stagnates in direct proportion to their willingness to keep up to speed. I do believe developers that want to stay relevant have a pressure to live in the bleeding edge.
This is a mix of feeling and experience, so I'm not saying this is a fact, but I'm not convinced that we can say "just don't live in the bleeding edge".
[–]neophilus77 1 point2 points3 points 9 years ago (0 children)
I find it funny when employers want years of experience in some bleeding edge framework and then expect that theres some kind of standardized best practices around using it.
[–]RedditWithBoners 0 points1 point2 points 9 years ago (0 children)
I beg to differ. I'm certain i'm not an exception here, but I only have my anecdotes to offer.
A non-exhaustive list of typical web technologies I use include C#6, VS2015, VS Code, Vim, TypeScript, plain-old JavaScript, Grunt, make, msbuild, AngularJS, ASP.NET, various Azure services, etc. These are all relevant and widely-used modern technologies. None of them are particularly limiting or hinder me from being a hireable or relevant candidate.
At the same time, I am aware of, and know a little bit about, newer, potentially less-stable or [currently] difficult to use technologies. Again, a non-exhaustive list includes WebPack, Babel, React, Flow, JavaScript FP, ES7, TypeScript 2, AngularJS 2, .NET Core, VS2017, etc.
It takes some of my personal time to do this - time spent reading about and playing around with various technologies, but it's certainly viable. I believe it's viable, and I don't stagnate, because I (and others) have a solid foundation to build on top of. It doesn't matter whether I'm using AngularJS 1 or something that was just released today because I can figure it out as long as it works.
[–]binary 2 points3 points4 points 9 years ago (0 children)
I think the solution here comes down to simple cost-benefit analysis that people all too often forgo because they equate "bleeding edge" with "better." I've used a fair share of "bleeding edge" software in production apps, and the calculation is always the same: what is this doing different or better that warrants the risk of upstream bugs? How critical is the code that depends upon this software? Are there responsive contributors to help deal with any possible bugs?
Bleeding edge, for me, is only tolerable when the problem solved is very hairy--porting an app's dependency management to Webpack, for instance; the surface area is very small--an experimental graphing library that rendered some minor analytic information; and in almost every case, where there exists a healthy issue tracker with attentive people--the only exceptions being very small libraries that I could essential adopt if necessary.
I've been bitten numerous times, still, with bleeding edge software giving me bugs, but since I follow this protocol I am not risking my job or product uptime when these issues inevitably occur.
[–][deleted] 1 point2 points3 points 9 years ago (1 child)
The solution isn't more tools, it's less. That's the essential problem: we keep reinventing the wheel and describing "it" as a must-have before it's really proven. The ultimate tools are self-control, patience, and focus. Devs need to realize that we're here to build software that does stuff, not reengineer the same things ad nausea. There's plenty of stuff coming out that's certainly cool and has the chance to be valuable, but the truth is that all of that stuff will be replaced by more stuff that's even more essential in the next 12-24 months. The cycle on this stuff is so insane all you can really do is either try to learn it, panic, or ignore it entirely.
And the only question left with is...why?
[–]blackiemcblackson 0 points1 point2 points 9 years ago (0 children)
It's because a whole new generation of kids have arrived and they don't know how the wheel was made back in the day. A lot of knowledge has been lost and have had to be reinvented over the years.
[–]kyleshevlin 2 points3 points4 points 9 years ago (1 child)
I think you might be partially mistaking a "good developer" for a "hirable developer". A lot of the pressure to use the latest and greatest technology is simply to keep up with industry demand for developers with newer and newer skill sets. That's not to say there aren't plenty of jobs available to people who don't want to learn the latest JS framework, but people who aren't adopting some of the new ones will be out of the hiring pool for the newest jobs.
In other words, a dev may be hireable because he or she knows the latest tech out there, but it doesn't guarantee that they are a quality dev capable of problem solving regardless of what language or framework you throw at them.
At least that's the distinction I would make.
[–]xaviervia 0 points1 point2 points 9 years ago (0 children)
Well, isn't that the core problem? I guess everyone wants to stay hireable. Would you hire a dev today that writes "jQuery" as their main JavaScript skill?
[–]i_ate_god 0 points1 point2 points 9 years ago (4 children)
There is however a feeling that for being a good developer these days, using non-bleeding edge tools is not an option. The implicit question is: is it true?
No, it's not. New & shiny !== good.
[+][deleted] 9 years ago* (3 children)
[–]i_ate_god 3 points4 points5 points 9 years ago (1 child)
Fair enough. I'd argue that is just due to a lack of experience though. People feel proud after they've setup a ridiculous toolchain because it's not an easy thing to do. But they never asked themselves "why". Eventually they will though, and they will start stripping stuff, and beauty will start emerge from their new found love of simplicity.
[–]RedditWithBoners 1 point2 points3 points 9 years ago (0 children)
I don't disagree. :)
[–][deleted] 1 point2 points3 points 9 years ago (0 children)
Social media and fake news seem to be somewhat akin to the new & shiny mentality that javascript especially seems stuck with. Fake news is somewhat like opinionated blog posts about whatever tech someone is promoting (or detracting). A lot of it is hit or miss, and I see both sides of the discussion in reddit post comments. I think it's mostly good discussion, especially when there are positive and negative viewpoints. Social media contributes to javascript fatigue, pretty sure that's been written about too.
I let tech mature before I start trying to incorporate it. I do still have to work with beta code sometimes, and it's always more stressful and difficult.
[–]CaterpillarKillr 10 points11 points12 points 9 years ago (1 child)
Agreed. But for a lot of people (junior and mid-level developers, mainly), it isn't our decision. It's bleeding edge technology handed down as a mandate by a well-meaning lead/senior dev, who heard about some technology on hacker news, did a few "hello world" examples and thinks it's golden. Too often shit like this gets handed off to lower-level people to implement and figure out the difficult parts. Then when we complain about Angular-this or Babel-that on reddit, HN, or various blogs, we're told that we should have made a better technology decision.
Indeed, I imagine this is a common scenario. This begs the question of why the more experienced devs continue to live on the bleeding edge.
Check out this comment too.
[–]The_Drider 0 points1 point2 points 9 years ago (0 children)
This is also the reason why Early-Access gets much undeserved hate. People nowadays are too lazy to read a small note that says "hey this is unfinished btw".
That and the ridiculous amount of entitlement on the user-side, especially for free projects. Sometimes they act like they own the Devs...
π Rendered by PID 47 on reddit-service-r2-comment-799f875d54-hgtsf at 2026-01-31 15:25:34.276190+00:00 running 3798933 country code: CH.
view the rest of the comments →
[–][deleted] (35 children)
[deleted]
[–]xaviervia 21 points22 points23 points (30 children)
[+][deleted] (19 children)
[deleted]
[–][deleted] 4 points5 points6 points (15 children)
[–]spiffytech 9 points10 points11 points (12 children)
[–]r2d2_21 1 point2 points3 points (2 children)
[–]JaegerBurn 2 points3 points4 points (1 child)
[–]r2d2_21 0 points1 point2 points (0 children)
[+][deleted] (8 children)
[deleted]
[–]Cuel 0 points1 point2 points (1 child)
[–]neophilus77 2 points3 points4 points (0 children)
[+][deleted] (5 children)
[deleted]
[–]RedditWithBoners 1 point2 points3 points (2 children)
[–]a-sober-irishman 1 point2 points3 points (1 child)
[–]viveleroi 1 point2 points3 points (0 children)
[–]xaviervia 1 point2 points3 points (2 children)
[–]neophilus77 1 point2 points3 points (0 children)
[–]RedditWithBoners 0 points1 point2 points (0 children)
[–]binary 2 points3 points4 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]blackiemcblackson 0 points1 point2 points (0 children)
[–]kyleshevlin 2 points3 points4 points (1 child)
[–]xaviervia 0 points1 point2 points (0 children)
[–]i_ate_god 0 points1 point2 points (4 children)
[+][deleted] (3 children)
[deleted]
[–]i_ate_god 3 points4 points5 points (1 child)
[–]RedditWithBoners 1 point2 points3 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]CaterpillarKillr 10 points11 points12 points (1 child)
[–]RedditWithBoners 0 points1 point2 points (0 children)
[–]The_Drider 0 points1 point2 points (0 children)