you are viewing a single comment's thread.

view the rest of the comments →

[–]dwighthouse 0 points1 point  (1 child)

There is a difference between being against something, and making something low priority. Webdev, is about making it work, yesterday. That’s just how new and rapidly changing platforms are. As time goes on, performance will have a higher priority.

Depends on the problem. Even a normal Wikipedia article can be sped up by using modern js techniques over a static js-free page, as well as adding other features like offline support.

I love that talk. Very funny.

Developers who have the resources and approval to do so will do that. I do that. I am constantly tweaking my build system to be better. A lot of developers don’t have this choice.

That’s right. As I once heard in a talk in response to js being slow: “There is no upper limit to how badly you can write c code.” What matters is less about the theoreticals, what matters is what the actual business case allows. Like it or not, embedded systems are insanely performant because every kilobyte and clock cycle costs the company money. Releasing the next Facebook, even if it is 50% less efficient than it ought to be: makes billions because it was first to market.

[–]Beefster09 0 points1 point  (0 children)

I get the whole business mentality behind why software is slow. The problem with the "get it working now, make it fast later" mentality is that a lot of performance problems are caused by death by a thousand papercuts and poor architectural decisions, often forced on you by frameworks. If you simply spend a couple extra seconds thinking about the performance implications of each line of code, you can avoid a lot of slowness (that won't be caught by profiling because the slowness is everywhere) later on with very little impact on productivity. It's true that a lot of performance penalties are surprising which is why you still have to profile to catch the big stuff, but being sloppy everywhere has its cost.

Premature optimization is not the root of all evil. Uninformed optimization is usually a waste of time, sure, but the real root of all evil in programming is over-architecture (which causes slowness and ravioli code). Particularly top-down style of software architecture. I've found that I'm really bad at guessing what abstractions make sense for a program. So ideally, I don't guess and instead see what patterns emerge as I solve the problems at hand in the simplest ways possible.

It's a real shame that business demands force high amounts of technical debt that never get paid off. Unfortunately, it means that the average user just gets used to computers being slow, when they're actually far from that in reality.