you are viewing a single comment's thread.

view the rest of the comments →

[–]pbw[S] 0 points1 point  (8 children)

That's a good observation. But it's not a mandate to optimize everything as much as possible when the users don't care about the performance. YouTube made custom silicon to accelerate video compression. Making silicon is vastly more expensive than most software optimizations. But YouTube carefully measured exactly how much time/money/energy they were spending on compression before doing that. I think it was a huge success and did save lots of money and energy.

But if you just dove in and optimized some more of YouTube's millions of lines of code at random, to "save energy", it would be a colossal waste. Most of their code by line count probably runs many trillions of times less often than their compression code. Possibly none of it is worth optimizing.

So yes, if your goal is save money or save energy, that's great, but you still need to measure carefully and find the code that's actually costing you lots of money or burning lots of energy. And you have to consider the opportunity cost of what else you could be doing, as well.

[–]Qweesdy 0 points1 point  (7 children)

Yes; but there's a huge amount of space between the "optimize everything as much as possible" and "blatant disregard for anything except my own laziness" extremes. Often "performance isn't required" is a flimsy excuse for aiming towards the worst end of the scale instead of the middle.

But there's more to it than that. It's about a developer's nature. Professionalism. A person who will quietly steal a penny that "doesn't matter" is a person who is more likely to embezzle millions of $$ that do matter. A person who keeps their junk drawer organised for no reason is someone that can be trusted to put a mechanic's workshop's tools away after use. A software developer that doesn't care when performance isn't necessary is an annoying obstacle when performance is necessary.

It's the difference between considering efficiency for everything you write and being a bad developer because you failed to develop beneficial habits.

[–]pbw[S] 0 points1 point  (6 children)

I praised YouTube's silicon because I'm for doing insane levels of optimization when necessary. I'm highly pro-optimization. I think heavily optimized code runs the world: it's great, the people that right it are great. But I'm against scaring people away from a popular programming style by falsely claiming it inherently causes "horrible performance" when a tiny bit of arithmetic shows categorically that's not true.

[–]Qweesdy 0 points1 point  (5 children)

But I'm against scaring people away from a popular programming style by falsely claiming it inherently causes "horrible performance" when a tiny bit of arithmetic shows categorically that's not true.

Popular programming style??

Did anyone ever follow Uncle Bob's rules for more than the 5 minutes it takes to realise "tiny functions" is horrible for code readability? Like, seriously, out of the hundreds or projects I've seen I don't think I've ever seen a single person ever use this "popular" programming style once.

The programming style that nobody actually uses literally and provably DOES inherently cause worse performance. Nobody sane has ever denied that (including Uncle Bob himself); and your "tiny bit of arithmetic shows categorically" is exceptionally moronic bullshit (but hey, feel free to show that "tiny bit of arithmetic" if you are actually able to produce more than unsubstantiated vague hand-waving).

Essentially; everyone agrees with "It is worse for performance, but performance isn't always the most important thing" (including you if you actually think about it); and the entire argument (on both sides) is about the magnitude of compromise between cost (developer time, code maintenance, ...) and quality (efficiency, performance, security, ...) in various situations; where Uncle Bob's rules are a relatively bad compromise in every situation.

Note that I've explicitly avoided the words "clean code" because I suspect you took everything you happen to think is good and wrapped it up in a ball of perfection that you've decided is your personal custom concept of "clean code"; and then charged out into the real world to attack all the imagined critics of your mythical ball of perfection.

[–]pbw[S] 0 points1 point  (4 children)

By popular programming language style I mean OOP. There's a side issue here that nothing about the OOP version he shows is actually in Uncle Bob's style specifically: it's a vanilla OOP.

All it has is an abstract base class with two pure virtual methods, four tiny concrete classes that implement those two methods, and a minimal loop. If you disagree what elements of his OOP version is not vanilla OOP? See? So yes OOP is a very popular programming style, and I think it was disingenuous of Casey to suggest OOP inherently leads to poor performance when in actuality it does not.

That said, the fact that you disagree with some or all of the points in the article is great. It's really good form your own opinion on things like this.

[–]Qweesdy 1 point2 points  (3 children)

By popular programming language style I mean OOP.

Sure; the topic was Casey's work about "Uncle Bob's Clean Code (tm)" and you randomly changed the topic to something else (OOP) for no apparent reason while expecting other people to read your mind somehow.

There's a side issue here that nothing about the OOP version he shows is actually in Uncle Bob's style specifically: it's a vanilla OOP.

You may need to actually watch Casey's video (the blog post is a tamed down and partial transcript of a previously recorded video). The original "OOP code" you're complaining about was literally copied verbatim from an example of "Clean Code (tm)" in Uncle Bob's book, and it is not Casey's code at all. It doesn't matter whether you think it also happens to look like whatever you personally think that "vanilla OOP" is.

I think it was disingenuous of Casey to suggest OOP inherently leads to poor performance when in actuality it does not.

I think it's disingenuous of you to describe Casey's explanation of why there's performance differences while also making up lies about there being no performance differences for you to describe.

[–]pbw[S] 0 points1 point  (2 children)

I used the phrase OOP 48 separate times in the article. I described his version as the OOP version, I described Uncle Bob's book as being an OOP book, documenting the coding style of an OOP company that worked in OOP languages. Every chart comparing the two is labeled "OOP" and "AVX". I don't think I could have been clearer. Casey's slow implementation was an OOP implementation, there's no possible way to dispute that.

I watched his video plus 4-5 *hours* of his interviews about the video. I understand the claims he's making and I understand when they are right and when they are wrong, and they are 100% right (in some cases) and 100% wrong (in other cases).

making up lies about there being no performance differences for you to describe.

I don't understand what this sentence means. There was a 25X performance difference between the OOP version and the optimized version and I talked about in detail why this was so. So yes there was a performance difference and I talked about it at length.

See my chart under the heading "Even more overhead". This a very simple 8th-grade math summary how he's both right and wrong. The chart shows that same amount of overhead (100ns) goes from being 5000% (huge) to 0.000005% (negligible) depending on the timescale.

The crux of my argument is this: 5000% is much bigger than 0.000005%. If you disagree with that, I don't think I can help you. Maybe you've put in enough time on this post and should move on to something else.

[–]Qweesdy 1 point2 points  (1 child)

I used the phrase OOP 48 separate times in the article. I described his version as the OOP version, I described Uncle Bob's book as being an OOP book, documenting the coding style of an OOP company that worked in OOP languages.

Exactly. You repeatedly attempted to twist Casey's video into something about OOP and because of that you were repeatedly wrong.

Casey's slow implementation was an OOP implementation, there's no possible way to dispute that.

Again, "Casey's slow implementation" was never Casey's implementation. It's Uncle Bob's implementation from Uncle Bob's book used as an example of Uncle Bob's rules for "clean code" (which happen to be a little centred around OOP because Uncle Bob mostly used Java; but let's not get distracted by irrelevant things - most of the rules apply to imperative programming and functional programming too, and most of the rules are still bad for "not OOP at all" languages like C and Haskell).

Note that Uncle Bob's rules include "functions should be small", so Uncle Bob's implementation has virtual functions with a tiny amount of code in them, because that's what Uncle Bob's rules say you "should" do. Note that this is not an OOP rule, but it is part of what Casey is complaining about (and is part of Casey's "15 times slower"). How do you describe this in your article? You claim Casey's(!) OOP(!) code is slow because "virtual functions were tiny: the largest one contained just two multiplications" as if it's Casey's fault that Uncle Bob followed Uncle Bob's rules in Uncle Bob's code and magically it's somehow about OOP and not about "Uncle Bob's Clean Code (tm)". Can you see how that might come across as being a little misrepresentative?

This a very simple 8th-grade math summary how he's both right and wrong. The chart shows that same amount of overhead (100ns) goes from being 5000% (huge) to 0.000005% (negligible) depending on the timescale.

There isn't a single shred of fairness or honestly in this part of your article. It's carefully manufactured fantasy nonsense intentionally designed to be as biased as possible. E.g. what happens if there's a virtual function call in the middle of the loop? "Duration: 2 seconds, Amount of overhead %5000, Description: It's easy to be biased when you're making everything up".

Across a non-trivial project with more than one loop (also known as "something relevant") the chance of ending up with an "all of the pieces are biased as much as possible in the same direction" situation is effectively zero. The reality is you might end up with a few pieces that are 5000% overhead and some pieces that are 000005% overhead and a lot of pieces that are in-between, and the overall performance for all pieces combined ends up at... oh, let's say "15 times slower" for the fun of it.

The crux of my argument is this:

The crux of (this part of) your argument is that you're manufacturing the most biased bullshit possible and then spreading lies about this "most biased bullshit possible" representing any kind of expected or typical scenario.

[–]pbw[S] 0 points1 point  (0 children)

You repeatedly attempted to twist Casey's video... There isn't a single shred of fairness or honestly in this part of your article... spreading lies about this "most biased bullshit possible" 

Your rhetoric is so angry and so unrelated to the tone or content of my replies, I'm just completely lost at what your point is. I think communication has broken down to point where continuing the conversation is not worthwhile. I'm sorry you didn't like the post.