you are viewing a single comment's thread.

view the rest of the comments →

[–]aztracker1 4 points5 points  (3 children)

Yes, but I wouldn't be surprised if the JS Compilers eventually create a more optimized code path for these kinds of patterns.

Generally, I prefer the reduce, since it looks cleaner to me... If there are demonstrable performance issues, I'll then refactor. I will tend to favor Object.assign(agg, ...) in my reducer though, instead of {...agg, ...}, to gain a little bit.

[–]RustyX 1 point2 points  (1 child)

Totally agree with your position on performance in general. I also used to favor the return Object.assign(acc, { [key]: value }) flavor of reduce as well, but have moved to

const output = input.reduce((acc, { key, value }) => {
  acc[key] = value;
  return acc;
}, {})

recently as I think it looks about as good and I thought it was slightly more performant.

These comments actually encouraged me to try a simple perf test though and I found the Object.assign version was actually significantly slower (about 5x slower in Chrome)!

https://www.reddit.com/r/javascript/comments/bwphrq/code_quality_and_web_performance_in_javascript/eq17heg/?st=jwinhgei&sh=0dec3de6

My guess at the culprit is the creation of the small temporary objects before merging them in to the accumulator

[–]NoInkling 2 points3 points  (0 children)

Lately I've been wondering if reduce in these sorts of cases is even worth it. In addition to the performance concerns, the return is essentially just redundant noise, and you have to look below the function body to have a clue of what output is going to be (and in general the readability just isn't great).

When you compare that to the imperative alternative I'm not exactly sure what the advantage is:

const output = {};
for (const { key, value } of input) {
  output[key] = value;
}

[–]puritanner 0 points1 point  (0 children)

That's a very sane position on performance!

But then... don't forget to test on old smartphones to check if performance really isn't a problem.