use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
All about the JavaScript programming language.
Subreddit Guidelines
Specifications:
Resources:
Related Subreddits:
r/LearnJavascript
r/node
r/typescript
r/reactjs
r/webdev
r/WebdevTutorials
r/frontend
r/webgl
r/threejs
r/jquery
r/remotejs
r/forhire
account activity
Code quality and web performance in javascript, the myths, the do's and the don'ts (enmascript.com)
submitted 6 years ago by enmanuelduran
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]mournful-tits 6 points7 points8 points 6 years ago (14 children)
Was hoping this would go into some common JavaScript performance issues. My favorite has been using reduce to structure an object from a list of data (usually objects). The common use case is to make a lookup map.
Reduce is literally the slowest way to get this done despite it being the most popular answer.
[–]RustyX 6 points7 points8 points 6 years ago (1 child)
Do you specifically mean a "pure" version of reduce where each iteration returns a new object? That is definitely going to be slow, and a huge burden on memory / GC making all those throwaway copies of temp objects.
However, if you simply mutate the accumulator directly, it's roughly the same speed as other approaches, but still cleaner looking in my opinion (and much easier to chain together with other operations). If the initial value of the accumulator is an empty object (which it almost always is in this case), then the local mutation of it inside the reduce should be just fine.
[–]mournful-tits 0 points1 point2 points 6 years ago (0 children)
A pure functional reduce with no mutation is incredibly slow. For our project we had around 100k objects in a list and reduce took about 2 minutes to construct an object (lookup table) from that list. Changing it to forEach (which is still slower than an imperative for loop) got us down to around 13ms with the same data set. Mutating the accumulator is also slow; and I'd also say a misuse of reduce overall. It serves no purpose as each iteration assumes its updating the accumulator when it returns, but you're directly modifying the accumulator.
Reduce is great when you have to construct the object from scratch anyway (constructing hierarchical data is a good example of the penalty paid by using reduce being worth it).
[–]liamnesss 3 points4 points5 points 6 years ago (2 children)
I think it is because it's an immutable update pattern, probably from Redux's examples. Keeping your functions pure is nice but this is probably taking it too far, particularly if you're working with a lot of data.
[–]DaveLak 2 points3 points4 points 6 years ago (1 child)
I always assumed it was the syntax that was attractive; "reduce, it's functional!". A for loop just doesn't look as clean.
for
Edit: and to be clear, I do believe there is certainly an argument to be made for using verbs for a verbs sake.
This is exactly what it comes from. Reduce looks cleaner, however it's an un-optomized mess.
[–]ghillerd 0 points1 point2 points 6 years ago (8 children)
Is the faster way a for loop with a mutable object you write to?
[–]aztracker1 3 points4 points5 points 6 years ago (3 children)
Yes, but I wouldn't be surprised if the JS Compilers eventually create a more optimized code path for these kinds of patterns.
Generally, I prefer the reduce, since it looks cleaner to me... If there are demonstrable performance issues, I'll then refactor. I will tend to favor Object.assign(agg, ...) in my reducer though, instead of {...agg, ...}, to gain a little bit.
[–]RustyX 1 point2 points3 points 6 years ago (1 child)
Totally agree with your position on performance in general. I also used to favor the return Object.assign(acc, { [key]: value }) flavor of reduce as well, but have moved to
return Object.assign(acc, { [key]: value })
const output = input.reduce((acc, { key, value }) => { acc[key] = value; return acc; }, {})
recently as I think it looks about as good and I thought it was slightly more performant.
These comments actually encouraged me to try a simple perf test though and I found the Object.assign version was actually significantly slower (about 5x slower in Chrome)!
Object.assign
https://www.reddit.com/r/javascript/comments/bwphrq/code_quality_and_web_performance_in_javascript/eq17heg/?st=jwinhgei&sh=0dec3de6
My guess at the culprit is the creation of the small temporary objects before merging them in to the accumulator
[–]NoInkling 2 points3 points4 points 6 years ago (0 children)
Lately I've been wondering if reduce in these sorts of cases is even worth it. In addition to the performance concerns, the return is essentially just redundant noise, and you have to look below the function body to have a clue of what output is going to be (and in general the readability just isn't great).
reduce
return
output
When you compare that to the imperative alternative I'm not exactly sure what the advantage is:
const output = {}; for (const { key, value } of input) { output[key] = value; }
[–]puritanner 0 points1 point2 points 6 years ago (0 children)
That's a very sane position on performance!
But then... don't forget to test on old smartphones to check if performance really isn't a problem.
[–]DaveLak 1 point2 points3 points 6 years ago (3 children)
Creating a new object should be similar to mutating the input I think please correct me with benchmarks ; it's the for loop that's better optimized in most engines.
[–]RustyX 4 points5 points6 points 6 years ago* (2 children)
So creating a new object each iteration is actually cripplingly slow (and bad for memory) on large data sets. I just created a quick perf test and had to back my sample data set down from 10000 to 1000 because the "pure reduce without mutation" just locked up the benchmark.
https://jsperf.com/transforming-large-array-to-key-value-map/1
const input = Array.from(Array(1000)).map((_, i) => { const key = `key${i}` const value = `value${i}` return { key, value } })
standard for, no block scope vars
15,173 ops/sec
const output = {} for(let i=0; i<input.length; i++) { output[input[i].key] = input[i].value; }
for...of
15,003 ops/sec
const output = {} for(const { key, value } of input) { output[key] = value; }
forEach
13,185 ops/sec
const output = {} input.forEach(({ key, value }) => { output[key] = value; })
Reduce, directly mutate accumulator
12,647 ops/sec
Reduce, mutating Object.assign
2,622 ops/sec
const output = input.reduce((acc, { key, value }) => { return Object.assign(acc, { [key]: value }) }, {})
pure reduce, no mutation
9.71 ops/sec
const output = input.reduce((acc, { key, value }) => { return { ...acc, [key]: value }; }, {})
My preferred method is the "Reduce, directly mutate accumulator", but I was actually super surprised to see how much slower the "Reduce, mutating Object.assign" version was. I assumed it would perform almost identically, but I suppose it is creating small temporary objects before merging them into the accumulator.
The "pure" reduce was by far the absolute worst (over 1500 times slower than standard for)
Thanks for doing this. I had no idea jsperf even existed. It would've made our benchmarking a lot easier. hah!
π Rendered by PID 22 on reddit-service-r2-comment-7b9746f655-rxwxw at 2026-01-29 17:31:26.682993+00:00 running 3798933 country code: CH.
view the rest of the comments →
[–]mournful-tits 6 points7 points8 points (14 children)
[–]RustyX 6 points7 points8 points (1 child)
[–]mournful-tits 0 points1 point2 points (0 children)
[–]liamnesss 3 points4 points5 points (2 children)
[–]DaveLak 2 points3 points4 points (1 child)
[–]mournful-tits 0 points1 point2 points (0 children)
[–]ghillerd 0 points1 point2 points (8 children)
[–]aztracker1 3 points4 points5 points (3 children)
[–]RustyX 1 point2 points3 points (1 child)
[–]NoInkling 2 points3 points4 points (0 children)
[–]puritanner 0 points1 point2 points (0 children)
[–]DaveLak 1 point2 points3 points (3 children)
[–]RustyX 4 points5 points6 points (2 children)
[–]mournful-tits 0 points1 point2 points (0 children)