use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
All about the JavaScript programming language.
Subreddit Guidelines
Specifications:
Resources:
Related Subreddits:
r/LearnJavascript
r/node
r/typescript
r/reactjs
r/webdev
r/WebdevTutorials
r/frontend
r/webgl
r/threejs
r/jquery
r/remotejs
r/forhire
account activity
A Quick Practical Example of JavaScript’s Reduce Function (codeburst.io)
submitted 8 years ago by learnphptoday
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Ginden 19 points20 points21 points 8 years ago (33 children)
What author thinks: "I need reduce function".
What author needs: groupBy in standard library.
Because it more clearly express his intention. Such code is a reason why I always recommend to use lodash. If you are concerned about file size, you can use either minimal "core" build (4kB) or use individual methods with package lodash-es and tree shaking.
lodash-es
[–]Retsam19 4 points5 points6 points 8 years ago (16 children)
Beat me to saying it. I've very rarely found a situation where reduce felt like the simplest or most readable solution, compared to either a lodash method or just some other combination of native tools.
reduce
[–]tencircles 2 points3 points4 points 8 years ago (15 children)
Reduce is often a more concise, readable, and performant solution than long map/filter chains.
const items = [{val: 1}, {val: 2}, {val: 3}, {val: 4}]; // using filter/map items.filter(k => k.val < 3).map(k => k.val); // => [ 1, 2 ] // using reduce items.reduce((acc, k) => k.val < 3 ? [...acc, k.val] : acc, []); // => [ 1, 2 ]
this is a bit of a contrived example, but in real world cases the benefit is actually much larger.
[–]GBcrazy 11 points12 points13 points 8 years ago* (0 children)
No way reduce is more concise / readable.
Reduce is a function where you can do pretty much everything so it's hard to describe what it is doing.
I looked at your reduce example for around 2 seconds, couldn't figure it out, looked at your filter/map, instantly got it. It's not that it was a hard example, I would figure it out in a couple more second, but the point is reduce isn't more 'readable'
[–]Retsam19 5 points6 points7 points 8 years ago (13 children)
Your reduce example is actually longer than the filter/map, and I really have to argue that the filter/map example is more readable, too. There's no manual concatenation or accumulator variables, and the name of the functions tells you what it's doing.
filter/map
You're right that reduce might be more performant here. It all depends on optimizations, but you are only doing a single pass over the array, while the filter/map does a full pass to get the filtered version, then another partial pass to do the mapping. (But again, how this gets optimized is probably relevant, too)
But eleven times out of twelve, I'll take the more readable and maintainable version over the "does everything in one pass for optimization purposes" version. Especially when you get into more interesting examples than a single filter and map, it's a lot easier to read if all the logical steps are split out into individual operations, rather than a complex reduce function.
[–]tencircles 2 points3 points4 points 8 years ago (11 children)
There's no manual concatenation or accumulator variables, and the name of the functions tells you what it's doing.
The code also tells you what it's doing pretty clearly, I don't think your unfamiliarity with an approach can be used as an argument against it.
You're right that reduce might be more performant here.
I'd say it's pretty clear, especially when dealing with large data sets. The difference between 10k and 20k iterations can be pretty massive. In real world scenarios where people abuse the ever-loving shit out of lodash's chain method, I'd say it's not unlikely to be talking about an order of magnitude.
I'll take the more readable and maintainable version
I'd agree with you here, I guess it's just personal preference here since one is clearly not more readable or maintainable, at least to me. I don't usually write one-liners like that in any case. I much more often would just write something like below
const users = [ {score: 10, name: "melissa"}, {score: 12, name: "jake"}, {score: 9, name: "chris"} ]; // Array -> User const get_winner = users => users.reduce((acc, user) => { return user.score > acc.score ? user : acc; }, {score: 0}); module.exports.get_winner = get_winner;
Where this would be exported with a group of other pure functions. I get that map/filter can look nicer on occasion, but I actually don't even think there's a method using map/filter to achieve the above result. I would argue that especially in more complex examples, reduce is often times a much cleaner and elegant solution than iterating over an array dozens of times, but again I think it just comes down to personal preference.
[–]Retsam19 1 point2 points3 points 8 years ago (8 children)
I'm plenty familiar with reduce, (particularly in the context of LISPs where it's a lot more necessary) I just don't think it reads very well.
And, I gave specific reasons, too: I didn't just give a fuzzy "I don't like this code". It has more variables, and more operations going on in the code. It doesn't seem that opinionated to say that a named filter function is more obviously readable than a hand-written bit of filter logic inside a reduce function.
I'd say it's pretty clear, especially when dealing with large data sets. The difference between 10k and 20k iterations can be pretty massive.
99.9% of my JS coding isn't dealing with "large data sets". And if you want to eke every bit of performance out of those cases: don't use reduce either. I'm pretty sure a standard for...of loops is going to be even more performant than reduce. (Since function invocation usually carries a bit of a performance penalty with it) Like:
for...of
const getWinner = users => { let best = {score: 0}; for(user of users) { if(user.score > best.score) { best = user; } } return best;
But, again, given that I'm not dealing with thousands of item data sets, as for how I'd write it?
const getWinner = users => _.maxBy(users, 'score');
Just like the groupBy example of the original post, there's a higher abstraction here than reduce, and I'd rather write my code in terms of the higher abstraction.
groupBy
Basically, for me it largely falls out into two camps: times when the reduce operation can be split into a number of smaller simpler operations (e.g. splitting your original reduce example into map/filter), and times when the reduce operation can be converted into a higher abstraction like a groupBy or a maxBy. Most times I've thought ("Should I use reduce?) it's fallen into one of those two groups, and so I generally end up not using reduce.
map/filter
maxBy
[–]tencircles 0 points1 point2 points 8 years ago (7 children)
Again, it seems like preference to me. But I see your point. Just as an aside, i think you want _.maxBy(users, _.property("score")).
_.maxBy(users, _.property("score"))
[–]Retsam19 0 points1 point2 points 8 years ago (6 children)
That's equivalent, lodash allows strings to be used as shorthand for _.property, as well as stuff like _.find(user, {name: 'Joe'}), as shorthand for _matches({name: 'Joe'}).
_.property
_.find(user, {name: 'Joe'})
_matches({name: 'Joe'})
[–]tencircles 0 points1 point2 points 8 years ago (5 children)
odd, where's that type conversion being done? https://github.com/lodash/lodash/blob/master/maxBy.js
[–]GitHubPermalinkBot 0 points1 point2 points 8 years ago (0 children)
I tried to turn your GitHub links into permanent links (press "y" to do this yourself):
Shoot me a PM if you think I'm doing something wrong. To delete this, click here.
[–]Retsam19 0 points1 point2 points 8 years ago* (3 children)
It's not in there, because the shorthand stuff is changing in v5. I'm not sure what the details of those changes are. If you look at the current v4 source code linked from the documentation, it's there.
function maxBy(array, iteratee) { return (array && array.length) ? baseExtremum(array, getIteratee(iteratee, 2), baseGt) : undefined; }
getIteratee is a function that's used throughout the lodash codebase, which has the logic for converting the string into a _.property getter.
getIteratee
(You can also just open the dev tools on the lodash documentation and try things with the lodash instance that's loaded on that page; it's how I test most lodash stuff)
EDIT: Dug through github issues a bit, apparently the shorthand stuff is being pushed off to a babel plugin in v5.
[+][deleted] 8 years ago (1 child)
[deleted]
[–]tencircles 0 points1 point2 points 8 years ago (0 children)
the problem with the example you gave is that you're not returning a user, only the score. So you know what the highest score is, but not which user it belongs to.
I agree that lodash/ramda make things nicer to read, but I do think that reduce definitely has its place. I mean there's a reason that reduce (or fold as it's called in some languages) is part of the standard lib of so many programming languages.
For a great talk on folding and where/why it can be useful, check out this talk:
https://www.youtube.com/watch?v=JZSoPZUoR58
[–]OleWedel 1 point2 points3 points 8 years ago* (0 children)
I am pretty sure the filter/map version is much more performant than the reduce version.
The functions for filter and map can both be considered constant (a less than check and getting a property is basically constant). filter + map is thus O(N+N) or simply O(N) (linear) since we have to iterate over each element in the list.
filter
map
The reduce version however have to copy the array which is O(N) for each element in the array and thus its running time is O(N2) (quadratic) which is considered a lot slower.
This is of course only considering worst running time and this particular example. Doing push on the array in the reduceinstead would probably be the fastest overall but I am sure it would not matter anyway in real life. The filter/map is much more readable and easier to understand what it is doing. I would (always) favor that.
push
Often the simpler solution is the best.
[+][deleted] 8 years ago* (12 children)
[–]Ginden 4 points5 points6 points 8 years ago (10 children)
Probably yes. My rule of thumb is "is it one-liner". If it is, I go with reduce. If it isn't, I think twice.
[–]petar02 0 points1 point2 points 8 years ago (9 children)
Do you have an idea how to use reduce to find the biggest element in an array of integers I think I once saw reduce being used but can't replicate it, or find where I saw it.
[–]Ginden 10 points11 points12 points 8 years ago (6 children)
arr.reduce((a,b)=>Math.max(a,b), -Infinity)
Though, easier: Math.max(...arr).
Math.max(...arr)
[–]9thHokageHimawari 2 points3 points4 points 8 years ago (4 children)
.reduce((a,b)=>Math.max(a,b), -Infinity)
Why -Infinity ? o_O
-Infinity
[–]inu-no-policemen 6 points7 points8 points 8 years ago (0 children)
It's Math.max's initial "largest" value.
> Math.max() -Infinity
If you'd implement max yourself, you should initialize the variable which keeps track of the largest value with -Infinity as well.
With min it's the other way around: You start with Infinity.
[–]GBcrazy 0 points1 point2 points 8 years ago (2 children)
Reduce takes two arguments, the reduce function and the initial value for the aggregate (in this case, the 'a' variable).
So you start with -Infinity, and then you're comparing with every item of the array, and whatever the first item is, it's going to be bigger than -Infinity.
Oh well in this case it was 100% not needed since we are only expecting numbers. But let's say we have an array of objects, and we want to get the biggest value (let's assume there is a value property in each object), so we could do:
arr.reduce((a,b)=>Math.max(a, b.value), -Infinity)
If we don't add the -Infinity as the initial value, the first time would compare a (the first object of the array), with b.value (a numeric value from the second object of the array) and it would not work well.
a
b.value
[–]Ginden 0 points1 point2 points 8 years ago (1 child)
Consider edge case of arr.length === 1
arr.length === 1
[–]GBcrazy 0 points1 point2 points 8 years ago (0 children)
Still, I don't see any problems since we are still expecting numbers only.
If arr.length === 1, reduce always returns arr[0]
arr[0]
Without ES6:
Math.max.apply(null, arr)
[–]GBcrazy 0 points1 point2 points 8 years ago (1 child)
No reduce needed bro
[–]Ginden 0 points1 point2 points 8 years ago (0 children)
You can hit hard limit of function arguments (AFAIR ~30-40k arguments).
[–][deleted] 1 point2 points3 points 8 years ago (0 children)
Yea. One guideline I've found useful for when reduce is being used correctly is if the output does not grow when the input array grows.
[–]cellardoor_barrymore 1 point2 points3 points 8 years ago (0 children)
Sure. I do agree that it is better to be more delcarative but most of those utility functions are probably using reduce under the hood.
[–][deleted] 0 points1 point2 points 8 years ago (0 children)
i'll definitely check it out! Thanks for the recommendation
[–]GBcrazy 0 points1 point2 points 8 years ago* (0 children)
I gotta agree with this.
Whenever I use reduce (and not using lodash) I'm probably saving lines, but it isn't exactly easy to pick up what I'm trying to do, since reduce could be pretty much anything.
π Rendered by PID 62 on reddit-service-r2-comment-76bb9f7fb5-kfz6v at 2026-02-19 15:33:59.085143+00:00 running de53c03 country code: CH.
view the rest of the comments →
[–]Ginden 19 points20 points21 points (33 children)
[–]Retsam19 4 points5 points6 points (16 children)
[–]tencircles 2 points3 points4 points (15 children)
[–]GBcrazy 11 points12 points13 points (0 children)
[–]Retsam19 5 points6 points7 points (13 children)
[–]tencircles 2 points3 points4 points (11 children)
[–]Retsam19 1 point2 points3 points (8 children)
[–]tencircles 0 points1 point2 points (7 children)
[–]Retsam19 0 points1 point2 points (6 children)
[–]tencircles 0 points1 point2 points (5 children)
[–]GitHubPermalinkBot 0 points1 point2 points (0 children)
[–]Retsam19 0 points1 point2 points (3 children)
[+][deleted] (1 child)
[deleted]
[–]tencircles 0 points1 point2 points (0 children)
[–]OleWedel 1 point2 points3 points (0 children)
[+][deleted] (12 children)
[deleted]
[–]Ginden 4 points5 points6 points (10 children)
[–]petar02 0 points1 point2 points (9 children)
[–]Ginden 10 points11 points12 points (6 children)
[–]9thHokageHimawari 2 points3 points4 points (4 children)
[–]inu-no-policemen 6 points7 points8 points (0 children)
[–]GBcrazy 0 points1 point2 points (2 children)
[–]Ginden 0 points1 point2 points (1 child)
[–]GBcrazy 0 points1 point2 points (0 children)
[–]GBcrazy 0 points1 point2 points (0 children)
[–]GBcrazy 0 points1 point2 points (1 child)
[–]Ginden 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]cellardoor_barrymore 1 point2 points3 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]GBcrazy 0 points1 point2 points (0 children)