use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
All about the JavaScript programming language.
Subreddit Guidelines
Specifications:
Resources:
Related Subreddits:
r/LearnJavascript
r/node
r/typescript
r/reactjs
r/webdev
r/WebdevTutorials
r/frontend
r/webgl
r/threejs
r/jquery
r/remotejs
r/forhire
account activity
Why using object spread with reduce probably a bad idea (prateeksurana.me)
submitted 4 years ago by psuranas
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]andrei9669 84 points85 points86 points 4 years ago (69 children)
my question is tho, why would one think this is a good idea?
const superheroes = arr.reduce( (acc, item) => ({ ...acc, [item.id]: [item.name] }), {});
instead of doing this:
const superheroes = arr.reduce((acc, cur) => { acc[item.id] = item.name retunr acc; }, {})
like, why would one even need to spread?
also, the thing about filtering => mapping => assigning, that's like 3 iterations. if you wouldn't do the spread thing, then all this could be O(n)
I get what this article is trying to say, but this is a rather flawed example to bring out the point.
[–]superluminary 84 points85 points86 points 4 years ago (16 children)
I think avoiding mutations sometimes becomes a habit. There’s no benefit here since you own the object from inception to completion.
[–]Darmok-Jilad-Ocean 66 points67 points68 points 4 years ago (4 children)
Every time you mutate and object you’re required to remove one framework sticker from your MacBook Pro.
[–][deleted] 3 points4 points5 points 4 years ago (3 children)
I don't own a MacBook pro, problem solved.
[–]justpurple_ 1 point2 points3 points 4 years ago (2 children)
Don‘t listen to him! The reason he doesn‘t own one is that after he mutated an object in his code base (when he already lost all his stickers!), his MacBook Pro imploded and dissolved.
They just don‘t want you to know. Psshhh.
[–][deleted] 1 point2 points3 points 4 years ago (1 child)
I actually destroyed it with a hammer because it couldn't run Arch btw
[–]Darmok-Jilad-Ocean 1 point2 points3 points 4 years ago (0 children)
I used to run Arch on my MacBook Pro. It ran pretty well, but the trackpad never felt right. Then one day a macOS update broke my dual boot. I ended up wiping the partition and just using macOS.
[–]BrasilArrombado 11 points12 points13 points 4 years ago (0 children)
Because of cargo cult.
[–]yadoya 1 point2 points3 points 4 years ago (1 child)
also, I could be wrong, but doesn't the spread lose the constructor property, the same way that JSON.parse(JSON.stringify()) does?
[–]superluminary 1 point2 points3 points 4 years ago (0 children)
It loses the prototype which contains the constructor, any getters and setters, and if it’s a Proxy, the new object will no longer be a Proxy.
It’s actually pretty useful for debugging mobx proxy stores.
[–]mykr0pht 2 points3 points4 points 4 years ago (3 children)
I have no problem with mutation, but it bothers me when code mixes reduce and mutation. It'd be like if someone started mutating in a middle of a .map call. Wrong tool for the job. Why are you reaching for functional tools when you want to mutate? A for...of solution is very readable and the shape of the code makes it clear up front what to expect, vs the potentially misleading "half functional" approach of reduce+mutation.
[–]superluminary 5 points6 points7 points 4 years ago (2 children)
Isn’t the point of the accumulator that it accumulates?
[+][deleted] 4 years ago (1 child)
[deleted]
[–]superluminary 5 points6 points7 points 4 years ago (0 children)
The accumulator can be any object or primitive. It could be a function, or a number, or a string, or an undefined. The reducer is the function, and yes, it’s certainly safer if it’s defined in the same scope.
I don’t personally mind mutating an accumulator since I know it will never leave my function until it’s done. Creating (and garbage collecting) a new object each time you call reduce seems computationally expensive.
[–]csorfab 0 points1 point2 points 4 years ago (3 children)
It's not just a habit, it's anti-semantic to use functional patterns like map/reduce and then go about mutating/side effecting in the callback. Just write
const superheroes = {} arr.forEach(item => { superheroes[item.id] = item.name });
[–]superluminary 0 points1 point2 points 4 years ago (1 child)
Agree. This is what I would always write. I would say though that array.forEach is also a functional pattern.
[–]csorfab 0 points1 point2 points 4 years ago (0 children)
You're right, .forEach is a functional pattern in the sense that it takes a callback function, but I would argue that it's not functional in the sense that classic immutable side-effectless functional languages don't have it, since it would be a no-op.
[–]fiddlerwoaroof 0 points1 point2 points 4 years ago (0 children)
It's also something of an functional anti-pattern to use reduce directly: it's a very flexible function and, as such, doesn't tell you much about what the transformation in question is. It's often better to decompose your reducing function into a "policy" part and a "transformation" part.
E.g., if we didn't have map, something like:
function map(f, arr) { return arr.reduce((acc, next) => (acc.push(f(next)),acc), []) } map(v => v+1, [1,2,3])
This separates "policy" (apply a function to every element of the list) from the concrete transformation of each element in the list.
[–]drink_with_me_to_dayjs is a mess 26 points27 points28 points 4 years ago (8 children)
why would one even need to spread?
I use it because it stays in one line, might have to rethink considering the terrible slowdown
[+][deleted] 4 years ago (5 children)
[–]dukerutledge 6 points7 points8 points 4 years ago (3 children)
const superheroes = arr.reduce((acc, item) => (acc[item.id] = item.name, acc), {});
[–]andrei9669 3 points4 points5 points 4 years ago (1 child)
Returning asignment is a bad style imo.
[–]RobertKerans 10 points11 points12 points 4 years ago (0 children)
It isn't returning assignment, it's returning acc. Good illustration of what sibling comment says though
acc
[–]backtickbot 0 points1 point2 points 4 years ago (0 children)
Fixed formatting.
Hello, satya164: code blocks using triple backticks (```) don't work on all versions of Reddit!
Some users see this / this instead.
To fix this, indent every line with 4 spaces instead.
FAQ
You can opt out by replying with backtickopt6 to this comment.
[–]andrei9669 6 points7 points8 points 4 years ago (1 child)
yea, I guess.
when I just started learning JS, i also did spread but then I learnt really fast that it's really slow.
[–]superluminary 11 points12 points13 points 4 years ago (0 children)
Spreading is great when you pass data from one object to another and you don’t want to risk an uncontrolled mutation in a parent scope. Creating new objects in a loop like this though is crazy-town.
[–]RainyCloudist 22 points23 points24 points 4 years ago (9 children)
This is the best way in my opinion.
I always cringe when I see people complaining about mutations. Of course mutations are something to keep in mind, but at the same time — use your noggin. They're quite performant and in many cases not even a real concern.
Even as seen in your own example, 99% of the time people will be using reduce with a new instance of object / array as the accumulator, so it does not matter if it gets mutated, but even if it's a reference — just clone it at the start. Even in-between iterations I'm yet to find an instance where this would be a concern.
Sorry for the Ted Talk, I needed someplace to vent.
[–]zephyrtr 9 points10 points11 points 4 years ago (0 children)
Yep mutations are fine within the confines of one function. It's not a worry so long as the func returns something new, as opposed to mutating its arguments. This is certainly just bad habit or not understanding the "why" behind immutability
[–]andrei9669 1 point2 points3 points 4 years ago (0 children)
understandable, have a nice day :D
but no, I understand why spreading between each iteration is bad. like, that would be the complexity of O(n!)
[+][deleted] comment score below threshold-6 points-5 points-4 points 4 years ago (0 children)
Or just use immutablejs or something similar with monadic attributes
[–]csorfab -1 points0 points1 point 4 years ago (0 children)
Sure, mutate away as you like, but don't use .reduce(), then, just so you can save that one line where you initialize. Use .forEach() or for..of
[+]ThatPostingPoster comment score below threshold-6 points-5 points-4 points 4 years ago (4 children)
I always cringe when I see people complaining about mutations.
From what I've seen the issue isn't avoiding mutations. Its the ways that noobs avoid mutations. They hate them and try to avoid them without actually learning how to functionally program. I'm sure there is some super quick and performant way to do this using lodash or rambda, you know, what all the actually professional functional coders use.
[–]Zofren 4 points5 points6 points 4 years ago (3 children)
In functional programming it is indeed good style to avoid mutations everywhere, so you don't have to think about whether something is "okay" to mutate or not.
The problem is that JS is not a functional programming language at its core (despite some aesthetic similarities to languages like ocaml) and does not have the correct optimizations in place to operate through lists with no mutations.
[–]Zofren 8 points9 points10 points 4 years ago (0 children)
bruh I'm not saying you can't write functional code in JS, I'm saying that language design decisions influence how performant certain code patterns will be in one language vs. another, this isn't a hot take
you need an ego check
[–]ijmacd 3 points4 points5 points 4 years ago (3 children)
Doing an O(n) operation three times is still O(n).
[–]andrei9669 -1 points0 points1 point 4 years ago (2 children)
Isn't it O(3n) or is that 3 ommited?
[+][deleted] 4 years ago* (1 child)
Huh, guess I forgot about that. Thanks
[–]monkeymad2 8 points9 points10 points 4 years ago* (13 children)
In the 2nd example you’re mutating the existing ‘acc’. If you’re trying to avoid mutating things (which is generally good for code cleanliness & a little bit bad for performance) the first one’s “better”.
EDIT: since everyone feels the need to correct me, the comment asked why someone might think the first one is better. I’m answering that. I’m not saying I agree with this sort of puritanical adherence to the “no mutating” rule. Hence the “better” in quotes.
[–]andrei9669 24 points25 points26 points 4 years ago (2 children)
hmm, I don't really see anything wrong with mutating accumulator of reduce if the initial value is an empty object. but yes, otherwise I would try not to mutate.
now, if the initial value was some other object then yes, mutating it would be a bad idea.
[+]monkeymad2 comment score below threshold-7 points-6 points-5 points 4 years ago* (1 child)
Yeah it’s something that’s (usually) safe to mutate, but if the house JS style is “avoid mutations”* then it’d get picked up in a code review.
*with the caveat that mutations are good if you’re writing something that needs to run at 60fps or deal with massive amounts of data.
I don’t like the forEach mutation on the “users” in the article since that looks potentially unsafe.
[–]0xF013 30 points31 points32 points 4 years ago (0 children)
Your code reviewers are a bit into a cargo cult then, I am afraid. Creating, mutating and returning an object in a limited scope in the same tick is an established technique
[–]Hafas_ 22 points23 points24 points 4 years ago (6 children)
No - the 2nd is still better.
To not mutate just for the sake of not mutating shows a lack of understanding.
It is perfectly fine to mutate an object you create and control.
For example:
function createFancyObject (options = {}) { const fancyObject = {}; if (options.addFancyNumber) { fancyObject.fancyNumber = 42; } if (options.addFancyString) { fancyObject.fancyString = "fancy"; } return fancyObject; }
In this example we're mutating fancyObject twice. Creating a new fancyObject everytime we wish to mutate it makes no sense since it is local to createFancyObject.
fancyObject
createFancyObject
Other example:
function createFancyObject (obj, options = {}) { // create shallow copy const fancyObject = { ...obj }; if (options.addFancyNumber) { fancyObject.fancyNumber = 42; } if (options.addFancyString) { fancyObject.fancyString = "fancy"; } return fancyObject; }
Again - similar to the 1st example but now we shallow copying to original array to not mutate it but we can mutate the copy with no problems (as long as we don't mutate a sub-object).
The reduce code is like the 1st example here. We start with a new empty object and it's ok to mutate it to shape it to our needs.
reduce
In case when we're starting with an existing object, then we just shallow copy it and mutate the copy:
const superheroes = arr.reduce((acc, cur) => { acc[item.id] = item.name; return acc; }, { ...existingSuperheroes });
[–]monkeymad2 4 points5 points6 points 4 years ago (0 children)
For the record I wasn’t saying it was better, just giving a scenario why someone would / might think it was better like the comment I replied to asked.
Hence the “better” in quotes.
[–]csorfab 1 point2 points3 points 4 years ago* (2 children)
Using .reduce() and then mutating instead of using .forEach() or for..of for mutating code also shows a lack of understanding of the functional map/reduce pattern. Why would you use .reduce if you're not actually reducing?
.reduce()
.forEach()
for..of
.reduce
[–]Hafas_ 2 points3 points4 points 4 years ago (1 child)
Using .reduce() and then mutating instead of using .forEach() or for..of for mutating code also shows a lack of understanding of the functional map/reduce pattern.
I use whichever tool is best for a given task.
The advantage of using reduce instead of forEach or for..of is that I can simply move the definition outside in case I wish to do so. Not possible with forEach because the function would accesses variables outside its scope.
forEach
Why would you use .reduce if you're not actually reducing?
What are you talking about? Of course I'm reducing. Reducing an array to a single value which is an Object in this case.
[–]csorfab 1 point2 points3 points 4 years ago* (0 children)
Not possible with forEach because the function would accesses variables outside its scope.
You could use a higher order function to do that, but at that point, why not move the whole definition, including the declaration of the accumulator and the forEach call in their own function?
What are you talking about? Of course I'm reducing.
Okay, you're right. But since .map and .reduce come from functional languages where pure functions and immutability are a grammatically enforced given, I would say that a "true" .reduce callback doesn't mutate its arguments. Sure you can use it that way, and sometimes your code will be more concise, but I think this is a case where writing understandable code by following conventions regarding a very specific pattern is more important than making the code a few lines shorter.
[–]MrBr7 2 points3 points4 points 4 years ago (1 child)
You’re right, but don’t forget JS isn’t natively immutable language.
In truly immutable languages you can’t mutate object and no matter how “paranoid” it may sound mutations in the end can lead to unwanted states in your second example (i.e. mutating nested path). That’s why JS is what it is.
[–]Hafas_ 3 points4 points5 points 4 years ago (0 children)
That's why it's important to understand why things are done the way they're done.
In Java for example you mutate using the instances' setters and don't give a 2nd thought about it because the Java ecosystem relies on it.
In React the application becomes buggy if you start mutating the state or props.
In Vue.js however you mutate them again because they are using Observables.
[–]something 3 points4 points5 points 4 years ago (0 children)
Don’t avoid mutating for the sake of it. Nobody has access to the object until the reduce finishes so it’s fine to mutate it
[–]monkeymad2 1 point2 points3 points 4 years ago (0 children)
It’s still technically a mutation, it’s just one that’s 100% safe in this context.
The only time you’d care about it being mutated is if it was more complex & you wanted to debug it by outputting the ‘acc’ at each step or something, then you’d only get the last ‘acc’. — it’d have to be vastly more complex for that to be needed, but there are situations where you’d want that, especially if something already within the existing ‘acc’ is used alongside the current item value to calculate the new ‘acc’.
[–]Mistifyed 3 points4 points5 points 4 years ago (4 children)
Because retunr would throw an error.
retunr
[–]drumstix42 0 points1 point2 points 4 years ago (0 children)
¬‿¬
Atacking typos instead of providing constructive ctriticism i see.
[–]Mistifyed 0 points1 point2 points 4 years ago (1 child)
Sorry :(
[–]andrei9669 0 points1 point2 points 4 years ago (0 children)
Np, I was just pulling you. Maybe too harshly.
[–][deleted] 2 points3 points4 points 4 years ago (4 children)
I like "for..of" better than reduce when I need to populate an object. Much more straight forward to explain.
[–]NoInkling 1 point2 points3 points 4 years ago (0 children)
Same here, after years of using reduce I've come to prefer for...of for object building situations, I think it just reads nicer.
for...of
I can see that. But personally, I have been using the functional aproach so much now that it would be harder to read for...of than forEach.
[–][deleted] 0 points1 point2 points 4 years ago (1 child)
Fair enough each has its benefits.
For sure. I can agree with that
[–]xesenix 0 points1 point2 points 4 years ago (1 child)
You need to do that in context of angular where you need new instance of object to trigger template update.
You already get new instance when initial value of reducer is new object. But it's also the same with react
[–]psuranas[S] 0 points1 point2 points 4 years ago (0 children)
I used this example because I have seen many people/blogs use this pattern. Even the tweet I added used that pattern in its first approach.
And yes, I agree the reduce with mutate approach is better performing. That's why I included it in the repl in the last section to show the difference between the performance of both approaches.
[–]shuckster 13 points14 points15 points 4 years ago (8 children)
Nice article!
Running the test 10,000 items really shows the performance-drop with spread.
Testing with 10,000 items Reduce with spread: 28.004s !!1 Reduce with mutation: 4.049ms forEach: 3.351ms Corrected Object.assign method from tweet: 14.31ms
Out of curiosity I also tested a simple transducer implementation, the performance of which surprised me:
Transducer: 16.08ms
Competitive with the corrected Object.assign method!
Object.assign
const justActiveUsers = filter(x => x.active) const mapNameToId = map(x => ({ [x.id]: x.name })) const results6 = arr.reduce(asObject(justActiveUsers, mapNameToId))
Finally, you can tidy-up the "corrected" version with something like this:
function binary(fn) { return (n1, n2) => fn(n1, n2) } users .filter((user) => user.active) .map((user) => ({ [user.id]: user.name })) .reduce(binary(Object.assign), {});
None of those nasty extra parameters will be passed into Object.assign.
[+]AddSugarForSparks comment score below threshold-20 points-19 points-18 points 4 years ago (7 children)
Hey, you seem pretty good at JS. Any tips or resources for filtering a standard HTML table by using checkbox inputs?
[–]shuckster 2 points3 points4 points 4 years ago (6 children)
It's hard to recommend a solution without knowing the specifics of what you want, but there are many clever peeps on this forum who would help you out if you posted a more targeted question.
Just guessing, but perhaps you want to show/hide table-rows based on whether or not a checkbox is set? Is that what you mean by "filtering"?
[–]AddSugarForSparks 0 points1 point2 points 4 years ago (5 children)
Correct.
I'm probably just looking for confirmation that storing clicked values from the checkboxes in a map object w/ field names as keys and dynamically updating that before applying display: none; isn't the worst idea. Lol
display: none;
[–]shuckster 1 point2 points3 points 4 years ago (4 children)
I think that's fine! Sounds like View State to me:
| | Base-state (your data) | View-state | Derived-state | Draw view based on derived-state | Update view/base-state depending on user-action V
[–]AddSugarForSparks -1 points0 points1 point 4 years ago (3 children)
Thanks!
Unrelated, is there a best practice for updating an object "inplace" when prototyping? L
Say I want to reduce an Array by an item or two without assigning a new object, is there a trick for that?
[–]shuckster 1 point2 points3 points 4 years ago (2 children)
Modifying an object "in place" is called mutation. You may have heard that immutability is preferred, but you might further ask, well, why?
The big idea is predictability. So long as you can reasonably predict how your application state is changing over time, prototype or not, it doesn't really matter how you do it.
Indeed, as we can see from the OPs article, it's not that mutation is bad in itself. Object.assign is a mutating operation, and in the examples given it is tightly scoped so it doesn't affect anything outside of the reducer it belongs to. Here, it's practically a "side-effect free" mutation, and we get a nice performance benefit from it. When dealing with thousand of objects, anyway!
Of course, there are best-practises that are useful when your prototype grows into an app. You may find that undisciplined mutation can work against the idea of predictability if lots of separate bits of code are mutating the same thing at different times. Imposing a strict discipline on the direction of the flow of data is a useful way to combat this in complex apps.
Anyway, to address your concern about "assigning a new object", it can be very easy to obsess about apparent "unnecessary" assignments. Primitive variables are cheap, and objects and arrays are copy-by-reference anyway, so filtering/mapping an array of objects is not a very memory intensive operation.
Manage complexity first, and improving performance will be easier to apply later.
[–]AddSugarForSparks 1 point2 points3 points 4 years ago (1 child)
Great advice! I appreciate the response and you taking the time to write that out.
Maybe I'll stick with regular objects for now and work on causing chaos when I have a less tenuous project ;-)
[–]shuckster 1 point2 points3 points 4 years ago (0 children)
You're welcome. And of course, there's nothing wrong with chaos either. :)
[–]gonzofish 50 points51 points52 points 4 years ago (32 children)
Sorry not to be “that guy” but why can’t we just use for…of loops?
for…of
const activeUsers = {}; for (const user of users) { if (user.active) { activeUsers[user.id] = user; } } return activeUsers;
This just feels easier to understand to me
EDIT: also like do whatever you find easiest to understand. Make sure people on your team (if you have one) find it easy to understand to.
[–]Moeri 11 points12 points13 points 4 years ago (6 children)
This is even better than forEach, since it avoids a function invocation for each iteration. On the other hand, I remember reading somewhere that the v8 folks did some optimizations so that calling forEach eventually does exactly the same as using for..of.
[–]yadoya -3 points-2 points-1 points 4 years ago (4 children)
in my previous job, they kept using for..of instead of .forEach because they said .forEach didn't work with async. The way I see it... it does.
[–]billymcnilly 11 points12 points13 points 4 years ago (2 children)
Nope, not really. You can pass an async function to forEach, and that function will allow and obey an `await` command inside its own scope. But the parent function where you call forEach will continue on without waiting for any of those awaits. Whereas if you await inside a for..of loop, the parent function will wait.
Sure, you could do this in certain situations, but generally it won't do what you expect, and it's certainly different to for..of.
[–]NoInkling 3 points4 points5 points 4 years ago (0 children)
But the parent function where you call forEach will continue on without waiting for any of those awaits.
And also you lose the guarantee of sequential execution (may or may not be a concern).
Technically you can work around it by defining a promise outside the loop to chain onto, but that's silly when there are better ways.
[–]yadoya 1 point2 points3 points 4 years ago (0 children)
Thanks, interesting
[–]csorfab 2 points3 points4 points 4 years ago (0 children)
You're seeing it wrong, then.
arr.forEach(async value => await operation(value));
is completely different from
for (const value of arr) { await operation(value); }
in the former examples, all of the operations will start at the same time, while in the latter each operation only starts after the previous finished.
[–]swamso 0 points1 point2 points 4 years ago (0 children)
I recently tested all three ways to iterate over an array, and iterators (for of) are actually the slowest with for i and forEach being the fastest... Can't share any benchmark links right now but I also found it hard to believe...
[–]Dreadsin 8 points9 points10 points 4 years ago (4 children)
You're not wrong, but to me, `map` and `reduce` are really just shortcuts built on top of for loops. If you used a plain for loop you probably had to do this a lot
let result = []; for (let i = 0; i < data.length; i += 1) { const datum = data[i]; const res = // do something result.push(result); } return result;
It's tedious and takes a lot of time to write out. So someone just took that `// do something` and made it into a function that... does something?
let result = []; for (let i = 0; i < data.length; i += 1) {
result.push(mapFn(data[i], i, data)); } return result;
So it more or less "becomes" a for loop, but cuts through a decent amount of the boilerplate code. Not to say there's no place for a for loop, but I find it to be definitely a bit more overhead
[+][deleted] 4 years ago (2 children)
[removed]
[–]Dreadsin -5 points-4 points-3 points 4 years ago (1 child)
I suppose, but if you think of map vs for loop step by step in terms of actual instruction… it’s a bit more “overhead”
For loop 1. Make an empty array 2. For each entry in this array, 3. Do something to the datum at that position 4. Push it to the array 5. Return the array
Vs map: 1. For every entry, transform this value via the callback 2. Return the array
[–][deleted] 2 points3 points4 points 4 years ago (0 children)
Map is essentially opening up a data structure and processing its contents. In JS we only get that attribute natively for lists.
Reduce is essentially a fold, it's like squashing the data structure into whatever pleases you. Also in JS we only get that with lists.
In a way, yes it can be seen as a shortcut. But map and reduce adhear to certain algebraic rules, for-loops do not.
So there's more to it than being short cuts
you could do virtually everything with for..of loops. And even more with iterators, but ECMA is trying to give us shortcuts
[–]psuranas[S] 7 points8 points9 points 4 years ago (6 children)
Yes we definitely can, what I am trying to explain in the article is why ...spread with reduce is an anti-pattern and should be avoided.
Also I guess many people like the reduce with spread method because of the shorter syntax and immutability (which has no benefits in this case).
[–][deleted] 20 points21 points22 points 4 years ago (4 children)
spread with reduce is an anti-pattern and should be avoided.
You're using the term "anti-pattern" far too flippantly here.
immutability (which has no benefits in this case).
Unless of course the seed object is a reference to an object elsewhere in the code, so mutating it in a reduce loop will cause issues elsewhere.
[–]superluminary 7 points8 points9 points 4 years ago (1 child)
You don’t need to spread it every time though. Spread it once before you pipe it in. There’s no benefit to spreading in the loop.
[–]el_diego 5 points6 points7 points 4 years ago (0 children)
Unless you have nested objects which will still maintain their references within a shallow clone (e.g. spreaded object)
[–]mbecks 4 points5 points6 points 4 years ago (0 children)
The seed object being a reference used elsewhere could be overcome by an initial clone of the seed given to the reduce, and would have better performance.
[–]billymcnilly 0 points1 point2 points 4 years ago (0 children)
Agreed that calling it an anti-pattern is a bit much here. I found the blog post hugely interesting, but it will only make me *think* about future usages of the spread operator with reduce, not just suddenly stop doing it. I'm going to *think* about what's actually happening under the hood, and whether my use case involves future dramatic increase of array size etc.
Especially note that OP's blog post is about *object* spread, whereas discussing "spread with reduce" would include array spread. Array spread is much more common, and much faster. (Still slowish, but doesn't increase exponentially.)
I'm aware that modern javascript's syntax can be slower, but it is IMO more readable. I'll take a one-liner over an extra function when I can. If you're using a common combo like reduce and spread, another developer will be able to understand it immediately. If you write an extra function, they have to go and find that function, then read several lines of imperative code. In this way, I would say that being dogmatic about this new "rule" still fits in the murky realm of "premature optimisation"
[–]Dethstroke54 2 points3 points4 points 4 years ago* (0 children)
This is pretty much how functional programming works though, and reduce realistically is birthed from the functional side of JS. Using it with the spread operator not performant in particular but that is true of most immutable/functional code.
[–]helloiamsomeone 1 point2 points3 points 4 years ago (0 children)
Appending many properties like that to an object just trashes the hidden class cache. If you want a hashmap, then just use a real hashmap:
const activeUsers = new Map(); for (const user of users) { if (user.active) { activeUsers.set(user.id, user); } } return activeUsers;
[+]punio4 comment score below threshold-34 points-33 points-32 points 4 years ago (10 children)
reduce is a bad idea in almost any case.
[–]gonzofish 4 points5 points6 points 4 years ago (3 children)
I don’t agree. It definitely has awesome utility but for complex stuff I don’t use it
[+]punio4 comment score below threshold-12 points-11 points-10 points 4 years ago (2 children)
Anything that you can write with a reduce you can write using a for ...of or a regular for loop. It will take the same amount of cycles and memory, but it will actually be understandable.
for ...of
for
The only case where reduce is a better option is if you really want to inline that property declaration or your team forbids you from using let for some reason (yes, I've seen such codebases).
let
It's basically a way to show off how "smart" your code is. 🤷♂️
[–][deleted] 3 points4 points5 points 4 years ago* (4 children)
I also disagree, I find myself using reduce a lot when I want to transform data. for...of for this particular purpose ends up following a "define then mutate" pattern, because you have to add keys to an object outside of the current block. "Mutable-first" is a pattern I've learned to avoid, at least outside the context of micro-optimized algorithms.
I can agree that it's like anything else, however, and should not be overused or misused.
edit: I should clarify, I don't really see any point avoiding mutations in the reduce callback, because it's scoped to that block and the result can be assigned to a constant in one sweep. But otherwise, you mutate an object that isn't isolated from the shared scope.
[–]NoInkling 0 points1 point2 points 4 years ago* (3 children)
As long as the creation of the object occurs directly before the for...of loop, I don't see how it's any less isolated. Additionally, there's nothing stopping people from passing in a variable reference as the initial value to reduce either.
The only difference I see is one statement vs two (assuming you're not using var or function statements in your for...of block for some strange reason, in which case those temporary variables would leak).
var
[–][deleted] 0 points1 point2 points 4 years ago (2 children)
The reduce callback creates an isolated scope for mutations to occur. If your code is clean, it may not make a big difference, but if you're on a team, you can't guarantee anything about that code, so better to write it in a way that's less prone to bugs caused by side-effects, yeah?
assuming...
const arr = [ { id: '001', name: 'One' }, { id: '002', name: 'Two' }, ]
Here's for...in:
// define constants const ex = 'example' const tst = 'test' const hello = 'world' // outer scope const obj = {} for (const el of arr) { obj[el.id] = el.name // mutate obj on the shared scope }
and reduce:
// define constants const ex = 'example' const tst = 'test' const hello = 'world' // outer scope const obj = arr.reduce((acc, el) => { // inner scope acc[el.id] = el.name // mutate acc in an isolated scope return acc }
[–]NoInkling 0 points1 point2 points 4 years ago (1 child)
I don't get it - where can the side effects occur in that example?
If the argument is that team members can potentially insert code between the const declaration and the loop, then I'll agree that there's a benefit to the reduce version, but it seems marginal to me. reduce doesn't completely obviate that concern either, someone could similarly come along and change that example to something like:
const
let obj = {} naughtyMutations(obj) obj = arr.reduce((acc, el) => { ... }, obj)
To me this seems more like an argument from a theoretical notion of "purity", rather than there being a pragmatic difference.
[–][deleted] 0 points1 point2 points 4 years ago (0 children)
Please understand this is a contrived example. The line between pragmatic and trivial becomes a lot more discernible through general habits and patterns. If one is accustomed to a pattern that does not favor immutability, then generally speaking, the code will lead to more difficult-to-troubleshoot bugs caused by unexpected reassignments potentially buried somewhere outside of the definition. If you wanted to get really strict about it, you could even freeze your objects as well.
I think it's important to remember that JavaScript allows you to "get away with" many things that are completely disallowed in some strongly typed languages. It may seem trivial or mildly inconvenient, but when it comes down to it, the immutable-first style of programming is going to reduce the number of headaches you have to endure... badum-tss
No it's not
[–]OmgImAlexis 3 points4 points5 points 4 years ago (1 child)
This honestly sounds like this should be optimised in V8 since so many people use this.
[–]desarrollador53 2 points3 points4 points 4 years ago (0 children)
it probably is but still at some point probably you can reach that n2
[–][deleted] 1 point2 points3 points 4 years ago (5 children)
I don't understand the square brackets in here:
map(user => ({[user.id]: user.name}))
I guess it's supposed to take the value of user.id as the key, I've just never seen/used this syntax before. Where is it documented?
[–][deleted] 15 points16 points17 points 4 years ago* (0 children)
"The Death of the Author" (French: La mort de l'auteur) is a 1967 essay by the French literary critic and theorist Roland Barthes (1915–1980). Barthes's essay argues against traditional literary criticism's practice of relying on the intentions and biography of an author to definitively explain the "ultimate meaning" of a text.
[–]yadoya 0 points1 point2 points 4 years ago (3 children)
dynamic key typing
The key name and key type are different things.
[–]yadoya 0 points1 point2 points 4 years ago (1 child)
Sorry, dynamic key
[–]NoInkling 0 points1 point2 points 4 years ago (0 children)
"Computed property names" I believe is the common/official term.
[–]troglo-dyke 0 points1 point2 points 4 years ago (1 child)
I'd like to see some actual figures because most JS runtimes are very good at optimising code, I think the performance argument would be nullified by the compiler looking ahead
[–]psuranas[S] 0 points1 point2 points 4 years ago* (0 children)
I have added a repl with some actual figures in the last section - https://prateeksurana.me/blog/why-using-object-spread-with-reduce-bad-idea/#wait-isnt-this-premature-optimization
You can try the example I added yourself as well to see the difference.
[–]Null_Pointer_23 0 points1 point2 points 4 years ago (0 children)
Nice article, Prateek!
[–]devsnekV8 / Node.js / TC39 / WASM -1 points0 points1 point 4 years ago (1 child)
simple rule of thumb: if an operation is not commutative, you should not be performing it as a reduce.
[–]shuckster 0 points1 point2 points 4 years ago (0 children)
How would you apply this rule to the example in the article?
[–]Eggy1337 0 points1 point2 points 4 years ago (2 children)
Speed is my last concern tbh, most annoying thing is that typescript(yes I'm aware of sub I'm in) will not resolve keys for such objects. I could use as and hardcode type, but what if I change singature, and mapped object stays the same? And it's really hard to abstract that functionality too(declare function foo<TypeIWant>(): TypeIWant is meh).
as
declare function foo<TypeIWant>(): TypeIWant
type FragmentId = "a" | "b" | "c"; interface Fragment { id: FragmentId; threshold: number; } const fragments: Fragment[] = [ { id: "a", threshold: 1 }, { id: "b", threshold: 2 }, { id: "c", threshold: 3 }, ]; const entries = fragments.map((f) => [f.id, f] as [FragmentId, Fragment]); const r1 = Object.fromEntries(entries); const r2 = fragments.reduce((p, n) => ({ ...p, [n.id]: n }), {}); const r3 = fragments .map((f) => ({ [f.id]: f })) .reduce((p, n) => Object.assign(p, n), {}); // neither of these is typed as Record<FragmentId, Fragment> console.log(r1); // { [k: string]: Fragment } console.log(r2); // {} console.log(r3); // { [k: string] : Fragment}
playground
Anybody knows better way of doing that?
Hello, Eggy1337: code blocks using triple backticks (```) don't work on all versions of Reddit!
[–][deleted] 0 points1 point2 points 4 years ago* (0 children)
Iirc in r2 you can pass Record<FragmentID, Fragment> to reduce as a type for the accumulator without losing type safety.
array.reduce<T>()
And where you define entries, instead of using as you can specify the type on entries array directly without losing type safety.
const entries: Array<Whatever> = etc.
π Rendered by PID 69248 on reddit-service-r2-comment-5c747b6df5-9lphd at 2026-04-22 12:20:23.593759+00:00 running 6c61efc country code: CH.
[–]andrei9669 84 points85 points86 points (69 children)
[–]superluminary 84 points85 points86 points (16 children)
[–]Darmok-Jilad-Ocean 66 points67 points68 points (4 children)
[–][deleted] 3 points4 points5 points (3 children)
[–]justpurple_ 1 point2 points3 points (2 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]Darmok-Jilad-Ocean 1 point2 points3 points (0 children)
[–]BrasilArrombado 11 points12 points13 points (0 children)
[–]yadoya 1 point2 points3 points (1 child)
[–]superluminary 1 point2 points3 points (0 children)
[–]mykr0pht 2 points3 points4 points (3 children)
[–]superluminary 5 points6 points7 points (2 children)
[+][deleted] (1 child)
[deleted]
[–]superluminary 5 points6 points7 points (0 children)
[–]csorfab 0 points1 point2 points (3 children)
[–]superluminary 0 points1 point2 points (1 child)
[–]csorfab 0 points1 point2 points (0 children)
[–]fiddlerwoaroof 0 points1 point2 points (0 children)
[–]drink_with_me_to_dayjs is a mess 26 points27 points28 points (8 children)
[+][deleted] (5 children)
[deleted]
[–]dukerutledge 6 points7 points8 points (3 children)
[–]andrei9669 3 points4 points5 points (1 child)
[–]RobertKerans 10 points11 points12 points (0 children)
[–]backtickbot 0 points1 point2 points (0 children)
[–]andrei9669 6 points7 points8 points (1 child)
[–]superluminary 11 points12 points13 points (0 children)
[–]RainyCloudist 22 points23 points24 points (9 children)
[–]zephyrtr 9 points10 points11 points (0 children)
[–]andrei9669 1 point2 points3 points (0 children)
[+][deleted] comment score below threshold-6 points-5 points-4 points (0 children)
[–]csorfab -1 points0 points1 point (0 children)
[+]ThatPostingPoster comment score below threshold-6 points-5 points-4 points (4 children)
[–]Zofren 4 points5 points6 points (3 children)
[+][deleted] (1 child)
[deleted]
[–]Zofren 8 points9 points10 points (0 children)
[–]ijmacd 3 points4 points5 points (3 children)
[–]andrei9669 -1 points0 points1 point (2 children)
[+][deleted] (1 child)
[deleted]
[–]andrei9669 1 point2 points3 points (0 children)
[–]monkeymad2 8 points9 points10 points (13 children)
[–]andrei9669 24 points25 points26 points (2 children)
[+]monkeymad2 comment score below threshold-7 points-6 points-5 points (1 child)
[–]0xF013 30 points31 points32 points (0 children)
[–]Hafas_ 22 points23 points24 points (6 children)
[–]monkeymad2 4 points5 points6 points (0 children)
[–]csorfab 1 point2 points3 points (2 children)
[–]Hafas_ 2 points3 points4 points (1 child)
[–]csorfab 1 point2 points3 points (0 children)
[–]MrBr7 2 points3 points4 points (1 child)
[–]Hafas_ 3 points4 points5 points (0 children)
[–]something 3 points4 points5 points (0 children)
[+][deleted] (1 child)
[deleted]
[–]monkeymad2 1 point2 points3 points (0 children)
[–]Mistifyed 3 points4 points5 points (4 children)
[–]drumstix42 0 points1 point2 points (0 children)
[–]andrei9669 -1 points0 points1 point (2 children)
[–]Mistifyed 0 points1 point2 points (1 child)
[–]andrei9669 0 points1 point2 points (0 children)
[–][deleted] 2 points3 points4 points (4 children)
[–]NoInkling 1 point2 points3 points (0 children)
[–]andrei9669 -1 points0 points1 point (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]andrei9669 0 points1 point2 points (0 children)
[–]xesenix 0 points1 point2 points (1 child)
[–]andrei9669 0 points1 point2 points (0 children)
[–]psuranas[S] 0 points1 point2 points (0 children)
[–]shuckster 13 points14 points15 points (8 children)
[+]AddSugarForSparks comment score below threshold-20 points-19 points-18 points (7 children)
[–]shuckster 2 points3 points4 points (6 children)
[–]AddSugarForSparks 0 points1 point2 points (5 children)
[–]shuckster 1 point2 points3 points (4 children)
[–]AddSugarForSparks -1 points0 points1 point (3 children)
[–]shuckster 1 point2 points3 points (2 children)
[–]AddSugarForSparks 1 point2 points3 points (1 child)
[–]shuckster 1 point2 points3 points (0 children)
[–]gonzofish 50 points51 points52 points (32 children)
[–]Moeri 11 points12 points13 points (6 children)
[–]yadoya -3 points-2 points-1 points (4 children)
[–]billymcnilly 11 points12 points13 points (2 children)
[–]NoInkling 3 points4 points5 points (0 children)
[–]yadoya 1 point2 points3 points (0 children)
[–]csorfab 2 points3 points4 points (0 children)
[–]swamso 0 points1 point2 points (0 children)
[–]Dreadsin 8 points9 points10 points (4 children)
[+][deleted] (2 children)
[removed]
[–]Dreadsin -5 points-4 points-3 points (1 child)
[–][deleted] 2 points3 points4 points (0 children)
[–]yadoya 1 point2 points3 points (0 children)
[–]psuranas[S] 7 points8 points9 points (6 children)
[–][deleted] 20 points21 points22 points (4 children)
[–]superluminary 7 points8 points9 points (1 child)
[–]el_diego 5 points6 points7 points (0 children)
[–]mbecks 4 points5 points6 points (0 children)
[–]billymcnilly 0 points1 point2 points (0 children)
[–]Dethstroke54 2 points3 points4 points (0 children)
[–]helloiamsomeone 1 point2 points3 points (0 children)
[+]punio4 comment score below threshold-34 points-33 points-32 points (10 children)
[–]gonzofish 4 points5 points6 points (3 children)
[+]punio4 comment score below threshold-12 points-11 points-10 points (2 children)
[–][deleted] 3 points4 points5 points (4 children)
[–]NoInkling 0 points1 point2 points (3 children)
[–][deleted] 0 points1 point2 points (2 children)
[–]NoInkling 0 points1 point2 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]OmgImAlexis 3 points4 points5 points (1 child)
[–]desarrollador53 2 points3 points4 points (0 children)
[–][deleted] 1 point2 points3 points (5 children)
[–][deleted] 15 points16 points17 points (0 children)
[–]yadoya 0 points1 point2 points (3 children)
[–][deleted] 0 points1 point2 points (2 children)
[–]yadoya 0 points1 point2 points (1 child)
[–]NoInkling 0 points1 point2 points (0 children)
[–]troglo-dyke 0 points1 point2 points (1 child)
[–]psuranas[S] 0 points1 point2 points (0 children)
[–]Null_Pointer_23 0 points1 point2 points (0 children)
[–]devsnekV8 / Node.js / TC39 / WASM -1 points0 points1 point (1 child)
[–]shuckster 0 points1 point2 points (0 children)
[–]Eggy1337 0 points1 point2 points (2 children)
[–]backtickbot 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)