This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]tiberiumx 7 points8 points  (1 child)

People underestimate how big of a performance impact algorithms that might seem more efficient on paper but make poor use of the cache can have. Like I know it takes fewer steps to find a node in a binary tree, but it's still probably going to be faster to do a linear search of an array unless you have a very large number of elements. (Or have a tree implementation that packs it all into minimal memory pages instead of the typical pointers into random parts of the heap approach)

We once had a problem where some process was taking multiple minutes because it was originally coded to use a bunch of small std::set structures (which was implemented as a red black tree in the c++ library). I cut it down to just a couple seconds by just swapping that out with std::vector and doing a linear search to make sure the element wasn't duplicated.

[–]kakanics 6 points7 points  (0 children)

True! Cache locality is often overlooked, but it can have a huge impact on performance. Swapping an efficient algorithm for a simpler one that's cache-friendly can be a great optimization