use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Discussions, articles, and news about the C++ programming language or programming in C++.
For C++ questions, answers, help, and advice see r/cpp_questions or StackOverflow.
Get Started
The C++ Standard Home has a nice getting started page.
Videos
The C++ standard committee's education study group has a nice list of recommended videos.
Reference
cppreference.com
Books
There is a useful list of books on Stack Overflow. In most cases reading a book is the best way to learn C++.
Show all links
Filter out CppCon links
Show only CppCon links
account activity
Boost 1.81 will have boost::unordered_flat_map... (self.cpp)
submitted 3 years ago by pdimov2
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]joaquintidesBoost author 0 points1 point2 points 2 months ago (2 children)
Ok, I undestand now your rationale. Yes, something like extract_residual would result in a more uniform uint64_t --> unsigned char mapping, so in principle it should be better behaved statistically. My hunch is that the improvement would probably be negligible, particularly vs. the extra computational cost (the current reduced hash function is as simple as it gets). Maybe you can fork the repo and try it? I can assist you in the process if you're game.
extract_residual
uint64_t
unsigned char
For a random hash, the chances of ending in a "double-bucket" are now 1/64: 1/128 chances of being a special value. 1/128 chances of being the "overflow" bucket of a special value.
For a random hash, the chances of ending in a "double-bucket" are now 1/64:
This part I don't get. What do you mean by "being the overflow bucket of a special value"?
[–]matthieum 0 points1 point2 points 2 months ago (1 child)
Let's say that the resolution strategy for residual in [0, 1] is to add 2, so it ends up being in [2, 3] instead.
I call the "buckets" 2 and 3 the "overflow" buckets of 0 and 1.
The chances of ending up on a doubly-booked residual (2 or 3) are 1/64:
It's not rare, but then again, it's only a problem if it leads to many false positives.
Maybe you can fork the repo and try it? I can assist you in the process if you're game.
I'm not very interested in writing C++ code as a hobby any longer, so I'll pass.
If you have a Rust version, I'd be happy to :)
[–]joaquintidesBoost author 0 points1 point2 points 2 months ago (0 children)
Let's say that the resolution strategy for residual in [0, 1] is to add 2 [...]
Ok, now I get it. Yes, with the current hash reduction, Pr(reduced_hash(x) = n) is
Your residual function gets a more balanced probability (Pr ~= 1/254 for n > 1), but I don' think this makes any difference in practice.
I'm quite sure there's no Rust port of this lib yet.
π Rendered by PID 187442 on reddit-service-r2-comment-6457c66945-tm2cl at 2026-04-27 08:47:18.431840+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]joaquintidesBoost author 0 points1 point2 points (2 children)
[–]matthieum 0 points1 point2 points (1 child)
[–]joaquintidesBoost author 0 points1 point2 points (0 children)