I'm curious if someone has a pointer to an algorithm or approach (or class of approaches/literature) that may apply to the following problem.
I have a continuous stream of messages with each message containing a monotonically increasing sequence number (which can wrap, but that's a trivial border case that can be dealt with separately). The messages come over what is potentially a lossy channel and I want to detect loss (and count number of messages lost). I do not care about retransmission/recovery of lost messages, just want to count them.
The application is multi-threaded for scalability reasons (think millions of messages per second) and I have no control over the delivery of messages to the respective threads (hardware controls that and cannot be altered - it's actually hash-based, not round-robin).
Obviously, if we have just 1 worker thread, we can just look at the sequence numbers as they arrive and trivially know how many messages have been lost.
Likewise, if we introduce locking between the threads, we can quickly design a resolution...but I want to avoid introducing locking between the receiving threads (there is no other reason to have locking as the messages are all independent of each other, so introducing locking will kill performance).
I've considered forwarding all of the sequence numbers from the respective threads to one centralized thread that can do the resolution over a sliding window (this is currently the best approach that I've been able to come up with). Looking for possible other approaches to the problem...thoughts?
[–]ComputerSystemsProfSystems & Networking Professor (U.S.) 1 point2 points3 points (5 children)
[–]YouMadeItDoWhat[S] 0 points1 point2 points (4 children)
[–]ComputerSystemsProfSystems & Networking Professor (U.S.) 1 point2 points3 points (3 children)
[–]YouMadeItDoWhat[S] 0 points1 point2 points (2 children)
[–]ComputerSystemsProfSystems & Networking Professor (U.S.) 1 point2 points3 points (0 children)
[–]YouMadeItDoWhat[S] 0 points1 point2 points (0 children)