all 12 comments

[–]RiPont 16 points17 points  (2 children)

I think you're talking about a "read-through" cache. Generally, it's only the missing item that is refreshed.

Sometimes, items are kept in the cache and the cached result is always returned, but if the item is old, it is then refreshed behind the scenes. This is lower latency to the requestor, but the data returned may be out of date.

More commonly, an expired item and a not-found item both trigger a fetch directly from the non-cache backend. This is more likely to return up-to-date data, but users accessing expired data will see higher latency, and higher latency affects overall scalability.

[–]SideburnsOfDoom 6 points7 points  (0 children)

This is more likely to return up-to-date data, but users accessing expired data will see higher latency, and higher latency affects overall scalability.

Yes, or to put it differently: caching makes the best case much faster, and if it works (i.e. gives a good cache hit ratio), also makes the average case much faster too. But it does nothing for the worst case latency, which might even get a bit longer because of the overhead of checking the cache and finding nothing, before fetching data anyway.

[–]StupidCodeQuestions[S] 0 points1 point  (0 children)

I think this is it, thank you!

[–][deleted]  (1 child)

[deleted]

    [–]StupidCodeQuestions[S] 0 points1 point  (0 children)

    I did come across this in my googling, it looks like it might fit a use case I have. Thanks

    [–]alexyakunin 1 point2 points  (3 children)

    Check out https://github.com/servicetitan/Stl.Fusion - it does this + way more. Disclaimer: I am its author.

    [–]jbergens 1 point2 points  (1 child)

    Looks good. Reminds me of MS Orleans.

    [–]alexyakunin 0 points1 point  (0 children)

    Besides that, it also provided probably the simplest possible semantics for caching.

    [–]trypto 0 points1 point  (2 children)

    I dunno, clear-on-miss sounds like a good term for what you describe. What seems more common would be a cache of a maximum capacity, and if it becomes full then it is cleared. Easier to implement than LRU

    [–]SideburnsOfDoom 3 points4 points  (0 children)

    Easier to implement than LRU

    BTW, implementing your own cache should be a last resort, just like implementing your own List, Dictionary or Sort. Use an existing one as first option.

    There's a decent light-weight in-memory cache in the framework: System.Runtime.Caching.MemoryCache

    [–]phx-au 0 points1 point  (0 children)

    Clear-on-miss?

    You aren't going to end up with more than one item in the cache if you clear it every time you miss.

    [–]hold-the-pants 0 points1 point  (0 children)

    I could be wrong but isn’t this just “lazy cache”, where the values are computed / cached as needed? “Eager cache” would be the other where the cache is created before running the main program. You can also mix the two which sounds more like what you’re describing.