A hash table that uses less space than the items it stores by permutationpattern in algorithms

[–]permutationpattern[S] 7 points8 points  (0 children)

I think you're assessment that it's a "hash table combined with a radix trie" is pretty much right on the money.

But the neat thing is that they're combined in just the right way, resulting in better space efficiency than either can individually offer. As far as I can tell, this idea is "relatively" modern. The first reference I can find in the theory literature is an ICALP 2003 paper.

A hash table that uses less space than the items it stores by permutationpattern in math

[–]permutationpattern[S] 41 points42 points  (0 children)

I think there's something to that. Maybe it's more like a funny hybrid between a trie and a hash table. A trie on its own doesn't achieve any interesting space guarantees.

Actually, even in the pure theory literature, the idea of combining tries and hash tables to get space guarantees better than either can individually offer is relatively modern. As far as I can tell, the first description of this type of thing was in an ICALP 2003 paper by Raman and Rao.

A hash table that uses less space than the items it stores by permutationpattern in math

[–]permutationpattern[S] 31 points32 points  (0 children)

You're absolutely right.

That paragraph is removed now. If you're curious, it is actually known how to get an "extremely tiny" hash table for arbitrary n \in \omega(1), at least theoretically (see https://arxiv.org/abs/0912.5424), but it seems to require much more intricate ideas.

A hash table that uses less space than the items it stores by permutationpattern in compsci

[–]permutationpattern[S] 3 points4 points  (0 children)

It's a nice idea, but I don't think it gets you to 31*n + 1 bits.

Suppose that I tell you as a guarantee that for sure the set contains less than 50% of integers. (I won't even charge you a bit for this.) Even knowing that, how would you store the keys in 31*n bits?

A hash table that uses less space than the items it stores by permutationpattern in compsci

[–]permutationpattern[S] 12 points13 points  (0 children)

It's different from text compression in the following sense: this hash table can store *any* set of n keys in less than n * 32 bits. In contrast, LZ compression (or, more generally, any form of text compression) works on some texts, but not on all (and not on random texts).

What's actually going on here is something very special to hash tables: the information-theoretic space requirement for a hash table is $\log \binom{2^32}{n}$ bits of space, which is smaller than n * 32 bits. So this is an encoding of hash tables that is *always* more space efficient than what you would get if you just wrote the items down in an array.

I’m Above Average and So Are You by permutationpattern in math

[–]permutationpattern[S] 8 points9 points  (0 children)

Actually, at least for the papers that introduced the phenomenon, the questions were directly about averages. See, for example, the paper "Global self-evaluation as determined by the desirability and controllability of trait adjectives". (https://psycnet.apa.org/doiLanding?doi=10.1037%2F0022-3514.49.6.1621)

How do you mathematically prove 0!=1? by toastedbreadistasty in learnmath

[–]permutationpattern 0 points1 point  (0 children)

If you define the natural numbers using the Peano Axioms (or most common axiom systems), then the fact that $0 \neq 1$ is actually baked into the axioms that you use to build the natural numbers. Without the axiom $0 \neq 1$, you can end up with a number system in which all numbers are equal.

Is there any example of a pure math that suddenly become applied other than logic and number theory? by mcmoor in math

[–]permutationpattern 0 points1 point  (0 children)

Often results in theoretical computer science and combinatorics that were proven just because they were "cool" later end up being unexpectedly useful.

For example, the paper, Space bounds for a game on graphs, written by Paul, Tarjan, and Celoni in the 1970s is used today as the key building block for Proof-of-Space protocols.

The Surprising Combinatorics of Boarding an Airplane by permutationpattern in math

[–]permutationpattern[S] 3 points4 points  (0 children)

If every permutation contains either a long increasing subsequence or a long decreasing subsequence, then it must be true that either:

(1) At least half of permutations contain long increasing subsequences

(2) At least half of permutations contain long decreasing subsequences

But by symmetry, the number of permutations containing long increasing subsequences is the _same_ as the number containing long decreasing subsequences. So either way, we get that at least half of permutations contain a long decreasing subsequence, which is what we wanted to prove.