David Eppstein asks: how many riffles does it take until all permutations are possible?
A riffle shuffle permutation, in the mathematics of shuffling cards, is a permutation of the cards that can be obtained by a single riffle shuffle — that is, you cut the deck into two packets and then interleave the packets. There are distinct riffle shuffles of n cards. For example, consider a five-card deck, which is initially in the order 1, 2, 3, 4, 5. Then a riffle shuffle consists of:
- cutting the deck in one of the six possible ways: 12345/ (no cut), 1234/5, 123/45, 12/345, 1/2345, /12345 (also no cut) where the slash represents the cut.
- riffling. For example look at 123/45. There are
possible ways to make a permutation from this – we decide which two slots to put the 4 and the 5 in, and then everything else is forced. For example if we decide that 4 and 5 will go in the second and fourth positions, we must get 14253. In general if we have k cards in the left-hand pile and
cards in the right-hand pile, we have $\latex n \choose k$ possible shuffles. Furthermore we can decompose any one of these permutations into two “rising sequences” in exactly one way, so it can come from exactly one cut — with a single exception. That exception is the identity permutation
, which we can obtain from any of the n+1 possible cuts — so we must subtract n for the duplications. (If you put probabilities on this it becomes the Gilbert-Shannon-Reeds model.
Eppstein asks how many riffle shuffles it takes for each permutation to have nonzero probability. Since there are outcomes of a single riffle shuffle, in k iterations there are at most
possible results. There will actually be less, because there are relations among the permutation subgroup generated by the riffle shuffles. (In less fancy language, there are sequences of different shuffles which give the same result. It’s the pedigree collapse of shuffling.)
Let’s replace this with to make the math easier. Now, there are
permutations; this is greater than
by a standard bound. So just in order to have
possible sequences of shuffles of length k, we have to have
, or, taking nth roots of both sides,
. Taking base-2 logs, we get
.
Eppstein gives a dynamical systems argument that you need at least $\lceil \log_2 n \rceil$ shuffles – and it turns out that that’s enough. This is in comparison to the classic result of Bayer and Diaconis, claiming that you need $3/2 \log_2 n$ shuffles to get a well-shuffled deck.
Reblogged this on lava kafle kathmandu nepal <a href="https://plus.google.com/102726194262702292606" rel="publisher">Google+</a>.