 # Coding Blackjack in Java Extra: Efficient Shuffles

I mentioned in my previous post about how my implementation of shuffling was probably not very efficient if we were shuffling more items.

Which got me wondering: What’s the most efficient way to fairly shuffle N items?

#### Fisher-Yates Shuffling

Turns out there’s a clean algorithm floating round called the Fisher-Yates shuffle which runs in O(N) time. This is an in-place shuffle, meaning it will operate over an existing array. Allegedly, if we wanted to both create and shuffle the array at the same time, we could be a bit more optimal with an inside-out type algorithm, but let’s stick with the in-place for now i.e we have an existing array that needs shuffling.

The idea of Fisher-Yates:

```1. Start with N objects, let n=N
2. Generate random number i between 1 and n
3. SWAP element n (at the end) and element i (randomly picked to the left of n)
4. Decrease n by 1
5. Repeat 2 to 4 until n = 1```

I mentioned that any implementation of a shuffle would require “remembering” what had been randomly picked. This algorithm implicitly does this by swapping elements out to the end of the array; an area that has already been iterated over.

#### Java Implementation

```import java.util.Random;
public int[] FisherYates(int[] a){
Random random = new Random();
for( int n = a.length-1; n > 0; n--){
int i = random.nextInt(n+1);
int temp = a[i];
a[i] = a[n];
a[n] = temp;
}
return a;
}```

The a.length-1 is necessary because in java, element 1 is at index 0, so we have to adjust by 1 from the steps outlined above. nextInt() has to be passed n+1 because the function includes all numbers up to it’s input, but doesn’t include the input; we want it to be possible that the element at the end swaps with itself (otherwise it’s not fair). The temporary integer is needed to remember what was in position i before we overwrite it, so that i and n can be swapped properly.

#### Ensuring random shuffle

Let’s see if the algorithm looks fair. To be fair, we expect every permutation to be equally likely. Let’s test with [1, 2, 3] as the list to sort.

```public static void main(String args[]) {
int[] a = new int[]{1,2,3};
Shuffling shuffler = new Shuffling();
int[] comboCounter = new int;
int[][] combos = new int[][] {
new int[]{1,2,3}, new int[]{1,3,2},
new int[]{2,1,3}, new int[]{2,3,1},
new int[]{3,1,2}, new int[]{3,2,1},
};
for(int k = 0; k < 100000; k++) {
int[] shuffled = shuffler.FisherYates(a);
for(int j=0; j < 6; j++) {
if(Arrays.equals(shuffled, combos[j])) {
comboCounter[j]++;
}
}
}
System.out.println(Arrays.toString(comboCounter));
}```

Don’t worry too much about the code, the main thing is it gives us how many times each way of arranging 1, 2, 3 ends up being the result of the shuffle:

`[16533, 16489, 16667, 16782, 16773, 16491]`

These are all similar enough to conclude the shuffle is fair, and if you don’t believe me, here’s a plot of 50 of these 100 000 runs No line is unusually above or below expected. An easier way to see this if we order each combination from most to least occurrences: As a final nod to what made to me look into this, let’s look at the difference between my blackjack shuffle compared to a Fisher-Yates. I went back and re-coded my blackjack shuffle so that instead of cards, it shuffles integers, but keeping everything else the same.

###### Time taken for 1000 shuffles

It’s easy to see that doubling N increases run-time by 4 ( O(N^2) ) for MBJack whereas doubling N increases N by 2 ( O(N) ) for Fisher-Yates, which is exactly as hypothesised in the original blog post.

#### Conclusion

It’s a good thing there are only 52 cards in a deck.