Is there any way to measure how random something is?

In summary, the conversation discusses the concept of randomness and ways to measure it, specifically in regards to shuffling a deck of cards. While it is impossible to determine if a deck of cards is truly random, there are tests designed for pseudorandom number generators that can give an indication of how well shuffled the cards are. However, these tests require a large amount of data and may not be practical for a deck of cards. Another measure of randomness is Boltzmann Entropy, which takes into account the arrangement of cards in a specific game. The conversation also mentions the issue of bad shuffling causing repetitive card combinations, and suggests using statistics to track and improve the shuffling technique.
  • #1
DyslexicHobo
251
0
Odd question that at first seemed to have an obvious answer: no.

It all started when I realized that my friend is horrible at shuffling cards. They fall in large packets such that they fall in layers of unshuffled cards, rather than all of the cards being randomized. Then I realized to myself that no matter how they are shuffled, they are still as random as being shuffled 100 times... aren't they?

Is there any way to measure how "well shuffled" cards are, or more generally, how random something is?
 
Mathematics news on Phys.org
  • #3
You are correct that no matter what order they are in there's no way to say if they are random or not. However there are tests designed for determining if http://en.wikipedia.org/wiki/Pseudorandom_number_generator" are producing decent results or not. Again it is possible for a truly random source to produce a sequence that fails these tests, but it would be extremely unlikely. Also it's possible for a sequence to pass the tests and still have a complex underlying pattern.

It's been a while since I messed with this stuff, but Diehard is the sort of industry standard. I found some newer ones that also worked well. The problem with the tests is that they are generally intended to be used with a computer based PRNG, so they expect input in binary form, and generally require millions of bits worth of data. A single deck of cards has around 200 and something bits, so your friend will need to do a lot of shuffling to get enough data for those tests.


http://www.google.com/search?q=diehard+prng+test
 
Last edited by a moderator:
  • #4
I agree with what the other posters have said about random number tests, but if we look at what the OP is trying to quantify I think that Boltzmann Entropy would be a better measure.

Boltzman Entropy for a deck of cards D with respect to a particular game G could be defined as:

S = natural log of (the number of distinct decks D' equivalent to D for the purposes of a round of G)

from this definition it follows that the state with the maximum entropy is the state where the arrangement of the cards was 'most typical' with respect to G, and states with a low entropy represent an atypical shuffling of the deck. Here is an analysis from another page (http://www.cs.unm.edu/~saia/infotheory.html):

Now since all permuations have equal probability in a random deck of cards, the entropy of that deck is log52! = 225.6 bits. When we shuffle a deck of cards, that shuffle has entropy equivalent to log(52 choose 26) = 48.8 bits (we assume the deck is divided in half and a "rifle" shuffle is used). This means we should use a "rifle" shuffle 225.6/48.8 = 4.6 or 5 times on average to assure complete randomness. This computation is relatively simple because the probabilities of all events are assumed to be equal.

In my experience the problem with bad shuffling is that it causes the appearance of card combinations similar to those in the previous round of the game, which is repetitive and therefore boring. It should be pretty simple to generate statistics about these kind of repeats, just by counting them. Say something like "Your shuffling has caused X incidents of repetition in the last Y rounds of the game, while we know that with proper shuffling the probability for this is negligible." Finding the exact probability of "negligible" means would be a worthy homework problem in combinatorics.
 
Last edited by a moderator:

FAQ: Is there any way to measure how random something is?

How do we define randomness?

The concept of randomness refers to the lack of a predictable pattern or sequence in a set of data or events. It is often used to describe a phenomenon or process that is not subject to human control or influence.

Can we measure randomness objectively?

Yes, there are statistical methods and tests that can be used to measure the degree of randomness in a set of data. One common method is to compare the observed data to a random distribution and calculate a statistic, such as the chi-square or Kolmogorov-Smirnov test.

What factors can affect the randomness of a process?

There are many factors that can influence the randomness of a process, such as the design of the experiment, the selection of samples, and the precision of measurements. Additionally, external factors such as environmental conditions and human intervention can also impact the randomness of a process.

Is there a perfect way to measure randomness?

No, there is no universally agreed upon method for measuring randomness. Different statistical tests and measures may produce different results, and the choice of which method to use may depend on the specific context and purpose of the analysis.

How can we apply the concept of randomness in real-world situations?

Randomness is a fundamental concept in many fields of science, such as statistics, physics, and biology. It is used to model and understand complex systems, make predictions, and design experiments. In everyday life, we encounter randomness in activities like gambling, weather forecasting, and random number generation for security purposes.

Back
Top