- #1
JoaLi
- 3
- 0
- TL;DR Summary
- Can the resulting excess of matter over antimatter just be the result of a statistical process with average and standard deviation so that it would be normal and not an exception to expect a deviation? Taking as model system the Hawking radiation where 2 particles from vacuum fluctuation get separated from each other at the event horizon. Whereas this separation is a random event to decide if the particle to escape is matter or antimatter. So could it have been at the beginning of our universe?
Dear collegues,
I would like to share and discuss a simple explanation to the matter-antimatter asymmetry. Maybe this explanation is too simple, but your feedback is appreciated.
I would like to start with what inspired me to this idea. The Hawking radiation. You know the event horizon of a black hole. The Hawking radiation comes from particles that are created by vacuum fluctuations and then separated by the event horizon so that they cannot recombine to nothing. As this vacuum fluctuation generates pairs of matter and antimatter, there will always be a separation between them. However, as this process can be considered statistically, there will be produced an equal amount of matter and antimatter on each side of the event horizon given the law of big numbers in statistics, so that average value is expected to be zero.
Really? We can translate this experiment into the situation of flipping a coin, the coin can show either side - let it call here matter or antimatter - after each throw. If we do this experiment 10 times, we may expect on average 5 times side A and 5 times side B. However, in reality there not just is the average, but also the “standard deviation”. This means, doing a real coin flipping experiment, e.g. 10 times, we may end up with an asymmetry, e.g. 6:4 or 7:3. In fact, it is more likely that we will come up with a deviation from 1:1 than with exactly 1:1.
The interesting question is, will this deviation shrink, expand or stay, if we significantly increase the number of throws – e.g. from 10 to 1EX with this the number of pair building events at creation of the universe (so maybe a really large number)?
The formula to determine the standard deviation of such “Bernoulli Experiment” is known:
d = SQRT( n *p * (1 – p) )
= SQRT (n) * SQRT (p) * SQRT (1-p)
With n the number of throws of the coin, p the likelihood for either outcome – here ½ obviously.
Given p=1/2 as a constant, one can simplify the formula to:
d = SQRT(n) * const.
= SQRT(n) * 1/2
So this answers the question, if the standard deviation will shrink, expand or stay when doing this Bernoulli Experiment for a very large number of times: It is expanding by the square root of the number of tries.
So if at the beginning matter was created at a separating layer like an event horizon, it would be normal to expect an asymmetrical amount of matter (and antimatter) formed on either side of that separator. It is no special rules or laws of nature, that we might not yet know that somehow violate symmetry, but simple statistics on randomness and with randomness as a constitutive driving force of quantum mechanics. We see quantum mechanics behaving randomly all the time, so why not also for this case at the beginning?
I would like to share and discuss a simple explanation to the matter-antimatter asymmetry. Maybe this explanation is too simple, but your feedback is appreciated.
I would like to start with what inspired me to this idea. The Hawking radiation. You know the event horizon of a black hole. The Hawking radiation comes from particles that are created by vacuum fluctuations and then separated by the event horizon so that they cannot recombine to nothing. As this vacuum fluctuation generates pairs of matter and antimatter, there will always be a separation between them. However, as this process can be considered statistically, there will be produced an equal amount of matter and antimatter on each side of the event horizon given the law of big numbers in statistics, so that average value is expected to be zero.
Really? We can translate this experiment into the situation of flipping a coin, the coin can show either side - let it call here matter or antimatter - after each throw. If we do this experiment 10 times, we may expect on average 5 times side A and 5 times side B. However, in reality there not just is the average, but also the “standard deviation”. This means, doing a real coin flipping experiment, e.g. 10 times, we may end up with an asymmetry, e.g. 6:4 or 7:3. In fact, it is more likely that we will come up with a deviation from 1:1 than with exactly 1:1.
The interesting question is, will this deviation shrink, expand or stay, if we significantly increase the number of throws – e.g. from 10 to 1EX with this the number of pair building events at creation of the universe (so maybe a really large number)?
The formula to determine the standard deviation of such “Bernoulli Experiment” is known:
d = SQRT( n *p * (1 – p) )
= SQRT (n) * SQRT (p) * SQRT (1-p)
With n the number of throws of the coin, p the likelihood for either outcome – here ½ obviously.
Given p=1/2 as a constant, one can simplify the formula to:
d = SQRT(n) * const.
= SQRT(n) * 1/2
So this answers the question, if the standard deviation will shrink, expand or stay when doing this Bernoulli Experiment for a very large number of times: It is expanding by the square root of the number of tries.
So if at the beginning matter was created at a separating layer like an event horizon, it would be normal to expect an asymmetrical amount of matter (and antimatter) formed on either side of that separator. It is no special rules or laws of nature, that we might not yet know that somehow violate symmetry, but simple statistics on randomness and with randomness as a constitutive driving force of quantum mechanics. We see quantum mechanics behaving randomly all the time, so why not also for this case at the beginning?