- #141
vanesch
Staff Emeritus
Science Advisor
Gold Member
- 5,117
- 20
Vanadium 50 said:Personally, I think the standard Kent sets for risk ([tex]10^{-22}[/tex]) is absurd. Consider the risk that if we open a bottle, an angry genie will emerge from it and wipe out all life on earth. Since we've made maybe a trillion bottles to date, and this hasn't happened yet, we only know that the risk is less than around ([tex]10^{-12}[/tex]), a full ten billion times larger than Kent would permit. According to his argument, we should ban bottles.
I think this is an important point to make indeed. There are two kinds of risks associated with "doing new things". One risk is the objective probability that it will go wrong and do something evil, like eat up the earth. If the objective probability of such an event is stated to be p (say, 1 in 1 000 000 for instance) that means that if we were to repeat this experiment on, say, 100 million earth-like planets, that 100 of them on average would be eaten up/blown up whatever it was, and that 99 999 900 of them would be ok.
The other risk is the subjective risk (Bayesian "belief" risk) of weird things happening. This is not a probability in the frequentist sense, but a belief or not in the existence of a specific phenomenon, like evil genies popping out of bottles. Although certain theorists might feel insulted, posing the hypothesis of making tiny black holes at LHC energies which would not undergo Hawking radiation are, to me, not very far from genies popping out of a bottle. What "risk" is associated with such hypothetical phenomena ? If the phenomenon doesn't exist - that means, if it was just a quirk in the mind of a theorist or a story teller - then you can repeat the experiment on a gazillion earth-like planets, it will NEVER happen. If on the other hand, the phenomenon exists, it can occur at ANY rate, even "for sure". What do we do then ? It cannot be excluded that totally unexpected and weird things happen, and if we really do something totally new for the very first time in the universe, it is totally impossible to know. We might offend a deity which collapses the entire universe just by uttering a new phrase, we might get a genie out of a bottle, we might invoke the "coming of the great white handkerchief", or we might turn the Earth in a black hole.
Happily, most things (except for uttering phrases) we can do happen already somewhere in the universe, so by analyzing these things, we can sometimes obtain upper limits for the probability of the phenomenon happening under the hypothesis that the phenomenon is possible in the first place. But note that these estimates are not estimates of the probability that this will happen: they are conservative upper bounds.
Of course, from a decision PoV, Bayesian "ignorance" probability is probably to be treated on the same level as "genuine frequentist probability".
The danger of fixing too low probability levels for weird things to happen, is that we will for ever be stiffled in attempting new things - even at first totally inoffensive things, like writing phrases (which might invoke magical powers).