Different probabilities depending on meaning?

In summary: we did an experiment and we got 1,2,3,4,5,6,1,2,3,4,5,6,1,2,3,4,5,6,1,2,3,4,5,6,1,2,3,4,5,6... then obviously we still have independence since the conditional probabilities are solidly skewed.
  • #1
homer5439
5
0
Here it is.
The probability of getting 6 when rolling a die is 1/6.
The probability if getting 5 consecutive 6 when rolling the die 5 times is 1/7776 (1/6*1/6*1/6*1/6*1/6).

So far so good.

But let's assume I've been rolling the die 4 times, and I got 6 all times so far.
The probability of getting 6 at the next roll is 1/6, but if that happens, that also means I get five 6 in a row, which has a probability of 1/7776. So one may say that the probability of getting 6 at the next roll (and thus of getting five 6 in a row) is 1/7776.

Clearly, one of the two views is wrong, and I think it's the 1/7776 one, but I'd like to understand why. It seems that depending on the meaning you give to the event, its probability changes.

Thanks
 
Physics news on Phys.org
  • #2
homer5439 said:
he probability if getting 5 consecutive 6 when rolling the die 5 times is 1/7776 (1/6*1/6*1/6*1/6*1/6).But let's assume I've been rolling the die 4 times, and I got 6 all times so far.
The probability of getting 6 at the next roll is 1/6, but if that happens, that also means I get five 6 in a row, which has a probability of 1/7776. So one may say that the probability of getting 6 at the next roll (and thus of getting five 6 in a row) is 1/7776.

Clearly, one of the two views is wrong, and I think it's the 1/7776 one, but I'd like to understand why. It seems that depending on the meaning you give to the event, its probability changes.

Thanks

This is a common misunderstanding of what we call independent events. You can only properly assign probabilities to unknown outcomes. Once an outcome is known, it exists with a probability of one. The probability of a given outcome with each roll of a fair die is 1/6. The probability of getting the same result n consecutive times before the first roll is [tex](1/6)^{n}[/tex]
 
Last edited:
  • #3
They are two very different situations. The first one is A: "rolling five consecutive 6", the other one is B: "rolling five consecutive 6, given that we have rolled four consecutive 6". Notationally these are respectively P(A) and P(B)=P(A|C), where C: "rolling four consecutive 6". The first one is the expectations of rolling five consecutive 6, the second one is the expectation of rolling five consecutive 6 whenever we have rolled four. Naturally, the more positive information you have, the more you can expect the wanted final outcome.
 
  • #4
Jarle said:
They are two very different situations. The first one is A: "rolling five consecutive 6", the other one is B: "rolling five consecutive 6, given that we have rolled four consecutive 6". Notationally these are respectively P(A) and P(B)=P(A|C), where C: "rolling four consecutive 6". The first one is the expectations of rolling five consecutive 6, the second one is the expectation of rolling five consecutive 6 whenever we have rolled four. Naturally, the more positive information you have, the more you can expect the wanted final outcome.

That's just wrong. These are not conditional probabilities. They are assumed to be independent probabilities. The prospective probabilities in rolling a fair die are not in any way influenced by what happened before.

There are many threads in this forum and others where this or similar situations been discussed have been discussed.
 
Last edited:
  • #5
SW VandeCarr said:
That's just wrong. These are not conditional probabilities. They are assumed to be independent probabilities. The prospective probabilities in rolling a fair die are not in any way influenced by what happened before.

Surely "Rolling 6 five consecutive times in five rolls, given that we have rolled four 6" (written A|C in my notation) is a conditional probability. And obviously P(A|C) = 1/6, if that is what you are questioning.
 
  • #6
Jarle said:
Surely "Rolling 6 five consecutive times in five rolls, given that we have rolled four 6" (written A|C in my notation) is a conditional probability. And obviously P(A|C) = 1/6, if that is what you are questioning.

Yes. The probability for a given outcome in one roll is 1/6. It's not a conditional probability in a practical sense. It's an independent probability. You could write a statement P(A|C) = P(C|A=1/6)P(A=1/6)/P(C=1/6) = 1/6. The point is that prior events have no effect on the outcome.
 
Last edited:
  • #7
SW VandeCarr said:
Yes. The probability for a given outcome in one roll is 1/6. It's not a conditional probability. It's an independent probability.

I may be wrong here, but for true independence, like you have said SW VandeCarr, the previous results do not affect the future results. If this is the case, then over a very very long number of rolls, the conditional probabilities for all different combinations will all be the same as well and thus prove independence.

Like for example if we did an experiment and we got

1,2,3,4,5,6,1,2,3,4,5,6,1,2,3,4,5,6...

then obviously we don't have independence since the conditional probabilities are solidly skewed.

But yeah if you had a sequence of rolls that had no definitive pattern where conditional probabilities were showing that there is no dependence on past values, then yeah that should prove that all events are independent for all rolls in the experiment.
 
  • #8
chiro said:
yeah if you had a sequence of rolls that had no definitive pattern where conditional probabilities were showing that there is no dependence on past values, then yeah that should prove that all events are independent for all rolls in the experiment.

Yes Outcomes of processes like flipping fair coins or rolling fair dice are assumed to be independent and experiments bear this out. The onus is on the one who argues otherwise to show that the fact I toss heads at time 1 has any effect the probability that I will toss heads again at time 2.

In fact the coins or dice don't even need to be fair as long their particular probabilities don't change over time. Even if such changes occurred, it would have to be shown that the changes were related to the previous handling of the dice or coins.

I don't agree that a pattern proves dependence. Patterns will occur with known probability.
 
Last edited:
  • #9
SW VandeCarr said:
Yes Outcomes of processes like flipping fair coins or rolling fair dice are assumed to be independent and experiments bear this out. The onus is on the one who argues otherwise to show that the fact I toss heads at time 1 has any effect the probability that I will toss heads again at time 2.

In fact the coins or dice don't even need to be fair as long their particular probabilities don't change over time. Even if such changes occurred, it would have to be shown that the changes were related to the previous handling of the dice or coins.

I don't agree that a pattern proves dependence. Patterns will occur with known probability.

I agree with you for the most part with the exception of the example I gave above where we have conditional probabilities equaling 100 per cent in which case, there really is no randomness.

Of course most systems don't work like this, but for the ones that somehow do, its pretty clear that they are not random even though each individual event has the same probability over all (ie probability of getting any number = 1/6)

Edit: Note that I'm talking about the probability for a single event that doesn't include conditional probabilities.
 
  • #10
SW VandeCarr said:
Yes. The probability for a given outcome in one roll is 1/6. It's not a conditional probability in a practical sense. It's an independent probability. You could write a statement P(A|C) = P(C|A=1/6)P(A=1/6)/P(C=1/6) = 1/6. The point is that prior events have no effect on the outcome.

You are making no sense. This is conditional probability merely because it is on the form "A, given C".

Consider the similarity to the example given by wikipedia http://en.wikipedia.org/wiki/Conditional_probability#Introduction
 
Last edited:
  • #11
Jarle said:
You are making no sense. This is conditional probability merely because it is on the form "A, given C".

Consider the similarity to the example given by wikipedia http://en.wikipedia.org/wiki/Conditional_probability#Introduction

You defined A as rolling five consecutive sixes and C as rolling four consecutive sixes and it's true that P(A|C)=P(A)=1/6. However using Bayes Theorem:

P(A|C)=P(C|A)P(A)/P(C) we are required to condition C on A where C occurs before A. This does not make sense. You can write what you wish like P(A|C)=P(A) but you can't properly derive it since P(C)=1 and P(A)=1/6. That is, you are conditioning an outcome that is already known on one that has not yet occurred.
 
Last edited:
  • #12
SW VandeCarr said:
P(A|C)=P(C|A)P(A)/P(B) we are required to condition C on A where C occurs before A.

P(C|A) is one. If all five dice come up on sixes (A happens), then C happens necessarily. Stop trying to think about these things purely with intuition and really look at the sample space and how probabilities and conditional probabilities are defined

You can write what you wish like P(A|C)=P(A) but you can't properly derive it since P(C)=1 and P(A)=1/6.

P(C) is 1/64 and P(A) is 1/65 actually. So let's see what happens.

P(A|C) = P(C|A) P(A)/P(C) = 64/65 = 1/6
And that's what you want it to equal!
 
  • #13
Office_Shredder said:
P(C|A) is one. If all five dice come up on sixes (A happens), then C happens necessarily. Stop trying to think about these things purely with intuition and really look at the sample space and how probabilities and conditional probabilities are defined
P(C) is 1/64 and P(A) is 1/65 actually. So let's see what happens.

P(A|C) = P(C|A) P(A)/P(C) = 64/65 = 1/6
And that's what you want it to equal!

If P(A|C)=P(A)=1/6, does how P(A) also equal [tex]1/6^{5}[/tex] and if P(C) =1, how does P(C) also equal [tex]1/6^{4}[/tex]? Four in a row is given so P(C)=1. Five in a row requires that P(A)=1/6.
 
Last edited:
  • #14
SW VandeCarr said:
If P(A|C)=P(A)=1/6, how P(A) also equal 4/6 and if P(C) =1, how does P(C) also equal 5/6?

A is rolling five sixes and C is rolling four sixes. These are not independent events. P(A|C) is 1/6, P(A) is not. I don't know where you got P(A) being 4/6 and P(C) being 5/6
 
  • #15
SW VandeCarr said:
If P(A|C)=P(A)=1/6, does how P(A) also equal [tex]6^{4}[/tex] and if P(C) =1, how does P(C) also equal [tex]6^{5}[/tex] Four in a row is given so P(C)=1. Five in a row requires that P(A)=1/6.

To clear up this:

P(A)= 1/6^5 != 1/6
P(C)= 1/6^4 != 1
P(A|C) = 1/6
P(C|A) = 1
 
  • #16
Jarle said:
To clear up this:

P(A)= 1/6^5 != 1/6
P(C)= 1/6^4 != 1
P(A|C) = 1/6
P(C|A) = 1

This gives the right answers although I don't know how you go from [tex]1/6^{5}[/tex] to 1/6 or
[tex] 1/6^{4}[/tex] to 1. It doesn't matter how long the strings are; only how much they differ in length.

Note that in the correct Bayesian analysis P(C) is actually the prior probability and P(A|C) is actually the likelihood.

P(C)=1 P(A)=1/6 and P(A|C)=1/6

Then

P(C|A)=P(A|C)P(C)/P(A)=1 (posterior probability)

So the numbers come out right despite the fact that the conditional terms were reversed. You got away with it because of the fact that P(A|C)=P(A) and P(C|A)=P(C) which are the "conditions" of independence. If it were a true conditional probability problem, you would have gotten the wrong answers.
 
Last edited:
  • #17
SW VandeCarr said:
Note that in the correct Bayesian analysis P(C) is actually the prior probability and P(A|C) is actually the likelihood.

P(C)=1 P(A)=1/6 and P(A|C)=1/6

Then

P(C|A)=P(A|C)P(C)/P(A)=1 (posterior probability)

So the numbers come out right despite the fact that the conditional terms were reversed. You got away with it because of the fact that P(A|C)=P(A) and P(C|A)=P(C) which are the "conditions" of independence. If it were a true conditional probability problem, you would have gotten the wrong answers.

As previously mentioned P(C) is not 1, it is 1/6^4, P(A) is not 1/6, it is 1/6^5, and "P(A|C)=P(A) and P(C|A)=P(C)" is not true. You have also come up with several unexplained terms which I fear have no well-defined mathematical equivalences, such as "prior probability" (presumably in your opinion different from "likelihood"), "posterior probability" and "true conditional probability". In any case they are clearly not necessary in this trivial problem and only serves to confuse both you and everyone else. You should brush up on elementary probability if you are still convinced of your calculations.
 
Last edited:
  • #18
SW VandeCarr said:
You defined A as rolling five consecutive sixes and C as rolling four consecutive sixes and it's true that P(A|C)=P(A)=1/6. However using Bayes Theorem:

P(A|C)=P(C|A)P(A)/P(B) we are required to condition C on A where C occurs before A. This does not make sense. You can write what you wish like P(A|C)=P(A) but you can't properly derive it since P(C)=1 and P(A)=1/6. That is, you are conditioning an outcome that is already known on one that has not yet occurred.

I don't think you meant to have B in the mix. I also disagree a little: I calculate

[tex]
P(A \mid C) = \frac{P(C \mid A) P(A)}{P(C)}
[/tex]

as you started. With the events as defined, if you know A has occurred (5 sixes in a row) then it has to be true that C has occurred, so that

[tex]
P(C \mid A) = 1
[/tex]

and this gives

[tex]
P(A \mid C) = \frac{P(C \mid A)P(A)}{P(C)} = \frac{1 \cdot \left(\frac 1 6\right)^5}{\left(\frac 1 6\right)^4} = \frac 1 6
[/tex]

as it must.
 
  • #19
Jarle said:
You have also come up with several unexplained terms which I fear have no well-defined mathematical equivalences, such as "prior probability" (presumably in your opinion different from "likelihood"), "posterior probability" and "true conditional probability".

There is so much wrong with this statement - and the rest of it that I haven't included - that I am not sure where to begin. If you find these terms unexplained and not well-defined, it is you who needs to study more probability.
 
  • #20
statdad said:
There is so much wrong with this statement - and the rest of it that I haven't included - that I am not sure where to begin. If you find these terms unexplained and not well-defined, it is you who needs to study more probability.

Granted, I withdraw my statement which was too hasty, but there is nothing "untrue" about this conditional probability problem as he proposed.
 
  • #21
Jarle said:
To clear up this:

P(A)= 1/6^5 != 1/6
P(C)= 1/6^4 != 1
P(A|C) = 1/6
P(C|A) = 1

You have stated (correctly) that P(A)=1/6 and P(C)= 1 and correctly P(A|C)=1/6. The probability of string length 1/6^5 is irrelevant as is 1/6^4. What matters is the difference between them, which is the probability of extending the string by one toss or 1/6, the independent probability of P(A).

Just to rephrase: The string of length four is given. Therefore the probability is of this string is 1. The probability of extending this string to a string of length five is 1/6 given the string of length four.

I have shown that P(A|C)=P(A). You have already conceded that. I have also shown that P(C|A)=P(C) proving that A and C are independent.

Look up likelihood to see how it relates to the posterior probability in Bayes Theorem.

I'm afraid it's you that doesn't understand conditional probability. Are you seriously saying that P(A|C) doesn't equal P(A)? You already conceded that it does. Are you seriously going to argue that this doesn't define the fact that A and C are independent and if they are independent, they are not conditional on one and other? Go back to the wiki article you linked for my edification.

I'm not going to continue this since I'm already repeating myself.
 
Last edited:
  • #22
statdad said:
There is so much wrong with this statement - and the rest of it that I haven't included - that I am not sure where to begin. If you find these terms unexplained and not well-defined, it is you who needs to study more probability.

Thanks statdad. I didn't see your posts until I posted my last post. I bugs me that some people who seem (and may be) pretty smart don't know what they don't know.

EDIT: I corrected B for C in post 11. Thanks.
 
Last edited:
  • #23
SW VandeCarr said:
You have stated (correctly) that P(A)=1/6 and P(C)= 1 and correctly P(A|C)=1/6. The probability of strings length 1/6^5 is irrelevant as is 1/6^4. What matters is the difference between them, which is the probability of extending the string by one toss or 1/6, the independent probability of P(A).

Just to rephrase: The string of length four is given. Therefore the probability is of this string is 1. The probability of extending this string to a string of length five is 1/6 given the string of length four.

I have shown that P(A|C)=P(A). You have already conceded that. I have also shown that P(C|A)=P(C) proving that A and C are independent.

As you can see I have stated that P(A) is not 1/6, and that P(C) is not 1, which is obvious. A is, as I have defined, "rolling five 6 in five rolls". Also, P(A|C) is not equal to P(A) for obvious reasons. P(C|A) is not equal to P(C), also for obvious reasons. I have given the correct values in a list, which I can restate here:
P(A)= 1/6^5
P(C)= 1/6^4
P(A|C) = 1/6
P(C|A) = 1
 
Last edited:
  • #24
It seems to me, that there is a misunderstanding here, because you guys appear to be working in different probability spaces.

It appears Jarle is arguing in the probability space of five independent die rolls (here, there are 65 different outcomes, each occurring with probability 1/65), with A the event consisting of the single outcome (6, 6, 6, 6, 6), and C the event consisting of those outcomes in which the first four rolls are 6 (there are 6 outcomes in C).

As far as I can tell, SW VandeCarr is arguing in the probability space consisting of the fifth roll of a die, after it has come up 6 four times (there are 6 outcomes: (6, 6, 6, 6, 1), (6, 6, 6, 6, 2), (6, 6, 6, 6, 3), (6, 6, 6, 6, 4), (6, 6, 6, 6, 5), and (6, 6, 6, 6, 6), each occurring with probability 1/6), with A the event consisting of the single outcome (6, 6, 6, 6, 6), and C the event consisting of those outcomes in which the first four rolls are 6 (i.e. the whole probability space).

If this is the case, then you're both right, using your own definitions. Please let me know if I've misrepresented anybody's viewpoint.
 
  • #25
Moo Of Doom said:
It seems to me, that there is a misunderstanding here, because you guys appear to be working in different probability spaces.

It appears Jarle is arguing in the probability space of five independent die rolls (here, there are 65 different outcomes, each occurring with probability 1/65), with A the event consisting of the single outcome (6, 6, 6, 6, 6), and C the event consisting of those outcomes in which the first four rolls are 6 (there are 6 outcomes in C).

As far as I can tell, SW VandeCarr is arguing in the probability space consisting of the fifth roll of a die, after it has come up 6 four times (there are 6 outcomes: (6, 6, 6, 6, 1), (6, 6, 6, 6, 2), (6, 6, 6, 6, 3), (6, 6, 6, 6, 4), (6, 6, 6, 6, 5), and (6, 6, 6, 6, 6), each occurring with probability 1/6), with A the event consisting of the single outcome (6, 6, 6, 6, 6), and C the event consisting of those outcomes in which the first four rolls are 6 (i.e. the whole probability space).

If this is the case, then you're both right, using your own definitions. Please let me know if I've misrepresented anybody's viewpoint.

The issue is whether the probability of rolling four 6's in a row is relevant. It could be any number. It could be 24. Once the sequence is rolled, it exists with P=1. Given any prior string of n consecutive 6's, what is the probability of rolling another 6 (or extending the sequence to n+1)? Regardless of what happened before, the probability of rolling a 6 is ALWAYS 1/6. It is NOT conditional on the previous outcome.

Each roll of a fair die is independent just as each toss of a fair coin is independent. Using the notation of conditional probabilities P(A|C)=P(A) can only be true if P(C)=1, which is the same as saying P(A) is independent of C. We can say that C consists of 4 consecutive sixes but P(C) after 4 (or 40) consecutive rolls of 6 is 1 after the fact and it doesn't affect the probability of the next roll. I don't think that Jarle meant to say that it did, at least I hope not. I think he simply doesn't understand that if P(A|C)=1/6 (with which he agrees) then P(A) MUST equal 1/6; that is, P(A|C)=P(A) if P(A) is independent.
 
Last edited:
  • #26
First of all you need to know what is randomness - series of events or the one event. But the problem is that we do not know what is the randomness. So both your points of view are right.
 
  • #27
SW VandeCarr said:
Each roll of a fair die is independent just as each toss of a fair coin is independent. Using the notation of conditional probabilities P(A|C)=P(A) can only be true if P(C)=1, which is the same as saying P(A) is independent of C. We can say that C consists of 4 consecutive sixes but P(C) after 4 (or 40) consecutive rolls of 6 is 1 after the fact and it doesn't affect the probability of the next roll. I don't think that Jarle meant to say that it did, at least I hope not. I think he simply doesn't understand that if P(A|C)=1/6 (with which he agrees) then P(A) MUST equal 1/6; that is, P(A|C)=P(A) if P(A) is independent.

This is ridiculous, I think anyone in this thread can agree with that P(A) = 1/6^5, P(C) = 1/6^4, P(A|C) = 1/6, and P(C|A) = 1, all following from elementary probabilistic observations. I have clearly defined A and C so there should be no ambiguity there.

It is absolutely clear here that A and C are not independent variables. If you have rolled four 6es, then it obviously affect the probability of rolling five 6es at that point, since you only have one more to go. It pains me to even explain that.
 
Last edited:
  • #28
Aleksey said:
First of all you need to know what is randomness - series of events or the one event. But the problem is that we do not know what is the randomness. So both your points of view are right.

Regardless of your opinion regarding randomness, it is true that the probability of rolling four sixes in a row is [tex](1/6)^{4}[/tex] before the fact. After the fact, it is a given and has a probability of one. If you condition the next roll of the die on a probability of one, it is the same as saying the next roll of the die is independent.

If you say that the probability of a six on the fifth roll is conditional on a probability of [tex]1/6^{4}[/tex], either you do not understand the difference between independent probabilities and conditional probabilities OR you are saying that the fifth roll of the die is not independent. Is that what you're saying? Do you (and other prior and would be posters) understand my question?
 
  • #29
Jarle said:
It is absolutely clear here that A and C are not independent variables. If you have rolled four 6es, then it obviously affect the probability of rolling five 6es at that point, since you only have one more to go. It pains me to even explain that.

Who said anything about independent variables? We are talking about independent probabilities. I guess you're coming right out and saying that the probability of the outcome of the fifth roll of the die depends on the previous four rolls. That pains me and should pain anyone else who knows that the probability of the outcome of each roll of a fair die or each flip of a fair coin is assumed to be independent.
 
  • #30
SW VandeCarr said:
Who said anything about independent variables? We are talking about independent probabilities.

Do you believe independent probabilities is something detached to independent variables and events? In that case I would like to hear your definition of independent probabilities.

SW VandeCarr said:
I guess you're coming right out and saying that the probability of the outcome of the fifth roll of the die depends on the previous four rolls.

You have grossly misunderstood my posts; of course I haven't said such an absurd thing. I have explicitly defined my variables and assigned them with explicit probabilities which I believe no one here will disagree with. I have no idea what you are talking about.
 
Last edited:
  • #31
Jarle said:
You have grossly misunderstood my posts; of course I haven't said such an absurd thing. I have explicitly defined my variables and assigned them with explicit probabilities which I believe no one here will disagree with. I have no idea what you are talking about.

t is absolutely clear here that A and C are not independent variables. If you have rolled four 6es, then it obviously affect the probability of rolling five 6es at that point, since you only have one more to go. It pains me to even explain that.

Perhaps you can explain what you mean then. The probability of getting five in a row (A) given you have four in a row (C) is 1/6. But it doesn't have to be four in a row. It can any number n. The probability extending the string to n+1 is always going to be 1/6. It's not dependent on the length of C or the probability of a string of that length.

I'm sure I'm ever going to convince you, but I'm responding for the benefit of others that are following this thread.
 
  • #32
SW VandeCarr said:
Perhaps you can explain what you mean then. The probability of getting five in a row (A) given you have four in a row (C) is 1/6.

Yes, this event is called A|C, or "A given C", and P(A|C) = 1/6. You are confusing A|C with A.

SW VandeCarr said:
But it doesn't have to be four in a row. It can any number n. The probability extending the string to n+1 is always going to be 1/6. It's not dependent on the length of C or the probability of a string of that length.

We are talking about explicit events here, A and C.
 
Last edited:
  • #33
SW VandeCarr said:
Regardless of your opinion regarding randomness, it is true that the probability of rolling four sixes in a row is [tex](1/6)^{4}[/tex] before the fact. After the fact, it is a given and has a probability of one. If you condition the next roll of the die on a probability of one, it is the same as saying the next roll of the die is independent.

If you say that the probability of a six on the fifth roll is conditional on a probability of [tex]1/6^{4}[/tex], either you do not understand the difference between independent probabilities and conditional probabilities OR you are saying that the fifth roll of the die is not independent. Is that what you're saying? Do you (and other prior and would be posters) understand my question?

"The fifth roll of the die is not independent" - exactly. It depends of my opinion regarding randomness.
 
  • #34
"As previously mentioned P(C) is not 1"

In less sophisticated language than SW would use (I hope I'm not butchering your argument SW - if so, feel free to slap me about )

If you ask at the start of the rolls "what is the probability of obtaining four sixes in a row?" then [itex] P(C) = (1/6)^4 [/itex].

If you have already observed those four sixes in a row, and are starting calculations from that point, you treat [itex] P(C) = 1 [/itex] - you know they've occurred. In the first case [itex] A [/itex] refers to this question: "What is the probability that in the first five rolls you obtain five sixes?" This leads to [itex] P(A \mid C) = 1/6 [/itex], as we've seen.

In the second case [itex] A [/itex] can be interpreted as "What is the probability that, continuing on, we finish the next roll and have five sixes in a row?" The only way that can happen is for the next roll to be a six, since we've seen four. In this sense, all calculation refers to one roll only, but we still obtain [itex] 1/6 [/itex] as the answer; in this case, however, [itex] P(A) = 1/6 = P(A \mid C) [/itex] because the setting of the problem is different from our first case.
 
  • #35
"The fifth roll of the die is not independent" - exactly. It depends of my opinion regarding randomness."

Not an opinion - you would need to put forth a description of how one result could possibly be influenced by previous rolls.
 
Back
Top