- #1
Zacarias Nason
- 68
- 4
I just started learning some Stat. Mech by Leonard Susskind's lectures and am in a part briefly overviewing basic probability in general; One of the things brought up was F(i), some quantity associated with the ith state of a system, and the important average of F(i) averaged over the probability distribution, or:
[tex] < F(i)> \ = \sum_{i} F(i)P(i) [/tex]
where F(i) is again some quantity associated with the ith state and P(i) is the probability of that particular outcome; Susskind emphasizes that the average of F(i) does not have to be any of the values it can take on, e.g. if your system is a coin toss where heads and tails are assigned values of F(H) = 1, F(T) = -1 respectively, <F(i)> when you have a very large number of tosses should approach zero. If this can't represent one of the values of F(i), what is the value or purpose of <F(i)>? What does it tell you? Why is it important?
[tex] < F(i)> \ = \sum_{i} F(i)P(i) [/tex]
where F(i) is again some quantity associated with the ith state and P(i) is the probability of that particular outcome; Susskind emphasizes that the average of F(i) does not have to be any of the values it can take on, e.g. if your system is a coin toss where heads and tails are assigned values of F(H) = 1, F(T) = -1 respectively, <F(i)> when you have a very large number of tosses should approach zero. If this can't represent one of the values of F(i), what is the value or purpose of <F(i)>? What does it tell you? Why is it important?