Calculating Entropy for 26 Particles in Box

  • Thread starter Fruitbraker
  • Start date
  • Tags
    Entropy
In summary, that means the entropy of this particular configuration is 213.439.In summary, the entropy of this configuration is 213.439.
  • #1
Fruitbraker
4
0

Homework Statement


52 distinguishable particles have been in a box long enough to reach equilibrium. The box is divided into two equal-volume cells. Let's say that there are 103 sub-states (s1 through s1000) available to each particle on each side, regardless of how many other particles are around. (In a more realistic case that number would be much greater and depend on the temperature, but our key results would not change.) So a microstate is specified by giving the position (left half or right half) of every particle and its sub-state within this half.

Edit: forgot to add the question

What is the entropy of this configuration? (26 particles in left side of box)

Homework Equations


1. Use the binomial distribution to find the number of possibilities 26 particles are in the left side of the box
2. ln(omega) for entropy, where omega is the distribution size (ie the number of accessible microstates)

The Attempt at a Solution


Using the binomial formula 52 choose 26 yields 495918532948104 possibilities for 26 particles to be in the left side of the box. Among those 26 particles, they each have an additional 1000 different states, which means there are a total of 100026 different states on the left side for a particular combination.

Therefore, the total number of microstates is (52 choose 26) * 100026. Taking the natural log of that yields 213.439, which isn't the entropy of this particular configuration.

What am I doing wrong?

Thanks all!
 
Physics news on Phys.org
  • #2
Hello Fruit,

What about the other side of the divider ?

(and how do you know your answer isn't right ?)
 
  • #3
Hi BvU,

I know my answer isn't right because it's an online homework assignment with real time feedback.

If I consider the right side as well and add it on top of the left side, then I should have my original answer doubled. But that is still wrong.
 
  • #4
Fruitbraker said:
then I should have my original answer doubled
No. Do you understand why not ?
 
  • Like
Likes Fruitbraker
  • #5
Aha! I had the right idea.

There are 52 choose 26 different combinations for 26 particles to be in the left side. Then there are 100026 for both sides of box. Apparently I was a factor of 100026 off in the natural log. Thanks!
 
  • #6
Well done. The factor 495918532948104 only appears once.
 

FAQ: Calculating Entropy for 26 Particles in Box

What is entropy?

Entropy is a measure of the disorder or randomness of a system. In the context of particles in a box, it represents the number of possible arrangements of the particles in the box.

How do you calculate entropy for 26 particles in a box?

The formula for calculating the entropy of a system is S = k ln(W), where S is entropy, k is the Boltzmann constant, and W is the number of microstates or possible arrangements of the particles in the box. In this case, W = (N+M-1)! / (N! M-1!), where N is the number of particles and M is the number of available positions in the box. For 26 particles in a box with 26 available positions, the calculation would be S = k ln(26!) / (26! 26-1!).

What is the relationship between entropy and the number of particles in a box?

The relationship between entropy and the number of particles in a box is directly proportional. This means that as the number of particles increases, the entropy also increases. This is because with more particles, there are more possible arrangements or microstates, resulting in a higher degree of disorder or randomness.

How does increasing the volume of the box affect entropy?

Increasing the volume of the box has no effect on the entropy of the system, as long as the number of particles and available positions remains the same. This is because the number of microstates or possible arrangements does not change, even with a larger volume.

Can entropy ever decrease?

According to the Second Law of Thermodynamics, entropy of an isolated system can never decrease. This means that in a closed system, the overall disorder or randomness will always increase over time. However, in certain cases, the entropy of a specific component within a system can decrease if there is an input of energy from an external source.

Similar threads

Replies
3
Views
2K
Replies
10
Views
863
Replies
3
Views
1K
Replies
23
Views
1K
Back
Top