Entropy Calculation for Small Systems

  • Thread starter Redsummers
  • Start date
  • Tags
    Entropy
In summary, the conversation discusses the calculation of entropy for systems with small amounts of molecules or cards. The equations used for these calculations are the well-known formulas for entropy, as well as the binomial coefficient and the Boltzmann constant. The results for both examples are small, but this is expected due to the large number of possible arrangements. The conversation also touches on the concept of entropy as a macroscopic variable that relates to temperature and energy.
  • #1
Redsummers
163
0

Homework Statement



My question is more global than a specific exercise, but I will illustrate my question with two examples.

My question is, when determining entropy for systems of say, 1000 molecules, 200 cards, and other small amounts (in contrast to the 6.022 E23 molecules in one mole), is it okay to give our result according to the well-known formulas (cited in 2.)?

Let me put two examples:

a) You have 5 identical sets of cards, each with 24 different cards. Then you shuffle the five sets, calculate the entropy.

b) You have 1000 molecules of N2 and 250 molecules of O2 in two different containers until you mix them together. Imagine that they reach equilibrium and the process let no temperature change due to the large-enough volume.

Homework Equations



For a) I would just use:

[tex]S = k_B lnW[/tex]

Where W are the possible combinations, so:

[tex]S = k_B ln \frac{n!}{(n-r)!r!}[/tex]
Now for b), similarly:

[tex]S = k_B ln\Omega[/tex]

And [tex]\Omega[/tex] is the different kind of arranging possibles, i.e.:

[tex]S = k_B ln \frac{N!}{N_1!N_2!}[/tex]

Where [tex]N = N_1 + N_2[/tex]

The Attempt at a Solution



The calculations are really straight forward at this point.

For a) I would use n = 24 and r = 5, and at the end, it gives me S = 10.65 * k_B = 1.47 E-22

And b) I use N1=1000 and N2=250, being N=1250. And after plugging the numbers in the computer (since the number of different arrangements is rather high). It gives me S= 621.93*k_B = 8.58 E-21.

As you can see both results are really small, I don't know if I'm missing something, or if in these statistical entropy cases there are other ways to deal with it.
 
Physics news on Phys.org
  • #2
I wouldn't worry about the units, as the Boltzmann constant will cause such a unfamiliar exponent to occur. Rather, think of it in relative terms. Aka, the multiplicity of one situation vs. another, and then the entropy is just a tuning of this value. The inherent stat mech is in the multiplicity.

That being say, the binomial coefficient that you are using may not be appropriate to the situation of cards. I agree that for molecules it works because you are essentially setting up a binary scenario, meaning a molecule is paired or unpaired. I'm not sure about card shuffling though.
 
  • #3
Oh okay, that makes sense. But yeah, the card shuffling thing was my guess as for what I know about combination and permutations. I don't know either if I should be using the Boltzmann constant or if there is another way of calculating such disorder.

Thanks anyway! Now I'm more relieved about the small quantities results.
 
  • #4
If this helps make the results seem more sensical,

Entropy is a macroscopic variable that relates to temperature and energy. Imagine you drop an ice cub into a pot of water. The temperature will eventually change, maybe 10 degrees, but imagine how many new microstates you created by dumping 10^23 new molecules in the system. This is why a huge change in multiplicity doesn't manifest as a huge change in entropy, because multiplicities only translate into a few degrees of temperature etc...

If you learn anything more about the cards, post back. I think there is a lot of information on it if you google card riffling.
 
  • #5
That's a good conceptual definition, definitely. And yeah, I will ask my professor in the next lecture, which will be this friday, so if there are no answers before then I will post a more logical result concerning the card example.
 

Related to Entropy Calculation for Small Systems

1. What is entropy and why is it important to calculate for small systems?

Entropy is a measure of the disorder or randomness in a system. It is important to calculate for small systems because it can help us understand the behavior and predict the changes in these systems.

2. How is entropy calculated for small systems?

Entropy can be calculated using statistical mechanics, which uses the number of possible microstates of a system and the probability of each microstate occurring to determine the overall disorder of the system.

3. Can entropy be negative for small systems?

No, entropy cannot be negative for small systems. According to the Second Law of Thermodynamics, the entropy of a closed system will either remain constant or increase over time, but it can never decrease.

4. What factors can affect the entropy of a small system?

The number of particles in the system, the energy of the particles, and the volume of the system are all factors that can affect the entropy of a small system. Changes in these factors can lead to changes in the disorder of the system.

5. How is entropy related to the concept of equilibrium in small systems?

Entropy is closely related to the concept of equilibrium in small systems. When a system reaches equilibrium, its entropy is at a maximum, meaning that the system is in its most disordered state. This is because at equilibrium, all possible microstates are equally likely to occur, resulting in maximum entropy.

Similar threads

  • Advanced Physics Homework Help
Replies
7
Views
2K
  • Introductory Physics Homework Help
Replies
7
Views
1K
  • Advanced Physics Homework Help
Replies
10
Views
4K
Replies
2
Views
937
Replies
1
Views
1K
Replies
2
Views
1K
Replies
19
Views
1K
  • Introductory Physics Homework Help
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
1K
  • Introductory Physics Homework Help
Replies
2
Views
2K
Back
Top