Unformatted text preview:

Physics 301 24-Sep-2003 6-1Entropy and ProbabilitiesWe’ve been using the idea that the entropy is the logarithm of the number of statesaccessible to the system. We’ve also said that each state is equally likely. At this point, I’dlike to make the connection between entropy and probability. This allows one to constructan expression for the entropy of a system that isn’t in equilibrium. It should also improveour intuition about the entropy and the partition function.We might expect that an expression for entropy can be written in terms of the prob-abilities that a system is in a particular state. If there are g states, and the probabilitiesare p1, p2, . . . , pg, then we would like to writeσ = σ(p1, p2, . . . , pg) .One of the things that will guide us in our selection of the function is that the entropyshould be additive (i.e., an extensive parameter). If we have two non-interacting systemswith total numbers of states g1and g2, entropies σ1and σ2, and probabilities p1iand p2j(the first index is the system, the second index is the state), we can also think of it as asingle system with g = g1g2states, σ = σ1+ σ2and, since any state in system 1 can becombined with any state in 2, the probability of a state in the combined system must bepij= p1ip2j.Since the probabilities multiply, while the entropies add, we might expect that theentropy should involve the log of the probability. The first guess might beσ1= −Xilog p1iwrong .Since p1i≤ 1, the minus sign is inserted to make the entropy positive. Why doesn’t thisexpression work? There are several reasons. First, suppose one has a totally isolatedsystem. Then only states with the exact energy of the system are allowed. Disallowedstates have p1i= 0 and this will lead to problems with the logarithm. In addition, withthe above expression, the entropy is not additive. To fix up the problem with p1i= 0, wemight try multiplying by p1isince in the limit x → 0, x log x → 0. Does this make theentropy additive? Considerσ = −Xi,jp1ip2jlog p1ip2j,= −Xi,jp1ip2jlog p1i−Xi,jp1ip2jlog p2j,= −Xip1ilog p1i−Xjp2jlog p2j,= σ1+ σ2.Copyrightc 2003, Princeton University Physics Department, Edward J. GrothPhysics 301 24-Sep-2003 6-2We used the fact thatPip1i=Pjp2j= 1. We adopt the following expression for theentropy in terms of the probabilities.σ = −Xipilog pi,where we can include or omit states with probability 0 without affecting the value of theentropy.What set of probabilities maximizes the entropy? The answer depends on the condi-tions under which we seek a maximum. Suppose we are dealing with a completely isolatedsystem. Then a state can have non-zero probability only if it has the required energy (andany other conserved quantities). So let’s limit our sum to allowed states. (Here, we’redoing this for convenience, not because our expression might blow up!) The other thingwe know is that the probabilities of the allowed states sum to 1. The problem we want tosolve is maximizing the entropy under the constraint that the probabilities sum to 1. Howdo we maximize with a constraint? Lagrange multipliers! So we seek to maximizeX(p) = σ(p) + λ 1 −Xipi!,= −Xipilog pi+ λ 1 −Xipi!.We set the derivative of X with respect to pito zero,0 =∂X∂pi= − log pi− 1 − λ .This givespi= e−(λ + 1),so the probabilities of all allowed states are the same when the entropy is a maximum. Wealso set the derivative of X with respect to λ to 0 which recovers the condition that theprobabilities sum to 1. Solving for λ, we find λ = log g − 1. (g is the number of allowedstates and the number of terms in the sum.) Finally,σ = −Xi1glog1g= − log1g= log g ,as we had before.Now suppose we consider a system which is not isolated, but is in equilibrium thermalcontact with a heat bath so that the average value of its internal energy is U. Again, wesum only over allowed states. This time states with energies other than U are allowed,Copyrightc 2003, Princeton University Physics Department, Edward J. GrothPhysics 301 24-Sep-2003 6-3provided the average turns out to be U. We want to find the probabilities that maximizethe entropy under the constraints that the probabilities sum to 1 and average energy is U.We find the maximum ofX(p) = −Xipilog pi+ λ1 1 −Xipi!+ λ2 U −XipiEi!,where Eiis the energy of state i. We want0 =∂X∂pi= − log pi− 1 − λ1− λ2Ei.It follows thatpi= e−1 − λ1− λ2Ei.You are asked to show in the homework that λ2= 1/τ. So, the probabilities wind up witha Boltzmann factor!Consider an ensemble of systems. The case in which the energy of each system isidentical and equal probabilities are assigned is known as the micro-canonical ensemble.The case in which the energies vary and the probabilities are assigned with Boltzmannfactors is known as the canonical ensemble.Heat CapacityIn general, the amount of energy added to a system in the form of heat, dQ, and therise in temperature dτ resulting from this addition of heat are proportional,dQ = C dτ ,where C is “constant” of proportionality. Why is constant in quotes? Answer: the badnews is that it can depend on just about everything. The good news is that over a smallrange of temperature it doesn’t vary too much, so it can be treated as a constant.One of the things it obviously depends on is the amount of material in the system. Toremove this dependence, one often divides by something related to the amount of materialand then speaks of the specific heat. For example, dividing by the mass of the system givesthe heat capacity per unit mass, c = C/m. Of course, this is only useful if one is dealingwith a homogeneous material. That is you might speak of the specific heat of aluminumand the specific heat of water, but for boiling water in an aluminum pan you would beconcerned with the heat capacity (which you could calculate from the masses of the panand the water and the specific heats from the Handbook of Chemistry and Physics). In thecase of gasses, the amount of material is usually measured in moles and the heat capacityCopyrightc 2003, Princeton University Physics Department, Edward J. GrothPhysics 301 24-Sep-2003 6-4is divided by the number of moles to give molar specific heat or molar heat capacity. Thisis usually a number of order the gas constant. In statistical physics, we often speak of theheat capacity per molecule. This is usually a number of order Boltzmann’s constant.All the above is mainly bookkeeping. Of


View Full Document

Princeton PHY 301 - lecture 06

Documents in this Course
Lecture

Lecture

8 pages

Lab 8

Lab 8

10 pages

Lab 7

Lab 7

8 pages

Load more
Download lecture 06
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view lecture 06 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view lecture 06 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?