DOC PREVIEW
Princeton PHY 301 - Entropy and the Number of States

This preview shows page 1-2-3 out of 8 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Physics 301 15-Sep-2003 2-1ReadingThis week, you should read the first two chapters of K&K.Entropy and the Number of StatesAs we discussed last time, in the statistical view, entropy is related to the numberof “microstates” of a system. In particular, the entropy is the log of the number ofstates that are accessible to the system when it has specified macroscopic parameters(its “macrostate”).The fact that entropy always increases is just a reflection of the fact that a systemadjusts its macroscopic parameters, within the allowed constraints, so as to maximize thenumber of accessible states and hence the entropy.So, a large part of statistical mechanics has to do with counting states and anotherlarge part has to do with deriving interesting results from these simple ideas.Why is the Number of States Maximized?Good question. We are going to take this is an axiom or postulate. We will notattempt to prove it. However, we can give some plausibility arguments.First, remember that we are typically dealing with something like Avogadro’s numberof particles, N0= 6.02 × 1023. As we discussed last time, this makes the probabilitydistributions very sharp. Or put another way, improbable events are very improbable.The other thing that happens with a large number of particles has to do with therandomness of the interactions. Molecules in a gas are in continual motion and collide witheach other (we will see later in the term, how often). During these collisions, moleculesexchange energy, momentum, angular momentum, etc. The situation in a liquid is similar,one of the differences between a liquid and gas has to do with the distance a moleculetravels between collisions: in a gas, a molecule typically travels many molecular diameters;in a liquid, the distance between collisions is of the order of a molecular diameter. In asolid, molecules tend to be confined to specific locations, but they oscillate around theselocations and exchange energy, momentum, etc. with their neighbors.OK, molecules are undergoing collisions and interactions all the time. As a result, thedistribution of molecular positions and speeds is randomized. If you pick a molecule andask things like where is it located, how fast is it going, etc., the answers can only be givenin terms of probabilities and these answers will be the same no matter which molecule youCopyrightc 2003, Princeton University Physics Department, Edward J. GrothPhysics 301 15-Sep-2003 2-2pick. (Provided you pick the same kind of molecule - you’ll probably get different answersfor a N2molecule and an Ar atom, but you’ll get the same answers for two N2molecules.)Sticky point: suppose we assume that the world is described by classical mechanics.Also suppose we know the interactions between molecules in some isolated system. Supposewe also know all ∼ N0positions riand momenta pi(and whatever else we might need toknow to specify the system, perhaps the angular momenta of the molecules, etc.). Then inprinciple, the equations of motion can be solved and the solution tells us the exact stateof the system for all future times. That is, there is nothing random about it! How do wereconcile this with the probabilistic view espoused in the preceding paragraphs?So far as I know, there are reasonable practical answers to this question, but there areno good philosophical answers. The practical answers have to do with the fact that onecan’t really write down and solve the equations of motion for ∼ N0particles. But we can inprinciple! A somewhat better answer is that we can only know the initial conditions withsome precision, not infinite precision. As we evolve the equations of motion forward, theinitial uncertainties grow and eventually dominate the evolution. This is one of the basicconcepts of chaos which has received a lot of attention in recent years: small changes inthe initial conditions can lead to large changes in the final result. (Have you ever wishedyou could get a 10 day or 30 day weather forecast? Why do they stop with the 5 dayforecast?)Of course, the fact that we cannot measure infinitely precisely the initial conditions norsolve such a large number of equations does not mean (still assuming classical mechanics)that it couldn’t be done in principle. (This is the philosophical side coming again!) Soperhaps there is still nothing random going on. At this point one might notice that it’simpossible to make a totally isolated system, so one expects (small) random perturbationsfrom outside the system. These will disturb the evolution of the system and have essentiallythe same effect as uncertainties in the initial conditions. But, perhaps one just needs toinclude a larger system!If we recognize that quantum mechanics is required, then we notice that quantummechanics is an inherently probabilistic theory. Also, I’m sure you’ve seen or will see inyour QM course that in general, uncertainties tend to grow with time (the spreading outof a wave packet is a typical example). On the other hand, the system must be describedby a wave function (depending on ∼ N0variables), whose evolution is determined bySchroedinger’s equation . . ..As you can see this kind of discussion can go on forever.So, as said before, we are going to postulate that a system is equally likely to be inany state that is consistent with the constraints (macroscopic parameters) applied to thesystem.Copyrightc 2003, Princeton University Physics Department, Edward J. GrothPhysics 301 15-Sep-2003 2-3As it happens, there is a recent Physics Today article on exactly this subject: trying togo from the reversibility of classical mechanics to the irreversibility of statistical mechanics.It’s by G. M. Zaslavsky and is called, “Chaotic Dynamics and the Origin of StatisticalLaws,” 1999, vol. 52, no. 8, pt. 1, p. 39. I think you can read this article and get a feel forthe problem even if some of it goes over your head (as some of it goes over my head).Aside—Entropy and InformationIn recent times, there has been considerable interest in the information content ofdata streams and what manipulating (computing with) those data streams does to theinformation content. It is found that concepts in information theory are very similarto concepts in thermodynamics. One way out of the “in principle” problems associatedwith classical entropy is to consider two sources of entropy: a physical entropy and aninformation or algorithmic entropy. This goes something like the following:


View Full Document

Princeton PHY 301 - Entropy and the Number of States

Documents in this Course
Lecture

Lecture

8 pages

Lab 8

Lab 8

10 pages

Lab 7

Lab 7

8 pages

Load more
Download Entropy and the Number of States
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Entropy and the Number of States and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Entropy and the Number of States 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?