DOC PREVIEW
MIT 8 08 - Statistical Ensembles

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Chapter 1Statistical Ensembles1.1 Principle of statistical physics and ensembles1.1.1 Partial information to partial resultStatistical systems are complex systems. The systems are so complex that we cannot obtain all theinformation to completely characterize the systems. For example, a liter of gas may contain 1022atoms.To completely characterize such a system (or more precisely, a state of such a system), we need to knownthe three components of the velocity for each atoms and the three components of the position for eachatoms. It is impossible to obtain 6 × 1022real numbers to completely characterize the gas.However, not knowing all the information needed to characterize gas does not prevented us to develop atheory of gas. This is because we are only interested in some average properties of gas such as the pressure,volume, temperature. Those properties do not depend on every little details of each atoms. Not knowingevery thing about the atoms does not prevent us from calculating those properties. This is the kind ofproblems in statistical physics. In statistical physics we try to understand the properties of a complexsystem without know all the information of the systems. This is possible since the properties that we areinterested in do not depend on the all details of the system.1.1.2 All possible states appear with an equal probabilityIn statistical physics there is only one principle: All possible states appear with an equal probability. Letus explain what do we mean by the ab ove statement. Suppose we know certain quantities, such as pressure,total energy, etc, of a complex systems. But those quantities do not characterize the system completely.This means that the system has a number of states for which those quantities take the same values. Thuseven after knowing the values of those quantities, we still do not know, among those possible states, whichstate the system is actually in. Then according to the principle of statistical physical, we say all the possiblestates are equally likely.1.1.3 Time average and ensemble averageBut the system can only be in one state at a given time. What do we mean by “all the possible statesare equally likely”? There are two points of view. In the first point of view, we may imagine we havemany copies of the system, all characterized by the same set of quantities, such as total energy, pressure,etc . But each copy may be in a different possible state. Then “equally likely” means that each possible1state appear the same number of times among the copies of the system. The copies of the system is calledensemble. We have to have an ensemble to even define the probabilities. Under the first interpretation,statistical physics is a science that deal with ensembles, rather than individual systems.The second point of view only apply to the situation where the environment of the system is independentof time. In this case we interpret “equally likely” as that all the possible states appear for the same amountof time during a long period of time. The second point of view is related to the first point of view if weview the system at different times as the different copies of the system.The two points of view may not be equivalent. The two points of view are equivalent only when thesystem can visit all the possible states, many times, during a long period of time. This is the ergodicityhypothesis.Not all systems are ergodic. For a non-ergodic system, statistical physics only apply to its ensemble.For an ergodic system, statistical physics also apply to the time average of the system.1.2 Microcanonical ensembleA microcanonical ensemble is an ensemble formed by isolated systems. All the systems in the ensemblehave the same energy (and possibly some other properties). Here by “same energy” we really mean thatall systems has an energy which lies within a small window between E and E + ∆E.1.2.1 Number of states and entropyLet us first study a simple example: N spins in magnetic field. The energy for an up-spin is E↑= ϵ0/2and for a down-spin E↓= −ϵ0/2.We like to ask: How many states are there with a total energy E?Since the total energy is given by E = Mϵ02− (N − M)ϵ02, where M is the number of up-spin. Sothe states with a total energy E are the states with M up-spins. But how many states are there withM =Eϵ0+N2up-spins? The answer is CMN=N!M!(N−M)!. (Here CMNis the number of ways to pick M objectsfrom N objects.) So the number of states with a total energy E isΓ(E) = CEϵ0+N2NAfter obtain the number of states as a function of total energy E, we can define the entropy of thesystem: the entropy is kBtime the log of the number of states:S(E) = kBln Γ(E), (1.2.1)where kB= 1.3807 ×10−16erg/K = 8.617343(15) ×10−5eV/K is the Boltzmann constant. (The Boltzmannconstant is conversion factor between energy and temperature: 1K = 1.380710−16erg = 8.617343(15) ×10−5eV. Or 1eV = 11605K. We will introduce the definition of temperature shortly.)For a microcanonical ensemble, entropy a function of energy E. So for our N-spin system, the entropyisS(E) = kBln CEϵ0+N2N(1.2.2)To calculate ln CMN, we can use the Stirling’s approximationln(n!) = n ln n − n +12ln(2πn) + O(1/n) (1.2.3)2ε/ε0E/E =0−0.50.510S(E)/NFigure 1.1: The entropy per spin, S(E)/N, as a function of E or ϵ the average energy per spin. Themaximum entropy of a spin-1/2 spin is kBln(2) = 0.69314718056kB.Thus (see Fig. 1.1)k−1BS(E) = ln CMN≈N ln N − M ln M − (N − M) ln(N − M)= − M ln(MN) − (N − M) ln(N − MN)=N(−f↑ln f↑− f↓ln f↓) (1.2.4)where f↑≡MN(or f↓≡ 1−MN) is the probability for a spin to be up (or down). Since E = Mϵ02−(N −M)ϵ02,we have f↑=12+EE0and f↓=12−EE0where E0= Nϵ0. Thusk−1BS(E) =N[− (12+EE0) ln(12+EE0) − (12−EE0) ln(12−EE0)](1.2.5)Clearly, from the definition, the physical meaning of the entropy isnumber of states with energy E = eS(E)/kB(1.2.6)1.2.2 Concept of temperatureTo introduce the concept of temperature, let us put two systems of spins together. The system-1 has N1spins and the system-2 has N2spins. Let˜E1,2be the energies of the two systems at the beginning. Thetotal energy is E =˜E1+˜E2. If we allow the two systems to exchange their energy, then the spins in thetwo systems may flip up and down, and sample all the possible states with total energy E. Now we like toask what is the probability for the system-1 to have a new energy E1? Certainly, the system-2


View Full Document

MIT 8 08 - Statistical Ensembles

Documents in this Course
Load more
Download Statistical Ensembles
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Statistical Ensembles and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Statistical Ensembles 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?