Brandeis BCHM 104A - Lectures on Entropy and free Energy

Unformatted text preview:

Lectures on Entropy and free EnergyLecture 1After a formal course on thermodynamics most chemistry or biochemistry studentsremember two formulas.† DG = DH - TDSandDG0= RT lnKand that is a very good thing. These two formulas are the center of chemicalthermodynamics, and if you do not remember them or where they came from, go backand look them up.Beyond those two formulas most students leave their Pchem courses prettyconfused. In particular they are confused about Entropy. Where it comes from and how itinfluences equilibria. This is not your fault, and it is not Entropy’s fault. The problem issimply that most introductory thermodynamics courses use the historical derivation ofthermodynamics that relies on macroscopic properties of substances. This version ofthermodynamics was developed long before it was clear that there are atoms andmolecules made up from those atoms. Thermodynamics was originally derived to explainthings like steam engines and to determine the maximal amount of work that could beextracted from those machines. On the other hand, in modern chemistry – in biochemistryin particular- we think about reactions on a molecular basis. When we think about DNAwe do not think about a bucket full of one mole of DNA and we are not interested in theamount of heat that is needed to warm this bucket of DNA by one degree C. We arethinking about DNA as a single molecule made up of individual atoms and we want tounderstand the properties of this single molecule and how it may interact with othermolecules.Fortunately there is an approach to thermodynamics that takes this samemolecular view of the world. This type of thermodynamics is called statisticalthermodynamics and Ludwig Boltzmann is its father and we will later see and use hisfamous formula for entropy.So lets start from scratch and see if we can arrive a view of thermodynamics thatis more appropriate for our type of problems. The state at which a system consisting ofmany molecules comes to an equilibrium is determined by the battle between twotendencies of every physical system. These tendencies are expressed in the following twoextremum principles.1) The principle of minimal potential energy2) The principle of maximal multiplicityThe currency in which the principles of minimal potential energy and maximalmultiplicity barter over the exact location of the minimum is energy as it is expressed inthe equations† DG = DH - TDSandDG0= RT lnKAt the end of the next two lectures you should have a pretty good idea of how these twoexteremum principles act at the molecular level and how they are related to enthalpy andentropy and how they influence the equilibrium of every chemical reaction.The principle of minimal potential energy.Lets start with the principle of minimal potential energy. This is pretty easy tounderstand, because this principle of minimal potential energy acts on macroscopicsystems just the same way as it does on microscopic systems. Just think of a ball in amechanical well or two balls connected by a spring. If we leave these systems alone, theywill eventually adopt their state of minimal potential energy. This potential energy isdirectly equivalent to the enthalpy of a system of microscopic particles. The enthalpy of amole of molecules is simply the sum of the minute potential energies of each of themolecules in the system.Lets say we have a molecule that likes to form a straight line, if we force thismolecule to adopt anything but a straight conformation we need to expend energy andthis energy is now stored in these molecules. The tendency for each molecule is now tostraighten itself out again (just like a macroscopic spring) and to release this energy thatwe put into the system. In other words the principle of minimal potential energy wouldpredict that all molecules would be perfectly straight.The principle of maximal multiplicity.This is the second extremum principle. Unlike the first principle it cannot be explained bya macroscopic equivalent of a single molecule-like object. Instead it is a principle that isstatistical in nature. It emerges only when we look at a system of a large number ofindividual objects. So understanding this principle and how it relates to entropy will takea lot more attention.Lets think about the simple example of a series of coin tosses. Lets say we toss acoin 10 times. (As a quick reminder if we try to determine the probability of a specificseries of independent events, we simply have to multiply the probabilities of theindividual events.) So what is the probability to find 10 heads?HHHHHHHHHH† phphphphphphphphphph= 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5= 0.510= 0.000976...Wow, that is pretty unlikely, but that sort of makes sense because what are the chances ofgetting 10 heads in a row?Now in the back of your head you are thinking that the principle of maximal multiplicitysomehow represents entropy. You also vaguely remember that entropy has something todo with probability and that disordered systems are favored by entropy. So what is theprobability of finding a disordered series? Lets say:HTTHHHTHTT† phptptphphphptphptpt= 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5 ⋅ 0.5= 0.510= 0.000976...Darn, it is exactly the probability as the all heads case even though it is “disordered” andit even fulfills the statistically expected ratio of 1/2 heads 1/2 tails.We can keep going with this and we will find that the probability of finding a particularsequence as a result of sequential and independent trials is† p(n1n2n3...ni,N) = p1n1p2n2p3n3.....piniwhereniiÂ= NandpiiÂ=1Each specific sequence of heads or tails is just as likely to occur as any other and what weare really interested in is not the probability of a particular series of events but theprobability of finding a sequence that has a particular “macroscopic” property.Macroscopic, in this context, means that we do not care what the order of the heads andtails is, but simply how many of the coins are heads or tails. This sort of questioninvolves multiplicity rather than probability.MultiplicitySo what is multiplicity? The Multiplicity of a set of combined event is the number ofdifferent ways in which this set of events could possibly occur. For a simple example, letssay a car company makes two models of cars and paints them in three different colors.The multiplicity of the “event” called car then is 6.In our case we want to know what is the


View Full Document

Brandeis BCHM 104A - Lectures on Entropy and free Energy

Download Lectures on Entropy and free Energy
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lectures on Entropy and free Energy and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lectures on Entropy and free Energy 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?