This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 3

This module delves into the concepts of ensembles and the statistical probabilities associated with the occupation of energy levels. The partition function, which is to thermodynamics what the wave function is to quantum mechanics, is introduced and the manner in which the ensemble partition function can be assembled from atomic or molecular partition functions for ideal gases is described. The components that contribute to molecular ideal-gas partition functions are also described. Given specific partition functions, derivation of ensemble thermodynamic properties, like internal energy and constant volume heat capacity, are presented. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

In the last video, we looked at the exponential dependence of Boltzmann

Â probabilities on the energy. In this video, I'd like to focus on the

Â concept of the Boltzmann population. So let me recap what we have talked about

Â so far when it comes to our thought construct.

Â The very large water cooler, or an ensemble at fixed temperature t, if we

Â want to use more sophisticated thermodynamic speak.

Â What we determined was that the number of bottles in our large water cooler having

Â an energy e sub i. Is given by, so the number is a, and it's

Â a constant that we have yet to determine, e to the minus beta, another constant we

Â have yet to determine, times that energy. And I'll ask a key question I asked

Â previously, although I've got it in slightly modified form.

Â What I want to know is, what is the normalized probability?

Â That the water I select randomly, I reach in the box and pull out a water.

Â What's the normalized probability it will be in state i, having characteristic

Â energy e sub i. So normalized meaning that all the

Â probabilities together add up to 1. So, let's continue to work with this

Â equation and see what we can do with it to transform it to a probability.

Â To begin let's take another value, capital A.

Â And I'll define capital A to be the total number of bottles in the water cooler.

Â In that case, the probability indexed by j, that a randomly chosen bottle will be

Â in state j, that is having characteristic energy e sub j.

Â Is the total number of bottles in that state divided by the total number of

Â bottles in the cooler, that is, a sub j divided by capital A.

Â And so what is capital A? Well, of course it's the sum over all

Â energy states, indexed by k here. Of how many bottles are in each one of

Â those states k. If I take each of these two number

Â expressions and replace them by their probabilistic expression here the

Â exponential, in the numerator I'll get c e to the minus beta e sub j.

Â In the denominator, I get a c in every term.

Â I'll just carry it out front. And it's a sum or e to the minus beta,

Â all the possible e sub k's. So the two c's will cancel each other.

Â And I'm left with e to the minus beta e sub j above and sum over all possible

Â states e to the minus beta e sub k below. Right, so just writing that all more

Â compactly. The probability that I will find a bottle

Â in state j when I randomly select it. Is equal to the exponential of minus beta

Â times its characteristic energy divided by the sum of the exponential of negative

Â beta all of the energies. 'Kay?

Â That is the probability normalized that a randomly chosen bottle will be in quantum

Â state j. That expression, that appears in the

Â denominator, is so important and so recurring that it has its own name.

Â It's called the partition function, and for our course we'll typically indicate

Â it by a capital Q for an ensemble and so one specifies in capital Q, certain

Â arguments on which the energy depends. And in this case, and things that are

Â being held fixed. So in this case, we're holding the number

Â of molecules fixed. In a bottle is our picture of an

Â ensemble. We're holding the volume fixed, the

Â bottle. And finally we're holding beta, the

Â constant, fixed. and so here is that expression that

Â appearing in the denominator, some overstates, indexed by j in this

Â instance, e to the minus beta e sub j, and it is upon number and volume that the

Â energy depends. Now you can actually make a connection

Â between classical thermodynamics and statistical mechanics, which is bit

Â beyond what I want to do here, it's more serious, statistical mechanics.

Â That allows you to actually derive what is beta.

Â Instead I'm just going to present it, beta is one over Boltsmann's constant

Â times the temperature, right? And that has units of energy.

Â If you look back at the units on Boltzmann's constant, you'll discover

Â when you multiply times temperature, you'll have energy.

Â So we could also write, then, completely equivalently, that the partition function

Â depends not on n, v, and beta. But on n, v, and t, because Boltzmann's

Â constant is a constant, so beta varies with temperature so if we hold beta

Â fixed, we're holding the temperature fixed, that we sum over states e to the

Â minus e sub j, now not multiplied times beta, but divided by kT.

Â And I'll always indicate in formulae that it's Boltzmann's constant with a

Â subscript capital b, but I'll nearly always just say k.

Â I won't say kb, it takes too long. So let's think about this this partition

Â function. We see the mathematical form, but again

Â it's nice to have some intuitive feel. What, what does it mean?

Â What does the value of the partition function tell you.

Â Well, just for simplicity in order to think about the meaning of the partition

Â function, let's take the ground state of the system, that is the lowest possible

Â energy. Let's take that to be non-degenerate, so

Â there's not multiple ways to make that energy, there's just one state with

Â energy zero. And we will define that energy to be

Â zero. It's arbitrary where you set it we'll

Â just pick that because it's convenient. Well in that case when I sum over the

Â states the very first state will be the ground state, and I'll get e to the minus

Â 0. E to the 0 is 1.

Â So I'm going to pull that out 1. And then I'll keep all the remaining

Â states, their excited states, because I said my ground state was non degenerate.

Â And so same expression on the inside except that I know that e will be above 0

Â because it's an excited state. So notice a few features then of the

Â partition function. As I let the temperature go to 0 I will

Â be dividing by 0 in this expression. So it'll become infinitely large.

Â I get e to the minus infinity, that's 0. And so as the temperature goes to 0 the

Â only term that survives in the partition function is 1.

Â Temperature goes to 0, q goes to 1. Now what about if T goes to infinity?

Â Well in that case, I will be dividing something by infinity, so it will be 0.

Â I'll get e to the minus 0, that's 1. So for every single state I'll add a 1.

Â So 1, 1, 1, 1, 1, 1 I just count all the states.

Â So as the temperature becomes infinite, q goes to the total number of accessible

Â states. So the partition function can be thought

Â of as an effective measure of the accessible number of energy states, from

Â only one at absolute 0 for a non-degenerate ground state to everything

Â that can be accessed at. Infinitely high temperature.

Â So, with that in mind I'm going to let you work on a conceptual question having

Â to do with the partition function, and then we'll return to consider this

Â further. Alright, let's take one further look at

Â the behavior of the partition function and what I want to look at is its

Â behavior with respect to the density of states.

Â I have that in quotes it's something you'll hear physical chemists talk about

Â with some frequency. What, what does the density of states

Â mean? It means, how closely spaced are the

Â relative energy levels. So you might remember in Week 1 we talked

Â about the relative spacing of energy levels, and we mentioned that electronic

Â energy levels are usually spaced quite far from one another.

Â Closer are vibrational levels, still closer are rotational levels and finally

Â very, very closely spaced indeed are translational energy levels.

Â But what happens as these levels get closer and closer and closer to one

Â another? Well, let's do the easy case first.

Â Let's make them get very far from one another, that is the density goes to zero

Â because they're so far separated. Well if the density of states is going to

Â 0 that means this first excited state is very high in energy so I get e to the

Â minus a very large number, and I'll just mark it some fixed temperature now.

Â We looked at the dependence on temperature before.

Â Now we're looking at the dependence on energy.

Â So e to the minus a very large number is 0.

Â I get nothing. I'm back to q equal 1.

Â I've got one accessible state. On the other hand, as the density of

Â states becomes infinite, so that means the first excited state is at 10 to the

Â minus 94th joules, for instance, some incredibly small number, and they're all

Â really closely spaced. Well in that case, because this is such a

Â small number, I'll really be taking roughly e to the minus 0.

Â 10 to the minus 90 is pretty close to 0. And so again, this sum will do a whole

Â lot of counting of states before the energy gets large enough that it actually

Â starts to exponentially die off. And so, as the density goes to infinite,

Â Q will go the total number of states, which itself will be infinite in that

Â case. It will be a continuum of states.

Â So once again, just to emphasize that the partition function that is a measure of

Â the accessible number of energy states. So, if you ever see a number associated

Â with the partition function 5, 30, 10 to the sixteenth.

Â That's giving you some feel for how many energy states can the system access at

Â the n v and t that are specified. And let's actually take a look at a

Â specific example to hopefully make that even more clear.

Â And so, first. Let me, let me focus on this, graph to

Â the left. So what I'm showing you here, is, for a

Â given temperature, 300 kelvin. And at 300 kelvin, KT.

Â That is, Boltzmann's constant times temperature, expressed in energy units of

Â wave numbers. It's about 200 wave numbers.

Â And so let me imagine that my density of states is such that the first excited

Â state and every state thereafter is up by a 1000 wave numbers.

Â So first at a 1000, second at 2000 and so on.

Â So if I look, what I'm plotting here with the red line corresponding to this gap is

Â first off, what is the total partition function, and it's actually 1.01.

Â So very very close to 1 implying the really only one state is accessible.

Â Which kind of makes sense, thermal energy is about 200 wave numbers worth, but I've

Â gotta go up a thousand wave numbers to get to the first state.

Â And so, plotted on the left is the fraction, that is the normalized

Â probability that would be found in a given state.

Â So it's very close to 1 for the energy state 0, the ground state, and maybe it's

Â about 1% for the first excited state, and it's just close to 0 for everything else.

Â And you're done. On the other hand, as the density of

Â states gets smaller. So now let's make the gap 200 wave

Â numbers. That's about equal to kt.

Â And what you find is that q has increased to 1.62.

Â So there's more accessible states. It's still not a huge number.

Â And the ground state is now about 60%. Looks like the first excited state is

Â about 22%. The second excited state, maybe 8%.

Â And so on and eventually again you can't access the highest states.

Â Finally what if I shrink the gap still more I make the gap 50 wave numbers.

Â Well now as I evaluate the contributions to the total partition function.

Â It adds up to 4.69, so getting bigger, many more accessible states.

Â Note that the number itself doesn't mean that there's only 4.69, can't have a

Â fractional state, but anyway, you certainly go beyond that value in terms

Â of some population. So it's a qualitative measure, I guess

Â it's quantitative but, in a qualitative way, try to work with that.

Â in any case, what we see is a slowly dropping probability, once more

Â eventually reaching 0 with a sum of all of the individual exponentials adding to

Â about 4.7. Now alternatively, let me look at a

Â different thing. Let me keep the energy density a

Â constant. So the spacing between, the various

Â states, still 200 wave numbers. But now look at what happens when I

Â change the temperature. So I already showed you 300 kelvin for a

Â 200 wave number gap. That's this green curve, and it's the

Â exact same green curve that's found over here.

Â So 60% in the ground state, and 22% in the first excited state and so on.

Â But, now, let me drop the temperature by an order of magnitude.

Â So instead of having thermal energy of about 200 wave numbers, I go to about 20

Â wave numbers. And sure enough q, it's 1.

Â So really only 1 state is populated. It just drops like a bomb.

Â Ground state has everything, no population of excited states.

Â On the other hand, instead of reducing the temperature by an order of magnitude,

Â let me increase the temperature by an order of magnitude.

Â At 3,000 kelvin. Now, the partition function is up to

Â 10.9. And you see that there is a, it almost

Â looks linear on this plot. Although it should be dropping with some

Â exponential dependents. And so the ground state is populated.

Â The first excited state, just a little less.

Â The second excited state, just a little less.

Â And as I add up all those exponentials, the sum converges to 10.9.

Â So, a bigger number. We finish then, with the more complete

Â mathematical expression that the probability of being found in state j.

Â Given a fixed n number of particles, v, volume, and t,temperature.

Â Is equal to the exponential of minus the energy associated with that number and

Â volume. Divided by Boltzmann's constant times the

Â temperature, all divided by the partition function.

Â This equation, which is a key equation in statistical mechanics, can actually be

Â derived directly from s equals k log w. That was Boltzmann's famous equation.

Â we're not going to do it that way, but just so you know, there are multiple ways

Â to approach that particular problem, and Boltzmann approached it in the other

Â direction as well. But this is one form of the so-called

Â Boltzmann Distribution Law. It's a central result of physical

Â chemistry, and enormously useful for telling us what energy states we should

Â expect to see populated. In a system that is held at certain fixed

Â values, in this case N, V, and T. So that bring us to the end of the sort

Â of formal statistic analysis of the ensemble and probabilities and the

Â partition function. I'd like to actually move next to

Â considering. How this analysis can give us insight

Â into the properties of an ideal gas.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.