This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

161 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 3

This module delves into the concepts of ensembles and the statistical probabilities associated with the occupation of energy levels. The partition function, which is to thermodynamics what the wave function is to quantum mechanics, is introduced and the manner in which the ensemble partition function can be assembled from atomic or molecular partition functions for ideal gases is described. The components that contribute to molecular ideal-gas partition functions are also described. Given specific partition functions, derivation of ensemble thermodynamic properties, like internal energy and constant volume heat capacity, are presented. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

In the last video, we looked at the exponential dependence of Boltzmann

probabilities on the energy. In this video, I'd like to focus on the

concept of the Boltzmann population. So let me recap what we have talked about

so far when it comes to our thought construct.

The very large water cooler, or an ensemble at fixed temperature t, if we

want to use more sophisticated thermodynamic speak.

What we determined was that the number of bottles in our large water cooler having

an energy e sub i. Is given by, so the number is a, and it's

a constant that we have yet to determine, e to the minus beta, another constant we

have yet to determine, times that energy. And I'll ask a key question I asked

previously, although I've got it in slightly modified form.

What I want to know is, what is the normalized probability?

That the water I select randomly, I reach in the box and pull out a water.

What's the normalized probability it will be in state i, having characteristic

energy e sub i. So normalized meaning that all the

probabilities together add up to 1. So, let's continue to work with this

equation and see what we can do with it to transform it to a probability.

To begin let's take another value, capital A.

And I'll define capital A to be the total number of bottles in the water cooler.

In that case, the probability indexed by j, that a randomly chosen bottle will be

in state j, that is having characteristic energy e sub j.

Is the total number of bottles in that state divided by the total number of

bottles in the cooler, that is, a sub j divided by capital A.

And so what is capital A? Well, of course it's the sum over all

energy states, indexed by k here. Of how many bottles are in each one of

those states k. If I take each of these two number

expressions and replace them by their probabilistic expression here the

exponential, in the numerator I'll get c e to the minus beta e sub j.

In the denominator, I get a c in every term.

I'll just carry it out front. And it's a sum or e to the minus beta,

all the possible e sub k's. So the two c's will cancel each other.

And I'm left with e to the minus beta e sub j above and sum over all possible

states e to the minus beta e sub k below. Right, so just writing that all more

compactly. The probability that I will find a bottle

in state j when I randomly select it. Is equal to the exponential of minus beta

times its characteristic energy divided by the sum of the exponential of negative

beta all of the energies. 'Kay?

That is the probability normalized that a randomly chosen bottle will be in quantum

state j. That expression, that appears in the

denominator, is so important and so recurring that it has its own name.

It's called the partition function, and for our course we'll typically indicate

it by a capital Q for an ensemble and so one specifies in capital Q, certain

arguments on which the energy depends. And in this case, and things that are

being held fixed. So in this case, we're holding the number

of molecules fixed. In a bottle is our picture of an

ensemble. We're holding the volume fixed, the

bottle. And finally we're holding beta, the

constant, fixed. and so here is that expression that

appearing in the denominator, some overstates, indexed by j in this

instance, e to the minus beta e sub j, and it is upon number and volume that the

energy depends. Now you can actually make a connection

between classical thermodynamics and statistical mechanics, which is bit

beyond what I want to do here, it's more serious, statistical mechanics.

That allows you to actually derive what is beta.

Instead I'm just going to present it, beta is one over Boltsmann's constant

times the temperature, right? And that has units of energy.

If you look back at the units on Boltzmann's constant, you'll discover

when you multiply times temperature, you'll have energy.

So we could also write, then, completely equivalently, that the partition function

depends not on n, v, and beta. But on n, v, and t, because Boltzmann's

constant is a constant, so beta varies with temperature so if we hold beta

fixed, we're holding the temperature fixed, that we sum over states e to the

minus e sub j, now not multiplied times beta, but divided by kT.

And I'll always indicate in formulae that it's Boltzmann's constant with a

subscript capital b, but I'll nearly always just say k.

I won't say kb, it takes too long. So let's think about this this partition

function. We see the mathematical form, but again

it's nice to have some intuitive feel. What, what does it mean?

What does the value of the partition function tell you.

Well, just for simplicity in order to think about the meaning of the partition

function, let's take the ground state of the system, that is the lowest possible

energy. Let's take that to be non-degenerate, so

there's not multiple ways to make that energy, there's just one state with

energy zero. And we will define that energy to be

zero. It's arbitrary where you set it we'll

just pick that because it's convenient. Well in that case when I sum over the

states the very first state will be the ground state, and I'll get e to the minus

0. E to the 0 is 1.

So I'm going to pull that out 1. And then I'll keep all the remaining

states, their excited states, because I said my ground state was non degenerate.

And so same expression on the inside except that I know that e will be above 0

because it's an excited state. So notice a few features then of the

partition function. As I let the temperature go to 0 I will

be dividing by 0 in this expression. So it'll become infinitely large.

I get e to the minus infinity, that's 0. And so as the temperature goes to 0 the

only term that survives in the partition function is 1.

Temperature goes to 0, q goes to 1. Now what about if T goes to infinity?

Well in that case, I will be dividing something by infinity, so it will be 0.

I'll get e to the minus 0, that's 1. So for every single state I'll add a 1.

So 1, 1, 1, 1, 1, 1 I just count all the states.

So as the temperature becomes infinite, q goes to the total number of accessible

states. So the partition function can be thought

of as an effective measure of the accessible number of energy states, from

only one at absolute 0 for a non-degenerate ground state to everything

that can be accessed at. Infinitely high temperature.

So, with that in mind I'm going to let you work on a conceptual question having

to do with the partition function, and then we'll return to consider this

further. Alright, let's take one further look at

the behavior of the partition function and what I want to look at is its

behavior with respect to the density of states.

I have that in quotes it's something you'll hear physical chemists talk about

with some frequency. What, what does the density of states

mean? It means, how closely spaced are the

relative energy levels. So you might remember in Week 1 we talked

about the relative spacing of energy levels, and we mentioned that electronic

energy levels are usually spaced quite far from one another.

Closer are vibrational levels, still closer are rotational levels and finally

very, very closely spaced indeed are translational energy levels.

But what happens as these levels get closer and closer and closer to one

another? Well, let's do the easy case first.

Let's make them get very far from one another, that is the density goes to zero

because they're so far separated. Well if the density of states is going to

0 that means this first excited state is very high in energy so I get e to the

minus a very large number, and I'll just mark it some fixed temperature now.

We looked at the dependence on temperature before.

Now we're looking at the dependence on energy.

So e to the minus a very large number is 0.

I get nothing. I'm back to q equal 1.

I've got one accessible state. On the other hand, as the density of

states becomes infinite, so that means the first excited state is at 10 to the

minus 94th joules, for instance, some incredibly small number, and they're all

really closely spaced. Well in that case, because this is such a

small number, I'll really be taking roughly e to the minus 0.

10 to the minus 90 is pretty close to 0. And so again, this sum will do a whole

lot of counting of states before the energy gets large enough that it actually

starts to exponentially die off. And so, as the density goes to infinite,

Q will go the total number of states, which itself will be infinite in that

case. It will be a continuum of states.

So once again, just to emphasize that the partition function that is a measure of

the accessible number of energy states. So, if you ever see a number associated

with the partition function 5, 30, 10 to the sixteenth.

That's giving you some feel for how many energy states can the system access at

the n v and t that are specified. And let's actually take a look at a

specific example to hopefully make that even more clear.

And so, first. Let me, let me focus on this, graph to

the left. So what I'm showing you here, is, for a

given temperature, 300 kelvin. And at 300 kelvin, KT.

That is, Boltzmann's constant times temperature, expressed in energy units of

wave numbers. It's about 200 wave numbers.

And so let me imagine that my density of states is such that the first excited

state and every state thereafter is up by a 1000 wave numbers.

So first at a 1000, second at 2000 and so on.

So if I look, what I'm plotting here with the red line corresponding to this gap is

first off, what is the total partition function, and it's actually 1.01.

So very very close to 1 implying the really only one state is accessible.

Which kind of makes sense, thermal energy is about 200 wave numbers worth, but I've

gotta go up a thousand wave numbers to get to the first state.

And so, plotted on the left is the fraction, that is the normalized

probability that would be found in a given state.

So it's very close to 1 for the energy state 0, the ground state, and maybe it's

about 1% for the first excited state, and it's just close to 0 for everything else.

And you're done. On the other hand, as the density of

states gets smaller. So now let's make the gap 200 wave

numbers. That's about equal to kt.

And what you find is that q has increased to 1.62.

So there's more accessible states. It's still not a huge number.

And the ground state is now about 60%. Looks like the first excited state is

about 22%. The second excited state, maybe 8%.

And so on and eventually again you can't access the highest states.

Finally what if I shrink the gap still more I make the gap 50 wave numbers.

Well now as I evaluate the contributions to the total partition function.

It adds up to 4.69, so getting bigger, many more accessible states.

Note that the number itself doesn't mean that there's only 4.69, can't have a

fractional state, but anyway, you certainly go beyond that value in terms

of some population. So it's a qualitative measure, I guess

it's quantitative but, in a qualitative way, try to work with that.

in any case, what we see is a slowly dropping probability, once more

eventually reaching 0 with a sum of all of the individual exponentials adding to

about 4.7. Now alternatively, let me look at a

different thing. Let me keep the energy density a

constant. So the spacing between, the various

states, still 200 wave numbers. But now look at what happens when I

change the temperature. So I already showed you 300 kelvin for a

200 wave number gap. That's this green curve, and it's the

exact same green curve that's found over here.

So 60% in the ground state, and 22% in the first excited state and so on.

But, now, let me drop the temperature by an order of magnitude.

So instead of having thermal energy of about 200 wave numbers, I go to about 20

wave numbers. And sure enough q, it's 1.

So really only 1 state is populated. It just drops like a bomb.

Ground state has everything, no population of excited states.

On the other hand, instead of reducing the temperature by an order of magnitude,

let me increase the temperature by an order of magnitude.

At 3,000 kelvin. Now, the partition function is up to

10.9. And you see that there is a, it almost

looks linear on this plot. Although it should be dropping with some

exponential dependents. And so the ground state is populated.

The first excited state, just a little less.

The second excited state, just a little less.

And as I add up all those exponentials, the sum converges to 10.9.

So, a bigger number. We finish then, with the more complete

mathematical expression that the probability of being found in state j.

Given a fixed n number of particles, v, volume, and t,temperature.

Is equal to the exponential of minus the energy associated with that number and

volume. Divided by Boltzmann's constant times the

temperature, all divided by the partition function.

This equation, which is a key equation in statistical mechanics, can actually be

derived directly from s equals k log w. That was Boltzmann's famous equation.

we're not going to do it that way, but just so you know, there are multiple ways

to approach that particular problem, and Boltzmann approached it in the other

direction as well. But this is one form of the so-called

Boltzmann Distribution Law. It's a central result of physical

chemistry, and enormously useful for telling us what energy states we should

expect to see populated. In a system that is held at certain fixed

values, in this case N, V, and T. So that bring us to the end of the sort

of formal statistic analysis of the ensemble and probabilities and the

partition function. I'd like to actually move next to

considering. How this analysis can give us insight

into the properties of an ideal gas.

Coursera provides universal access to the world’s best education,
partnering with top universities and organizations to offer courses online.