This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

122 ratings

University of Minnesota

122 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant β in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Welcome to Week 6 of statistical molecular thermodynamics.

[SOUND]. In this first video of the week, I want

to begin discussing entropy. So, it turns out this nice studio I'm

sitting in is about about maybe 200 feet from the banks of the Mississippi River.

And there's a bridge over the Mississippi not far from here.

And I'd like you to imagine an experiment maybe many of you have seen.

What if I were carrying a chunk of metallic potassium, standing in the

middle of the bridge, and I tossed it into the Mississippi River, well below

me, fortunately. Well, many of you probably have not

actually seen it with potassium, you've seen it with sodium.

Toss a chunk of sodium in water, it skitters around for a while, it makes

hydrogen gas, the hydrogen gas ignites, there's a little explosion, it's all

pretty dramatic. It's a lot more dramatic with potassium

it turns out. This is one of those, don't try that at

home experiments. But I guess what I want to emphasize is,

I toss that potassium K into the river, K plus H2O.

There's a big explosion and I make potassium hydroxide and a half a mole of

hydrogen gas, assuming I don't ignite that hydrogen gas and end up making water

with the oxygen in the atmosphere. Let me ask you to consider a different

process. What if I take a flask full of potassium

hydroxide in water, just like that little section of the river, and I bubble

hydrogen gas through it. What happens?

Not really much, right? We, it, the direction that it goes, we

know which direction it goes. But why?

Now, some might think oh, well, you know, it goes from potassium plus water to KOH

plus one half H2 because that's the exothermic direction.

Heat is obviously released. Maybe it's exothermicity.

Well, let me tell you about a process or two where there is no exothermicity, so

enthalpy change would be zero. And yet the process we would all agree

from sort of common sense and observation is spontaneous.

So let me take one example. Here I have two vessels separated by a

stopcock, and at the moment that stopcock is closed.

On the left side is an ideal gas and on the right side is a vacuum.

That vessel is completely evac-, evacuated.

I open the stopcock and immediately, so this is a little bit of a thought

experiement because in practice it would take a second, two seconds maybe.

But imagine that I've got a big enough hole in my stopcock that immediately

[SOUND] all the gas is equally distributed across both vessels.

I'm going to do this in an insulated system, all right?

So I build a lot of insulation around it. It's adiabatic, there's no heat transfer.

So delta q is equal to 0. But I expanded against the vacuum.

The external pressure was also 0. And as a result, the change in internal

energy, delta q plus delta w, that's all zero.

For an ideal gas, which, for which the internal energy only depends on

temperature, since there has been no change in internal energy, there must

also have been no change in temperature. This is isothermal.

And given that if I look at the enthalpy, which is delta u plus delta PV, given

that it's an ideal gas PV is just equal to nRT but there's been no change in T,

and there has been no change in U. And so the enthalpy change is zero.

So the change in everything is zero. But I think we all agree that this only

ever goes one way. The gas will leave the full vessel and it

will occupy the empty vessel until they both have equal amounts of gas, half as

much as was in the original, if I started with equal volumes.

Hm. So why does it do that?

Let's talk about another process. Let's imagine now that instead of having

a full vessel of gas and a vessel of vacuum, I actually just have two

different gases. I'll have Bromine on the left.

It'll be a nice pretty brownish color, the Bromine gas.

And Nitrogen on the right, clear and colorless.

And again, I open the stopcock. I think we all suspect, know, feel

certain, that given a certain amount of time we would have two vessels, each

looking equally about half as brown as the original vessel did, as the bromine

and the nitrogen distribute themselves equally across the two vessels.

Again, I'm going to do this in an insulated system.

Adiabatic. No change in heat.

In this case, it's not that the external pressure is zero, it's that there's no

change in volume. The volume available to these gasses is

always the same. So, delta V is zero for the overall

system. And so the work done is zero.

And, that, hence, once again, delta U, which is the sum of heat and work is 0.

And, same argument as before, these are ideal gases.

If the internal energy doesn't change, the temperature doesn't change, and so

the enthalpy doesn't change. And yet again I'll mention we're all

pretty certain it only ever goes one way. You wouldn't start with a 50/50 mixture

of gases and expect to come back a little later and just randomly all the bromines

on one side and all the nitrogens on the other side.

So, what is going on? Well, evidently, the universe responds to

driving forces beyond those associated with heat, work, internal energy.

There also [UNKNOWN] appears to be an influence of what you might call

disorder. So there is a tendency to go to higher

degrees of disorder or randomness in a system.

And so you can find spontaneous processes, like the two I just outlined,

that maybe thermo-neutral, so delta H equals 0.

You can even see processes where it's endothermic, delta H is actually greater

than 0. Later in the course we'll have a

demonstration of at least one of those. but what they all have in common, if

they're spontaneous, is that there is an increase in the disorder of the system.

So we might offer a hypothesis for instance, that systems spontaneously

evolve to increase their disorder. Unless there's going to be an interplay

between changing the energy of a system or increasing its disorder that both have

to be considered when evaluating is a process spontaneous.

And so, I want to explore this a little bit further and I want to consider the

heat transfer that's associated with a small reversible change in the

temperature and volume of an ideal gas. Right?

So I'm going to rearrange the first law of thermodynamics, to isolate the heat

transfer on one side. It's reversible.

It's dU minus the reversible work. Remember, what is the change in internal

energy for an ideal gas? It only depends on temperature.

It's the heat capacity times dT. And for an ideal gas, for the reversible

process. We know the pressure is nRT over V times

dV. So, this expression is how one might go

about computing the reversible work. Now, it's actually this last term, nRT

over V dV that's slightly problematic. It's what makes the delta q a path

function, instead of a state function. And the issue is that you have a

commingling of variables in that term. You've got a dV and a V, that's fine,

they, that's the same variable. But you've also got temperature.

So you have one term that's involving two different variables of the system and it

makes it an inexact differential, we would say.

But let me attempt to ameliorate that situation, and in particular.

Let me divide both sides of that prior equation by T, temperature.

In which case, I eliminate the T that used to be in the last term.

I get delta q over T is equal to CV over T.

CV may be a function of temperature or we may be working over a region where it's a

constant. But we'll just note that it can depend on

temperature. But in any case, I have a term here that

depends only on T and a term that depends only on V.

[SOUND] So that means that the quantity on the left side is now not an inexact

differential, it's an exact differential. It allows me to integrate on both sides.

And so, if I think about integrating this expression currently on the right-hand

side, if or more accurately, if I want to express this as something that makes it

more obvious what the integral is, this is the left-side, the derivative of this

integral, plus these are just constants nR.

This integral, and of course there's a constant if I take the derivative of

constant at zero. So we won't worry about the fact that

that's thermodynamics. You're more often interested in changes

than absolute values, the constant is arbitrary.

So clearly then, del q rev, reversible heat, change divided by temperature is a

state function, because it's an exact differential.

And we give it a name. We call it entropy.

And we indicate it by S, capital S. So, dS, the exact differential, that's

why there's a d, is the reversible work. We tend to keep this delta, sorry,

reversible heat, we tend to keep this delta around just to remind ourselves

that heat is not always as, a state, it's state functions of path function usually,

but when divided by T, it makes this exact differential ds entropy.

And if you're wondering about the word entropy itself comes from the original

Greek where entropia would mean internal transformation.

So this concept, as we'll see, that relates to disorder.

We'll see how entropy relates to disorder.

It involves internal transformations [SOUND].

All right. Well, that tells us how to construct this

state function we call entropy. What we need to do next is explore its

properties in a little bit more detail. And so that's where we'll be heading,

entropy as a state function.

Coursera provides universal access to the world’s best education,
partnering with top universities and organizations to offer courses online.