This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

From the course by University of Minnesota

Statistical Molecular Thermodynamics

143 ratings

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

From the lesson

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant Î² in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Well, let's move on and consider entropy in a little more detail, and in

Â particular take a look at its properties as a state function.

Â So, remember what a state function says. The property of being a state function

Â says, that irrespective of what path you take between one state point and another,

Â or state point implies the specification of say temperature, pressure, volume.

Â Temperature, pressure, number of particles.

Â you can't specify all of temperature pressure and volume, that, the substance

Â dictates that. But temperature, pressure, number of

Â particles. Irrespective of what path you take, as

Â long as you end up at the same destination, you should see the same

Â change in the state function. Indeed if you end up taking a circular

Â path, that what this equation says, so a circle intergral, you follow a path back

Â to it's original point. Since you're back at the original point,

Â you must still have the same value of the state function.

Â That is you will have changed by 0. So delta S is 0 for a cyclic process.

Â But, what I'd like to do then, is go back to our ideal gas roadmap, with that we

Â used to look at changes in internal energy and observe that U was a state

Â function. So here's our, our old friend where we

Â considered different ways to go from original pressure, volume and

Â temperature, to a second pressure and volume, but still the same temperature.

Â So there is the isothermal expansion, the adiabatic expansion, followed by constant

Â volume warming and the constant pressure expansion, followed by constant volume

Â cooling. So I want to look at, I'll start with

Â path A versus path B plus path C. What's the change in entropy along these

Â two different paths? If it's a state function, it's got to be

Â the same change in entropy. So, let's just do the math and check.

Â So, remember that the reversible heat for the isothermal path.

Â Since it is isothermal, the change in internal energy is zero all along the

Â path. That means that del q is equal to minus

Â del w, and the reversible work is the pressure of the ideal gas being the

Â external pressure. So nRT1, the temperature we're operating

Â at, divided by V dV. And so, if I now equate that with the

Â reversible heat, it's just a change in sorry, it's, it's exactly this.

Â It's nRT1. I integrate to get the heat.

Â Sorry, I'm moving a little slowly here. So, this is the delta.

Â In order to get the actual heat change, I want to integrate dV over V from V1 to

Â V2. And I get nRT1 log V2 over V1.

Â And we did this in week five. You can go back and review those videos

Â if you want to see those steps again. So, given that, this is just

Â recapitulating that equation. I've got the the change in the heat here.

Â If I instead want to compute the change in entropy, now I integrate not del q,

Â but del q over t from state point one to state point two.

Â So 1 over T1, because T is a constant here.

Â nRT1, those Ts will cancel. So I'll get the integral from V1 to V2 of

Â nRV dV, nR log V2 over V1. Alright, so that is the change in

Â entrophy along path A. I will ask you to notice that since

Â volume 2 is greater than volume 1. So we've moved from left to right on the

Â volume axis, this is a positive, this is a number greater than 1, that makes the

Â logarithm a positive value, n and R are positive, number of moles in a constant

Â that's positive. And so the change in entropy is positive.

Â Entropy has increased as I've gone from a lesser volume to a greater volume.

Â And that's consistent with our idea, idea of entropy is a measure of disorder.

Â If I have the same amount of gas in a larger volume, there's sort of more ways

Â to imagine where the gas molecules might be.

Â Now, let's consider paths B and C. So, path B is particularly simple.

Â Path B is the adiabatic expansion. And adiabatic means that there is no heat

Â transfer. Del q is equal to zero.

Â So, in that case the entropy change if I integrate zero divided by t, doesn't

Â really matter what t is, I'm integrating zero and I get zero.

Â So the change in entropy for an adiabatic expansion is zero.

Â And then, I'll ask you to remember that we worked out, and you can look at video

Â 5.4, if you'd like to see the individual steps, that the heat transfer for step C,

Â the constant volume heating, was equal to there's no work done, it's equal to the

Â internal energy change. And so that is the integral from T2, the

Â starting temperature, to T1, the ending temperature , C dT dT.

Â And we showed that that was equal to, and this is now in Video 5.5 if you want to

Â see the individual steps. I'm just going to recall the answer.

Â It is equal to nR log V2 over V1. This is indeed then, when I sum this

Â result with this result, 0 plus nR log V2 over V1, is indeed this term.

Â And that's just what we found, four path a.

Â So, entropy is obeying its necessary behavior as a state function, that

Â independent of path, when we arrive at the final point, we have the same net

Â entropy change. So I'm going to pause here for a second,

Â and I'm going to let you consider what the entropy change is for path E, to see

Â if you've appreciated the development so far.

Â Alright, we've begun to get some experience working with entropy, and

Â hopefully, it's becoming a little bit more comfortable and familiar.

Â this is just the definition again of dS. I'll point out one feature, I suppose,

Â that's also worth bearing in mind as a conceptual understanding point.

Â And that is that, if entropy is related to the disorder of a system, and you

Â increase the entropy by adding heat. So, del q is positive.

Â Notice that the change in entropy is dependent on what is the current

Â temperature. So at very low temperatures, this implies

Â a certain quantity of heat will increase the disorder considerably more than

Â adding that same quantity of heat at very high temperatures.

Â Right? So the same heat delivered at low T

Â increases entropy more. So that's, just an appreciation if you

Â will of the definition of dS. Well, having looked at the ideal gases,

Â ideal gas expansion paths that is, we've got some feel for how entropy relates to

Â heat and work, and different paths. Next, I want to consider, the role of

Â spontaneity more generally in thermodynamics, and indeed express it in

Â terms of the second law.

Â Coursera provides universal access to the worldâ€™s best education,
partnering with top universities and organizations to offer courses online.