Chapter 5.5: Thermodynamics and systems

The study of how energy in its various forms moves through a system is called thermodynamics, and in chemistry specifically - thermochemistry. The First Law of Thermodynamics tells us that energy can be neither created nor destroyed, but it can be transferred from a system to its surroundings and vice versa. For any system, if we add up the kinetic and potential energies of all of the particles that make up the substance we get the total energy – this is called the system’s internal energy, E. It turns out that it is not really possible to measure the total internal energy of a system - but what can be measured and calculated is the change in internal energy ΔE (the Greek letter Δ stands for “change in”). There are two ways that the internal energy of a system can change - we can change the total amount of thermal energy in the system (denoted “q”) or the system can do work, or have work done to it (work is denoted “w”). The change in internal energy ΔE is therefore q +w.

 

5.1 Systems
5.2 Temperature
5.3 Vibrations
5.4 Phase changes
5.5 Thermodynamics
5.6 Phases, again


At the molecular level, it should now be relatively easy to imagine the effects of adding or removing thermal energy from a system. On the other hand, work (usually defined as force x distance) done on or by the system is a macroscopic phenomenon. If the system occupies more or less volume, work is done on the surroundings or on the system, respectively. Imagine pushing the plunger of a syringe to expel a liquid - that process involves work (and it can be carried out under conditions where temperature does not change). That said, most systems we study in chemistry do not expand or contract significantly (except gases. In these situations ΔE = q, the thermal energy (heat) change. In addition most of the systems we will study in chemistry (and biology) are under constant pressure (usually atmospheric pressure). It turns out that the heat change at constant pressure is what is known as a “state function” and is called enthalpy (H). In a system at constant pressure with no volume change, it is the change in enthalpy (ΔH) that we will be primarily interested in (together with the change in entropy - see below).

Since we cannot measure energy changes directly, we have to use some observable (and measurable) change in the system. Typically we measure the temperature change and then relate it to the energy change. For changes that occur at constant pressure and volume this energy change is the enthalpy change ΔH. If we know the temperature change (ΔT), the amount (mass) of material and its specific heat, we can calculate the enthalpy change (J) = mass (g) x specific heat (J/ g ºC) x ΔT (ºC). When considering the enthalpy change for a process, the direction of energy transfer is important. By convention if thermal energy goes out of the system to the surroundings (that is, the surroundings increase in temperature) the sign of ΔH is negative and we say the process is exothermic (literally heat out). Combustion reactions, such as burning wood or gasoline, are probably the most common examples of exothermic processes. In contrast if a process requires thermal energy from the surroundings to make it happen, that is: energy is transferred from the surroundings to the system, the sign of ΔH is positive and we say the process is endothermic.


Question to answer:

  • You have systems (at 10ºC) composed of water, methanol, ethanol or propanol.
  • Predict the final temperature of each system if equal amounts of thermal energy (q) are added to equal amounts of a substance (m).
  • What do you need to know to do this calculation?
  • Draw a simple sketch of a system and surroundings. Indicate by the use of arrows what we mean by an endothermic process, an exothermic process.
  • What is the sign of ΔH for each process?
  • Draw a similar diagram and show the direction and sign of w (the work), when the system does work on the surroundings (expands), and when the surroundings does work on the system (contracts).

Questions to ponder and questions for later:

  • What does the difference in behavior of water, methanol, ethanol, and propane tell about their molecular behavior/organization/structure?

While the First Law of Thermodynamics states that you can not get more energy out of a system than is already present in some form, the Second Law of Thermodynamics tells us that we cannot even get back the energy that we use to bring about a change in a system. This idea is captured by the phrase, for any change in a system, the total entropy of the universe must increase. As we will see, this means that some of the energy is changed into a form that can no longer do work.

There are lots of subtle and not so subtle implications captured in these statements and we will need to look at them carefully to identify them. You already probably have ideas about what entropy means, but what does it mean scientifically? As you might imagine, it is not a simple idea. As you may know the word entropy is often used to designate randomness or disorder, but this is not a very useful or accurate way to define entropy (although randomly disordered systems do have high entropy). A better way to think about it - a way that will allow us to measure, calculate and predict outcomes, is to think about entropy in terms of probabilities. As we will see, thermal energy transfers from hot to cold systems because the outcome is the most probable outcome. A drop of dye disperses in water because the resulting dispersed distribution of dye molecules is the most probable. Osmosis occurs when water passes through a membrane from a dilute solution to a more concentrated solution, because the resulting system is more probable. In fact whenever a change occurs, the overall entropy of the Universe always increases. The Second Law has (as far as we know) never, ever been found to have been violated. In fact the direction of entropy change has been called “time’s arrow” - the forward direction of time is determined by the entropy change. At this point you should be shaking your head - all this can’t possibly be true. First: if entropy is always increasing then in the past was there a time when entropy was 0? [link]. Second, you can probably think of situations where entropy decreases (things become more ordered), for example when you clean up a room. FInally, while it is commonsense that time flows in only one direction (to the future), it is also the case that at the atomic and molecular scale all events are reversible - how does that work?

Probability and Entropy

Before we look at entropy in detail, let us remind ourselves about probability. Let us look at a few systems and think about what is more probable. For example: if you take a deck of cards and shuffle it, which is more probable - that the cards will shuffle into the order ace, king, queen, jack, 10, 9 … etc for each suite - or that they will end up in some random jumbled order? Of course the answer is obvious: the random order is much more probable since there are many sequences of cards that “count” as random order, but only one that is “ordered”. This is true even though whatever order the shuffled cards actually assume is, itself, just as unlikely as the perfectly ordered one. But because we “care” about a particular order, we place all other possible orders of the cards together - we do not distinguish between them.

We can calculate, mathematically, the probability of the result we care about. To determine the probability of an event (for example, a particular order of cards), we divide the number of outcomes cared about by the total number of possible outcomes. For 52 cards there are 52! (factorial)(52 x 51 x 50 x 49 ….) ways that the cards can be arranged [link]; this number is approximately 8.07 x 1067, a number on the same order of magnitude as the number of atoms in our galaxy. So the probability of shuffling cards to produce any one particular order is 1/52! - a very small number indeed (but since it is > 0, this is an event that can happen, and in fact it always happens, since the probability that some arrangement of cards will be occur is 1 - now that is a mind bender, but true nevertheless, highly improbable events occur all the time! [link]

This idea of entropy in terms of probabilities can help us understand why different substances or systems have different entropies. We can actually calculate entropies for many systems from the formula S = klnW, where S is the entropy, k is the Boltzmann constant, and W is the number of distinguishable arrangements (or states) that the system has. So the larger the value of W (the number of arrangements) the larger the entropy.

In some cases it is relatively easy to figure out which system will have more arrangements. For example, in a solid substance (for example, ice) the molecules are fixed in place and can only vibrate. While in a liquid the molecules are free to roam around. So that each molecule has the possibility that it could be anywhere within the liquid mass (not confined to one position). In a gas the molecules are not confined at all, and can be found anywhere (or at least anywhere in the container). In general gases have more entropy than liquids and solids. This so - called positional entropy can be extended to mixtures. In most (but not all - as we will see) mixtures the number of distinguishable arrangements is larger for the mixture, than for the unmixed components. The entropy of a mixture is larger (usually).

So let us return to the idea that the direction of change in a system is determined by probabilities. We will consider the transfer of thermal energy (heat) and see if we can make sense of it. FIrst, remember that energy is quantized. So, for any substance at a particular temperature there will be a certain number of energy quanta. To make things simple(r) we will consider a four atom solid that contains 2 quanta of energy. These quanta can be distributed so that a particular atom can have 0, 1, or 2 quanta of energy. You can now either calculate (or determine by trial and error) the number of different possible arrangements of these quanta (there are 10). Remember that W is the number of distinguishable arrangements, so for this system W = 10 and S = k ln 10. Now what happens if we consider two similar systems, one with 4 quanta and the other with 8 quanta. The system with 4 quanta will be at a lower temperature than the system with 8 quanta. We can also calculate the value of W for the 4 quanta (4 atom) system by considering the maximum number of possible ways to arrange the quanta over the 4 atoms. For the 4 atom 4 quanta system W = 35. If we do the same calculation for the 8 quanta 4 atom system W = 165. If taken together, the total number of arrangements of the two systems considered together is 35 x 165 = 5775.
But what about temperature?

The 4 quanta system is at a lower temperature than the 8 quanta system, since the 8 quanta system has more energy. What happens if we put the two systems in contact? Energy will transfer from the hotter (8) to the colder (4) system until the temperatures are equal. At this point, each will have 6 quanta (which corresponds to a W of 84); since there are two systems (each with 6 quanta), the total W for the combined systems is W of 84 x 84 = 7056 states. You will note that 7056 is greater than 5775. [We multiply, rather than add W when we combine systems.]

 


Our conclusion, there are more distinguishable arrangements in the two systems after the energy transfer than before - the final system is more probable and so has a higher entropy.

Now you might well object - given that we are working with systems of only 4 or 5 atoms each, it is easy to imagine that random fluctuations could lead to the movement of quanta from cold to hot, and that is true - that is why the behavior at the nanoscale is reversible. But when we are talking about macroscopic systems such a possibility quickly becomes increasingly improbable as the number of atoms/molecules increases. Remember a very small drop of water, weight 0.05 grams, contains approximately 1.8 x 1021 molecules (perhaps you can also calculate the volume of such a drop). What is reversible at the nanoscale is irreversible at the macroscopic scale – yet another wacky and (perhaps) counter-intuitive fact. The realization that change is driven simply by the move to more probable states is, for some, quite difficult to accept, but it is true even when we consider living systems (when considered in the context of their surroundings).

 

It is generally true that we are driven to seek a purpose for why things happen, and in the grand scheme of things, the idea that the overarching principle of change in the universe is towards more probable states can be difficult to accept [link].

 

5.1 Systems
5.2 Temperature
5.3 Vibrations
5.4 Phase changes
5.5 Thermodynamics
5.6 Phases, again


Question to answer:

  • Which has more entropy - in each case explain why your choice has more entropy:
    • A new deck of cards or a shuffled deck?
    • Separated dye and water - or a mixed up solution
    • H2O(s) or H2O(l)
    • CaCO3(s) or CaO(s) + CO2(g)
    • H2O(l) (at 25 °C) or H2O(l) (at 50 °C)
  • When a change occurs in a system does it affect anything else? - what effect does the change have on the surroundings.
  • Do you think that the structure of a compound affects its entropy? Why?
  • Predict the relative entropies of diamond and sodium chloride. What factors influence your prediction?
  • Look up the entropies - were you correct?
  • Predict the relative molar entropies of carbon dioxide, oxygen and HF. What factors influence your prediction?
  • Look up the entropies - were you correct?

Questions to ponder and questions for later:

  • Can you think of changes that occur that seem to produce more order?
  • Why don’t living systems (organisms) violate the second law of thermodynamics?
  • Does the second law rule out evolution? or argue for a supernatural soul running the brain?

28-Jun-2012