Entropy

In the Second and Third Law of Thermodynamics, we discuss a little bit on the concept of entropy. Here we shall start with a brief summary, and then discuss a little bit more.

Simply put, entropy is a measure of the disorderliness of a system. The more disordred a system is, the higher the entropy.

Blog Pic 12

The equation for entropy is given by

S = kB ln(W)

where

  • kB is the Boltzmann constant (seen in the Ideal Gas Equation pV = NkBT), and
  • W is the number of ways of arranging the molecules of the system, accounting (this is like a combinatorial multiplication) for the number of possible energy states of the system.

Accordingly, the change in entropy is given by the difference in entropies between the final and initial states of the system.

ΔS = Sfinal – Sinitial = kB ln(Wfinal/Winitial)

Change in Entropy

Standard change in entropy is written as

ΔrS298Ө

whose indices you should aleady be familiar with (if not, see the discussion on Enthalpy under the First Law). Similarly, standard change in entropy of a reaction is

ΔrS298Ө = ∑viΔfS298Ө(products) – ∑viΔfS298Ө(reactants)

whose indices you should also already be familiar with.

System at Constant Temperature

There is an easy way to determine the change in entropy of a closed system if

  • heat is being supplied to the system, and
  • temperature of the system does not change.

The change in entropy in this case is given by

ΔS = q/T

where

  • q is the heat supplied to the system, and
  • T is the temperature of the system in Kelvin.

Leave a Reply

Your email address will not be published. Required fields are marked *