1. Entropy:
The second law of Thermodynamics is related to practical devices suchs
as heat machine and refrigerators. It exist another statement phrase
for this law that involves a quantity called the entropy.
2. Entropy is a variable of state :
That is, it depends only on the initial and final state of the system.
A system, at the themodynamic equilibrium, is governed by its equation
of state (PV = nRT = NkT for an ideal gas). In other words, It is
characterized by the values of its variables of state, such as P, V, n,
and T. Recall that, for the ideal gaz that the internal energy depends
only on the temperature. Thus, we can write U(T). recall also the change
is null for a variable state for a cycle; for instance ΔU = 0,
ΔT = 0, ΔV = 0, Δn = 0, for a cycle. Ui and
Ui are independant of the path connecting the states "i" to "f".
In contrast Q(NET heat) or W(NET work) involved during the cycle
depend on the state of the system before it falls in the equilibrium
situation.
3. Macroscopic definition of the entropy :
We can state a mathematical definition only for a reversible process. The
related equation is:
dS = dQ/T
Between two states, we have :
ΔS = ∫dQ/T [from i → f]
In the case if an irreversible process, it is always possible
to device a reversible process (starting in "i" and ending in "f")
that mach the starting and the ending of the irreversible process
(mostly an isothermal + adiabatic processes in a Carnot cycle:
[ + dQ1, T1, - dQT2, T2].
For the cycle, we have:
+ dQ1/T1 = dQ2/T2, that is
S2 = S1; or ΔS = 0
ΔS = 0 [for a cycle] wchich infer that
the entropy S is a variable of state
Heat goes from a hot source to a cold one; then dQ >=0. Foretheremore,
entropy always increases.
S >= 0
- At the equilibrium, S = 0,
- At the irreversible case, S > 0
If we write Universe = System + Surroundings, we have always
SUniverse >= 0
4. Microscopic definition of the entropy :
Now let's write the things as follows:
Suppose that we have a function S of vsriables P, V , n and T.
S = S(P, V, n, T). S is an intensive quantity; then for
two parts "1" and "2" of a system, we can write the total
entropy as follows:
S = S1 + S2
But if their numbers of microstates are Ω(1)and Ω(2), the total
microstate for the system is Ω(1) x Ω(2). A macrostate depends
also on the same variables of states: P, V, n, and T.
To recap:
S = S(P, V, n, T), S = Σ Si
Ω = Ω(P, V, n, T)= Π Ω(i)
The function that responds to the above criteria is the
natural logarithm (ln) function. So,
S = κln(Ω)
κ is a constant.
|