In statistical mechanics, entropy (as per Boltzmann) is defined to be, in dimensionless form

\(S=Log\left( W \right) \)

where \(W\) is the number of accessible microstates. For a complex system composed of subsystems \(A, B, C…\) in interaction with each other, we define the total entropy as

\(S=Log\left( { W }_{ A }{ W }_{ B }{ W }_{ C }... \right) \)

A subsystem can be the phase space of an independent variable, as for example position or velocity. It doesn't necessarily refer to physical systems like bottles of gas that are in contact. So, for example, for an ideal gas that is composed of independently moving point-like atoms, one phase space plots all the current positions of the individual atoms while another phase space plots all the current velocities of the individual atoms.

One property of entropy is that if there are any changes among the subsystems, then the total entropy will always rise, and reach a maximum when equilibrium is reached. This is known as the 2nd Law of Thermodynamics [which is yet to be rigorously proven]. This property, in statistical mechanics, is based a number of assumptions, some of which are

1) \(W\) is the number of **accessible** microstates, each of which has [at least roughly] equal probability of happening, or accessed

2) Each subsystem is circulating freely within their phase space of microstates

3) There is some conserved quantity or constraint or interdependence between the subsystems so that there isn’t runaway expansion in the number of microstates, if there is to be a state of equilibrium for which entropy is at maximum

4) Probability favors a subsystem being in a larger phase space than smaller

The simplest example of such a system is where

\({ W }_{ A }+{ W }_{ B }=4\)

so that total entropy would have a maximum at

\( S=Log\left( 2\cdot 2 \right) \)

For an ideal \(3D\) thermodynamic gas, we assume that the governing variables are

\(m, v, L, N\) or mass, root-mean-velocity, length, and number

For two subsystems of phase volumes \({ v }^{ 3N }\) and \({ L }^{ 3N }\) and for some dimensional proportionality constant \(k\) we have (with \(m\) having a phase space size of \(1\) because it remains constant)

\(S=kLog\left( W \right) =kLog\left( { v }^{ 3N }{ L }^{ 3N } \right) \)

Defining physical volume to be \(V={ L }^{ 3 }\), we have

\(S=3NkLog\left( v\cdot { V }^{ \frac { 1 }{ 3 } } \right) \)

\(dS=3Nk\left( \dfrac { dv }{ v } +\dfrac { 1 }{ 3 } \dfrac { dV }{ V } \right) \)

\(dS=\left( Nmvdv+\dfrac { 1 }{ 3 } Nm{ v }^{ 2 }\dfrac { dV }{ V } \right) { \left( \dfrac { 1 }{ 3k } m{ v }^{ 2 } \right) }^{ -1 }\)

From statistical mechanics of an ideal gas, we make use of the following definitions and relations

\(U=\dfrac { 1 }{ 2 } Nm{ v }^{ 2 }\)

\(dU=Nm{ v }dv\)

\(pV=\dfrac { 1 }{ 3 } Nm{ v }^{ 2 }\)

\(pdV=\dfrac { 1 }{ 3 } Nm{ v }^{ 2 }\dfrac { dV }{ V } \)

\(\dfrac { 3 }{ 2 } NkT=U\)

\(T=\dfrac { 1 }{ 3k } m{ v }^{ 2 }\)

where \(T,p\) is temperature and pressure, and \(U\) is internal energy

From these, and the expression for \(dS\), we have

\(dS=\dfrac { dU+pdV }{ T } \)

If we define \(dQ=dU+pdV\), where \(Q\) is heat, then we have the classic thermodynamic expression

\(dS=\dfrac { dQ }{ T } \)

\(k\) is the Boltzmann Constant, with the value \(1.3806...\cdot{ 10 }^{ -23 } \cdot kg \cdot{ meter }^{ 2 }\cdot { sec }^{ -2 }\cdot { Kelvin }^{ -1 }\) so that classical entropy has the physical dimension of energy divided by temperature.

## Comments

Sort by:

TopNewestAwesome!

I'm fascinated! – Вук Радовић · 12 months ago

Log in to reply