entropy iop dec5 2012 · title: microsoft powerpoint - entropy_iop_dec5_2012 author:...

Post on 17-Oct-2020

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Entropy: is it what wethink it is and how should

we teach it?

David Sands

Dept. Physics and Mathematics

University of Hull

UK

Institute of Physics, December 2012

We owe our current view of entropy toGibbs:

“For the equilibrium of any isolated system it isnecessary and sufficient that in all possible variationsof the system that do not alter its energy, the variationof its entropy shall vanish or be negative.”

Equilibrium of Heterogeneous Substances, 1875

And Maxwell:“We must regard the entropy of a body, like its volume,pressure, and temperature, as a distinct physicalproperty of the body depending on its actual state.”

Theory of Heat, 1891

Clausius:

Was interested in what he called “internal work”– work done in overcoming inter-particle forces;

Sought to extend the theory of cyclic processesto cover non-cyclic changes;

Actively looked for an equivalent equation tothe central result for cyclic processes;

0T

dQ

Clausius:

The positive sign arises because of Clausius’ viewof heat; not caloric but still a property of a body

The transformation of heat into work wassomething that occurred within a body – led to thenotion of “equivalence value”, Q/T

In modern thermodynamics the sign is negative,because heat must be extracted from the system torestore the original state if the cycle is irreversible .

Clausius:

dWdITdZ

Invented the concept “disgregation”, Z, to extendthe ideas to irreversible, non-cyclic processes;

Inserted disgregation into the First Law;

0 TdZdHdQ

Clausius:

Changed the sign of dQ; (originally dQ=dH+AdL; dL=dI+dW)

Derived;

Called;

0

dZT

dHdQ

dZT

dH the entropy of a body.

Disgregation: Measures “the force of the heat”.

‘The law does not speak of the work which the heatactually does, but that which it can do. Similarly, inthe first form of the law … it is not of the resistanceswhich the heat overcomes but of those which it canovercome that mention is made’ - (Clausius, 1862)

‘In order … to determine the force of the heat wemust evidently not consider the resistance whichactually is overcome, but that which can beovercome’ - (Clausius, 1862)

T

dQdS

Therefore:

But what does the inequality really mean?

ddQTdS

d has the units of energy but is not dU, dQ,dW or any combination of them!

Moreover:

Clausius applied the concept of disgregation tothe ideal gas.

There are no inter-particle forces, and thereforeno internal work.

There is no need to introduce the disgregation.

In a free expansion:

pdVTdZ

0 dQdW

0 dZdS

Some property of the gas withunits of energy is changing, butpdV is a fictitious work term thathas its origins in the notion of“force of the heat”.

“If we define thermodynamics, as I think we maynow do, as the investigation of the dynamical andthermal properties of bodies, … all speculations asto how much of the energy in a body is in the form ofheat are quite out of place.”

Maxwell, Tait’s Thermodynamics, 1878

The microscopic view

The emphasis in thermodynamics shifted away fromengines and cyclic processes to the microscopic.

T

EZkS B ln

kT

E

i

i

eZ

Ep

1

)(

Statistical mechanics:canonical distribution

Often given a physical interpretation: the probability of finding asmall system in contact with a reservoir in a given energy state.

EER

RRSRSRSB

RE

ESEESEESEEk

)()()()](ln[

TE

ES

R

RR 1)(

The entropy depends onparameters that do notthemselves fluctuate

Statistical mechanics and informationtheory:

The work of Jaynes in particular and Tribus has attempted toexplain thermodynamics in terms of information theory asdeveloped by Shannon.

Jaynes published two papers on the subject in 1957:

He acknowledged that Statistical Mechanics was not aphysical theory,

But went further and claimed that it need not be: statisticalinference allows physically useful properties of systems,such as mean energy and mean square deviation, to bederived

Is this true of entropy?

Statistical mechanics agrees with thermodynamics.

If the increase in thermodynamic entropy in an irreversibleadiabatic process has no physical basis in energy, what doesthis mean for the statistical entropy?

If we derive a set of physically descriptive distributions and theentropy is different from that of statistical mechanics, what doesthat mean?

Gibbs:

A small part of a larger system is canonically distributed.

dvekT

mdvp

kT

mv

i

22

32

2

vdvpdvdvdvdvppppPd nnn

n)(....... 321321

system probability. Single particleMaxwellians.

This leads directly to the Gamma distribution

dEEedEEP E 1

)().(

=(kT)-1

=3n/2=E

dedP .)(

).(1

kT

E

i

i

eZ

Ep

1

)(

Histograms: 105 samples.solid lines: Gamma distribution

Computer simulations of hard sphere fluid confirm theGamma distribution

400 particles intotal.

Extension to open systems:

Insert permeable partition – a series of fixed spheresdividing the chamber, concentrate on one side only.

Look at a smallpart (n=6, n=20)of a largersystem – 400particles.

Systemoccupancy –equal volumes

The energy distributions for three of the occupationnumbers (7, 10 and 13) of the small system of 20particles

solid lines: Gamma distribution

For very large ratios of volumes the larger systembecomes both a thermal and particle reservoir – the smallsystem occupancy is given by the Poisson distribution

Ratio ofvolumes 24:1

Total numbersof particles:

100, 300, 500,700

The distribution of energy states in the smaller system asagain given by the weighted sum of gamma distributions

Ratio ofvolumes 24:1

Total numbersof particles:

100, 300, 500,700

),()(

)().(),(

,

,,

n

ini

inin

EnpEp

EpnpEnp

Entropy: Is the information entropy the same asthermodynamic entropy?

Which variable? Energy or phase?

dedP .)(

).(1

dEEedEEP E 1

)().(

ln)1()(ln)(H

ln)()( HEH

Energy

vdvpVndnV

VndnV

dvdvdvdvppppdP nnn

n)(.11

.......)( 321321

...lnln)(23 VnTnH

Phase

For physically descriptive statistical distributions the entropytaken over the distribution of energy states differs from energytaken over the phase space probability.

Only the latter agrees with thermodynamics as we know it.

VnkTnkdVT

p

T

dU

T

dQS lnln)(

23

But, Gibbs’paradox!

Entropy in open systems

)().(),( ,, inin pnpnp

n

B HnpnHknS )()()(),(

.....)(lnln23 nkHVknTknS

nnH ln2

1)(

H(n) is clearly non-thermodynamic

What about H(V)?

Planck: correct for over counting by dividing partition function by N!

Information theory: doesn’t deal with states but with probability

dVV

dVVp1

).(

Removeconstraints:entropyincreases

is normalized

Thermodynamic and information entropies wouldappear to be different!

Statistical entropy should be regarded as an aspect ofprobability theory and taught as such.

Maximising the entropy is equivalent to finding the mostprobable distribution!

Conclusions

1. Doubts over the concept of the entropy of a body,

2. Gibbs’ statistical mechanics is not a physical theory and attemptsto relate it to physical phenomena through entropy fluctuations areflawed.

3. Physically representative probability distributions in the classicalideal gas differentiate between energy and phase.

4. Information entropy contains non-thermodynamic components:H(n) and possibly H(V).

5. Thermodynamic and information entropies would appear to bedifferent.

6. Statistical entropy is a very useful property for characterizinguncertainty and for finding the most probable distribution.

Fin.

Thank youfor your time.

Addenda

Gibbs’ statistical mechanics

Presented the canonical ensemble in 1902.

Not intended to represent a physical system.

“I have found it convenient, instead of considering one system ofmaterial particles, to consider a large number of systems similar toeach other in all respects except in the initial circumstances of themotion, which are supposed to vary from system to system, thetotal energy being the same in all. In the statistical investigation ofthe motion, we confine our attention to the number of thesesystems which at a given time are in a phase such that thevariables which define it lie within given limits.”

J.C. Maxwell, On Boltzmann’s theorem on the average distribution ofenergy in a system of material points, Cambridge PhilosophicalSociety's Transactions, Vol. XII., 1876

solid lines: weighted sum of Gamma distributions

The distribution of energy states for four small systems; 6,10, 15 and 20 particles in 400, equal volumes

),()(

)().(),(

,

,,

n

ini

inin

EnpEp

EpnpEnp

1. Fluctuations in energy

The single-particle Gamma distribution shows that the velocityof a single particle is always governed by the Maxwellian.

Entropy in IT represents uncertainty; it should not fluctuate withthe state of the system.

Distribution ofvelocitiesacross 300particles(intervals of 30ms-1)

Mean H-function forsystems of differentsizes

Fluctuations in the H-function for a system of2000 particles

top related