17.3rd law of thermo

Upload: babu-aravind

Post on 04-Jun-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 17.3rd Law of Thermo

    1/4

    School of Physics and Astronomy

    Junior HonoursThermodynamics GJA 2013-2014

    Lecture TOPIC 17 1 (Finn: 11.1 11.3 11.4, 11.5, 5.10)

    Synopsis: The Third Law of Thermodynamics. Entropy probability, disorder.

    The Third Law of Thermodynamics: introduction.

    The Nernst Heat Theorem

    Like most theorems and laws in classical thermodynamics, Nernsts theorem has been amended from the

    initial statement. A compromise version is

    Consider a system undergoing a process between initial and finalequilibrium states as a result of external influences, such as pressure.

    The system experiences a change in entropy, and the change tends to

    zero as the temperature characterising the process tends to zero.

    The theorem can be traced back to the observations by

    Nernst summarised in the diagram. When processes

    having the same initial and final temperature Tare ob-

    served for different values ofT, the change G in the

    value of the Gibbs function decreases whilst the change

    H in the value of the enthalpy H increases in a waythat suggests that G Hasymptotically asT 0.

    Because

    G = GfGi = H(T S)= HfHiT(SfSi)= HTS

    for the situation considered, the two curves touch

    asymptotically only ifGHis achieved byS0.

    An alternative statement of the Third Law, attributed to Planck, ensures that it can be applied gener-

    ally by referring to a perfect crystalline state, to which a large number of systems can approximate. Thebetter the approximation, the simpler is the thermodynamic feature which they share:

    The entropy of all perfect crystals is the same at the absolute

    zero, and may be taken to be zero.

    The key feature is that all perfect crystals would all have the same value of entropy at the absolute zero

    of the thermodynamic temperature T. That this value should be uniquely chosen is not just a matter of

    convenience. The choiceS0=0 atT=0 allows a strong link to be made between thermodynamics andstatistical mechanics where the same assumption is madeand supported by evidence from a very wide

    range of physics phenomena. All this leads naturally to the Simon statement of of the Third Law:

    The contribution to the entropy from each aspectof a system which

    is in thermodynamic equilibrium disappears at absolute zero.

    1file topic17.pdf

    1

  • 8/13/2019 17.3rd Law of Thermo

    2/4

  • 8/13/2019 17.3rd Law of Thermo

    3/4

    If a succession of these processes is considered operating be-

    tween a fixed value of magnetic induction and zero induction

    with the final temperature of one process being the initial tem-

    perature of the next one, the temperature drop becomes pro-

    gressively smaller. The two curves of entropy versus temper-ature, one for zero field (B= 0) and the other for fixed field

    B=B0, both end onT= 0 withS=0 according to the Nernststatement of the 3rd law. Graphically it is seen that the de-

    crease of entropy in successive processes becomes smaller in

    such a way that for a finite number of processes the entropy

    never reaches zero and neither does the temperature.

    Disobeying the Third Law: Ideal Gas Entropy and Free Energy, and glasses

    The heat capacity for an Ideal gas iscv=3R/2, independent of temperature.

    Using

    cV

    T = (dS

    dT)Vand a Maxwell relation(dS

    dV)T= (dP

    dT)V we can easily calculate changes in S:

    S= T

    T0

    Cv

    TdT+

    VV0

    P

    T

    V

    dV= T

    T0

    3NR

    2TdT+

    VV0

    NR

    VdV=NRcv ln

    T

    T0+NR ln

    V

    V0

    Which implies that the entropy difference between absolute zero and finite T for an ideal gas is infinite.

    Obviously this is nonsense: the absolute value of Entropy (and Free EnergyG, andF) are undefined for

    an Ideal Gas. The best we can do is find their value relative to some reference state.2

    Glasses look as if their entropy cannot be reduced to zero even if they can be cooled to 0 K - there

    are many ways to arrange the atoms to still be a glass. A glass does not have the long range order of a

    crystalline substance. It has entropy frozen-in. It is well known, however, that like a liquid glass

    can flow (though extremely slowly) with structure gradually crystallising.

    So the get-outs are that ideal gasses arent quantised, and glasses are not equilibrium systems!

    The Microscopic origin of Thermodynamics: Statistics, entropy, probability and disorder

    DEFINITION: A micro-state asa way the particles could be arranged at one instant in a given phase

    DEFINITION: Ergodicity means thatit is actually possible to move from any microstate to any other.

    The number of microstates W is a rapidly increasing function of the energy. In the Boltzmann

    definition, S= kb lnW, so zero entropy means there is only one way of arranging the atoms (W=1).At the lowest temperature there will eventually only be one state, the ground state, and entropy S=kBlog(1) =0. Some examples of how to count...

    coordinate space: 1-dimensional

    Let the position coordinate of each particle in a system of a finite number of identical particles be inone of the rangesx1 to x2,x2to x3,x3to x4, and so on. Ergodicity (freedom to move around) means each

    particle is as likely to be in range 1 as in range 2.

    Consider a system of only three particles and two ranges. The chance of finding all three particles in

    range 1 is the same as the chance of finding all three in range 2.

    However, there are more ways of arranging the atoms with two in range 1 and one in range 2. 3 There

    is only one arrangement with all particles in range 1 [W=1]. There are three arrangements with one

    particle in range 1 and two in range 2, since the particle in range 1 can be any of the three available

    [W=3].

    The closer the arrangement is to uniformity, the more likely it is that the arrangement will be ob-

    served. This implies that although the particles are identical, exchanging two particles in any one range

    does not increase the possible number of arrangements.

    2This is why Standard Temperature and Pressure has such iconic status in chemistry.3Some people find this obvious, other say it defies common sense. Look up the Monty Hall problem or principle of

    restricted choice

    3

  • 8/13/2019 17.3rd Law of Thermo

    4/4

    The argument can be extended for 4, 5, 6, .. particles, becoming longer but more convincing. Ulti-

    mately the numbers of ways with 0, 1, 2 ..N particles in range 1 are given by Pascals triangle: 1; 1 2 1;

    1 3 3 1; 1 4 6 4 1; and so on.

    When all this is extended to a large number of particles the thermodynamic probability of finding

    all particles in a single range becomes minute compared with equal numbers in all ranges. Assuming

    ergodicity, the most likely state is the one which is observed, even if we start in an unlikely one: free

    expansion takes us from a low probability (entropy) state to a high probability one.

    phase spaceFor a full description of a system, there are six coordinates per particle, three for space

    coordinates and three for momentum (or alternatively velocity). For a gas in equilibrium, the molecules

    are distributed uniformly in space but although their velocity directions are uniformly distributed, all

    velocity magnitudes are not equally likely. The distribution tends to zero for zero velocity and for large

    velocity: how large large is depends on the energy content of the gas sample. The distribution has a

    single maximum, and is referred to as the Maxwell-Boltzmann distribution.

    The important factor is ergodicity: the particles must be able to move from one microstae to another.

    At its simplest, this ensures that for a sample of a very large number of particles each arrangement

    (microstate) is equally probable but that there is an overwhelming probablity of observing the systemin one or another of a very large number of equivalent arrangements (microstates) which make up the

    macrostate which can be constructed in the largest number of ways, ie with the largest statistical weight.

    An interesting outcome from all this is that the Second Law of Thermodynamics is frequently referred

    to as a statistical law. Which means theres a chance of breaking it.

    Suppose it were possible to restrict the more energetic molecules of a sample of gas to be in one

    part of the containing vessel if and when these more energetic molecules appeared in the course of

    random fluctuations in the sample. Then energy would be transferred preferentially from the cooler part

    of the sample (which would become progressively cooler) to the warmer part (which would become

    progressively warmer), putting the Second Law at risk. The means of achieving this without doing any

    mechanical work was the intervention (proposed by J C Maxwell) of an alert and well-informed finite

    being in total control of a special massless shutter mechanism, soon becoming known (to Maxwellsdisapproval) as Maxwells demon.

    Indeed, for small systems in small timescales it issometimespossible to violate the Kelvin statement,

    and extract work from a system without supplying heat. Imagine a pulling out a piston from a cylinder

    containing ideal gas atoms at temperature Tand volumeV against external pressure P0. if the internal

    pressure P< P0 this would require doing work to pull the piston. However, the internal pressure issupplied by atoms bouncing off the piston at random. If there are few enough atoms, and short enough

    time, the number of atoms striking the piston may be greater than expected. In which case the piston

    will be pushed out, the system doing work without losing heat. The catch is thatmore oftenwork is done

    on the system, and we cant predict when the violation will occur. This sort of thing has been shown

    experimentally. 4

    Other definitions of entropy beyond this course

    The probability interpretation of entropy leads to yet another definition, the Gibbs Entropy:

    S=kBi

    piln pi

    where pi is the probability of finding the system in microstate i. The generalisation of this to quantum

    mechanics gives the von Neumann entropy5. These are equivalent to the heat definition, and also to

    the Shannon Information Entropy which quantifies how much information is contained in a message

    (and therefore, how much the message can be compressed). Remarkably, they are all the same, and the

    missing entropy in Maxwells Demon is the information in the Demons brain!

    4G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & D. J. Evans (2002). Experimental demonstration of violations of the

    Second Law of Thermodynamics for small systems and short time scales. Physical Review Letters 89 (5): 0506015similar to Gibbs, using quantum mechanical probabilities, the so-called density matrix which you wont see in Junior

    Honours

    4