5_ftp

3
14 C O M P L E X I T Y Is Measurement Itself an Emergent Property? PHILIP W. ANDERSON Nobel laureate Philip Anderson is currently engaged in theo- retical research on high-Tc superconductivity and his most re- cent book, The Theory of Superconductivity, was published this year. His research in theoretical physics has focused on quan- tum theory of condensed matter, spectral line broadening, mag- netism, superconductivity, broken symmetry, superfluidity in 3 He and neutron stars, transport theory and localization, ran- dom statistical systems, and prebiotic evolution. Based at Princeton where he is Professor of Physics, Emeritus, he also serves on boards of the Aspen Center for Physics and Santa Fe Institute. For additional information about his current research, visit his web page at http://www.princeton.edu/~pmiinfo/fac- ulty/PWA.html. © 1997 John Wiley & Sons, Inc., Vol. 3, No. 1 CCC 1076-2787/97/01014-03 How can you predict the result when you can’t predict what you will be measuring? I do not argue with either of these two basic propositions: validity of all of the laws of physics in each microscopic process, and determinism of the quantum theory: that quantum physics is deterministic, in that the state of the whole system at any given instant deter- mines the past and the future, and that all interactions can be understood as lo- cal in space-time. But complete knowl- edge of the state is necessarily impos- sible in most measurement situations. This is the first and perhaps the least important of four fundamental sources of unpredictability in science, or, more accurately, of the inaccessibility of the Car- tesian ideal of deterministic computation by a super-duper computer from the fundamental laws: the idea that “the uni- verse runs like clockwork.” I see the difficulty in each case as, in a sense, “physical” uncomputability, an exponential explo- sion in the computer power necessary, rather than uncomputability in the mathematical sense. These four fundamental sources, then, are: 1. The measurement process: coupling of quantum system to dissipative classical variable. 2. Emergence of measurable quanti- ties: space, time, phase, field strength, etc. 3. Sensitivity to boundary conditions of classical variables. 4. Emergence of categories and con- cepts. What is relevant? What to measure? 1) Jim Hartle [1] has given us an adequate description of the decoherence process which leads to quantum “uncer- tainty.” For example, the apparatus in the canonical Stern- Gerlach experiment entangles one quantum variable, orien- tation, with another, position, which can then be encouraged to trigger a “dissipative process,” i.e., to couple to a large en- tropic bundle of histories that rapidly decohere from the al- ternative bundle from the other half of the wave function. (This is what happens in the detector.) I have nothing to add to his description, except to say that even this basic descrip- tion of measurement sees it as an emergent phenomenon, in that it is irrevocably tied to the possibility of irreversibility; which emerges only in a sufficiently large system. Your result only exists after the measurement. 2) There is another aspect of the measurement process that is emergent in a quite different sense. The quantities that we measure on the quantum system of interest are variables which are defined not by that system but by the macroscopic apparatus with which we do the measurement: orientation, position, velocity, field strength, etc. We cannot imagine a way to set up a coordinate system for orientation or position, for The question becomes, then, are space and time, for instance, inevitably the variables to measure, or are they a consequence of the dynamics of large systems. . .

Upload: constantinescuana200

Post on 28-Jan-2016

212 views

Category:

Documents


0 download

DESCRIPTION

ftp

TRANSCRIPT

Page 1: 5_ftp

14 C O M P L E X I T Y © 1997 John Wiley & Sons, Inc.

Is Measurement Itself an Emergent Property?

PHILIP W. ANDERSON

Nobel laureate Philip Anderson is currently engaged in theo-retical research on high-Tc superconductivity and his most re-cent book, The Theory of Superconductivity, was published thisyear. His research in theoretical physics has focused on quan-tum theory of condensed matter, spectral line broadening, mag-netism, superconductivity, broken symmetry, superfluidity in3He and neutron stars, transport theory and localization, ran-dom statistical systems, and prebiotic evolution. Based atPrinceton where he is Professor of Physics, Emeritus, he alsoserves on boards of the Aspen Center for Physics and Santa FeInstitute. For additional information about his current research,visit his web page at http://www.princeton.edu/~pmiinfo/fac-ulty/PWA.html.

© 1997 John Wiley & Sons, Inc., Vol. 3, No. 1CCC 1076-2787/97/01014-03

How can you predict the result when you can’t predict what you will be measuring?

Ido not argue with either of these two basic propositions:validity of all of the laws of physics in each microscopicprocess, and determinism of the

quantum theory: that quantum physicsis deterministic, in that the state of thewhole system at any given instant deter-mines the past and the future, and thatall interactions can be understood as lo-cal in space-time. But complete knowl-edge of the state is necessarily impos-sible in most measurement situations.

This is the first and perhaps the leastimportant of four fundamental sources of unpredictability inscience, or, more accurately, of the inaccessibility of the Car-tesian ideal of deterministic computation by a super-dupercomputer from the fundamental laws: the idea that “the uni-verse runs like clockwork.” I see the difficulty in each case as,in a sense, “physical” uncomputability, an exponential explo-sion in the computer power necessary, rather than

uncomputability in the mathematical sense.These four fundamental sources, then, are:

1. The measurement process: couplingof quantum system to dissipativeclassical variable.

2. Emergence of measurable quanti-ties: space, time, phase, fieldstrength, etc.

3. Sensitivity to boundary conditionsof classical variables.

4. Emergence of categories and con-cepts. What is relevant? What to measure?1) Jim Hartle [1] has given us an adequate description of

the decoherence process which leads to quantum “uncer-tainty.” For example, the apparatus in the canonical Stern-Gerlach experiment entangles one quantum variable, orien-tation, with another, position, which can then be encouragedto trigger a “dissipative process,” i.e., to couple to a large en-tropic bundle of histories that rapidly decohere from the al-ternative bundle from the other half of the wave function.(This is what happens in the detector.) I have nothing to addto his description, except to say that even this basic descrip-tion of measurement sees it as an emergent phenomenon, inthat it is irrevocably tied to the possibility of irreversibility;which emerges only in a sufficiently large system. Your resultonly exists after the measurement.

2) There is another aspect of the measurement process thatis emergent in a quite different sense. The quantities that wemeasure on the quantum system of interest are variableswhich are defined not by that system but by the macroscopicapparatus with which we do the measurement: orientation,position, velocity, field strength, etc. We cannot imagine a wayto set up a coordinate system for orientation or position, for

The question becomes, then, arespace and time, for instance,

inevitably the variables tomeasure, or are they a

consequence of the dynamics oflarge systems. . .

Page 2: 5_ftp

C O M P L E X I T Y 15© 1997 John Wiley & Sons, Inc.

instance, without a rigid body to refer to. It was not by acci-dent that Einstein’s writings on relativity were full of clocksand meter sticks.

What in fact is the role of the rigid body in the quantummeasurement process? It exhibits no dynamics, acting onlyas a boundary condition (slits) or static Hamiltonian (mag-netic field) or at most a low-frequency driving field. It is notexpected to undergo a quantum transition on scatteringagainst the system, in fact if it does so the measurement isspoiled—as in the Debye-Waller factor of x-ray scattering orthe Mössbauer effect, which cause reductions in the numberof measurements proportional to the number of recoil quanta.It is this kind of “zero-phonon process” in which an arbitraryunquantized amount of momentum or other quantum vari-able may be absorbed by the macroscopic order parameter,which is responsible for some of the puzzling features of themeasurement process. As far as I can see, it is a crucial part ofall measurements. The question becomes, then, are space andtime, for instance, inevitably the variables to measure, or arethey a consequence of the dynamics of large systems, i.e.,emergent concepts which are meaningless before condensa-tion occurred? Did they exist in the Big Bang?

That this is not an idle question is made clear by the ex-ample of phase of a superconductor or superfluid. Thisis a quantity which can be deliberately switched on or

off as a macroscopic thermodynamical variable by manipu-lating temperature. A given piece of metal can be a referencesystem for phase measurement in the same sense that a solidbody is one for position measurement, or not. And, if there isno superconductor in the vicinity, there is no meaning to thephase variable.

A magnetic field may be measured by means of a ferro-magnetic needle which has the relevant broken symmetry, bya current generated by a dissipative process, or by a chargedbeam, ditto; each of these is a spontaneous, emergent sourceof a field as a macroscopic coherent variable [2].

We cannot imagine another system of measurable vari-ables; but then, we wouldn’t be able to, would we? The exampleof superconductivity is enough of an argument that new mea-surable entities can arrive spontaneously.

3) The third source of unpredictability is the notorious sen-sitivity to boundary condition. This has been discussed byDavid Ruelle [3] and needs no further from me. Even at itsmost predictable, the world is not predictable, but much canstill be done about it.

4) Emergence of concepts and structure. This to some ex-tent melds into (2), but goes somewhat further. The questionhere is not only the absolute unpredictability of frozen acci-dents—could anyone have predicted the genetic code in de-tail?—but even more serious, the unpredictability of what kindof accident could happen—could anyone have predicted thata code would happen? The structure of Nature is so completelyhierarchical, built up from emergence upon emergence upon

emergence, that the very concepts and categories on whichthe next stage can take place are themselves arbitrary. A func-tional eye can be built upon several entirely different struc-tural principles, and which a given organism uses dependson its evolutionary history, not on any traceable logical pro-cess. But there are much deeper examples, such as the con-cept of “agenthood” which underlies life: is “wetware” life theonly example? The emergent concept, at the higher levels,becomes completely independent of its physical substrate:money can be wampum, stones, paper, gold,… Categories andconcepts can have enormous and permanent evolutionaryeffects. Agriculture is a concept with provably independentsubstrates of domesticates, which arose and caused popula-tion explosions and ecological disasters, and enormous modi-fications in organism populations, wherever it occurred.

I would argue that predictability is certainly an illusion ateven the “Darwinian” level of evolution and from there on.Futurology is a mug’s game. (As one can easily tell by lookingat its exponents.)

There is a fascinating side issue here: how well can we ex-trapolate, even within the narrow confines of the physi-cal sciences? We have some remarkable successes: the

black hole, the neutron star, the anisotropic superfluid 3He,all of which were predicted long before observation; as wasthe Josephson effect, shortly before. However, in each casethere were vital aspects which remained for observation totell us. But my impression is that the dismal failures, the caseswhere we had to receive a very heavy hint from Nature before

we caught on, are much more common. It took 15 years, from1923 to 1938, before Bose-Einstein’s idea of condensation waseven mentioned in connection with superfluidity, and anotherten for superconductivity (Landau, late 1940s). In 1956, 45years after its observation, Feynman publicly declared his in-ability to calculate superconductivity. I won’t mention howmany broad hints eventually forced us to the Standard model.In the case of the Quantum Hall Effects, the discovery waswholly due to the experimentalists—and it took a shockingperiod of years for the theory to be constructed. The fact ofthe matter is, Nature is much better at lateral thinking that isthe Turing machine equivalent we are supposed to have inour heads—and it is only when we leave that machine behindthat we do make real discoveries (and we don’t do it with mi-crotubules and quantum gravity).

We cannot imagine another system of measurablevariables; but then, we wouldn’t be able to,

would we? The example of superconductivity isenough of an argument that new measurable

entities can arrive spontaneously.

Page 3: 5_ftp

16 C O M P L E X I T Y © 1997 John Wiley & Sons, Inc.

POSTSCRIPT: VERIFICATION AND VALIDATION OF SCIENCEThis is my response to some of the recent comments on theverifiability of scientific results from philosophers like NaomiOreskes, quoted by Horgan in his attack on the Santa Fe Insti-tute [4,5]. “Verification and validation of numerical models ofnatural systems is impossible” is the quote he misuses. Oreskesis expressing an epistomological viewpoint very currentamong modern philosophers, that allows validity only to tau-tological statements such as those of a closed mathematicalsystem. What these philosophers miss is the false dichotomyof this distinction. Of course, solipsism is a logically accept-able view, epitomized by the figure quoted by Dennett [6] of“the brain in the vat” all of whose sense-input is controlled bya Cartesian supercomputer to correspond to natural experi-ence, but there is no outside world actually supposed to bethere.

A s Dennett points out, the “brain in the vat” is requiringa literally impossibly difficult task of the supercomputer,not only because of the enormous density of sensory

data it is having to be fed, but because it has a “schema”—apicture of the world and a map of the relationships in it—which it is constantly and actively checking and updating suc-cessfully. The enormous level of “compression” which theschema achieves is its evidence for validity; the solipsistic viewmay be logical but it is also idiotic in the sense of ignoring thiscompression. We can distinguish, then, three different typesof candidates for validity: 1) the closed tautological world oflogic and mathematics. 2) Simple induction, where we have0(1) measurements per fact: what is the length of a rod, or the

infectivity of AIDS, or the carcinogenicity of rhubarb? Here,of course, there is not true validation in any sense. And yet athird form of validity could be: 3) the multiply connectedscheme which lies behind our system of modern science, aswell as our understanding of the real world around us, whereenormous compression of a lot of correlated fact has takenplace, and where we can check the validity of any new theory(which Oreskes misnames a “model”) by applying Ockham’srazor: does it compress the description of reality, or expandit? From the “nature spirits” of primitive man to the inept theo-rist adding one parameter per fact, it has always been easy totell the real theories from the failures. This is what I like to callthe “seamless web” of science, and it represents an incrediblypowerful criterion for truth, one which the “philosophers” ofscience have not yet appreciated.

REFERENCES1. J. Hartle: Sources of predictability. Complexity 3(1): pp. 22-25, 1997.2. This aspect has been partly captured by W. Zurek in his idea of the

“pointer state” and its “quantum halo.” But he does not remark on theemergence properties of the point state. See his review article inPhysics Today, Oct. 1991, p. 36.

3. D. Ruelle: Chaos, predictability, and idealization in physics. Complexity3(1): pp. 26-28, 1997.

4. J. Horgan: The end of Science. Addison-Wesley, New York, 1996.5. N. Oreskes, K. Shrader-Frechette, and K. Belitz: Verification, validation,

and confirmation of numerical models in the earth sciences. Science263: p. 641, 1994.

6. D. C. Dennett: Consciousness explained. Little Brown, New York, 1991,chapter 1.

HAVE YOURENEWED YOURSUBSCRIPTION

TO COMPLEXITY?FOR FAST SERVICE CALL

USA

(800) 825-7550

OUTSIDE USA

(212) 850-6347

E-MAIL

[email protected] Wiley & Sons, Inc. • Subscription Department • 605 Third Avenue • New York, New York 10158-0012

“Its academic pedigreeis impeccable.

Production qualityis terrific;

it looks like anewsstand magazine.

Its contents areorganized like

a magazine too;editorial material,

tutorial articles andsurveys outweighspecialist papers”

–Nature